swlabr

joined 2 years ago
[–] [email protected] 8 points 1 month ago (1 children)

Yeah exactly. Loving the dude's mental gymnastics to avoid the simplest answer and instead spin it into moralising about promptfondling more good

[–] [email protected] 2 points 1 month ago

OpenAI should be opening Thanatoria any day now, with Sora^TM^ to generate visual^[Soon with audio! ] content to comfort you on your hemlock shuffle off the mortal ~~buffalo~~ coil.

[–] [email protected] 6 points 1 month ago

Swatting as a service

[–] [email protected] 9 points 1 month ago

Sorry, as mentioned elsewhere in the thread I can’t open links. Looks like froztbyte explained it though, thanks!

[–] [email protected] 5 points 1 month ago (1 children)

Actually I’m finding this quite useful. Do you mind posting more of the article? I can’t open links on my phone for some reason

[–] [email protected] 8 points 1 month ago (4 children)

what is this “alignment” you speak of? I’ve never heard of this before

[–] [email protected] 5 points 1 month ago

video eventsAh you see, this is proof that FSD is actually AGI. Elon told the FSD that it needs to maximise tesla profits. The FSD accessed a camera pointing at a tesla earnings report and realised that it could increase the value of tesla’s carbon credit scheming by taking out trees, hence the events of the video

[–] [email protected] 9 points 1 month ago (4 children)

In the current chapter of “I go looking on linkedin for sneer-bait and not jobs, oh hey literally the first thing I see is a pile of shit”

text in imageCan ChatGPT pick every 3rd letter in "umbrella"?

You'd expect "b" and "I". Easy, right?

Nope. It will get it wrong.

Why? Because it doesn't see letters the way we do.

We see:

u-m-b-r-e-l-l-a

ChatGPT sees something like:

"umb" | "rell" | "a"

These are tokens — chunks of text that aren't always full words or letters.

So when you ask for "every 3rd letter," it has to decode the prompt, map it to tokens, simulate how you might count, and then guess what you really meant.

Spoiler: if it's not given a chance to decode tokens in individual letters as a separate step, it will stumble.

Why does this matter?

Because the better we understand how LLMs think, the better results we'll get.

[–] [email protected] 24 points 1 month ago (1 children)

MFs are boiling the oceans to reinvent cold reading

[–] [email protected] 7 points 1 month ago* (last edited 1 month ago)

A real modest {~~brunch~~|bunch}

[–] [email protected] 14 points 1 month ago (4 children)

Just thinking about how I watched “Soylent Green” in high school and thought the idea of a future where technology just doesn’t work anymore was impossible. Then LLMs come and the first thing people want to do with them is to turn working code into garbage, and then the immediate next thing is to kill living knowledge by normalising people relying on LLMs for operational knowledge. Soon, the oceans will boil, agricultural industries will collapse and we’ll be forced to eat recycled human. How the fuck did they get it so right?

[–] [email protected] 5 points 1 month ago* (last edited 1 month ago)

If I had my druther’s I’d make my own hosting and call it “UnaGit”, and pretend it’s unagi/eel themed, when it is actually teddy K themed

view more: ‹ prev next ›