dgerard

joined 2 years ago
MODERATOR OF
[–] [email protected] 7 points 1 week ago (1 children)

look, i didn't say liar,

[–] [email protected] 4 points 1 week ago (1 children)

Transformers do way better transcription, buuuuuut yeah you gotta check it

[–] [email protected] 13 points 1 week ago

even the Nazis can't stand these fucks

[–] [email protected] 16 points 2 weeks ago

because they cribbed their ideas from deathnote, and they're stupid

[–] [email protected] 11 points 2 weeks ago

oh yeah that was obvious when you see who they are and what they do. also, one of the large opensource projects was the lesswrong site lololol

i'm surprised it's as well constructed a study as it is even given that

[–] [email protected] 4 points 2 weeks ago

wonder if that's why they tried and failed to change it to "squiggle maximizing"

[–] [email protected] 6 points 2 weeks ago

the comments are wild "yes good post having my brain taken over by a superintelligent autocomplete is a reasonable concern"

[–] [email protected] 12 points 2 weeks ago (2 children)

it should be, but he's been actively promoting the stuff

 

‘an expensive slot machine that outputs slop 98% of the time’

podcast and blog post tomorrow, i am ill

yes that's a clothes hanger at the back i forgot to put in the hall, we only reveal our clean laundry here

[–] [email protected] 5 points 2 weeks ago

all the stock sites are. use case: an image that's almost perfect but you wanna tweak it

LEARN PAINT YOU GHOULS

[–] [email protected] 4 points 3 weeks ago

you should be read up on the gospel of @fasterandworse never shutting up about "Hooked"

[–] [email protected] 6 points 3 weeks ago (1 children)

the psychology of loot boxes?

yep! https://awful.systems/post/4568900

the book is "Hooked" and it's Don't Build The Torment Nexus I'm Now Providing You A Detailed Blueprint Of

[–] [email protected] 6 points 3 weeks ago

AI->cocaine filter: Cocaine isn’t going to replace you. Someone using cocaine is going to replace you.

 

this podcast guy talks like sephiroth but the title is perfect and the content is largely correct

10
submitted 3 weeks ago* (last edited 3 weeks ago) by [email protected] to c/[email protected]
 

if you ever wonder how I write Pivot, it's a bit like this. The thing below is not a written text, it's a script for me to simulate spontaneity, so don't worry about the grammar or wording. But how are the ideas? And what have I missed?

(Imagine the text below with links to previous Pivots where I said a lotta this stuff.)


When some huge and stupid public AI disaster hits the news, AI pumpers will dive in to say stuff like “you have to admit, AI is here to stay.”

Well, no I don’t. Not unless you say just what you actually mean, when you say that. Like, what is the claim you’re making? Herpes is here to stay too, but you probably wouldn’t brag about it.

We’re talking about the generative AI stuff here. Chatbots. Image slop generators. That sorta thing.

What they’re really saying is give in. AI like it is right now in the bubble is just a permanent force that will reshape society in its image, so we have to give in to it now and do what the AI pumpers say. You know that’s what they really mean.

We get stuff like this egregious example from the Washington State school system. It stars with “AI is here to stay” then there’s a list of AI stuff to force on the kids assuming all of this works forever just like the biggest hype in the bubble. And that’s not true! [OSPI SLIDE]

If you ask why AI’s here to stay, they'll just recite promotional talking points. So ask them some really pointy questions about details.

Remember that a lot of people are super convinced by one really impressive demo. We have computers you can just talk to naturally now and have a conversation! That's legit amazing, actually! The whole field of natural language processing is 80% solved! The other 20% is where it's a lying idiot and probably can’t be fixed? That’s a bit of a problem in practice. Generative AI is all like that, it’s impressive demos with unfixable problems.

Sometimes they’ll claim chatbots are forever because machine learning works for X-ray scans. If they say that, they don’t know enough to make a coherent claim, and you’re wasting your time.

Grifters will try to use gotchas. Photoshop has AI in it, so you should let me post image slop! Office 365 has AI in it, so if you use Word you might as well be using AI! Spell check’s a kind of AI! These are all real examples. These guys are lying weasels and the correct answer is “go away”. At the least.

Are they saying the technology will surely get better because all technology improves? [WAVE HANDS] Will the hallucinating stop? Then they need evidence for that, cos it sure looks like the tech of generative AI is stuck at the top of its S-curve at 80% useful and hasn’t made any major breakthroughs in a couple of years. [o1 GRAPH] It’s an impressive demo, but the guy saying this will have to bring actual evidence it’s gonna make it to reliable product status. And we have no reason to think so.

Are they saying that OpenAI and its friends, all setting money on fire, will be around forever? Ha, no. That is not economically possible. Look through Ed Zitron’s numbers if you need a bludgeon. [ED Z SLIDE] [Ed Zitron]

These AI companies are machines for taking money from venture capitalists and setting it on fire. The chatbots are just the excuse to do that. The companies just are not sustainable businesses. Maybe after the collapse there’ll be a company that buys the name “OpenAI” and dances around wearing it like a skin.

Are they saying there’s a market for generative AI and so it’ll keep going when the bubble pops? Sure maybe there’ll be a market - but I’ve been saying for a while now, the prices will be 5x or 10x what they are now if it has to pay its way as a business.

Are they saying you can always run a local model at home? Sure, and about 0.0% of chatbot users do that. In 2025, the home models are painfully slow even on a high end box. No normal people are going to do this.

I’ve seen claims that the tools will still exist. I mean sure, the transformer architecture is actually useful for stuff. But mere existence isn’t much of a claim either.

Ya know, technologies linger forever. Crypto is still around serving the all important “crime is legal” market, but it’s radioactive for normal people. If you search for “AI is here to stay” on Twitter, you’ll see the guys who still have Bored Ape NFT avatars. Generative AI has a good chance of becoming as radioactive to the general public as crypto is. They’ll have to start calling the stuff that works “machine learning” again.

So. If someone says "AI is here to stay," nail them down on what the heck the precise claim is they're making. Details. Numbers. What do you mean by being here? What would failure mean? Get them to make their claim properly.

I mean, they won’t answer. They never answer. They never have a claim they can back up. They were just saying promotional mouth noises.

Now, I’ll make a prediction for you, give you an example. When, not if, the venture capitalists and their money pipeline go home and the chatbot prices multiply by ten, the market will collapse. There will be some small providers left. it will be technically not dead yet!! but the bubble will be extremely over. The number of people running an LLM at home will still be negligible.

It’s possible there will be something left after the bubble pops. AI boosters like saying it’s JUST LIKE the dot-com bubble!!! But i haven't really been convinced by the argument "Amazon lost money for years, so if OpenAI just sets money on fire then it must be Amazon."

Will inference costs — 80%-90% of compute load — come down? Sure, they’ll come down eventually. Will it be soon enough? Well, Nvidia’s Blackwell hasn’t been a good chip generation so they’re putting out more of their old generation chips while they try to get Blackwell production up. So it won’t be very soon.

So there you go. If you wanna say “but AI is here to stay!” tell us what you mean in detail. Stick your neck out. Give your reasons.

view more: ‹ prev next ›