Finally, of course, it is very much not just rationalists who believe that AI represents an existential risk. We just got there twenty years early.
This one?
Finally, of course, it is very much not just rationalists who believe that AI represents an existential risk. We just got there twenty years early.
This one?
We could also just fluoridate the water supply, which also massively reduces cavities.
Obligatory note that, speaking as a rationalist-tribe member, to a first approximation nobody in the community is actually interested in the Basilisk and hasn’t been for at least a decade.
Sure, but that doesn't change that the head EA guy wrote an OP-Ed for Time magazine that a nuclear holocaust is preferable to a world that has GPT-5 in it.
Are you saying that Clippy is proof I'm right or proof I'm wrong? Or I'm I just being unfunny and not getting the joke.
Microsoft is making laptops with dedicated Copilot buttons.
I think they'd rather burn their company to the ground, all the while telling their customers that they just needed to wait a little while longer, rather than admit that they got it wrong.
Who is even asking for this?
https://digitaldemocracy.calmatters.org/bills/ca_202320240sb1047
Have a AI regulation committee and also give the committee their own hardware so that they can use that hardware to regulate the other hardware. Maybe.
Uh