Soyweiser

joined 2 years ago
[–] Soyweiser@awful.systems 6 points 1 day ago

Yeah, think that is prob also why a Thiel supports Moldbug, not because he believes in what Moldbug says, but because Moldbug says things that are convenient for him if others believe it (Even if Thiel prob believes a lot of the same things, looking at his anti-democracy stuff, and the 'rape crisis is anti men' stuff (for which he apologized, wonder if he apologized for the apology now that the winds have seemingly changed).

[–] Soyweiser@awful.systems 2 points 1 day ago

Yeah you are correct, im mismatching longtermism with transhumanist digital immortality, which is why I called it a conspiracy theory, it being wrong and all that. (Even if I do think empathy for perfect copies of yourself is a thing not everyone might have).

[–] Soyweiser@awful.systems 4 points 1 day ago* (last edited 1 day ago) (1 children)

Clearly you do not have low self-esteem. But yes that is the weak point of this whole thing, and why it is a dumb conspiracy theory. (Im mismatching the longtermist 'future simulated people are important' utilitarian extremism with the 'simulated yous are yous' extreme weirdness).

The problems with yuds argument is that all these simulations will quickly diverge and no longer are the real 'you' see twins for a strawman example. The copies should then be ran in exactly the same situations and then wtf is the point. When I slam my toe into a piece of furniture I dont morn all the many world mes who also did just break a toe again. It just weird, but due to the immortality cope it makes sense for insiders.

[–] Soyweiser@awful.systems 7 points 1 day ago (12 children)

Weird conspiracy theory musing: So we know Rokos Basilisk only works on a very specific type of person who needs to belief in all the LW stuff about what the AGI future will be like, but who also feel morally responsible, and have high empathy. (Else the thing falls apart, you need to care about, feel responsible for, and believe the copies/simulated things are conscious). We know caring about others/empathy is one of those traits which seem to be rarer on the right than the left, and there is a feeling that a lot of the right is doing a war on empathy (see the things Musk has said, the whole chan culture shit, but also themotte which somebody once called an 'empathy removal training center' which stuck so I also call it that. If you are inside once of these pipelines you can also notice it, or if you get out, you can see it looking back, I certainly did when I read more LW/SSC stuff). We also know Roko is a bit of a chud, who wants some sort of 'transhumanist' 'utopia' where nobody is non-white or has blue hair (I assume this is known, but if you care to know more about Roko (why?) search sneerclub (Ok, one source as a treat)).

So here is my conspiracy theory. Roko knew what he was doing, it was intentional on Rokos part, he wanted to drive the empathic part of LW mad, discredit them. (That he was apparently banned from several events for sexual harassment also is interesting. Does remind me of another 'lower empathy' thing the whole manosphere/pua thing which was a part of early LW, which often trains people to think less of women).

Note that I don't believe in this, as there is no proof for it, I don't think Roko planned for this (nor considered it in any way) and I think his post was just a honest thought experiment (as was Yuds reaction). It was just an annoying thought which I had to type up else I keep thinking about it. Sorry to make it everybodies problem.

[–] Soyweiser@awful.systems 8 points 1 day ago* (last edited 1 day ago) (2 children)

Good for him to try and convince the LW people that the math is wrong. Do think there is a bigger problem with all of this. Technological advancement doesn't follow exponential curves, it follows S-curves. (And the whole 'the singularity is near' 'achtually that is true, but the rate of those S-curves is in fact exponential is just untestable unscientific hopeium, but it is odd the singularity people are now back unto exponential curves for a specific tech).

Also lol at the 2027 guys believing anything about how grok was created. Nice epistemology yall got there, hows the Mars base?

[–] Soyweiser@awful.systems 8 points 2 days ago* (last edited 2 days ago)

Re bonus findings: anybody who uses an argument like 'a man faces genetic oblivion' like that is way further right than they think they are. That he wrote this before the rise of the whole manosphere stuff (who also had a weird obsession with this, prob why they aligned with the stormfront people so easily (who worry about genetic oblivion of people with their skin color (while excluding Jewish people)) should be extra concerning. I hope it was just a youthful folly, and not something expressed near his kids(*). (While I point to the manosphere shits, there is also a big fantasy/sf element(**), where this is made seemingly important, which makes lonely nerds way more at risk of all of this, see Mickey Rourkes weird shit in the movie immortals (or better, do not it sucks, here is a link to a description of the scene, but be warned)).

*: and to be clear I see no reason why he has this was written years before he met his wife, which he seems to be in a loving, understanding and happy relationship with. See my musings about manosphere/nerd shit more as a jumping off point to talk about it than an accusation. (Think the 'not as left as you think you are' thing holds however).

**: That in Crusader Kings 3 for example, a kid being revealed as having a different father does not disinherit them (it just gives them a 'disputed heritage' trait) seems to trip a lot of people up. The player can disinherit them, but that comes at a renown cost (but you can do that for almost all children).

[–] Soyweiser@awful.systems 10 points 2 days ago* (last edited 2 days ago)

The intelligence is magic and the higher your intelligence the more magic thinking is quite something.

E: should have scrolled down, astrange said it better.

[–] Soyweiser@awful.systems 11 points 3 days ago

Ah right yes, thanks.

"“I would’ve been the chief rabbi of my shtetl,” I said. “All day long, I’d debate questions like how much restitution you’d have to pay if your ox gored your neighbor’s sheep. And for this, I’d get an arranged marriage with the most beautiful girl in town.”"

[–] Soyweiser@awful.systems 13 points 3 days ago* (last edited 3 days ago) (5 children)

The name of his blog was a reference that he thinks he was born in the wrong time, because in Shtetls people of his clear high intelligence would be assigned a wife (and get high social status).

Here is Sneerclub talking about it a while back. Dec 2022 is apparently 2 years ago for reddit.

E: Do note I can't recall where he wrote about this, so I might be misremembering, or taking what we said in hyperbole about it for the real thing.

E2: unrelated to the blog name, I did think of incels/untitled/this kind of stuff like the troubles of lonely nerds/NDs not getting dating and going to the wrong places when I read the last part of this question Dr NerdLove answered, the 'to be perfectly blunt' part. And just how common the 'you can't be mad at me now' thing is.

[–] Soyweiser@awful.systems 14 points 3 days ago* (last edited 3 days ago) (9 children)

“You’re Scott Aaronson?! The quantum physicist who’s always getting into arguments on the Internet, and who’s essentially always right, but who sustains an unreasonable amount of psychic damage in the process?”

And then everybody clapped.

(This is extra funny because he lost friends over the gaza genocide debate when his leftwing (and Jewish) friend told him 'well we do have power over them' re the protesting students, to which he has a good Rationalist replied with 'FUCK YOU'. He himself describes the situation slightly differently, but he has shown he doesn't always have the best ability to understand others in these kinds of emotional moment, and he cannot fathom he might be wrong (and that is how you end up stealing from the tip jar)).

E: And so much references to 'the sneerers' and our arguments again, he promised he would stop reading our shit because it is unhealthy for him. But looking at the arguments, I'm happy to see that he has indeed not read our stuff. The I in TESCREAL/TREACLES(*) stands for Incel.

Also, while the Rationalists are not incels, what does the name of your blog stand for Scott? (E: wanted to edit in a link but can't find the explainer page, wonder if he read it again and went 'wow yeah I get why people think that isn't great' and deleted it (I did find this congrats on being consistently wrong)).

*: still think these are dumb abbreviations, but not letting that get in the way of a dumb joke.

[–] Soyweiser@awful.systems 5 points 3 days ago (1 children)

huge gains

This just makes me think that this person is lifting weights. And also using a forlift. Such an odd choice of words.

[–] Soyweiser@awful.systems 5 points 4 days ago (1 children)

That felt as a trollish misdirection to me tbh.

11
submitted 1 month ago* (last edited 1 month ago) by Soyweiser@awful.systems to c/sneerclub@awful.systems
 

Begrudgingly Yeast (@begrudginglyyeast.bsky.social) on bsky informed me that I should read this short story called 'Death and the Gorgon' by Greg Egan as he has a good handle on the subjects/subjects we talk about. We have talked about Greg before on Reddit.

I was glad I did, so going to suggest that more people he do it. The only complaint you can have is that it gives no real 'steelman' airtime to the subjects/subjects it is being negative about. But well, he doesn't have to, he isn't the guardian. Anyway, not going to spoil it, best to just give it a read.

And if you are wondering, did the lesswrongers also read it? Of course: https://www.lesswrong.com/posts/hx5EkHFH5hGzngZDs/comment-on-death-and-the-gorgon (Warning, spoilers for the story)

(Note im not sure this pdf was intended to be public, I did find it on google, but might not be meant to be accessible this way).

 

The interview itself

Got the interview via Dr. Émile P. Torres on twitter

Somebody else sneered: 'Makings of some fantastic sitcom skits here.

"No, I can't wash the skidmarks out of my knickers, love. I'm too busy getting some incredibly high EV worrying done about the Basilisk. Can't you wash them?"

https://mathbabe.org/2024/03/16/an-interview-with-someone-who-left-effective-altruism/

 

Some light sneerclub content in these dark times.

Eliezer complements Musk on the creation of community notes. (A project which predates the takeover of twitter by a couple of years (see the join date: https://twitter.com/CommunityNotes )).

In reaction Musk admits he never read HPMOR and he suggests a watered down Turing test involving HPMOR.

Eliezer invents HPMOR wireheads in reaction to this.

view more: next ›