370 Comments
User's avatar
⭠ Return to thread
Satisficer's avatar

This feels like BARPod Christmas!

Edit: having listened to the episode, I feel like Trace is downplaying just how batshit "mainstream" rationalists like Eliezer Yudkowsky, LessWrong, even early-period Scott Alexander were. I know he's sympathetic to them, but from my perspective as someone who encountered the rationalists about a decade ago and bounced off hard, they seemed to have an obsession with tossing out all previous knowledge and reinventing epistemology from scratch, which is of course a good way to get yourself stuck in mental cul-de-sacs. Yudkowsky in particular displayed definite cult leader tendencies by claiming that most of his followers were incapable of understanding his esoteric knowledge, and asking them for money because he was the only person capable of averting the AI apocalypse. It's also worth noting that when these guys talk about "AI safety" and especially "alignment," that's a totally different field from traditional AI research. The rationalists made up this field themselves and imo it should be classified more accurately as a branch of philosophy than anything to do with computer science.

Re: Roko's Basilisk - For the people who take it seriously, the reason they believe the AI can go back in time to torture you and the reason it's eternal, is that they believe if the AI can generate a perfect simulation of you, that person *is* you. The way I understand it, there's sort of this constant veil-of-ignorance situation going on where if there are multiple exact copies of you in existence, your experience could reside in any one of the copies. So if you know about the Basilisk but don't dedicate your life to bringing it into existence, once it does appear it will use its superintelligence to make copies of you that it will torture forever. Trace is right that most rationalists don't take it seriously, but it does continue to break new rationalists' brains on occasion.

Expand full comment
dollarsandsense's avatar

Yes, I’m annoyed that any of these ideas are treated seriously. Most of it is dehumanizing bullshit clothed in fake logic.

Expand full comment
Penguin/Mom's avatar

Absolutely. I hate this sociopathic tech bullshit.

Expand full comment
Martin Blank's avatar

I don't think it is fake logic so much as ever so slightly over-confident logic, which is honestly even more dangerous.

Expand full comment
Martin Blank's avatar

My big problem with them is something like the following:

EY and his acolytes take a bunch of correct suppositions but stack them in epistemologically totally irresponsible ways.

So they use logic and reason and good and peer tested thinking about rationalism and decision theory to come up with premises. Premises which might even be each individually say 96% true. But then their philosophy builds a big giant contingent jenga tower out of these that is itself not at all meeting that level of epistemic likelihood/rigor.

If your philosophy has Premises 1-10 which all have like a 96% chance of being true depending about how you feel regarding various contentious issues in epistemology, and lets stack on top of that 20 more assertions/supposed deductions which each have a 96% chance of being true...

Your synthetic conclusion is not something which is 96% true, but is instead something which is say 30% true....

But they treat that conclusion like it is 100% true. Each phase of the argument is extremely plausible, but a chain of extremely plausible arguments can lead you somewhere very implausible when chained serially.

Expand full comment
fillups44's avatar

As was said in Love & Death, “To love is to suffer. To avoid suffering one must not love. But then one suffers from not loving. Therefore, to love is to suffer; not to love is to suffer; to suffer is to suffer. To be happy is to love. To be happy, then, is to suffer, but suffering makes one unhappy. Therefore, to be happy one must love or love to suffer or suffer from too much happiness.”

That’s exactly the way I feel when I read a lot of these arguments except the conclusions are even more implausibly absurd. I’m shocked when I see people who seem smart but end up getting their heads messed up by these bizarre thought experiments.

Expand full comment
tropical depression's avatar

the roko’s basilisk thing feels to me like the set up for an elaborate aristocrats style joke….

There’s a group of people who believe that there’s a future super computer that doesn’t yet exist but which can read your mind from the future and has hurt feelings you aren’t supporting it and thus will torture you forever unless you do what it wants and they believe this because they made it up in a blog comment….and the name of this group?

The Rationalists.

Expand full comment
Pam Param's avatar

Jesse at one point says you can’t call rationalists all dumb, they were ahead of the curve on AI and crypto, and like.. crypto is a ponzi scheme with no legal real-world utility. AI is real (but probably overvalued) but rationalists get absolutely no credit for it: none of them wrote code or developed machine learning tech, that all came from mainstream academia and industry, and ‘AI alignment’ in practice has meant making LLMs sound like agreeable woke HR managers, not anything Yudkowsky et al produced. They’ve sat on the sidelines building up their own brand and are trying to stamp it on stuff they had nothing to do with.

Expand full comment
Penguin/Mom's avatar

Yeah, it all sounds a bit grifty and self-aggrandizing.

Expand full comment
Jane's avatar

I like a lot of Scott Alexander's work and can understand why someone like Trace would be a fan, too. There are a lot of different Rationalists, and I don't think the existence of one odd Rationalist cult need vitiate the entire movement.

Yudkowsky always seemed weird and speculative enough to me that I didn't bother with him, so I will neither defend nor attack him. Thanks for your take.

Expand full comment
Satisficer's avatar

To be clear I like a lot of Scott Alexander's stuff too. Fortunately he's mellowed and developed a sense of humor around a number of topics he used to be genuinely angry about.

I also think the merger with effective altruism has been pretty healthy for the movement too (for rationalism I mean, not sure about the effect on EA). Scott seems more aligned with EA than rationalism these days, which is something I can respect a lot more.

Expand full comment
PNWGirl's avatar

Im curious, why is Trace sympathetic to the rationalists?

Expand full comment
Satisficer's avatar

He said in the episode that he has a lot of friends in the rationalist community and maybe considers himself rationalist-adjacent, which is not at all surprising to me considering the era of the internet he came up in (more or less the same one as me, though I was way less online than Trace was lol). In the early 2010s a whole lot of people online who considered themselves smart freethinkers and had the attention span to read through massive walls of text ended up drawn to the rationalists.

Expand full comment
ZEK's avatar

Yudkowsky was the Harry Potter fanfiction guy, right? Strangely enough, he wasn’t even the first fanfic writer in that fandom to form a cult.

Expand full comment
MindTheEels's avatar

wait please elaborate

Expand full comment
ennui's avatar

EY wrote a 1500 page Harry Potter fan fiction, available here: https://hpmor.com/

It starts off with a kind of promise of unveiling EY's rational philosophy within the confines of that universe but do not be fooled; like any fan fiction it's a self-serving adventure fantasy, "what if autistic nerd geniuses like me went to Hogwarts". To be fair though, it's decently written for a fanfic.

Expand full comment
MindTheEels's avatar

661,000 words!!!!! oh my god. as a fandom gal myself i almost respect it — or at least feel much more normal by comparison.

Expand full comment
jquinby's avatar

"Atlas Snored"

Expand full comment
Penguin/Mom's avatar

What you say about this community being a little borderline from the start would confirm what I was able to gather from the episode, other comments here and a few Internet searches on the topic. Something strikes me in many of these intellectual endeavors and the prominent figures of the movement, and that’s how detached from basic human decency and sociopathic they appear. And especially how all of this tech/nerdy shit is almost systematically presented as unavoidable. What the hell does that mean other than a total depoliticization of these ideas as a means to impose them upon us? This is both economically and institutionally supported propaganda, a self-fulfilling prophecy.

Expand full comment
Kathleen's avatar

Doesn’t Yudkowsky claim that infanticide is morally defensible, of am I thinking of some other nutter?

Expand full comment
dollarsandsense's avatar

Peter Singer makes that argument

Expand full comment
It's Complicated's avatar

Maybe you are thinking of Peter Singer?

Expand full comment
Kathleen's avatar

Yes, I think you’re right.

Expand full comment
Satisficer's avatar

He wrote a little sci-fi story where there were (unsympathetic) aliens that practiced infanticide, but that's the only thing I remember linking him to the topic.

Edit: Oh, but I think in that story or a different one he suggested that rape was morally defensible? I could be confused though, it was a long time ago

Expand full comment
Skull's avatar

Yeah but how many of them over there take rokos basilisk seriously? This reads like you're only commenting on their weirdest takes, which would make any group look bad. One of their most recent discussions was about whether classical architecture looks as good as modern architecture. What weirdos and freaks!

Expand full comment