Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

To be fair, the “how to deal with your many cognitive biases” part is not what the rationalist movement is generally hated on.


> what the rationalist movement is generally hated on

Out of genuine curiosity, what causes hatred/resentment towards this community?


They are accused of being basically a doomsday cult for intellectuals with an extreme fanatical focus on AI doomerism.

They also have a huge number of really unusual social norms including using their own ultra nerdy lingo with lots of obscure fiction references in regular life, widespread polyamory - and often vocal disapproval of monogamy, outspoken rejection of sexual norms including group sex parties and raising kids communally in polyamorous group homes, rejection of all political correctness, and willingness to discuss normally taboo topics in casual conversation.

In a lot of ways it reminds me of the beatniks- they're basically throwing out all of the existing culture, and trying to create something entirely new by trial and error, sometimes with quite bad results.

I've learned an awful lot of good ideas from the community that I've applied directly in my career as a scientist, and attended a few events in person, but personally wasn't able to connect with the people, I always felt like an outsider and I also found a lot of their blatant rejection and reinvention of virtually all social norms somewhat disturbing in person. There are also a lot of really kind, open minded, and brilliant people in the community- and I personally think most of their concerns over AI are well founded, but not everyone agrees.


It sounds like your primary critique here is that "rationalist" communities overlap in membership with other communities that have other, not strictly related, inclinations. That may be so, but I'm not sure it's relevant -- if a bowling league's membership consists primarily of Mormons, I still wouldn't interpret criticism of Mormon theology as being relevant to discussions of bowling.


I wouldn't characterize the rationalists as just a loose knit online community with a common interest in rational thinking that happen to overlap with some other unusual interests- but as a real life community and culture - centered around a particular group of people mostly in the Northern California "East Bay Area" with a very unusual lifestyle and social norms they've collectively invented within the movement, that includes all of what I mentioned as central aspects. It's a broad social experiment of trying to reinvent everything "rationally" instead of just doing what their culture or parents taught them.

See for example: https://putanumonit.com/2019/10/16/polyamory-is-rational/ "The Rationalist community isn’t just a sex cult, they do other great things too!"

I find that post hilarious, because the polling your friends and doing statistics on it thing is even more stereotypically rationalist than polyamory itself, but they conclude from poll data that most of the rationalists came to polyamory from within the movement itself, not from an existing or outside interest in it.

There is a larger international group of people that participate remotely and don't relocate or adopt the full lifestyle, but it would be a mistake to think of that as something that exists entirely separately, or would exist at all without that core community.


I'm confused then -- if you aren't construing the larger community of people following these ideas and participating remotely as being separate from the "core" group, then how do the more unusual lifestyles that only the "core" group follow describe the entirety of it?

The way you're describing it seems similar to looking at the lifestyles of monastic orders within the Catholic church as indicative of the way Catholics live generally.


I was actually thinking of the same exact analogy- of having a monastic order and lay people with varying levels of commitment, but didn't put it in my reply because I couldn't think of a clear way to not overuse the analogy.

Nobody would say the Catholics are a group of lay religious people that also happen for some reason to overlap in membership with another unrelated group that enjoys monastic lifestyles. The monastic lifestyle is a central key part of the religion, even if it isn't what every Catholic chooses to do. It doesn't describe the entirety of the religion either. Both the core group that follow the full lifestyle together in person, and more distant or less involved participants are all together the same movement- with both the Catholics and Rationalists.

Importantly- when one criticizes the actions of Catholic monastics, it is considered relevant as criticism of the entire organization and religion, unlike the bowling example you gave. People do rightfully blame the Catholics for things like the Spanish Inquisition, and for protecting child abusers and rapists in their monastic communities, even if the average lay person had no involvement in these beyond supporting the religion financially and socially.

One could be a Mormon and fundamentally disapprove of bowling, even if a lot of other Mormons do it, but you probably aren't going to make it as a Catholic if you think monastic lifestyles are immoral or harmful. You probably won't make it as a rationalist either if you think things like utilitarian ethics, and nonmonogamy are immoral or harmful.


You've made a good argument here -- I'll have to consider it further.

I suppose I'm trying to separate the "rationalist" ideas, interpreted as a methodology of reasoning, from the normative positions that some communities advancing those methods have converged upon, even where the application of that reasoning methodology might have been involved in forming those other positions.

I do think that devotion to AI eschatology, nonmonogamy, and utilitarianism do not necessarily proceed merely from rational inquiry, and require additional normative or empirical precepts as inputs, many of which may have circulated in those communities in parallel to the discourse on reasoning. So that's sort of what the Mormon/bowling analogy was getting at.


"The rationalists" don't own rationality. I don't think the specific community of people I'm talking about that call themselves rationalists have a monopoly on actually teaching practical rational thinking, although they do have some very good materials that explain a lot of valuable ideas and concepts, which I am grateful for.

From their own philosophy, they claim that "rationality is systematized winning" and everyone I've known that decided to focus their life around any of the 3 things you mentioned above, had consequences that were close to the exact opposite of "systematized winning."


It's worth noting that basically every major founding member of the "Rationalist" community was, in fact, part of these other communities. While I don't normally consider criticism of Mormon theology relevant to bowling, it does seem relevant to critiquing the Mormon Bowling League of Utah.

I personally find all those norms refreshing, mind you. I'm just saying this is a place where they're really intractably interwoven. I'd assume if bowling was invented by Mormons, there'd be a lot of people thrown off when the word "Jesus" shows up in the section on evaluating strikes. Similarly, many people are thrown off when reading about statistics and it suddenly concludes God is dead and you should be polyamorous.


Mostly it’s way too full of itself. “Here’s how to think to be less wrong” (to borrow the name of one of the main sites) gives way to “since we know how to think, we’re smarter than everyone else.” Techniques like Bayesian inference get used to put a mathematical veneer on total guesswork or rationalize what the person wants to do anyway.

Take longtermism, for example. This is a segment of the rationalist community that focuses on doing the most good for humanity in the very long term. The basic idea goes: if humanity is able to get off this planet and go colonize the galaxy, there are untold quintillions of additional lives that would be lived. But that future is uncertain. Something that increases the chances of it happening by 0.1% would have an expected value of saving quadrillions of lives. If you can increase these chances by one in a trillion, that’s worth orders of magnitude more than saving a child’s life right now.

This is sound thinking so far. A fun little thought experiment. The problem is that you can’t rigorously apply it practically. Predicting the future of humanity is hard and probabilities assigned to various events aren’t rigorous. In practice, this mindset either leads to fairly obvious conclusions like that it’s important to fight climate change, or it’s off the wall stuff like being obsessed with AI safety. And the veneer of math produces an attitude that anyone who disagrees is not only wrong, but provably wrong in a mathematical fashion, which doesn’t tend to endear.


The rationalist idea of doing morality as math with utilitarian consequentialism always seemed dangerous and a big mistake to me. It is easy to rationalize things that are obviously awful or absurd from common sense, and not meaningfully consistent with normal human experience or human brains and motivations. SBF for example justified all of his crimes with rationalist logic.

I'm not going to walk past a drowning kid in a lake so I can urgently go to a nerd meeting planning to save a quintillion imaginary sci-fi distant future kids - even if some made up math says the expected value of the meeting is a thousand times higher.

Fundamentally, I do have deontological ethics- I think the ancient stoics basically had morality/ethics right, and admire people that take a Socrates like stand on doing what is right on principle even in the face of manipulative people trying to control you by creating bad consequences.


It’s not just dangerous, but plainly incorrect in most cases.

It’s the usual GIGO problem. These arguments almost always start with a bunch of completely made-up numbers. It doesn’t matter how good the math is, the results will be useless.

It can work. When a government regulator decides whether to mandate some new safety equipment and after rigorous technical analysis concludes that it would result in net lives lost and so doesn’t require it, that’s sensible. But thats not what happens here.

I occasionally see this problem acknowledged, but even then, the given error bars are way too small and then it’s just full steam ahead anyway.

It could be dangerous anyway, but this makes it even more so.


Yeah, I think it is literally provably 'optimal' if you can execute it correctly with informative data, don't forget or omit any important considerations, and aren't just making up BS- all of which are almost always impossible for regular humans in real life no matter how much 'rationality training' they've had. It makes sense both for optimal behavior of some hypothetical superintelligent AI to realize its own goals efficiently, or for something like a government to weigh pros and cons of a difficult regulatory choice with well defined short term consequences - neither of which are anything like the everyday morality decisions humans make.


I take an even more sour view of this thought process. I don't actually think that SBF did the math and concluded that rationality justified his crimes. I think that he wanted to do those crimes and then, consciously or unconsciously, spread the veneer of rationality over them as a form of self-justification.

I think that a community that engages in brute math with unbounded values for priors to justify action would be worrisome. By choosing the right priors you can conclude almost anything. But I actually think that it is just roughly the same decision making that the rest of the world makes, with an unusual post-facto justification that also feeds one's ego.


It seems like that to me as well- that the whole thing can be a manipulative way to make what you wanted to do anyways seem somehow objectively correct. Which is basically the postmodernist criticism of any attempt to use logic or science for anything- and in some cases is valid.


I imagine that the leader of the Singularity Institute (now Center for Applied Rationality) using the workshops as a recruitment ground for his personal psychedelic drug cult, and his followers then killing a bunch of innocent people, did not do any favors to their perception among the wider public [1].

(This is about the SF Bay Area subculture dubbed the rationalists, obviously not related to the philosophical moment of the same name, critiqued by Kant in the late 18th century)

[1] https://medium.com/@sefashapiro/a-community-warning-about-zi...


This is incredible reading. It's hard to believe, but I believe it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: