I grew up with some friends who were deep into the early roots of online rationalism, even slightly before LessWrong came online. I've been around long enough to recognize the rhetorical devices used in rationalist writings:
> Aren't these the people who started the trend of writing things like "epistemic status: mostly speculation" on their blog posts? And writing essays about the dangers of overconfidence? And measuring how often their predictions turn out wrong? And maintaining webpages titled "list of things I was wrong about"?
There's a lot of in-group signaling in rationalist circles like the "epistemic status" taglines, posting predictions, and putting your humility on show.
This has come full-circle, though, and now rationalist writings are generally pre-baked with hedging, both-sides takes, escape hatches, and other writing tricks that make it easier to claim they weren't entirely wrong in the future.
A perfect exaple is the recent "AI 2027" doomsday scenario that predicts a rapid escalation of AI superpowers followed by disaster in only a couple years: https://ai-2027.com/
If you read the backstory and supporting blog posts from the authors they are filled to the brim with hedges and escape hatches. Scott Alexander wrote that it was something like "the 80th percentile of their fast scenario", which means when it fails to come true he can simple say it wasn't actually his median prediction anyway and that they were writing about the fast scenario. I can already predict that the "We were wrong" article will be more about what they got right with a heavy emphasis on the fact that it wasn't their real median prediction anyway.
I think this group relies heavily on the faux-humility and hedging because they've recognized how powerful it is to get people to trust them. Even the comment above is implying that because they say and do these things, they must be immune from the criticism delivered above. That's exactly why they wrap their posts in these signals, before going on to do whatever they were going to do anyway.
Yes, I do think that these hedging statements make them immune from the specific criticism that I quoted.
If you want to say their humility is not genuine, fine. I'm not sure I agree with it, but you are entitled to that view. But to simultaneously be attacking the same community for not ever showing a sense of maybe being wrong or uncertain, and also for expressing it so often it's become an in-group signal, is just too much cognitive dissonance.
> es, I do think that these hedging statements make them immune from the specific criticism that I quoted.
That's my point: Their rhetorical style is interpreted by the in-group as a sort of weird infallibility. Like they've covered both sides and therefore the work is technically correct in all cases. Once they go through the hedging dance, they can put forth the opinion-based point they're trying to make in a very persuasive way, falling back to the hedging in the future if it turns out to be completely wrong.
The writing style looks different depending on where you stand: Reading it in the forward direction makes it feel like the main point is very likely. Reading it in the backward direction you notice the hedging and decide they were also correct. Yet at the time, the rationalist community attaches themselves to the position being pushed.
> But to simultaneously be attacking the same community for not ever showing a sense of maybe being wrong or uncertain, and also for expressing it so often it's become an in-group signal, is just too much cognitive dissonance.
That's a strawman argument. At no point did I "attack the community for not ever showing a sense of maybe being wrong or uncertain".
> They don't really ever show a sense of "hey, I've got a thought, maybe I haven't considered all angles to it, maybe I'm wrong - but here it is". The type of people that would be embarrassed to not have an opinion on a topic or say "I don't know"
edit: my apologies, that was someone else in the thread. I do feel like between the two comments though there is a "damned if you do, damned if you don't". (The original quote above I found absurd upon reading it.)
Haha my thoughts exactly. This HN thread is simultaneously criticizing them for being too assured, not considering other possibilities, and hedging that they may not be right and other plausibilities exist.
This is right, but doesn't actually cover all the options. It's damned if you [write confidently about something and] do or don't [hedge with a probability or "epistemic status"].
But the other option, which is the one the vast majority of people choose, is to not write confidently about everything.
It's fine, there are far worse sins than writing persuasively about tons of stuff and inevitably getting lots of it wrong. But it's absolutely reasonable to criticize this choice, irregardless of the level of hedging.
Well, on a meta level, I think their community has decided that in general it's better to post (and subsequently be able to discuss) ideas that one is not yet very confident about, and ideally that's what the "epistemic status" markers are supposed to indicate to the reader.
They can't really be blamed for the fact that others go on to take the ideas more seriously than they intended.
(If anything, I think that at least in person, most rationalists are far less confident and far less persuasive than the typical person in proportion to the amount of knowledge/expertise/effort they have on a given topic, particularly in a professional setting, and they would all be well-served to do at least a normal human amount of "write and explain persuasively rather than as a mechanical report of the facts as you see them".)
(Also, with all communities there will be the more serious and dedicated core of the people, and then those who sort of cargo-cult or who defer much, or at least some, of their thinking to members with more status. This is sort of unavoidable on multiple levels-- for one, it's quite a reasonable thing to do with the amount of information out there, and for another, communities are always comprised of people with varying levels of seriousness, sincere people and grifters, careful thinkers and less careful thinkers, etc. (see mobs-geeks-sociopaths))
(Obviously even with these caveats there are exceptions to this statement, because society is complex and something about propaganda and consequentialism.)
Alternately, I wonder if you think there might be a better way of "writing unconfidently", like, other than not writing at all.
Yeah I think you're getting at what my skepticism stems from: The article with the 55% certain epistemic status and the article with the 95% certain epistemic status are both written with equal persuasive oomph.
In most writing, people write less persuasively on topics they have less conviction in.
> That's a strawman argument. At no point did I "attack the community for not ever showing a sense of maybe being wrong or uncertain".
Ok, let's scroll up the thread. When I refer to "the specific criticism that I quoted", and when you say "implying that because they say and do these things, they must be immune from the criticism delivered above": what do you think was the "criticism delivered above"? Because I thought we were talking about contrarian1234's claim to exactly this "strawman", and you so far have not appeared to not agree with me that this criticism was invalid.
If putting up evidence about how people were wrong in their predictions, I suggest actually pointing at predictions that were wrong, rather than on recent predictions about the future that that you disagree over how they will resolve. If putting up evidence about how people make excuses for failing predictions, I suggest actually showing them do so, rather than projecting that they will do so and blaming them for your projection.
It's been a while since I've engaged in rationalist debates, so I forgot about the slightly condescending, lecturing tone that comes out when you disagree with rationalist figureheads. :) You could simply ask "Can you provide examples" instead of the "If you ____ then I suggest ____" form.
My point wasn't to nit-pick individual predictions, it was a general explanation of how the game is played.
Since Scott Alexander comes up a lot, a few randomly selected predictions that didn't come true:
- He predicted at least $250 million in damages from Black Lives Matter protests.
- He predicted Andrew Yang would win the 2021 NYC mayoral race with 80% certainty (he came in 4th place)
- He gave a 70% chance to Vitamin D being generally recognized as a good COVID treatment
It's also noteworthy to read that a lot of his predictions are about his personal life, his own blogging actions, or [redacted] things. These all get mixed in with a small number of geopolitical, economic, and medical predictions with the net result of bringing his overall accuracy up.
> He predicted at least $250 million in damages from Black Lives Matter protests.
He says
> 5. At least $250 million in damage from BLM protests this year: 30%
which, by my reading means he assigns it greater-than-even odds that _less_ than $250 million dollars in damages happened (I have no understanding of whether or not this result is the case, but my reading of your post suggests that you believe that this was indeed the outcome).
You say
> He gave a 70% chance to Vitamin D being generally recognized as a good COVID treatment
while he says
> Vitamin D is _not_ generally recognized (eg NICE, UpToDate) as effective COVID treatment: 70%
(emphasis mine)
For what it's worth, your comments in this thread have been very good descriptions of things I became frustrated with after once being quite interested / enthralled with this community / movement!
(I feel like you're probably getting upvotes from people who feel similarly, but sometimes I feel like nobody ever writes "I agree with you" comments, so the impression is that there's only disagreement with some point being made.)
Thanks for sharing. You summed it up well: The community feels like a hidden gem when you first discover it. It feels like there's an energy of intelligence buzzing about interesting topics and sharing new findings.
Then you start encountering the weirder parts. For me, it was the group think and hero worship. I just wanted to read interesting takes on new topics, but if you deviated from the popular narrative associated with the heroes (Scott Alexander, Yudkowski, Cowen, Aaronson, etc.) it felt like the community's immune system identified you as an intruder and started attacking.
I think a lot of people get drawn into the idea of it being a community where they finally belong. Especially on Twitter (where the latest iteration is "TPOT") it's extraordinarily clique-ish and defensive. It feels like high school level social dynamics at play, except the players are equipped with deep reserves of rhetoric and seemingly endless free time to dunk on people and send their followers after people who disagree. It's a very weird contrast to the ideals claimed by the community.
Well nobody sent me; instead I had the strange experience of waking up this morning, seeing an interesting post about Scott Aaronson identifying as a rationalist, and when I check the discussion it's like half of HN has decided it's a good opportunity to espouse everything they dislike about this group of people.
Since when is that what we do here? If he'd written that he'd decided to become vegetarian, would we all be out here talking about how vegetarians are so annoying and one of them even spat on my hamburger one time?
And then of these uncalled-for takedowns, several -- including yours -- don't even seem to be engaging in good-faith discourse, and seem happy to pile on to attacks even when they're completely at odds with their own arguments.
I'm sorry to say it but the one who decided to use their free time to leer at people un-provoked over the internet seems to be you.
Seems like a perfectly reasonable thing to discuss in the comments on this article, and I don't know which other "takedowns" you're referring to, but this person's comments on it have not been in bad faith at all.
(Indeed, I think it's in worse faith to try to guilt trip people who are just expressing critical opinions. It's fine - good, even! - to disagree with those people, but this particular comment has a very "how dare you criticize something!" tone that I don't think is constructive.)
That sounds an awful lot like other people making stuff up to oppress me by sticking a "condescending" label to me without me having any way to contest it.
That sounds an awful lot like a victimhood complex.
"Are you being condescending" is a subjective judgement that other people will make up their own minds about. You can't control what people think about things you say and do, and they aren't "oppressing" you by making up their own minds about that.
> it was a general explanation of how the game is played.
You seem to be trying to insinuate that Alexander et. al. are pretending to know how things will turn out and then hiding behind probabilities when they don't turn out that way. This is missing the point completely. The point is that when Alexander assigns an 80% probability to many different outcomes, about 80% of them should occur, and it should not be clear to anyone (including Alexander) ahead of time which 80%.
> He predicted at least $250 million in damages from Black Lives Matter protests.
Edit: I see that the prediction relates to 2021 specificially. In the wake of 2020, I think it was perfectly reasonable to make such a prediction at that confidence level, even if it didn't actually turn out that way.
> He predicted Andrew Yang would win the 2021 NYC mayoral race with 80% certainty (he came in 4th place)
> He gave a 70% chance to Vitamin D being generally recognized as a good COVID treatment
If you make many predictions at 70-80% confidence, as he does, you should expect 20-30% of them not to come true. It would in fact be a failure (underconfidence) if they all came true. You are in fact citing a blog post that is exactly about a self-assessment of those confidence levels.
Also, he gave a 70% chance to Vitamin D not being generally recognized as a good COVID treatment.
> These all get mixed in with a small number of geopolitical, economic, and medical predictions with the net result of bringing his overall accuracy up.
The point is not "overall accuracy", but overall calibration - i.e., whether his assigned probabilities end up making sense and being statistically validated.
You have done nothing to establish that any correlation between the category of prediction and his accuracy on them.
I genuinely don't understand how you can point to someone's calibration curve where they've broadly done well, and cherry pick the failed predictions they made, and use this not just to claim that they're making bad predictions but that they're slimy about admitting error. What more could you possibly want from someone than a tally of their prediction record graded against the probability they explicitly assigned to it?
lol, what? That was a civil comment. This seems like an excellent example of the point being made. Replying to a perfectly reasonable but critical comment with "please be civil" is super condescending.
So is stuff like "one man's modus ponens".
Look, we get it, you're talking to people who found this stuff smart and interesting in the past. But we got tired of it. For me, I realized after awhile that the people I most admired in real life were pretty much the opposite of the people I was reading the most on the internet. None of the smartest people I know talk in this snooty online iamsosmart style.
> This isn’t about me being an expert on these topics and getting them exactly right, it’s about me calibrating my ability to tell how much I know about things and how certain I am.
> At least $250 million in damage from BLM protests this year: 30%
Aurornis:
> I forgot about the slightly condescending, lecturing tone that comes out when you disagree with rationalist figureheads.
> Since Scott Alexander comes up a lot, a few randomly selected predictions that didn't come true:
> He predicted at least $250 million in damages from Black Lives Matter protests.
Is this a "perfectly reasonable but critical comment"?
Am I condescending if I say that predicting a 30% chance that something happens means predicting a 70% chance that it won't happen... so the fact that it didn't happen probably shouldn't be used as "gotcha!"?
(I did waffle upon re-reading my comment and thinking it could have been more civil. But then decided that this person is also being very thin skinned. So I think you're right that we're both right.)
> Aren't these the people who started the trend of writing things like "epistemic status: mostly speculation" on their blog posts? And writing essays about the dangers of overconfidence? And measuring how often their predictions turn out wrong? And maintaining webpages titled "list of things I was wrong about"?
There's a lot of in-group signaling in rationalist circles like the "epistemic status" taglines, posting predictions, and putting your humility on show.
This has come full-circle, though, and now rationalist writings are generally pre-baked with hedging, both-sides takes, escape hatches, and other writing tricks that make it easier to claim they weren't entirely wrong in the future.
A perfect exaple is the recent "AI 2027" doomsday scenario that predicts a rapid escalation of AI superpowers followed by disaster in only a couple years: https://ai-2027.com/
If you read the backstory and supporting blog posts from the authors they are filled to the brim with hedges and escape hatches. Scott Alexander wrote that it was something like "the 80th percentile of their fast scenario", which means when it fails to come true he can simple say it wasn't actually his median prediction anyway and that they were writing about the fast scenario. I can already predict that the "We were wrong" article will be more about what they got right with a heavy emphasis on the fact that it wasn't their real median prediction anyway.
I think this group relies heavily on the faux-humility and hedging because they've recognized how powerful it is to get people to trust them. Even the comment above is implying that because they say and do these things, they must be immune from the criticism delivered above. That's exactly why they wrap their posts in these signals, before going on to do whatever they were going to do anyway.