I've been involved (or at least following) the rationalist and EA communities since the beginning. I call it a cult somewhat tongue in cheek, but it certainly has a lot more cult-like aspects beyond just "pretty concerned about AI risk." I mean, they have a charismatic leader whose unusual ideas about everything from AI risk to sexuality are more or less carbon copied, and practiced in group houses, etc. in some pretty creepy ways.
Literally anyone from the outside would easily be convinced they were pretty much your standard apocalyptic sex cult, just from accurately describing it.
I don't really care if people on EA forums don't all agree with SBF, but his type of thinking is standard if you use utilitarianism to make decisions in the real world, and it leads to some pretty horrific stuff. Consequentialism / utilitarianism are widely accepted in EA, and if you take that to the extreme, it can justify things like this.
They don't. Everyone in EA (AFAICT) has been pretty clear about this. Lying and undermining trust and institutions does tremendous lasting harm.
I am also tired of "people are very concerned about X and think that it's important, so they're basically a cult".