The rationalist community as a whole is remarkably functional. Like any subculture, it is rife with gossip, personality conflicts, and drama that is utterly incomprehensible to outsiders. But overall, the community’s activities are less drinking the Kool-Aid and more mutual support and vegan-inclusive summer barbeques.
Nevertheless, some groups within the community have wound up wildly dysfunctional-a term I’m using to sidestep definitional arguments about what is and isn’t a cult. And some of the blame can be put on the rationalist community’s marketing.
The Sequences make certain implicit promises. There is an art of thinking better, and we’ve figured it out. If you learn it, you can solve all your problems, become brilliant and hardworking and successful and happy, and be one of the small elite shaping not only society but the entire future of humanity.
This is, not to put too fine a point on it, not true.
Multiple interviewees remarked that the Sequences create the raw material for a cult. To his credit, their author, Eliezer Yudkowsky, shows little interest in running one. He has consistently been distant from and uninvolved in rationalist community-building efforts, from Benton House (the first rationalist group house) to today’s Lightcone Infrastructure (which hosts LessWrong, an online forum, and Lighthaven, a conference center). He surrounds himself with people who disagree with him, discourages social isolation, and rarely directs his fans to do anything other than read his BDSM-themed fanfiction.
But people who are drawn to the rationalist community by the Sequences often want to be in a cult. To be sure, no one wants to be exploited or traumatized. But they want some trustworthy authority to change the way they think until they become perfect, and then to assign them to their role in the grand plan to save humanity. They’re disappointed to discover a community made of mere mortals, with no brain tricks you can’t get from Statistics 101 and a good CBT workbook, whose approach to world problems involves a lot fewer grand plans and a lot more muddling through.
Black Lotus used a number of shared frameworks, including the roleplaying game Mage: the Ascension, that would allow them to cut through social norms and exercise true agency over their lives. Brent supposedly had the most insight into the framework, and so had a lot of control over the members of Black Lotus — control he was unable to use wisely.
However, if Brent wasn’t there, Black Lotus would have been fine. One interviewee said that, when Brent wasn’t there, Black Lotus led to beautiful peak experiences that he still cherishes: “Brent surrounded himself with people who built the thing he yearned for, missed, and couldn’t have.”
But in other cases — as in Leverage Research — the toxic dynamics emerged from the bottom up. Interviewees with experience at Leverage Research were clear that there was no single wrongdoer. Leverage was fractured into many smaller research groups, which did everything from writing articles about the grand scope of human history to incubating a cryptocurrency. Some research groups stayed basically normal to the end; others spiralled into self-perpetuating cycles of abuse. In those research groups, everyone was a victim and everyone was a perpetrator. The trainer who broke you down in a marathon six-hour debugging session was unable to sleep because of the panic attacks caused by her own.
Worse, the promise of the Sequences is more appealing to people who have very serious life problems they need desperately to solve. While some members of dysfunctional rationalist groups are rich, stable, and as neurotypical as rationalists ever get, most are in precarious life positions: mentally ill (sometimes severely), traumatized, survivors of abuse, unemployed, barely able to scrape together enough money to find a place to sleep at night in the notoriously high-rent Bay Area. Members of dysfunctional rationalist groups are particularly likely to be transgender: transgender people are often cut off by their families and may have a difficult time finding friends who accept them as they are. The dysfunctional group can feel like a safe haven from the transphobic world.
People in vulnerable positions are both more likely to wind up mistreated and less likely to be able to leave. Elizabeth Van Nostrand, who knows many members of dysfunctional groups both rationalist and non-rationalist, said, “I know people who've had very good experiences in organizations where other people had very bad ones. Sometimes different people come out of the same group with very different experiences, and one of the major differences is whether they feel secure enough to push back or leave if they need to. There isn't a substitute for a good BATNA.” 1
Still, vulnerability alone can’t explain why some members of the rationalist community end up in abusive groups. Mike Blume was a member of Benton House, which was intended to recruit talented young rationalists. He said, “I was totally coming out of a super depressive and dysfunctional phase in my life, and this was a big upswing in my mood and ability to do things. We were doing something really important. In retrospect, I feel like this is the sort of thing you can't do forever. You burn out on it eventually. But I would wake up in the morning and I'd be a little bit tired and trying to get out of bed and I'd be like, well, you know, the lightcone 2 depends on me getting out of bed and going to sleep and learning how to program. So I'd better get on that.”
Mike Blume was depressed, lonely, and unemployed when he entered the rationalist community, and he sincerely believed in both the art of rationality and the importance of the cause. The difference wasn’t his vulnerability. The difference was that his community helped him use his idealism and belief in the cause to learn real skills, become less depressed, and get to a more stable place.
One interviewee observed that the early rationalist community had been more supportive of less functional rationalists, perhaps because it was smaller. While it wasn’t capable of transforming them into a superhumanly rational elite (no one can do that), it helped them learn useful skills, and become independent. This interviewee said that, once the early rationalists became functional, they pulled the ladder up behind them. They (understandably) only wanted to hang out with people who already have their shit together. But without the support of more successful people, less functional new rationalists can be easy prey for anyone willing to offer help.
I’m not sure I agree. The early rationalist community had a number of success stories; it also had a guy that multiple people referred to, sighed, and said “that wasn’t a cult, he just did too many whippets.” The rationalist community I see provides a lot of support to many people who are neurodivergent, traumatized, or transgender; it also fails a lot of people.
Link nội dung: https://ohanapreschool.edu.vn/there-are-many-a31437.html