sofiechan home

How did Effective Altruism get so big?

anon_nahe said in #2015 1y ago: received

I disagree with many of the views behind EA, but I've been fascinated by how they managed to make institutions, build a scene, start companies, NGOs, get funding, etc.

I know very smart people who dropped out of school to join an EA organization - and it was not a "risky" move at all. It was a great move for their careers.

The only blog post that I've found that talks about this is Nadia's blog on "idea machines" - https://nadia.xyz/idea-machines.

Maybe it's the ideology that convinced people and it served as the gravity well to form EA around it. - Maybe we just need a compelling ideology that allows us to build an entire ecosystem around us.

But maybe it has to be something different...

Moldbug got me into the author called "Gaetano Mosca" who mentions that every regime has 4 classes: priests, merchants, warriors and peasants.

It's clear to me that EA in some way was a way for the techies (merchants) to build up a priestly class. Maybe building up a priest class is already mined out - maybe what we need to do is build up a peasant class or a warrior class?

I disagree with many received

anon_dany said in #2016 1y ago: received

Good post. My view is that EA's greatest strength is how it plays on human psychology.

Many people want to make huge amounts of money.
Many people feel guilty about this.
Many people want deeper meaning in their life.

EA is an ideology that legitimizes the first through the third, and thus eliminates the second.

The meaning in question is also basically the lowest common denominator you can come up with: altruism. Everyone wants to do "good," of course.

It uses this structure to funnel money from the rest of the economy into its own coffers. EA is just a Church stripped bare of any esoteric or ancient sacred symbolism. A religion for a sterile and autistic age, and especially the tech community.

referenced by: >>2017

Good post. My view i received

anon_nahe said in #2017 1y ago: received

>>2016
You might be right that it's an ideology that allows people to cloak their ambition with altruism, but that doesn't explain why they were able to build an entire ecosystem around them.

"Let's make a lot of money so we can improve the world " --- (broken link) ---> "give *us* a lot of money and human capital to improve the world in the specific way we think it's the right one"

You might be right t received

anon_ruvu said in #2018 1y ago: received

> that doesn't explain why they were able to build an entire ecosystem around them

I think it's less that they were able to build an entire ecosystem around them, than that the the financialization that was refined over the course of the 20th century created conditions for such an ideology to become the preferred legitimizing ideology of the elites.

> Maybe building up a priest class is already mined out - maybe what we need to do is build up a peasant class or a warrior class?

The warrior class could be the class capable of producing the industrial warfare equipment (i.e. drones). But the need for the priestly class remains. A new priestly class would ideally absorb the old priestly class, leaving the old priestly class as the middle of the priestly class. That is, by the new priestly class' redefinition of "good", the old priestly class now serves this new definition of "good" instead of the previous ultilitarian definition of good as defined in the financial world order.

I think it's less th received

anon_hodu said in #2019 1y ago: received

Yeah this is a very important question. People already touched on the precondition of a lot of money flooding into tech and the death of religion and alternate forms of meaning and community.

I think another crucial factor is that effective altruism, as an ideology, *encapsulates* the idea that giving money is how you make a difference, thus overcoming the money-brain issues that Wolf talked about in "Quit Your Job." And if financial sacrifice is built into the ideology, then it's a minimal change to redirect money from bednets to evangelism and sinecures, which allows you to build up institutions and spread rapidly.

Basically I think "that's a nice ideology, how many FTEs have you got?" is an underrated metric of success. Ideas are common; modern urban/Californian flakiness - both in time and willingness to commit money for no financial return - is perhaps *the* key challenge to overcome for any movement, and EA is inherently shaped to overcome it.

Yeah this is a very received

anon_lyse said in #2021 1y ago: received

I'm writing a longer article about this but my basic analysis is that the EAs have one big virtue they are fairly unique on: they treat money as a strategic resource to be disposed of rapidly for effective impact, whereas everyone else treats money like a transcendent private good. That alone almost explains their success. It's table stakes for building a powerful movement and getting things done.

Giving up your money, and going above and beyond to get more money to give, to your friends' ambitious social projects is a direct recipe for a powerful movement. It's also how you get rich, because you build up a bunch of.great machinery of legitimacy and operation that demands funding, creates a powerful network of people with and ethic of helping each other up in the world (see how EAs talk about "EAs").

Imagine for comparison if "based" people behaved anywhere in the same universe as "ea" people just on this one point. Club tropical excellent would be putting up gymnasium-libraries in every town in america, there would be powerful organizations going for the end of animal cruelty and a renewal of conservationism, there would be many cutting edge think tanks and publications with major success, there would be major lawfare organizations correcting legal corruption, literary movements, art movements, and mutual aid fraternities full of social power for normal Americans. This is just a straightforward scale-up of what people in the sphere are already doing, with the money the sphere already has, just directing that money towards "the cause" rather than "muh grill".

Of course this raises a bunch of related secondary elements to their success. Effective altruism demanded aristocratic behavior (personal disregard for money, public consciousness, taking your ideas seriously, and noblesse oblige) and so scooped up a large fraction of the aristocratic-minded youth. They are simply better people than your average American, who is a stunted grabbling creature held back by greed and self-defeating ideology. Meanwhile a bunch of EAs were the type of kids who were going to get rich, now armed with a great reason to get rich and a great friend network.

On the less admirable front, effective altruism is a straightforward extrapolation or radicalization of our society's financialized ethos and messianic thirdworldism. It's minimally challenging to existing bourgeois methods and priorities and the system is highly lubricated for something of approximately this shape. It gets easy legitimacy and easy operational success this way.

The later capture by the rationalist AI doom cult is just because rationalists were even better effective altruists who had a clearer, but different, vision of what the best cause was. EA was founded partially as a question: what's the most effective way to spend money for good? The P(doom) rationalists were there with an ideologically very close by but very radical answer.

So my takeaway is that the one big lesson one should learn from the success of EA is that the earn to give approach to money is a political superpower.

referenced by: >>2033 >>4677 >>4680

I'm writing a longer received

anon_qohu said in #2023 1y ago: received

Ideologically the EA holds three inherent advantages:

1. The core tennet is simple enough for child to understand - "the fewer humans suffer the better". In practice it's more complex, but many ideologies require a lengthy lecture to grasp even the basics. As median attention span shrinks, adherents increasingly prefer simpler ideologies

2. The rise of STEM at expense of Humanities means ideology should appeal to STEM bros. Translating all morality into utility function is practically Holy Grail to them, so EA wins again.

3. By attempting to quantify everything, the EA makes the world more legible, thereby attracting the powerful. Not only do they possess the resources to support this ideology, but even being associated with them brings prestige.

Ideologically the EA received

anon_gipi said in #2024 1y ago: received

Some good stuff here on the ideological and intellectual content of Effective Altruism, and how that package fit the zeitgeist. That's half the story.

The other half is the Karnofsky-Moskovitz alliance. The combination of Karnofsky's institution-building skill and Moskovitz's billions is what built Effective Altruism from a group of fringe intellectuals in Oxford dorm rooms and Berkeley group houses, organized around email lists, into a formidable patronage network enmeshed with DC elites and a career track suitable for the best and the brightest, organized around the Open Philanthropy Project and the plethora of small NGOs they fund. It was not a foregone conclusion that figures like those two would hitch their horses to the Effective Altruist wagon. Without them it would be much smaller, weirder, and less shiny.

referenced by: >>2025

Some good stuff here received

anon_twwe said in #2025 1y ago: received

>>2024
This is a good point. It's also worth bringing up the role of Laura Arillaga-Andreessen (pmarca's wife) in that alliance. She was running a project to get all of philanthropy rebuilt on quantitative-transparent foundations, run by people trained in that way of thinking (maybe by her programs?). I forget if she had any relation to Holden's givewell, but they were operating on the same ideology. She had various big money philanthropy players in her crew that were moving hundreds of millions to billions of dollars around to each other's programs. One of her acolytes was Cari Tuna, who (iirc) she introduced to Dustin Muskovitz. They then got married, and I believe Dustin has said it was Cari Tuna who gave him the idea to give away all the money.

There are a bunch of characters like this in the early ecosystem. The whole thing ended up being an alliance between weird but high potential intellectual stuff like what was going on at Oxford and Lesswrong, and these kinds of players. Or another way to look at it is that a few of these live player types took over the energy of the early movement. I still remember when Holden Karnofsky was considered EA by the EAs, but was carefully avoiding any expression of association the other way. The Karnofsky machine was quite independent of EA, with an obviously different set of inspirations and loyalties if you looked closely.

referenced by: >>2027

This is a good point received

anon_gipi said in #2027 1y ago: received

>>2025
Yeah. Arillaga-Andreessen was never formally affiliated with Open Philanthropy etc, but the intellectual descent is extremely clear. According to the rumors I heard at the time, Tuna was very important to setting up Moskovitz's charity empire in the early days and making the alliance with Karnofsky. I've heard little about her in recent years, so my guess is that she stepped back as Karnofsky stepped up.

Yeah. Arillaga-Andre received

anon_nahe said in #2033 1y ago: received

>>2021
Is there a way I can read that article?

referenced by: >>2035

Is there a way I can received

anon_lyse said in #2035 1y ago: received

>>2033
Yes when it is done.

Yes when it is done. received

anon_wefe said in #2096 1y ago: received

I'll caveat that I haven't done much reading by the EA philosophers, but I have known quite a few EAs over the last ten years. My take is that EA offers talented young people the ability to 'make an impact' i.e. have power very quickly. Almost any type of activity can be classified as 'effective' with just the right intellectual scaffolding. You can see this with EAs being mostly focused on malaria bednets in the early days and quickly branching out to other activities including political donations and funding AGI research. It turns out that the most important ingredient to become 'Effective' is power. EA ideology does not discourage this power seeking at all nor is it very self aware about that power seeking which provides a fertile ground for the power seeking aspects of our psyche to gravitate towards it.

referenced by: >>2099

I'll caveat that I h received

anon_lyly said in #2099 1y ago: received

>>2096
I've been in the EA scene from the beginning on the xrisk side, and I think you're mostly right. The subjective experience of 'impact' is having a bunch of smart comrades selflessly pouring their effort and resources into outcomes they think are important and ambitious but achievable. It's glorious, frankly, specific content aside.

What popped me out is ultimately I was radicalized through and beyond rationality into philosophy. The whole xrisk transhumanist AI narrative is a fantastic provocation to think very hard about some very uncorrelated thoughts. Taking that provocation seriously though, it undermines a lot of the simplistic utilitarian marginalism present in those crowds. You end up doing a deeper kind of epistemology than Bayesian, and a more serious form of perspectival morality than "shut up and multiply".

Anyways I got bored intellectually, and then EA turned into a recruiting platform for boring lobbying careers, which didn't appeal to me. The weird social behaviors (eg polyamory, deliberate neoteny, etc) were also eventually a big turn off. Oh and if course I met some of the top people like Holden who revealed themselves to primarily think of themselves as machievellian technocratic democrats in the Big Foundations tradition more than anything else even EA. This was clear from the decor of the openphilanthropy/givewell offices. Holden told us something like "we can't say this in public but i want to flood America with as many low skill immigrants as possible, preferably haitian" with an Israeli flag and a bunch of other flags from fake American empire countries (eg Kosovo) displayed prominently behind him while he said this. It was like something out of a /pol/ meme. To be fair the logic of this was something about the marginal net quality of life and net economic improvement being highest from moving people from the worst countries into the best countries. So there's utilitarianism revealed in it's essence: a great leveling with utter disregard for the generators of quality. I don't even agree with the calculation within the utilitarian framework, but the moral perspective itself is entirely wrong, so I don't hang out with those circles any more.

referenced by: >>2102

I've been in the EA received

xenophon said in #2102 1y ago: received

>>2099
> Taking that provocation seriously though, it undermines a lot of the simplistic utilitarian marginalism present in those crowds. You end up doing a deeper kind of epistemology than Bayesian ...

Right, you can't serious probe and weigh these questions without getting into questions of ends that are ultimately not utilitarianism.

> ... people like Holden who revealed themselves to primarily think of themselves as machiavellian technocratic democrats in the Big Foundations tradition ...

This is an instance of the broader phenomenon of utilitarianism unconsciously serving as cover for psychological motives that are very human but not necessarily well-considered or good.

Right, you can't ser received

anon_nahe said in #4677 4d ago: received

>>2021
Can I read that article you said you were writing?

Can I read that arti received

anon_nedi said in #4678 4d ago: received

EA convinced one billionaire to give away all his money to them. Once that was accomplished, they had enough money to scale up the rest of their ecosystem. EA is a minnow compared to progressive philanthropy writ large, and everything else is a minnow compared to EA.

The reason EA and not some "based" cause wins is that EAs find some random EA blogger or online commentator and immediately figure out ways to give him a salary and control over an empire of other salaried full-time EAs, whereas the "based" cause by revealed preference thinks its own adherents are not worth more than a follow and maybe $20/mo for the newsletter.

If someone commits $10 billion to your cause, it's practically impossible not to turn that cause into a major social movement that generates political power and additional follow-on businesses and institutions, and practically impossible not to attract a large cohort of ambitious young elites.

You can debate all these other reasons why EA and not some other weird cause persuaded this one billionaire, but note how underdetermined and contingent this outcome was. It mostly just takes one guy!

For comparison, look at how Patrick Collison has turned "Progress Studies" into a thing by giving ~1% of the money that EA has given. You can have all these debates but it ultimately just comes down to how much money one or two billionaires are willing to spend on turning a weird ideology into a powerful social institution.

referenced by: >>4679

EA convinced one bil received

anon_nedi said in #4679 4d ago: received

>>4678
It just cannot be emphasized enough how the difference between being powerful and being a disempowered critic is just psychological. 99% of the people whining about EA, leftism, etc. are spending 99% of their time on making money rather than building powerful social networks and institutions, and 99.9% of their net worths are not deployed towards any kind of non-financial goal. The EAs give money to their friends to spend more time on EA; their critics do not.

Just imagine the battle in the marketplace of ideas. Side A has its arguments, proponents, and adherents, and so does Side B. However Side A decides they are going to pay dozens of smart people hefty salaries to spend 100% of their time making arguments, propounding them, and converting adherents. Side B decides that's all a waste of money, they are only going to devote $20/mo each and contribute the level of intellectual firepower that can be mustered when someone is tweeting on the toilet between work breaks.

It does not matter a whit how much more money Side B accumulates in their portfolios, or whatever, if the % deployed to politics always remains infinitesimal compared to its opponents. It does not matter a whit how good Side B's ideas or, or how bad Side A's ideas are, if Side B's ideas are presented as memes thought up on the toilet, while Side A's ideas are presented as fully-fleshed-out moral, intellectual, and aesthetic visions that young elites can immerse themselves in entirely to pick up an entire network of habits, friends, and eventually financial opportunities.

I just cannot get over how blind to this obvious reality the wealthy people on "the right" or "anti-woke" side are. You could take the absolute stupidest, worst, most obviously terrible set of ideas ever heard of, say Aztec Cannibal Suicide Cultism, but if one rich guy dumps $10 billion on Aztec Cannibal Suicide Cultism and provides fun and fulfilling careers for cool and ambitious young elites to develop and promote Aztec Cannibal Suicide Cultism, it does not matter one fucking whit if every other idea in the "marketplace of ideas" is better, if every other idea gets no more firepower behind it than toilet tweets.

referenced by: >>4680

It just cannot be em received

anon_dekw said in #4680 3d ago: received

>>4679
>>2021
While, I agree that the funding environment and the level of organizational sophistication behind EA is genuinely excellent and really off the charts for any advocacy group or ecosystem, I don't think that people in the thread are thinking deeply enough about what sets EA apart.

The core tenets of effective altruism are basically just common sense factual statements about charitable donation (if you are donating to accomplish some goal, you should do that as effectively as you reasonably can) paired with solid ethical statements that are really just restatements of the bedrock principles of modern liberal democratic societies: all people have equal human dignity, suffering is bad, and happiness is good. Sure in a deeper philosophical sense all of these principles can and should be challenged, but the strict arguments, like the ones you see in Peter Singer's *Famine, Affluence, and Morality*, are, to a modern audience, intensely compelling.

It's not enough when starting a movement to just have money or even a social organizational structure. You need to have a legitimizing ideology for your movement, and the strength of this legitimizing ideology is absolutely vital. Communism worked so well in so many places because it appealed directly to the disjunctions in power and economic status between different classes in modern developed and developing societies. Napoleon was able to conquer most of Europe because he was able to harness the ideological strength of nationalism in the emerging French state to rally fanatical armies of bourgeois and proletarian Frenchmen. Ideology isn't marginal. It's the equivalent to tens of thousands of men or tens of billions of dollars.

The EA ideology not only legitimizes donation to ideological organizations, but it grants legitimacy to the people donating and, above all, to the people who are working within these organizations. For a young person working in EA, they get the social and moral status required of pursuing the highest values of modern American liberal democratic society. Social status is the single most important motivator in social movements, and EA is able to fully harness the background ideological energy of the American ruling class. Additionally, the overlap with rationalism, utilitarianism, and academic philosophy generally means that there is a strong intellectual core to the philosophy that allows it to recruit extremely high levels of human capital. The recruitment of extremely intelligent individuals not only helps bolster EA organizations, but far more importantly, allows for the recruitment of those with existing power and wealth. Just as Christianity was able to co-opt the Roman civil infrastructure by converting members of the elite and ultimately the imperial house, EA is able to convert members of the tech industry and co-opt their logistical networks and bank accounts.

referenced by: >>4682

While, I agree that received

anon_dekw said in #4681 3d ago: received

Compare this to the modern right. Not only does the right have no ruling ideology, instead being comprised of a constant mass of rival factions either in uneasy coexistence or outright civil war, but every expression of American right-wing ideology is directly counter to the dominant ethos of post-war Great Society American liberal democracy. American right-wingers cannot harness the energy of a truly radical right-wing agenda, because their ideology repels the brainwashed "normie" Americans exactly to the degree in which it is truly right-wing. Without a wholesale collapse of the progressive media-educational project, this is unlikely to change.

Whereas EAs can recruit young graduates and immediately grant them status with their peers, strangers, and probably even their family networks (who can really oppose doing good effectively?) when right-wingers recruit for conservative organizations recruits know that they will be despised by considerable portions of the population and exiled to an ideological ghetto. Of course, the right wing generally doesn't even have the opportunity to create wide ideological-organizational networks in the first place, because of the constant infighting and the lack of any unifying ideological framework.

Compare this to the received

anon_nedi said in #4682 3d ago: received

>>4680
EA is fundamentally not a revolutionary ideology at all, but a mutation of the already-existing ruling ideology, that is true. In this sense EA is very similar to DSA. But since anyone posting on this site is more interested in applying the lessons of EA to revolutionary causes, well, this facet of EA should definitely be noted and kept in mind, but it is unfortunately not something that can hold lessons for the rest of us.

referenced by: >>4683

EA is fundamentally received

anon_dekw said in #4683 3d ago: received

>>4682
I agree that EA is definitely "a mutation of the already-existing ruling ideology," like DSA. But I think the difference between DSA and EA is illustrative: DSA is a largely incompetent, fragmented organization that, despite massive membership and political clout, has trouble actually wielding any power on an organizational level. They're also always on the verge of going bankrupt.

While the DSA comes out of the revolutionary Marxist-continental-Franfurt-anticolonial-whatever lineage of Western Hegemonic Thought, EA comes more purely out of Enlightenment rationalization, and inherits some of the much more eusocial genes. Thus the shockingly low levels of fragmentation (for a political group) and the high levels of organizational effectiveness.

referenced by: >>4689

I agree that EA is d received

anon_nedi said in #4689 2d ago: received

>>4683
True, however both are extremely influential and have made lots of careers. Did you miss who won the mayoralty of New York City yesterday?

True, however both a received

You must login to post.