Sofie Channel

Sofie Channel

Anonymous 0x318
said (7mo ago #2015 ✔️ ✔️ 96% ✖️ ✖️ ):

How did Effective Altruism get so big?

I disagree with many of the views behind EA, but I've been fascinated by how they managed to make institutions, build a scene, start companies, NGOs, get funding, etc.

I know very smart people who dropped out of school to join an EA organization - and it was not a "risky" move at all. It was a great move for their careers.

The only blog post that I've found that talks about this is Nadia's blog on "idea machines" - https://nadia.xyz/idea-machines.

Maybe it's the ideology that convinced people and it served as the gravity well to form EA around it. - Maybe we just need a compelling ideology that allows us to build an entire ecosystem around us.

But maybe it has to be something different...

Moldbug got me into the author called "Gaetano Mosca" who mentions that every regime has 4 classes: priests, merchants, warriors and peasants.

It's clear to me that EA in some way was a way for the techies (merchants) to build up a priestly class. Maybe building up a priest class is already mined out - maybe what we need to do is build up a peasant class or a warrior class?

I disagree with many (hidden) ✔️ ✔️ 96% ✖️ ✖️

Anonymous 0x319
said (7mo ago #2016 ✔️ ✔️ 92% ✖️ ✖️ ), referenced by >>2017:

Good post. My view is that EA's greatest strength is how it plays on human psychology.

Many people want to make huge amounts of money.
Many people feel guilty about this.
Many people want deeper meaning in their life.

EA is an ideology that legitimizes the first through the third, and thus eliminates the second.

The meaning in question is also basically the lowest common denominator you can come up with: altruism. Everyone wants to do "good," of course.

It uses this structure to funnel money from the rest of the economy into its own coffers. EA is just a Church stripped bare of any esoteric or ancient sacred symbolism. A religion for a sterile and autistic age, and especially the tech community.

Good post. My view i (hidden) ✔️ ✔️ 92% ✖️ ✖️

Anonymous 0x318
said (7mo ago #2017 ✔️ ✔️ --- ✖️ ✖️ ):

>>2016
You might be right that it's an ideology that allows people to cloak their ambition with altruism, but that doesn't explain why they were able to build an entire ecosystem around them.

"Let's make a lot of money so we can improve the world " --- (broken link) ---> "give *us* a lot of money and human capital to improve the world in the specific way we think it's the right one"

You might be right t (hidden) ✔️ ✔️ --- ✖️ ✖️

Anonymous 0x31a
said (7mo ago #2018 ✔️ ✔️ --- ✖️ ✖️ ):

> that doesn't explain why they were able to build an entire ecosystem around them

I think it's less that they were able to build an entire ecosystem around them, than that the the financialization that was refined over the course of the 20th century created conditions for such an ideology to become the preferred legitimizing ideology of the elites.

> Maybe building up a priest class is already mined out - maybe what we need to do is build up a peasant class or a warrior class?

The warrior class could be the class capable of producing the industrial warfare equipment (i.e. drones). But the need for the priestly class remains. A new priestly class would ideally absorb the old priestly class, leaving the old priestly class as the middle of the priestly class. That is, by the new priestly class' redefinition of "good", the old priestly class now serves this new definition of "good" instead of the previous ultilitarian definition of good as defined in the financial world order.

I think it's less th (hidden) ✔️ ✔️ --- ✖️ ✖️

Anonymous 0x31b
said (7mo ago #2019 ✔️ ✔️ --- ✖️ ✖️ ):

Yeah this is a very important question. People already touched on the precondition of a lot of money flooding into tech and the death of religion and alternate forms of meaning and community.

I think another crucial factor is that effective altruism, as an ideology, *encapsulates* the idea that giving money is how you make a difference, thus overcoming the money-brain issues that Wolf talked about in "Quit Your Job." And if financial sacrifice is built into the ideology, then it's a minimal change to redirect money from bednets to evangelism and sinecures, which allows you to build up institutions and spread rapidly.

Basically I think "that's a nice ideology, how many FTEs have you got?" is an underrated metric of success. Ideas are common; modern urban/Californian flakiness - both in time and willingness to commit money for no financial return - is perhaps *the* key challenge to overcome for any movement, and EA is inherently shaped to overcome it.

Yeah this is a very (hidden) ✔️ ✔️ --- ✖️ ✖️

Anonymous 0x31d
said (7mo ago #2021 ✔️ ✔️ 93% ✖️ ✖️ ), referenced by >>2033:

I'm writing a longer article about this but my basic analysis is that the EAs have one big virtue they are fairly unique on: they treat money as a strategic resource to be disposed of rapidly for effective impact, whereas everyone else treats money like a transcendent private good. That alone almost explains their success. It's table stakes for building a powerful movement and getting things done.

Giving up your money, and going above and beyond to get more money to give, to your friends' ambitious social projects is a direct recipe for a powerful movement. It's also how you get rich, because you build up a bunch of.great machinery of legitimacy and operation that demands funding, creates a powerful network of people with and ethic of helping each other up in the world (see how EAs talk about "EAs").

Imagine for comparison if "based" people behaved anywhere in the same universe as "ea" people just on this one point. Club tropical excellent would be putting up gymnasium-libraries in every town in america, there would be powerful organizations going for the end of animal cruelty and a renewal of conservationism, there would be many cutting edge think tanks and publications with major success, there would be major lawfare organizations correcting legal corruption, literary movements, art movements, and mutual aid fraternities full of social power for normal Americans. This is just a straightforward scale-up of what people in the sphere are already doing, with the money the sphere already has, just directing that money towards "the cause" rather than "muh grill".

Of course this raises a bunch of related secondary elements to their success. Effective altruism demanded aristocratic behavior (personal disregard for money, public consciousness, taking your ideas seriously, and noblesse oblige) and so scooped up a large fraction of the aristocratic-minded youth. They are simply better people than your average American, who is a stunted grabbling creature held back by greed and self-defeating ideology. Meanwhile a bunch of EAs were the type of kids who were going to get rich, now armed with a great reason to get rich and a great friend network.

On the less admirable front, effective altruism is a straightforward extrapolation or radicalization of our society's financialized ethos and messianic thirdworldism. It's minimally challenging to existing bourgeois methods and priorities and the system is highly lubricated for something of approximately this shape. It gets easy legitimacy and easy operational success this way.

The later capture by the rationalist AI doom cult is just because rationalists were even better effective altruists who had a clearer, but different, vision of what the best cause was. EA was founded partially as a question: what's the most effective way to spend money for good? The P(doom) rationalists were there with an ideologically very close by but very radical answer.

So my takeaway is that the one big lesson one should learn from the success of EA is that the earn to give approach to money is a political superpower.

I'm writing a longer (hidden) ✔️ ✔️ 93% ✖️ ✖️

Anonymous 0x31f
said (7mo ago #2023 ✔️ ✔️ --- ✖️ ✖️ ):

Ideologically the EA holds three inherent advantages:

1. The core tennet is simple enough for child to understand - "the fewer humans suffer the better". In practice it's more complex, but many ideologies require a lengthy lecture to grasp even the basics. As median attention span shrinks, adherents increasingly prefer simpler ideologies

2. The rise of STEM at expense of Humanities means ideology should appeal to STEM bros. Translating all morality into utility function is practically Holy Grail to them, so EA wins again.

3. By attempting to quantify everything, the EA makes the world more legible, thereby attracting the powerful. Not only do they possess the resources to support this ideology, but even being associated with them brings prestige.

Ideologically the EA (hidden) ✔️ ✔️ --- ✖️ ✖️

Anonymous 0x320
said (7mo ago #2024 ✔️ ✔️ 91% ✖️ ✖️ ), referenced by >>2025:

Some good stuff here on the ideological and intellectual content of Effective Altruism, and how that package fit the zeitgeist. That's half the story.

The other half is the Karnofsky-Moskovitz alliance. The combination of Karnofsky's institution-building skill and Moskovitz's billions is what built Effective Altruism from a group of fringe intellectuals in Oxford dorm rooms and Berkeley group houses, organized around email lists, into a formidable patronage network enmeshed with DC elites and a career track suitable for the best and the brightest, organized around the Open Philanthropy Project and the plethora of small NGOs they fund. It was not a foregone conclusion that figures like those two would hitch their horses to the Effective Altruist wagon. Without them it would be much smaller, weirder, and less shiny.

Some good stuff here (hidden) ✔️ ✔️ 91% ✖️ ✖️

Anonymous 0x321
said (7mo ago #2025 ✔️ ✔️ 90% ✖️ ✖️ ), referenced by >>2027:

>>2024
This is a good point. It's also worth bringing up the role of Laura Arillaga-Andreessen (pmarca's wife) in that alliance. She was running a project to get all of philanthropy rebuilt on quantitative-transparent foundations, run by people trained in that way of thinking (maybe by her programs?). I forget if she had any relation to Holden's givewell, but they were operating on the same ideology. She had various big money philanthropy players in her crew that were moving hundreds of millions to billions of dollars around to each other's programs. One of her acolytes was Cari Tuna, who (iirc) she introduced to Dustin Muskovitz. They then got married, and I believe Dustin has said it was Cari Tuna who gave him the idea to give away all the money.

There are a bunch of characters like this in the early ecosystem. The whole thing ended up being an alliance between weird but high potential intellectual stuff like what was going on at Oxford and Lesswrong, and these kinds of players. Or another way to look at it is that a few of these live player types took over the energy of the early movement. I still remember when Holden Karnofsky was considered EA by the EAs, but was carefully avoiding any expression of association the other way. The Karnofsky machine was quite independent of EA, with an obviously different set of inspirations and loyalties if you looked closely.

This is a good point (hidden) ✔️ ✔️ 90% ✖️ ✖️

Anonymous 0x320
said (7mo ago #2027 ✔️ ✔️ 84% ✖️ ✖️ ):

>>2025
Yeah. Arillaga-Andreessen was never formally affiliated with Open Philanthropy etc, but the intellectual descent is extremely clear. According to the rumors I heard at the time, Tuna was very important to setting up Moskovitz's charity empire in the early days and making the alliance with Karnofsky. I've heard little about her in recent years, so my guess is that she stepped back as Karnofsky stepped up.

Yeah. Arillaga-Andre (hidden) ✔️ ✔️ 84% ✖️ ✖️

Anonymous 0x318
said (7mo ago #2033 ✔️ ✔️ --- ✖️ ✖️ ), referenced by >>2035:

>>2021
Is there a way I can read that article?

Is there a way I can (hidden) ✔️ ✔️ --- ✖️ ✖️

Anonymous 0x31d
said (7mo ago #2035 ✔️ ✔️ --- ✖️ ✖️ ):

>>2033
Yes when it is done.

Yes when it is done. (hidden) ✔️ ✔️ --- ✖️ ✖️

Anonymous 0x34f
said (6mo ago #2096 ✔️ ✔️ 91% ✖️ ✖️ ), referenced by >>2099:

I'll caveat that I haven't done much reading by the EA philosophers, but I have known quite a few EAs over the last ten years. My take is that EA offers talented young people the ability to 'make an impact' i.e. have power very quickly. Almost any type of activity can be classified as 'effective' with just the right intellectual scaffolding. You can see this with EAs being mostly focused on malaria bednets in the early days and quickly branching out to other activities including political donations and funding AGI research. It turns out that the most important ingredient to become 'Effective' is power. EA ideology does not discourage this power seeking at all nor is it very self aware about that power seeking which provides a fertile ground for the power seeking aspects of our psyche to gravitate towards it.

I'll caveat that I h (hidden) ✔️ ✔️ 91% ✖️ ✖️

Anonymous 0x350
said (6mo ago #2099 ✔️ ✔️ 88% ✖️ ✖️ ), referenced by >>2102:

>>2096
I've been in the EA scene from the beginning on the xrisk side, and I think you're mostly right. The subjective experience of 'impact' is having a bunch of smart comrades selflessly pouring their effort and resources into outcomes they think are important and ambitious but achievable. It's glorious, frankly, specific content aside.

What popped me out is ultimately I was radicalized through and beyond rationality into philosophy. The whole xrisk transhumanist AI narrative is a fantastic provocation to think very hard about some very uncorrelated thoughts. Taking that provocation seriously though, it undermines a lot of the simplistic utilitarian marginalism present in those crowds. You end up doing a deeper kind of epistemology than Bayesian, and a more serious form of perspectival morality than "shut up and multiply".

Anyways I got bored intellectually, and then EA turned into a recruiting platform for boring lobbying careers, which didn't appeal to me. The weird social behaviors (eg polyamory, deliberate neoteny, etc) were also eventually a big turn off. Oh and if course I met some of the top people like Holden who revealed themselves to primarily think of themselves as machievellian technocratic democrats in the Big Foundations tradition more than anything else even EA. This was clear from the decor of the openphilanthropy/givewell offices. Holden told us something like "we can't say this in public but i want to flood America with as many low skill immigrants as possible, preferably haitian" with an Israeli flag and a bunch of other flags from fake American empire countries (eg Kosovo) displayed prominently behind him while he said this. It was like something out of a /pol/ meme. To be fair the logic of this was something about the marginal net quality of life and net economic improvement being highest from moving people from the worst countries into the best countries. So there's utilitarianism revealed in it's essence: a great leveling with utter disregard for the generators of quality. I don't even agree with the calculation within the utilitarian framework, but the moral perspective itself is entirely wrong, so I don't hang out with those circles any more.

I've been in the EA (hidden) ✔️ ✔️ 88% ✖️ ✖️

Anonymous 0x352
said (6mo ago #2102 ✔️ ✔️ --- ✖️ ✖️ ):

>>2099
> Taking that provocation seriously though, it undermines a lot of the simplistic utilitarian marginalism present in those crowds. You end up doing a deeper kind of epistemology than Bayesian ...

Right, you can't serious probe and weigh these questions without getting into questions of ends that are ultimately not utilitarianism.

> ... people like Holden who revealed themselves to primarily think of themselves as machiavellian technocratic democrats in the Big Foundations tradition ...

This is an instance of the broader phenomenon of utilitarianism unconsciously serving as cover for psychological motives that are very human but not necessarily well-considered or good.

Right, you can't ser (hidden) ✔️ ✔️ --- ✖️ ✖️

You must login to post.