"Effective Altruism", something I find myself aligned with but not to the extremes taken by others.
Effective Altruism is such an interesting title. Almost no one views their Altruism as ineffective. The differentiator is what makes their flavor of Altruism effective, but that's not in the title. It would be like calling the movement "real Altruism" or "good Altruism".
A good name might be rational Altruism because in practice these people are from the rationalist movement and doing Altruism, or what they feel is Altruism. But the "rationalist" title suffers from similar problems.
I suppose in the beginning, it was about finding ways to measure how effective different altruistic approaches actually are and focusing your efforts on the most effective ones. Effective then essentially means how much impact you are achieving per dollar spent. One of the more convincing ways of doing this is looking at different charitable foundations and determining how much of each dollar you donate to them actually ends up being used to fix some problem and how much ends up being absorbed by the charitable foundation itself (salaries etc.) with nothing to show for it.
They might have lost the plot somewhere along the line, but the effective altruism movement had some good ideas.
> One of the more convincing ways of doing this is looking at different charitable foundations and determining how much of each dollar you donate to them actually ends up being used to fix some problem and how much ends up being absorbed by the charitable foundation itself (salaries etc.) with nothing to show for it.
Color me unconvinced. This will work for some situations. At this point, it's well known enough that it's a target that has ceased to be a good measure (Goodhart's Law).
The usual way to look at this is to look at the percentage of donations spent on administrative costs. This makes two large assumptions: (1) administrative costs have zero benefit, and (2) non-administrative costs have 100% benefit. Both are wildly wrong.
A simple counterexample: you're going to solve hunger. So you take donations, skim 0.0000001% off the top for your time because "I'm maximizing benefit, baby!", and use the rest to purchase bananas. You dump those bananas in a pile in the middle of a homeless encampment.
There are so many problems with this, but I'll stick with the simplest: in 2 weeks, you have a pile of rotten bananas and everyone is starving again. It would have been better to store some of the bananas and give them out over time, which requires space and maybe even cooling to hold inventory, which cost money, and that's money that is not directly fixing the problem.
There are so many examples of feel-good world saving that end up destroying communities and cultures, fostering dependence, promoting corruption, propping up the institutions that causing the problem, etc.
Another analogy: you make a billion dollars and put it in a trust for your grandchild to inherit the full sum when they turn 16. Your efficiency measure is at 100%! What could possibly go wrong? Could someone improve the outcome by, you know, administering the trust for you?
Smart administration can (but does not have to) increase effectiveness. Using this magical "how much of each dollar... ends up being used to fix some problem" metric is going to encourage ineffective charities and deceptive accounting.
This is a super fair summary and has shifted my thinking on this a bit thanks.
>Almost no one views their Altruism as ineffective
As someone who has occasionally given money to charities for homelessness and the like I don't really expect it to fix much. More the thought that counts.
The vast majority of non-EA charity givers to not expend effort on trying to find the most dollar efficient charities (or indeed pushing for quantification at all), which makes their altruism ineffectual in a world with strong competition between charities (where the winners are inevitably those who spend the most on acquiring donations).
Do you really think all altruism is effective? Caring about the immediate well-being of others is not as effective as thinking in the long term. The altruism you are describing is misguided altruism, which ultimately hurts more than it helps, while effective altruism goes beyond the surface-level help in ways that don't enable self-destructing behaviours or that don't perpetuate the problem.
No I think almost all people doing altruism at least think what they are doing is effective. I totally get that they EA people believe they have found the one true way but so does do others. Even if EA is correct it just makes talking about it confusing. Imagine if Darwin has called his theory "correct biology".
Technically lesswrong is about rationalists not effective altruists, but you're right in a sense that it's the same breed.
They think that the key to scientific thinking is to forego the moral limitations, not to study and learn. As soon as you're free from the shackles of tradition you become 100% rational and therefore 100% correct.
Approximately no one in the community thinks this. If you can go two days in a rationalist space without hearing about "Chesterton's Fence", I'll be impressed. No one thinks they're 100% rational nor that this is a reasonable aspiration. Traditions are generally regarded as sufficiently important that a not small amount of effort has gone into trying to build new ones. Not only is the case that no one thinks that anyone including themselves is 100% correct, but the community norm is to express credence in probabilities and convert those probabilities into bets when possible. People in the rationalist community constantly, loudly, and proudly disagree with each other, to the point that this can make it difficult to coordinate on anything. And everyone is obsessed with studying and learning, and constantly trying to come up with ways to do this more effectively.
Like, I'm sure there are people who approximately match the description you're giving here. But I've spent a lot of time around flesh-and-blood rationalists and EAs, and they violently diverge from the account you give here.
So much vitriol. I understand it's cool to hate on EA after the SBF fiasco, but this is just smearing.
The key to scientific thinking is empiricism and rationalism. Some people in EA and lesswrong extend this to moral reasoning, but utilitarianism is not a pillar of these communities.
Empiricism and rationalism both tempered by a heavy dose of skepticism.
On the other hand, maybe that is some kind of fallacy itself. I almost want to say that "scientific thinking" should be called something else. The main issue being the lack of experiment. Using the word "science" without experiment leads to all sorts of nonsense.
A word that means "scientific thinking is much as possible without experiment" would at least embedded a dose of skepticism in the process.
The Achilles heel of rationalism is the descent into modeling complete nonsense. I should give lesswrong another chance I suppose because that would sum up my experience so far, empirically.
EA to me seems like obvious self serving nonsense. Hiding something in the obvious to avoid detection.
That community is basically the "r/iamverysmart" types bringing their baggage into adulthood. Almost everything I've read in that sphere is basically Dunning–Kruger to the nth degree.
Note that these people often condescendingly refer to themselves as "rationalists," as if they've unlocked some higher level of intellectual enlightenment which the rest of us are incapable of achieving.
In reality, they're simply lay people who synthesize a lot of garbage they find on the Internet into overly verbose pseudo-intellectual blog posts filled with both the factual inaccuracies of their source material and new factual inaccuracies that they invent from whole cloth.