Five theses on the dangers of moral category errors
If you have writer’s anxiety it serves the world really well: This is the most readable text I have ever read on the subject (and I'm not even an EA enthusiast).
This was an extremely good and insightful article about a topic I've read a ton about. Really excited for this blog.
I agree there's a lot of overlap between conservative libertarianism and long-termism, though few of the latter seem to realize it.
For example, long-termists should favor high drug prices to the degree they incentivized medical R&D ( a point Tyler Cowen has made repeatedly) yet I would be surprised if the avg EA was not in favor of drug price controls.
This doesn't ring very true with my impressions of EA. A lot of these statements are extreme characterizations. But maybe I don't know enough EA people, despite considering myself one.
The truest characterization was the tendency within EA to debate abstract concepts.. that in the end really aren't that important. To some degree this is just ineffectiveness, but to some degree it's because others keep pulling out topics like the "Trolley Problem" as somehow useful analogies for why we should ignore logic, and EA's can't help but feel the need to defend themselves without full capitulation.
But in the end, it's pretty much useless to all sides. Yes, if you truly believe the premise of the trolley problem and don't allow for any other complexity of the world, you should pull that lever. But you know, none of that applies in any way to the real world, and it's a fools game to be trapped by it. EAs I know think institutions, laws, common sense, psychology, are important, and respect for them is also important, contrary to the logical conclusion of the naïve utilitarian trope.
It's pretty obvious this is true, however it's ignored pretty consistently whenever someone wants to bash EA. The reason for this is often no more than someone wants a counter argument, and the best way to get one is to ignore that. I have compassion for those that get trapped by this snare. Of course they're wrong, and the right answer to the trolley problem isn't to display to those looking for a reason to dismiss you the very thing they want, it's to note that the problem itself is flawed in about a dozen ways.
One of the main ones, is that the trolley problem requires you to put aside reservations about, the "what if I don't" and the sense of Hubris attached to our own actions, in order to choose to pull that lever. But just because someone can engage with that unrealistic scenario, and play the game you laid out for them, rather than bringing the baggage others apply doesn't in any way mean they don't have that baggage and are willing to use it in the environment where it's appropriate. Hubris is a great thing. I think you'd have a solid case that SBF had not so much of it. But this is hardly a good reason to assume all EA is similar, especially given that lack of hubris has been spotted in the wild associated with all kinds of tragedies similar to FTX unassociated with EA.
Anyhow, I do know the irony that in my defense here, I've somewhat been trapped in responding to the very traps I warn against.. but then again, I did say it would happen and you did set the trap, whether you realized it or not.
I'd end by suggesting you asking you if you really believe the mass of people who label themselves as EA, have EA at the core of some efforts they are making deserve the labeling you've applied here. Your title makes it sound like you're splitting hairs, but the words are harsh.
So, you're not (exactly) an Effective Altruist as you describe it.. But then again, who is?
> Korea stands out for its threadbare pension system and thus high rate of elder poverty. You may not like it, but this is what peak longtermism looks like
> My own contribution to this debate is to argue that, contra the growth-equity trade-off, robust social insurance programs are both a condition and accelerant of sustainable economic growth.
So is Korea's growth hobbled by its threadbare pension system?