This is the weekly visible open thread. Post about anything you want, ask random questions, whatever. ACX has an unofficial subreddit, Discord, and bulletin board, and in-person meetups around the world. 95% of content is free, but for the remaining 5% you can subscribe here. In other news:

1: Like many of you, I’ve been following the FTX disaster. My thoughts go first of all to all the depositors who lost money, and second of all to the people of the Bahamas who have reason to worry their economy will suffer. But the only category I have any special insight into are the charities that were reliant on FTX funding. FTX kind of went around to a bunch of charities saying “What could you do with twice as much money? With ten times as much money? Do it! We’ll give you the cash!” and then the charities did it, and now it looks like they will not get the cash. A lot of people in nonprofits are going to lose their jobs, and some people are worried they’re going to have to go into debt giving back funding that they’ve already spent (if this might be you, see here but also here). I hope everyone involved is okay, or as okay as it’s possible to be in situations like these.

2: The past year has been a terrible time to be a charitable funder, since FTX ate every opportunity so quickly that everyone else had trouble finding good not-yet-funded projects. But right now is a great time to be a charitable funder: there are lots of really great charities on the verge of collapse who just need a little bit of funding to get them through. I’m trying to coordinate with some of the people involved. I haven’t really succeeded yet, I think because they’re all hiding under their beds gibbering - but probably they’ll have to come out eventually, if only for food and water. If you’re a potential charitable funder interested in helping, and not already connected to this project, please email me at scott@slatestarcodex.com. I don’t want any affected charities to get their hopes up, because I don’t expect this to fill more than a few percent of the hole, but maybe we can make the triage process slightly less of a disaster.

3: I have no idea what’s going to happen with ACX Grants now. Some of the infrastructure I was hoping to use was being funded by the FTX Foundation and may no longer exist. It might or might not be more important to use all available funding to rescue charities about to go under from losing FTX support. I still want to do something , because of the increased need and urgency mentioned above, but give me a while to hide under my bed and gibber before I sort out specifics.

4: None of last year’s ACX Grants were funded by the FTX Foundation or anyone else linked to FTX, so if this is you, don’t worry.

5: In light of recent events, some people have asked if effective altruism approves of doing unethical things to make money, as long as the money goes to good causes. I think the movement has been pretty unanimous in saying no. Obviously everyone is condemning it super-strongly now. But people have also been condemning it, consistently, since the beginning of the movement. For example, Will MacAskill is as close as EA has to a leader, and he wrote in 2017:

We believe that in the vast majority of cases, it’s a mistake to pursue a career in which the direct effects of the work are seriously harmful, even if the overall benefits of that work seem greater than the harms.

Eliezer Yudkowsky has also been writing eloquently about this for over a decade, including Ends Don’t Justify Means (Among Humans):

“The end does not justify the means” is just consequentialist reasoning at one meta-level up. If a human starts thinking on the object level that the end justifies the means, this has awful consequences given our untrustworthy brains; therefore a human shouldn’t think this way.

And I tried to make the same point in Axiology, Morality, Law (2017), where I said:

In [just societies], the universally-agreed priority is that law trumps [personal] morality, and morality trumps axiology [ie consequentialist reasoning]. First, because you can’t keep your obligations to your community from jail, and you can’t work to make the world a better place when you’re a universally-loathed social outcast. But also, because you can’t work to build strong communities and relationships in the middle of a civil war, and you can’t work to make the world a better place from within a low-trust defect-defect equilibrium. But also, because in a just society, axiology wants you to be moral (because morality is just a more-effective implementation of axiology), and morality wants you to be law-abiding (because law is just a more-effective way of coordinating morality). So first you do your legal duty, then your moral duty, and then if you have energy left over, you try to make the world a better place.

I later discussed whether there could be exceptions to this (like “what if by giving one person a paper cut you could end all disease forever?”) but I hope that it’s still clear that in most normal situations following the rules is the way to go. This isn’t super-advanced esoteric stuff. This is just rule utilitarianism, which has been part of every discussion of utilitarianism since John Stuart Mill in the 1800s. See also The Dark Rule Utilitarian Argument For Science Piracy.

6: Some other people are asking whether this happened because utilitarians should have infinite appetite for risk, eg the St. Petersburg paradox. I think a first good answer would be something like “you shouldn’t even consider this question unless you are following the deontological rules and avoiding unethical externalities, which seems really hard to do as the amount that you’re risking gets higher”. But two specific points beyond that.

First, it’s true that $20K buys twice as many bed nets as $10K. But most of what FTX was funding wasn’t bed nets - it was things like medical research, or lobbying, or AI research labs. The effectiveness of these things probably follows a power law distribution - your first dollar funds an amazing lobbying organization run by superstars, your hundred millionth dollar funds a so-so lobbying organization scraping the bottom of the barrel, and your ten billionth dollar funds a hobo with the word “LABIYIST” scrawled on his shirt. I think FTX’s money was already getting near this point, at least for the short term; I would have preferred 100% chance that the charitable ecosystem keep the FTX money it had, compared to 50% chance of 10x more and 50% chance of zero.

Second, if you St. Petersburg yourself a bunch of times and lose everything, it’s going to be really hard to pat yourself on the back for a job counterfactually well done and walk away. More likely you’re going to panic and start grasping for unethical schemes that let you escape doom. So real-world St. Petersburg isn’t “50% chance of doubling your money, 50% chance of zero”, it’s “50% chance of doubling your money, 50% chance of getting put in a psychologically toxic situation where you’ll face almost irresistible pressure to do crazy things that will have vast negative impact.” And the only really effective way to resist temptation is to avoid getting in situations where you’re really tempted to do bad things. I wouldn’t have thought about it this way before recent events, but now that they’ve happened it seems obviously true. This is what Eliezer means by “running on corrupted hardware” - either you follow the deontological rules without knowing exactly why they apply in your particular case, or you try doing the seemingly-reasonable act-utilitarian thing and get to learn why it was wrong after you’ve destroyed everything.

Still, I’m reluctant to center the St. Petersburg narrative here. Hundreds of other crypto projects have proven fraudulent and gone bust without us needed to appeal to exotic branches of philosophy. SBF is a semi-mythical figure. It would feel appropriate if his downfall was for properly mythical reasons, like a deep commitment to literal St. Petersburg. But I think in the end it will probably have at least as much to do with the normal human vices that we all have to struggle against.

7: Some people are asking whether people who accepted FTX money should have “seen the red flags” or “done more due diligence”. Sometimes this is from outsider critics of effective altruism. More often it’s been effective altruists themselves, obsessively beating themselves up over dumb things like “I met an FTX employee once and he seemed to be frowning, why didn’t I realize that this meant they were all committing fraud?!” Listen: there’s a word for the activity of figuring out which financial entities are better or worse at their business than everyone else thinks, maximizing your exposure to the good ones, and minimizing your exposure to the bad ones. That word is “finance”. If you think you’re better at it than all the VCs, billionaires, and traders who trusted FTX - and better than all the competitors and hostile media outlets who tried to attack FTX on unrelated things while missing the actual disaster lurking below the surface - then please start a company, make $10 billion, and donate it to the victims of the last group of EAs who thought they were better at finance than everyone else in the world. Otherwise, please chill.

True, there are also other people outside of finance who are also supposed to look out for this kind of thing. Investigative reporters. Congress. The SEC. But the leading US investigative reporting group took $5 million from SBF. Congressional Democrats took $40 million from SBF in midterm election money. The SEC was in the process of allying with SBF to anoint him as the face of legitimate well-regulated crypto in America. You, a random AI researcher who tried Googling “who are these people and why are they giving me money” before accepting a $5,000 FTX grant, don’t need to feel guilty for not singlehandedly blowing the lid off a conspiracy that all these people missed. This is true even if a bunch of pundits who fawned over FTX on its way up have pivoted to posting screenshots of every sketchy thing they ever did and saying “Look at all the red flags!”

8: Disclosure of all my own FTX connections and conflicts-of-interest just so it doesn’t look like I’m hiding anything: I’ve never taken money from FTX as an organization. In January 2021, a few FTX employees including SBF bought subscriptions to this blog at way above sticker price; I will earmark that money for the rescuing-collapsing-charities project (I don’t generally dox people who subscribe to this blog without asking them first, but in this case it was public knowledge). I did some work helping the FTX Foundation find good charities to donate to; I was offered compensation but declined.

My emotional conflict of interest here is that I’m really f#%king devastated. I never met or communicated with SBF, but I was friendly with another FTX/Alameda higher-up around 2018, before they moved abroad. At the time they seemed like a remarkably kind, decent, and thoughtful person, and I liked them a lot. I desperately want to believe they didn’t know about the fraud, but it seems really implausible. If they did, then I genuinely have no idea what happened, and I hope the investigation finds some reasonable explanation, like that they were doing so many stimulants and psychedelics that the DMT entities were piloting their body like an anime mech. I probably shouldn’t exactly say “I hope they’re okay” when there are so many victims who deserve okayness more. But I hope there’s some other world-branch where they never got involved in any of this and they’re living their best life and doing lots of good, and I hope the version of me in that world branch is giving them the support and reassurance that I can’t give them here.

More generally, I trusted and looked up to the FTX/Alameda people. I didn’t actually keep money in FTX, but I would have if there had been any reason to; I didn’t actually tell other people they should trust FTX, but I would have if those other people had asked. Lower your opinion of me accordingly.

9: The past few days I’ve been thinking a lot of stuff along the lines of “how can I ever trust anybody again?”. So I was pleased when Nathan Young figured out the obvious solution: make a list of everyone I’ve ever trusted or considered trusting, make prediction markets about whether any of them are committing fraud, then pre-emptively be emotionally dead to anybody who goes above a certain threshold. You can find some preliminary markets here, although I have nitpicks about the exact questions. If anyone ever goes above 33% on that market in anything other than a short-term blip, I’ll either sever all ties with them, or at least write a public post presenting my explanation for why I’m not doing that despite the risk. [UPDATE: Some people took this too seriously so Nathan deleted it]

10: Sorry for cramming all of this into an Open Thread. I’m not really sure why I’m doing it this way, except maybe feeling like if it isn’t a real post then it’s not real and I can continue mashing the “DENIAL” button on my subconscious. This is still an Open Thread. Talk about whatever you want. This isn’t Challenge Mode. I’ll continue to experiment with doing that on Wednesdays only.