[apologies for an issue encountered when sending out this post; some of you may have gotten it twice]

Thanks to Chris Kavanagh, who wrote an extremely kind and reasonable comment in response to my Contra Kavanagh on Fideism and made me feel bad for yelling at him. I’m sorry for my tone, even though I’m never going to get a proper beef at this rate.

Now that I’m calmed down, do I disagree with anything I wrote when I was angrier?

Chris was too nice to really defend himself, but a few other people posted what I think of as partial arguments for the position I mocked as “fideism”. For example, Scott Aaronson:

This is a great post that contains a lot of truth. And yet … I also see a grain of truth in Kavanagh’s position. Like, I get emails every single day from P=NP crackpots and quantum mechanics crackpots and now AI crackpots too. Some of them probably would be better off never trying to think for themselves again, and just Trusting Science and Trusting the Experts. Sure, the experts are sometimes confidently wrong, but not as consistently so as they are! And for my part, I can’t possibly write 25,000 words to explain why each and every crackpot is wrong. As a matter of survival, I have to adopt a Kavanagh-like heuristic: “this person seems like an idiot.”

Alexander:

I liked your posts on ivermectin, but I do think there is a genuine cost to posts like it, which Kavanagh seems to me to be at least hinting at.

When you take conspiracy theorists arguments seriously, it implies a higher prior on conspiracy theories than when you dismiss them out of hand. This can lead to your readers (consciously or not) increasing their priors on conspiracy theories and being more likely to believe future conspiracy theories they come across.

If their prior on conspiracies were not previously too low, this is a relevant cost.

Maybe I’m being too charitable when I mentally translate the statements ‘pro-mainstream-anti-conspiracy people’ make as pointing at this issue, but I do think the issue is real.

Not saying the benefit isn’t worth the cost. Not saying the ‘pro-mainstream-anti-conspiracy people’ do a good job of pointing out that cost or doing any sort of cost-benefit analysis.

Just saying the cost exists, and is not entirely irrelevant. There probably exist some conspiracy theories it would be actively harmful for you to publicly take seriously because that cost would outweigh the benefit of practicing forming opinions.

In thinking about these kinds of questions, I find it helpful to consider three reflexive naive positions towards conspiracy theories (and cults, and misinformation, and general false ideas). All of these are caricatures, but hopefully they’ll help refine the borders of the debate:

Idiocy: Conspiracy theories are a thing dumb people sometimes fall for. If you understand that facts require evidence, and you’re not a Nazi trying to explain why the Jews caused 9-11, then there’s basically no chance you’ll believe. You mostly have to stay away from outright lies - for example, someone making up a story about a Jew confessing to causing 9-11 - which is easy to do, because you can just fact-check these.

Intellect: There is no difference between conspiracy theories and any other theory, except that the conspiracy theories are worse. There are some theories that the smartest experts give 50-50 odds of being true, like “high wages caused the Industrial Revolution”. There are some theories that the smartest experts give a 10-90 odds of being true, like “endocrine disruptors are a major cause of rising LGBTQ identification”. And there are some theories that the smartest experts give 0.001 - 99.999 odds of being true, like “the Illuminati singlehandedly caused the French Revolution”. All of these theories should be treated approximately the same way, as intellectuals discussing difficult questions - and sometimes, if they’re not smart enough to be up to the task, coming to the wrong answer.

Infohazard: Conspiracy theories are deadly traps lying that lie in wait for you, even if you’re smart. If you stumble on one unprepared, it will eat you up, turn you into an adherent, and leave you and society worse off. You should exercise standard infohazard precautions around them, like putting wax in your ears if you’re passing through somewhere you might hear them discussed, or tying yourself to the mast if you’re out of wax. If you have neither wax nor a mast, you can usually come out unscathed by reciting “trust experts . . . trust experts . . . trust experts” over and over as a mantra.

One advantage of the Idiocy perspective is that it makes conspiracy theories low status. Most people don’t want to seem like idiots; if their friends think anyone who believes in conspiracy theories is an idiot, they’ll take extra care to stay away from them.

But a disadvantage - one I find overwhelming - is that when you do come across a conspiracy theory, you’re totally blindsided by it. Since you “know” conspiracy theories only sound convincing to idiots, and you “know” you’re not an idiot, this convincing thing you just heard can’t be a conspiracy theory! It must be a legitimately true thing that Big Pharma is suppressing! Everyone knows Big Pharma sometimes suppresses stuff, that’s not a . . .

This is why I stress, again and again, that good conspiracy theories have lots of convincing-sounding evidence in their favor, and may sound totally plausible to a smart person reasoning normally. When people shrug off conspiracy theories easily, it’s either because the conspiracy theory isn’t aimed at them - the equivalent of an English speaker feeling smug for rejecting a sales pitch given entirely in Chinese - or because they’re biased against the conspiracy theory with a level of bias which would also be sufficient to reject true theories. Sure, everything went well this time - they were able to resist believing the theory - but one day they’ll encounter a sales pitch in English, on a topic where it accords with their biases. Then they’ll be extra-double-doomed because of how sure they are that they’re immune to propaganda.

When people criticize me, they act like I’m 100% taking the Intellect perspective. I admit I have some sympathies in that direction. Ivermectin is an especially clear case: for a while, most doctors and epidemiologists suspected that it worked, because there were impressive studies in favor. Then those impressive studies were gradually found to be flawed or fraudulent, better studies gradually came out showing that it didn’t work, and the experts gradually shifted to doubting it. At what point in this process - which second of which day - did it switch from plausible-but-false scientific theory to conspiracy theory? Obviously there’s no single moment (cf. philosophy of science’s long failure to solve the demarcation problem). So the difference between a good scientific theory and a conspiracy theory is definitely a spectrum.

But I think this meshes just fine with the Infohazard perspective. There are many arguments, very closely resembling correct arguments, that play on various biases and subtle errors of reasoning, and end out unfairly convincing. I like to call biases “cognitive illusions”, by analogy to optical illusions, which can also be unfairly convincing:

This is my favorite illusion. The top and bottom chess sets are the same color, and only look black vs. white because of contrast effects. This one is harmless, because it affects everyone equally, nobody cares about it too much, and you can easily check via Paint or Photoshop or something. The Infohazard perspective claims conspiracy theories are potentially this convincing, but in a much more pernicious way: they only hit some people (not necessarily the dumb ones!), and they subvert the checking process so that it appears to give pro-conspiracy results (see Trapped Priors).

All factual claims can become the basis for emotional/social coalitions. I wrote here about how an extremely pointless question - whether Abu Bakr or Ali should have been political leader of the Arabian empire in 632 AD - produced the Sunni/Shia split, whose different sides went on to develop different political systems, aesthetics, and philosophies, and to hate each other even today. It’s easy for a scissor statement like “is the chess set black or white?” to become the basis for a social/political movement, which then evolves the anti-epistemology necessary to protect its own existence (I’m still in awe of the way ivermectin advocates have made “small studies are more trustworthy than big studies” sound like a completely reasonable and naturally-arrived-at position).

I agree that everyone (including smart people) needs to be constantly vigilant of this possibility, and that any suggestion otherwise risks placing a stumbling block before the blind.

II.

Where I differ from Alexander is something like - quick analogy, there used to be a thing where some therapists would avoid asking patients if they were suicidal, because they didn’t want to “plant the idea” in their head. People would argue that you shouldn’t talk at length about the reasons for and against suicide, because that was highlighting it as an option, or dignifying it with a response. Most studies have since weighed in against this perspective. Depressed people aren’t idiots. They are aware that committing suicide is an option. You will never be able to suppress all knowledge of suicide’s existence, and “suddenly triggering the latent knowledge” isn’t a thing. Talking about it openly just means . . . it can be talked about it openly.

We currently live in a world where:

  • There are big studies in prestigious journals finding that ivermectin works

  • There are open letters from well-respected critical care doctors saying that ivermectin works

  • The medical guidelines of several countries recommend that doctors in those countries use ivermectin.

  • Several of the top universities and hospitals in the world have done studies on whether or not ivermectin works.

  • Experts have testified before Congress saying that ivermectin works

  • Several United States Senators have stated, on national television, that ivermectin works.

Consider the possibility that the cat is already out of the bag, and that me writing a negative article, against ivermectin, on ACX isn’t going to extract the cat any further. “C’mon, bro, just one more chance, bro, denying it of oxygen will totally work this time, bro, please, just one more chance!” At some point, you have to acknowledge that people who want to hold up examples of people taking ivermectin seriously can already point to the critical care doctors and senators and guideline-makers, and that maybe the time has come to start arguing against it in some way.

Eliezer Yudkowsky’s position is Let Them Debate College Students. I’m not a college student, but I’m not Anthony Fauci either, and I am known for blogging about extremely dignified ideas like the possibility that the terrible Harry Potter fanfiction My Immortal is secretly an alchemical allegory. I haven’t seen ivermectin advocates using “Scott takes this seriously enough to argue against it!” as an argument, and I have seen them getting angry about it and writing long responses trying to prove me wrong. Sometimes they have used me getting some points wrong as a positive argument, and I would be open to the argument that I failed in not arguing against it well enough that they couldn’t do that, but nobody has been making that argument, and if they did, then it would imply that people who are smarter than me should take over the job, which I endorse.

III.

I worry Scott Aaronson thinks I’m saying you shouldn’t trust the experts, and instead you should always think for yourself. I’m definitely not trying to say that.

I’ve tried to be pretty clear that I think experts are right remarkably often, by some standards basically 100% of the time - I realize how crazy that sounds, and “by some standards” is doing a lot of the work there, but see Learning To Love Scientific Consensus for more. Bounded Distrust also helps explain what I mean here.

I also try to be pretty clear that reasoning is extremely hard, it’s very easy to get everything wrong, and if you try to do it then a default option is to get everything wrong and humiliate yourself. I describe that happening to me here, and presumably it also happens to other people sometimes.

What I do think is that “trust the experts” is an extremely exploitable heuristic, which leads everyone to put up a veneer of “being the experts” and demand that you trust them.

I come back to this example again and again, but only because it’s so blatant: the New York Times ran an article saying that only 36% of economists supported school vouchers, with a strong implication that the profession was majority against. If you checked their sources, you would find that actually, it was 36% in favor, 19% against, 46% unsure or not responding. If you are too quick to seek epistemic closure because “you have to trust the experts”, you will be easy prey to people misrepresenting what they are saying.

I come back to this example less often, because it could get me in trouble, but when people do formal anonymous surveys of IQ scientists, they find that most of them believe different races have different IQs and that a substantial portion of the difference is genetic. I don’t think most New York Times readers would identify this as the scientific consensus. So either the surveys - which are pretty official and published in peer-reviewed journals - have managed to compellingly misrepresent expert consensus, or the impressions people get from the media have, or “expert consensus” is extremely variable and complicated and can’t be reflected by a single number or position.

And I genuinely think this is part of why ivermectin conspiracies took off in the first place. We say “trust science” and “trust experts”. But there were lots of studies that showed ivermectin worked - aren’t those science? And Pierre Kory MD, an specialist in severe respiratory illnesses who wrote a well-regarded textbook, supports it - isn’t he an expert? Isn’t it plausible that the science and the experts are right, and the media and the government and Big Pharma are wrong? This is part of what happens when people reify the mantras instead of using them as pointers to more complicated concepts like “reasoning is hard” and “here are the 28,491 rules you need to keep in mind when reading a scientific study.”

IV.

All of this still feels rambly and like it’s failing to connect. Instead, let me try describing exactly what I would advice I would give young people opening an Internet connection for the first time:

You are not immune to conspiracy theories. You have probably developed a false sense of security by encountering many dumb conspiracy theories and feeling no temptation to believe them. These theories were designed to trap people very different from you; others will be aimed in your direction. The more certain you are of your own infallibility, the less aware you will be, and the worse your chances. The ones that get you won’t look like conspiracy theories to you (though they might to other people).

When you run into conspiracy theories you don’t believe, feel free to ignore them. If you decide to engage, don’t mock them or feel superior. Think “there, but for the grace of God, go I.” Get a sense of what the arguments for the conspiracy theory look like - not from skeptics trying to mock them, but from the horse’s mouth - so you have a sense of what false arguments look like. Ask yourself what habits of mind it would have taken the people affected by the theory to successfully resist it. Ask yourself if you have those habits of mind. Yes? ARE YOU SURE?

To a first approximation, trust experts over your own judgment. If people are trying to confuse you about who the experts are, then to a second approximation trust prestigious people and big institutions, including professors at top colleges, journalists at major newspapers, professional groups with names like the American __ Association, and the government.

You might ask: Don’t governments and other big institutions have biases? Won’t they sometimes be wrong or deceptive? And even if you’ve lucked into the one country and historical era where the government 100% tells the truth and the intellectuals have no biases, doesn’t someone need to keep the flame of suspicion alive so that it’s available to people in other, less fortunate countries and eras?

The answer is: absolutely, yes, but also this is how conspiracy theories get you. They will claim that they are the special case where you need to take up the mantle of Galileo and Frederick Douglass and Jane Jacobs and all those people who stood up to the intellectual authorities and power structures of their own time. The whole point of “you are not immune to conspiracy theories” is that the evidence for them can sound convincing because something like it is sort of true. This is equally so for second-level claims like “prestigious institutions are fallible and biased”. Probably something like “make a principled precommitment never to disagree with prestigious institutions until you are at least 30 and have a graduate degree in at least one subject” would be good advice, but nobody would take that advice, and taking it too seriously might crush some kind of important human spirit, so I won’t assert this. But always have in the back of your mind that you live in a world where it’s sort of good advice.

If you feel tempted to believe something that has red flags for being a conspiracy theory, at least keep track of the Inside vs. Outside View. Say “on the Inside View, this feels like the evidence is overwhelming; on the Outside View, it sounds like a classic conspiracy theory”. You don’t necessarily have to resolve this discomfort right away. You can walk around with an annoying knot in your beliefs, even if it’s not fun. Look for the strongest evidence against the idea. Keep in mind important possibilities like:

  • Is it possible that everyone who disagrees with the idea is a bad mean cruel stupid person, but also, the idea really is false?

  • Is it possible that most of the standard arguments against the idea are dumb and flawed, but the idea really is false?

  • Is it possible that people are exaggerating the degree to which the idea is false, but when you strip away all those exaggerations, it’s still mostly false?

  • Is it possible that there’s a core of truth to the idea, but that core isn’t the part people are talking about when they say it’s false?

If none of this rings true, figure out whether you really need to have an opinion. Nobody needs to be sure whether Kennedy was assassinated by a lone gunman or not. If you find yourself compelled to speak out, consider whether this means that believing it fulfills some psychological need; if yes, be extra suspicious. If no, there’s no need to resolve the knot immediately; just admit it’s an awkward riddle for you and hope that it makes more sense later. Sometimes it will! This is how I treated my Atlantis worries - I never waved protest signs at archaeology conventions, I just went around with a knot in my belief structure, which I eventually settled with minimal embarrassment to myself. The number one way to gain useful skills for wrestling with conspiracies is to wrestle with conspiracies; I don’t recommend it as a deliberate tactic, but it’s a silver lining if you can’t avoid it.

All advice along the lines of “don’t do X unless you’re smart and sophisticated” is useless, because everyone believes themselves smart and sophisticated. Still, at some point, after a lot of experience and a few crises of faith, you might develop a skill something like Bounded Distrust, at which point it’s not necessarily instant epistemic suicide to suspend the second approximation.

To a first approximation, you should never suspend the first approximation.

I hope something like this is more useful than any of the three naive positions I mentioned earlier.