

Discover more from Fake Noûs
In informal logic books and classes, you can find long lists of standard fallacies, such as “appeal to authority”, “argument ad hominem”, and “begging the question”. But these lists are too long, and anyway most of the items on them are not things you would often do if you’re not stupid.
I’ve tried to think of what are the two most important errors that lead to false beliefs about all kinds of things, even among non-stupid people. I don’t know if these really deserve the name “fallacy”, but they’re important errors to attend to.
They’re a little trivial, but talking about them might be helpful, because it can induce people to be more on guard against falling into them.
1. Assumption
Most human beings are hair-trigger belief-formers (when it comes to topics that they care about). When they get some tiny hint that A might be true, they leap to assuming A. Sometimes they do it with no apparent justification at all.
a. Assumptions in History
This is my read on why the history of science and philosophy is so full of error. Most theories about the world that people have held have been 100% wrong: the theory of the four elements, the medieval theory of the four bodily humors, the theory that diseases can be caused by evil spirits or being “cursed”, traditional theories about the origin of humans and the Earth (generally involving gods), primitive theories of the structure of the cosmos, etc.
One reason why so many beliefs are wrong is the base rate: Almost all possible theories (that aren’t purely negative) are false. If we were rational, we would take that into account and start with very low initial credences in most theories, requiring a good deal of evidence to take them seriously.
Instead, people seem ready to adopt these sorts of beliefs on practically no basis. E.g., medieval doctors saw that some people were sick, guessed that maybe illness could be caused by imbalances of bodily fluids, then tried guessing what sorts of imbalances would lead to particular symptoms. There was some logic to that part (e.g., a fever is caused by an excess of blood), but the starting idea that diseases had to do with fluid imbalances was just a guess that seemed sort of vaguely plausible. Or so I would guess.
b. Motivated Guessing
Often, our assumptions have obvious emotional motivations. E.g., the widespread belief that Biden stole the 2020 election is just an assumption. There is no evidence for it of the standard sort (say, evidence that you could present in court and not be laughed at). The real basis for the belief is desire: one believes that Trump won because one wants to believe that. No person who actually wanted to know whether it was true believes it.
To take an example from the other side of insanity, why did Derek Chauvin kill George Floyd? Let’s assume it was racism. And since we’ve seen a few stories of black people being killed by police, let’s assume that there’s an epidemic of racist murders. Since we haven’t seen any news stories about white people being killed by police, let’s assume that that hardly ever happens.
When we want to believe something, our desire gives us a positive emotional reaction to the proposition. We confuse that emotional reaction with the thing’s being plausible. We then find any contrary idea “implausible”.
The habit of mere assumption is not limited to stupid extremists. What causes a person to turn to a life of crime? Many intelligent people just assume that the answer is “a bad upbringing”. More broadly, it is extremely common to assume that upbringing has a huge influence on people’s character.
c. Mundane Examples
When we observe people’s behavior in our day-to-day lives, we commonly assume their motives and attitudes. If someone is short with you one day, you assume that they don’t like you, when the truth may be that the person just got a traffic ticket that morning. Or maybe they weren’t in fact annoyed at all and you misperceived that.
If someone cuts you off in traffic, you assume that it’s “because they’re an asshole”, and not that they need to get somewhere quickly for some important reason.
When someone explains an unfamiliar idea to us, we assume that it’s closely related to some idea that we’ve heard about, or something we ourselves have thought, or the first free association we form with the name of the idea. Because of these sorts of erroneous assumptions, it often takes a long time for two people to actually engage with one another’s ideas.
d. What’s wrong with assuming?
Nearly all assumptions are false. That’s because nearly all (positive) propositions are false, the world is highly complicated, and human guesses are affected by all sorts of purely subjective factors that have no connection to the external reality being considered.
e. What should we do?
Instead of merely assuming stuff, we should:
Consider alternative possibilities. Don’t just rest with the first theory that comes into your head. Spend time trying to think of alternative theories.
Look for objections. Most people only spend time asking why their theory might be right. You should also ask yourself, “What are the main reasons why this might be wrong?”
If applicable, look for empirical tests of a theory. And note that a genuine test of a theory must be something that could yield evidence refuting the theory (or at least rendering it improbable) if it were false.
Listen to people. When forming a belief about something that has been much discussed, look for intelligent people with importantly different perspectives on the subject, and listen to them.
Be unsure. When thinking about a controversial subject, you probably shouldn’t have extreme (close to 1 or 0) credences. Admit that the truth may be something you haven’t thought of.
2. Dogmatism
Most human beings are highly resistant to revising beliefs. Though we may have adopted the belief that P based on a nearly random guess, once we adopt it, we demand overwhelming evidence against P before we’ll consider changing it. Some people adopt standards such that they would basically never revise the beliefs that they cherish.
a. Questions to Consider
If you hold a controversial belief, here are some good things to ask yourself.
a. Is this an empirical belief or an a priori belief? (Hint: Most beliefs, especially about politics and society, are empirical.)
b. What could convince me that this is false? If the answer is “nothing”, then you’re most likely an irrational dogmatist. (Note: But some a priori beliefs are in fact irrefutable or nearly irrefutable.)
c. If this belief were false, what would my evidence look like? If the answer is “just the way it now looks,” then think hard about why you hold the belief.
d. Is my belief what an outside observer would predict based on my biases? E.g., if you hold the belief that serves your interests, is promoted in your social group, or would be expected based on your personality traits, then maybe it’s a bias.
b. Am I being dogmatic?
When a person is being dogmatic, they do not normally feel dogmatic. I suppose the way they feel is that they’re just saying something completely obvious. (But this is also the way it feels when you are in fact saying something completely obvious.)
There is no foolproof algorithm for detecting dogmatism. But there are signs. If you hold a view, and a large number of smart, well-informed people disagree with that view, and if you can’t at all see where they are coming from and it just seems to you that they are being stupid or evil, then there’s a good chance that you’re actually being dogmatic. Because the base rate of people being dogmatic is a lot higher than the base rate of large numbers of experts being stupid or evil. (Of course it is possible that a lot of experts are actually being stupid or evil, but this is rare.)
c. Dogmatic techniques
As Jonathan Haidt says, when we consider a belief we like, we ask ourselves, “May I believe it?”; when we consider a belief we dislike, we ask ourselves, “Must I believe it?” That is, in the first case, we look for any excuse to hold the belief; in the second case, we look for any way that the belief could be avoided.
Here are some techniques for dogmatically clinging to your beliefs:
(a) Ignore sources that might present counter-evidence. If you’re left-leaning, avoid right-wing authors and news sources; if you’re right-leaning, avoid left-wing sources. If you do this, it is highly unlikely that your beliefs will be corrected even if they are in fact false.
(b) Posit conspiracies to explain away counter-evidence. In a recent video, Jordan Klepper talks to some Trump-supporters about January 6 (https://youtu.be/Il4Cp74XRFI).
They are convinced that the election was stolen from Trump. Klepper plays a video for one Trump-supporter, showing Bill Barr (Trump’s own Attorney General) saying that the stolen election claims were bullshit. The Trump supporter hypothesizes that someone got to Barr and made him say that. In another exchange, Klepper plays a video of Ivanka Trump saying that she was convinced by Bill Barr. Another Trump supporter hypothesizes that the video is a deep fake.
Notice that this is basically maximal dogmatism. I.e., these individuals are leaving no way for their beliefs to be corrected if they were false. There is no evidence that anyone could possibly present to them where they couldn’t just claim that the evidence was faked or produced by a conspiracy.
(c) Appeal to the rest of your belief system. If someone challenges one controversial element of your belief system, defend it by appealing to other controversial parts of your belief system, rather than trying to cite ideologically neutral evidence.
Why is this a tool of dogmatism? Because you’re basically saying that you won’t alter your belief system unless someone simultaneously refutes the entire system, which of course is for all practical purposes impossible.
(d) Allude to other evidence. Similar to (c): If some piece of evidence that you cited turns out to be false, say that it doesn’t matter because the other side hasn’t refuted all the other evidence for your belief system. Then don’t revise your credences at all.
Example: You cite the gender pay gap as evidence of America’s rampant sexism. Someone points out that the pay gap statistic fails to control for occupation and other pay-relevant features of one’s job, and that after controlling for these things, the gap virtually disappears (https://fakenous.substack.com/p/the-gender-pay-gap-empirical-facts). In response, you can claim that this doesn’t matter because it doesn’t refute all the other things that you claim are rampant sexism. Then don’t alter your credences about sexism at all.
This makes the belief in rampant sexism practically unrevisable, because no one can decisively refute the completely general claim that “sexism is rampant”. They can only investigate particular alleged instances, and no one is going to be able to address all instances that anyone might claim are sexism.
(e) Selectively scrutinize evidence. When someone cites evidence against your beliefs, subject it to incredibly demanding standards. E.g., if there’s a scientific paper, demand that it be replicated, scrutinize it for any possible flaws, etc. But when you see evidence supporting your beliefs, don’t do any of this; just accept it at face value.
This, again, is a tool of dogmatism, because basically every piece of empirical evidence is imperfect in some way. So you’ll pretty much always have an excuse to discount any evidence against your beliefs.
d. What should we do?
In brief, what you should do is consciously strive to avoid all the dogmatic techniques listed above. But come on. We both know you’re not going to do that. You’ll probably just criticize other people for using them but keep on using them yourself.
Two Master Fallacies
The more popular a position is, the easier it is to be dogmatic. Maybe my experience is biased, but while I am constantly hearing both sympathetic and unsympathetic criticisms of libertarianism, and responses to those criticisms, I almost never hear conservatives or especially progressives defending their own ideas. They mostly stick to criticizing the other side. Who is the defender of the status quo? Krugman? Donald Wittman?
I emphatically agree with your point about assumptions. I am no longer surprised when persons confidently declare that something is impossible, when it actually has already happened in history. I wonder if this timidity toward variation and experiment doesn’t hold back society to a remarkable degree. If someone doesn’t want to participate in an experiment, that's fine, but why prevent others from doing so? Mostly it seems to be because we have inherited a mass-production mass-consumption mindset from the progressive era.
You said that the gender pay gap disappears after "controlling for occupation," but what if the effect of sexism occurs earlier in the causal chain than that? That is, what if the reason women opt into different occupations at different rates is itself because of sexist factors? If so, then "controlling for occupation" wouldn't suffice to show that sexism isn't responsible for the gap in pay.
(I myself don't have an opinion about the pay gap, but I don't think your argument is enough to settle the issue. This is not an invitation to "fight me.")