Here, I explain why “critical thinking” is epistemically irrational.*
[* Based on: “Is Critical Thinking Epistemically Responsible?”, Metaphilosophy 36 (2005): 522-31.]
The Critical Thinking Philosophy
Obviously, by “critical thinking” here, I don’t just mean thinking rationally, avoiding fallacies, etc. I mean a particular thing that students are often advised to do in “critical thinking” classes: thinking issues through for yourself.
I have in mind controversial issues that have been publicly discussed, e.g., abortion, gun control, global warming. When approaching such issues, you could:
a. Trust the experts. In case the experts disagree, you could try to figure out what most experts think, or what the best experts think, or something like that.
b. Just withhold judgment. Or,
c. Think for yourself. That is, rather than relying on the experts, you could review the primary evidence that the experts themselves would be basing their judgment on for yourself, and form a belief based on your assessment of that evidence.
(c) is what I’m calling “critical thinking”. Many books tell students that that’s what they should do. But I think this is typically irrational, since either (a) or (b) is obviously better.
Critical Thinking Is Unreliable
Obvious point: experts tend to be better than non-experts at correctly assessing difficult issues, due to their obvious cognitive advantages over lay people. E.g., the experts typically have greater than average intelligence, much greater knowledge about the issue in question, and have also spent more time thinking about the issue than you. That’s why they’re called “experts”.
E.g., say you’re an ordinary person who wants to form an opinion about the wisdom of gun control laws. Well, there are smart people who have devoted many years to studying that. Do you suppose they learned anything during those years? How could they possibly fail to be more reliable than you? Are we going to say that intelligence has no effect on ability to figure out the truth? Are we going to say that knowledge of the subject also has no effect? If so, then our advice to students should be, absurdly, that there’s no point in their bothering to learn about a subject before forming a judgment about it.
Why not just learn the same information yourself? Well sure, if you want to spend many years, and if you’re smart enough, you could learn the same information and thus become an expert yourself. But hardly anyone is going to do that, and everybody knows that. What people are going to do when told to think critically is read about the subject for a few hours or weeks, then pronounce a judgment based on what seems right to them at that point. It’s easy to see how this would be less reliable, and very hard to see how it could be more reliable, than deferring to the judgments of experts who have done a lot more study.
The Critical Thinking Philosophy Is Incoherent
Hypothetical: Suppose you would like to form an opinion about moral realism, a topic which you have not yet had time to study carefully and have no firm opinions on. A student from your university, who took a critical thinking class and got an ‘A’ in it, sincerely tells you that he has recently studied the subject, applying all the critical thinking lessons he learned, and he has come to the conclusion that non-cognitivism is the correct metaethical theory.
Q: Should you now believe non-cognitivism?
Suppose you answer “yes, believe non-cognitivism”. Then you’re giving up the critical thinking philosophy right away, since you yourself would be violating its central advice.
Anyway, the answer is obviously no, and everyone knows it. No one would adopt a controversial philosophical view on such flimsy grounds. Notice that this means that we’re saying that the student’s judgment is not strong evidence of the truth of non-cognitivism. Indeed, it is almost no evidence at all (you should change your credence in non-cognitivism barely, if at all). And notice that that means that we’re saying the judgment of a student in such a situation is not reliable. In other words: critical thinking is unreliable. So why do we (professors of philosophy) tell people to do it? If you wouldn’t rely on that student’s judgment, how can you advise the student to rely on it?
The ‘Critical’ Patient
Interestingly, there are other contexts in which almost no one supports critical thinking. E.g., if you have a medical condition, almost no one tells you that you should diagnose and treat yourself, without bothering with doctors. If we taught students that, we would be extremely irresponsible, and we’d probably wind up with some dead students eventually. That’s because most people do not have the expertise to diagnose and treat themselves.
Practical note: This doesn’t mean that you should believe whatever your doctor says. If a doctor recommends an expensive or dangerous treatment, or you feel skeptical about a diagnosis, you should get a second opinion. Doctors are in fact often wrong (probably way more often than most people think). Nevertheless, lay people are rather obviously much worse. That’s why the smart advice is “get a second opinion”, not “diagnose yourself”.
Why do people have such different attitudes about medicine as compared to politics and philosophy? Maybe because people know that medicine is complicated, but they foolishly assume that political and philosophical issues are simple?
Objections
1. “But how can we judge who the experts are? Do we use more experts for that, or do we use critical thinking?”
Reply: Don’t be silly. For most controversial issues, it is a lot easier to judge who is an expert on X than it is to judge the truth of X itself. That’s why you don’t need meta-experts to tell you who the ordinary experts are. E.g., you could look for people who have PhD’s and have written books and articles on a subject, as part of their academic research. If you want, you could call that “critical thinking about who is an expert”, but that’s quite different from judging the actual controversial issues yourself, as the critical thinking books usually tell you to do.
2. “But we have to teach critical thinking in order to train the next generation of experts.”
Reply:
a) Only a tiny fraction of students will or should try to become experts. It doesn’t make sense to gear teaching toward a tiny fraction, rather than the vast majority of students.
b) Given the state of the academic job market, I really don’t think we’re in a position of worrying about a future shortage of experts.
c) Anyway, my main question was about what is epistemically rational, not what is socially useful.
3. “But I can think of 3 cases where experts were wrong!”
Reply: Notice that “experts are often wrong” does not imply “non-experts are more reliable”. Notice that the latter conclusion is still obviously false. But that’s what you’d have to maintain to defend critical thinking.
It’s perfectly possible that experts are unreliable – on many subjects, that’s clearly the case. But the conclusion from there must be that no one is reliable, not that non-experts must be reliable. Thus, the rational approach would be to suspend judgment in such subjects, not to try to figure them out on your own.
The Time for Critical Thinking
Having said all that, there are still some possible times when critical thinking is appropriate. One is when you’re addressing an issue (e.g., a decision in your personal life) which experts have not studied. Another might be when experts have some systematic bias. E.g., say you’re wondering what the best policy is for regulating some industry, and the main experts on that question are themselves industry insiders. It could be reasonable to think they are less reliable than a smart layperson, despite the experts’ greater knowledge, because the experts have an obvious conflict of interests.
However, I want to add that I think accusations of bias are too often used to rationalize ignoring people with opinions we don’t like, while ignoring our own biases. True, experts are often biased. But most of the time, lay people are just as biased as the experts. You probably don’t think you’re biased about any given issue, but, well, you’re probably just biased about the question of your bias. And critical thinking classes probably don’t do much to help with that. People probably mostly use their “critical thinking” lessons to come up with ways of discounting counter-arguments that they don’t like and thereby sticking to the opinions they prefer.
I think this presents a false dichotomy between evaluating every claim from the ground up and not using your thoughts about the issue to evaluate alternatives.
I don't think anyone is suggesting that anyone who wants to make a decision about a medical diagnosis should start from scratch and evaluate every argument afresh. However, the usual case is that there are multiple positions, often both defended by experts, and one can usually find presentations of the disagreement between them where you can at least compare the arguments they make to some degree.
I'd argue that one should think of it like a probabilistic proof verification. You can't hope to evaluate the complete arguments presented by each side by you can look at selections of the arguments they've selected as the most convincing and check how strongly various steps support the next intermediate conclusion. If one side's arguments seem to have weaker links when selected at random (assuming correction for length of the arguments) that's good statistical evidence for them having the weaker overall claim.
Also, when that fails, I'd argue that comparing how the various camps have done on prior predictions counts as "thinking for yourself".
Besides, if you don't think for yourself what other option do you have to decide between competing expert views? If have to decide whether to get a surgery and some docs are big proponents and others detractors and both sides are likely quite credible (none of the easy ways you can discredit global warming deniers). So, other than at least getting a sense of what the arguments are and why they hold them and then trying to analyze some of the links in their evidentiary chains what else can you do? Flip a coin?
I think that many people default to the heuristic "Defer to the person with high status," even though that person may not be an expert. A famous athlete appearing in a commercial for a financial service, for example. "Think for yourself" may be an admonition against the heuristic of believing the (non-expert) high-status individual, such as your favorite politician.