Appearances and Inclinations
I think I might have identified the biggest mistake in belief-formation: confusing inclinations to believe with appearances.
Let me back up. In epistemology, I argue that appearances are the source of justified belief. When something seems to one to be the case, and one has no reason for doubting it, then one has foundational justification for believing that thing. In fact, I think this is the only foundational source of justification that anyone ever has or could conceivably have for anything.
But what are these "seeming" states? In my view, they are a distinctive type of experience, one with propositional content but different from belief. Some philosophers, however, reject this notion. They say there is no distinctive type of experience called a "seeming". Instead, they say that such expressions as "It seems to me that P" merely report a belief or an inclination to believe P. People who say this also generally say that this sort of thing couldn't be a source of justification (you couldn't be justified in believing P merely because you're inclined to believe P).
Of course, I think the "inclination to believe" theory is false. The main problem: There are many reasons why you could be inclined to believe P that have nothing to do with P's seeming true. Say you feel inclined to believe P because you want P to be true. Or because you've been taught that it's morally virtuous to believe P (like how religious people are taught that "faith" is a virtue). Or because tough guys believe P. Or, to refer to a previous post (https://fakenous.net/?p=2083), you might be inclined toward P because that's the slogan of your preferred tribe, and you have to believe it in order to express loyalty to the tribe. All of these things really happen to people (the human mind is not utterly rational, it turns out). Notice that all of these things are importantly different from being inclined to believe P because it actually seems true.
It occurred to me that if philosophers have mistaken appearances for mere inclinations to believe, maybe ordinary people make that mistake too -- only not on a theoretical level, but on a concrete level. Maybe they confuse their particular inclinations to believe things with appearances, and thence form beliefs based on those inclinations.
And maybe, come to think of it, that is the primary error in belief-formation. Unjustified belief is belief not properly related to the appearances. This could be because one disregards some appearances that are relevant (as when one has acquired appearances that cast doubt on one's previous belief, but one fails to revise the belief in light of that). But more often, it is because something else, something that's not a genuine appearance, is influencing one's beliefs. That other thing is generally going to be some other form of inclination to believe.
And maybe part of why it's easy for people to make that kind of mistake is that these other inclinations to believe feel kind of similar to an appearance, so if you're not careful and reflective, you might confuse them.
(You might doubt that this "confusion" theory is needed. As long as you have an inclination to believe P, that's ipso facto going to incline you to believe P, even without your confusing it with anything else. Sure. But if the distinction between appearances and (other) inclinations to believe was more clear in people's minds, I assume that most people would recognize that one rationally shouldn't form beliefs based on non-appearance inclinations.)
Examples
The plausibility of this theory might not be obvious, because maybe the sort of "inclinations to believe" that I mentioned above don't sound very similar to appearances. It doesn't sound that easy to confuse, for example, wanting to believe P because it's morally virtuous to believe it with P's seeming to be true.
But many cases are more ambiguous, and we don't always know why we are drawn to a belief.
Explaining crime
Let's say the question is: Why do people commit crimes? You hear two people give theories about this:
A: "Some people are just born bad, that's all."
B: "People commit crimes because society failed them -- they were abused, or they weren't properly educated, or they lacked opportunities."
You think about those theories, and you have some sort of vaguely positive feeling toward one of them (which one depends on your personality -- see previous post on how ideology is about personality, https://fakenous.net/?p=2083). You imagine saying "Some people are just born bad", and it feels good -- or you imagine saying it and feel yourself recoiling. And that's how you decide to believe it or not believe it.
You don't exactly know why it felt good, or why you recoiled. If you're not highly reflective, you're likely to just say, "It sounded right" or "it sounded wrong". But "right" or "wrong" in what sense? Maybe what was the case was not exactly that it sounded true or false, but that it sounded like a thing you'd like to say or wouldn't like to say. If you're high in the "agreeableness" trait, maybe statement (A) feels like a wrong (i.e., inappropriate, not-nice) thing to say. If you're low in agreeableness, then it feels fine (hey, we're talking about criminals -- fuck those assholes. Who cares if we offend them?). We confuse these emotional reactions to theories with appearances, i.e., with the theories' actually appearing true or false.
Conspiracy theories
Another type of example: conspiracy theories. I've had a couple of students who, after hearing all my attacks on government, didn't understand why I'm not a 9/11 Truther. The government is bad -- why not say they planned the 9/11 terrorist attack?
People hear a conspiracy story, and they have an emotional reaction to it. At least for some people, there's a kind of pleasure in "exposing" how the high-status people (the powerful, the rich, the famous) are doing some secret, incredibly evil thing. Those who experience this kind of pleasure are liable to become conspiracy theorists. It may not be obvious to them that their belief is based on that feeling of pleasure. When they hear a conspiracy story, they just feel attracted to it. They also think they have evidence for it. But all those "evidence" claims are the same way: they accept E as evidence for the theory because they feel a kind of pleasure when they entertain the proposition, [E is evidence of this conspiracy].
Morality
"Sexual promiscuity is bad for women, but good (or at least less bad) for men." Does that seem true?
I don't think it seems true, but I think an enormous number of people believe it. I think they believe it because they feel a certain kind of self-righteous pleasure when they sit in judgment on promiscuous women, which they don't feel when they think about criticizing promiscuous men. I think that's pretty close to the entire basis for traditional sexual morality.
Being rational
I suggest that P's seeming true isn't the same as your feeling attracted to the belief that P. But it might be hard to make that distinction introspectively. In most cases, maybe the best thing is just to focus on the quality of your evidence.
In the "what causes crime" example, surely you couldn't expect to be reliable about that question without evidence. So set aside the question of which theory "sounds right", and ask the questions, "what's my evidence about this?", "what would be good evidence about this?", and "what are the alternative explanations of the evidence?"
Sounds obvious, but in fact I think people rarely ask themselves those things. My impression is that a lot of people merely assume one or the other of the theories ("some people are born bad" or "society failed them"). Then maybe they cite one or two anecdotes (a criminal who had a bad childhood, or one who didn't).
The reason these questions about "evidence" are helpful is that it's harder for your answers to them to be skewed by your non-cognitive inclinations. E.g., even if you're emotionally attracted to the "some people are born bad" theory, that's probably not going to stop you from seeing that the appropriate evidence for such a theory would be statistical, and there are general norms about statistical evidence that you probably know about, independent of this case.
* * *
I want to stress that all this isn't just a minor, occasional problem. It doesn't just affect a few isolated questions like "what causes crime?" I think this practice of treating an inclination as an appearance, or adopting a belief because it feels good, is incredibly common. When we consider a political question, we just do this automatically. It may be the source of the vast majority of ideologies, and religions, and philosophies.
I'm in a discussion with friends about this exact topic -- how to differentiate between intuitions about what's true versus being attracted to a belief independent of its truth -- and I found this through Google. I would love to see you expound on this as it's really subtle and complicated but critically important. For example, in the same way that you show examples of how some beliefs are not based on appearances of what seems to be true, it could be useful to show examples where people think that they are observing appearances of what seems to be true but are instead attracted to a belief due to some bias (i.e. lots of ethical questions).
As you point out, it's pretty difficult to differentiate as they both feel like gut feelings. I imagine this ambiguity is another reason why rationalists dislike intuitionism so much as it is legitimately difficult to figure out the difference often.