15 Comments

Regarding "P but I don't believe P" maybe it's relevant to consider some more complex examples"

To me it seems equally weird to say "P and Q but I don't believe P" or "P or (P and Q) but I don't believe P". However, if we increase the complexity of the left side enough it not longer seems weird, eg "The Riemann zeta hypothesis implies P but I don't believe P" seems fine even if it turns out that the Riemann zeta hypothesis is a logical truth (yes I know some ppl think mathematical truths aren't logical but, if necessary, we can make the sentence say if zeta is provable in ZFC then P).

I feel like this suggests there is at least an element of consideration of the complexity of the inference involved (and I think the same trick works on the belief side eg I don't believe P or Q is still weird).

Expand full comment

(looks like first time substack ate this) Regarding the last puzzle does it raise a problem for your solution that "it's raining but there is only a 90% chance that's true" also seems weird?

My analysis in this is that the problem comes from the implied tension. To my ear it seems fine to assert, "It's raining and I'm only 90% sure." I think the difference is that we make the background assumption that there is some implied level of certainty required to make assertions in this context so if you don't think it reached that level why did you assert it and if you do why did you imply a tension.

I'd like to extend this to the certainty case by suggesting that saying "I'm not certain" in this context has the Gricean implicature that there is enough reason to doubt that you should take notice. But if that's so then asserting the claim in the first place violated the norm of helpfulness (if you knew I needed to know with probability at least 99% that it was raining why would you say it's raining when you only have probability 90%).

Expand full comment

They Know What Is What But They Don't Know What Is What They just strut

Expand full comment

Mark 9:24

Expand full comment

When we say "It is raining" and *mean* that it is raining, then we are expressing our belief that it is raining. And expressing your belief that it is raining means expressing *that* you believe that it is raining. - Doesn't that sound plausible? And isn't it sufficient to explain the *defectiveness* of Moore-paradoxical statements? (Take your first example: Here you express that you believe that it is raining by saying that it is raining, and express that you don't believe that it is raining by saying that you don't believe that.)

Expand full comment
Jan 1, 2023·edited Jan 1, 2023

I'm pretty sure the Moorean paradox of "P, but I don't know P" is really just a result of us using the word "knowledge" in many different ways in ordinary contexts. For instance, psychologists sometimes talk about "knowing P" when discussing expectation effects (e.g. placebo and nocebo effects), even though it's clearly confident belief that matters for such effects and not knowledge.

However, if you want to take the Moorean paradox seriously, isn't the much more obvious solution just to say that there is a norm of thought which requires something like "Don't think P without qualification unless you are rationally believe that P is true with high confidence"? If so, then you avoid the messiness of making knowledge the norm of belief while still solving all the problems:

1. "It is raining, but I don’t believe that" violates the norm of thought because you can't rationally believe that P is true with high confidence while believing that you don't believe that P.

(In order to rationally believe that P is true with high confidence you must be aware of the grounds of your belief that P is true and the relation between that belief and its grounds, since some such relations will undermine justification and thus high confidence. Yet, if you are aware of the relation between your belief and its grounds, then you have sufficient grounds for rationally believing that you believe P. If you have sufficient grounds for rationally believing that you believe P and instead choose to believe that you don't believe P, then you are being irrational in believing that "I don't believe that" and thus violating the norm of thought.)

2. "It is raining, but that’s not true" violates the norm of thought because you can't rationally believe that P is true while believing that P is not true.

3. "It is raining, but there’s no reason to think that" violates the norm of thought because you can't rationally believe that P is true while believing that there is no reason to believe that P.

4. "It is raining, but I based that belief on a false premise" violates the norm of thought because you can't rationally believe that P is true while believing that this belief is based on a false premise which would make P not true.

(If you remove the last bit about making P not true, then I don't see the paradox. People base beliefs on approximate truths which are false premises all the time. For that matter, you can know things based on approximate truths, too, so you need the last bit for your solution to work as well.)

5. "It is raining, but my way of finding that out is unreliable" violates the norm of thought because you can't rationally believe that P is true while believing that this belief was formed through an unreliable method.

6. "It is raining, but my justification for that is defeated" violates the norm of thought because you can't rationally believe that P is true while believing that the justification for this belief is (all-things-considered) defeated.

7. "It is raining, but I don’t know that" violates the norm of thought because you can't rationally believe that P is true with high confidence while believing that you don't know that P.

(If you rationally believe that P is true with high confidence, then it is irrational to form the belief that you don't know that P because in order to rationally have sufficiently high confidence you must have and be aware of sufficient grounds for rationally believing that you know P. If you have sufficient grounds for rationally believing that you know P and instead choose to believe that you don't know P, then you are being irrational in believing that "I don't know that" and thus violating the norm of thought.)

Why would the norm of thought say "Don't think P without qualification unless you are rationally believe that P is true with high confidence"? Because thinking that P without qualification only seems appropriate if you have sufficiently high confidence that P is true. If you don't have such confidence, you should add qualifications (e.g. "P seems probably true" or "It seems to me that P") instead of thinking to yourself "P" without any qualifications.

Although this norm of thought is also a norm of assertion (you should add qualifications if you aren't confident when asserting "P"), it's a norm of thought because it applies to thoughts without assertion. For example, if you are politically tribal, you might sometimes find yourself thinking some very strong thoughts on a recent political event without qualification and without sufficient grounds for rational high confidence, and that is inappropriate. You should qualify your own thoughts in such circumstances.

An additional benefit of this solution (besides avoiding all the problems with making knowledge the norm of belief) is that it can explain why properly qualified assertions and thoughts avoid the Moorean paradox. For example, there is no paradox in saying that "P is probably true, but I don't know P for sure" or that "P seems to me to be true, but that seeming is unreliable." Some Moore paradoxes still arise for improperly qualified thoughts, but can be explained using the same sorts of explanations as I did with #1-#7. For example, "I think P is true, but there’s no reason to think that" is paradoxical for the same reason that #3 above is.

Expand full comment

At first I wanted to quibble about contingencies that might make “it's raining” true, false, or ambiguous. But that's not the point, we could replace “it's raining” with something unambiguous like “the sun is up.”

If I believe P, but I include a contingency in my plan in case P turns out to be false, is that a contradiction? I’ve been thinking that belief in P is willingness to act on P, including willingness to disbelieve new propositions if they contradict P.

Maybe contingency plans are more about “if this step fails for whatever reason” and less about “if this specific belief turns out to be wrong.”

Expand full comment

I don't buy that the norm for assertion or belief is *knowledge*, because it seems to me that people assert and believe things all the time that they'd admit they don't *know*. One example would be religious faith: some religious people will say they know that God exists (based on personal religious experience, perhaps), but other religious people will freely admit that they don't know it, because their belief is based on faith rather than evidence. Yet they might still genuinely believe, and assert, that God exists. More generally, if you ask people what it takes to count as genuinely knowing something, many will have a fairly high standard; there will be lots of things they believe to be true that don't meet that standard.

Instead, it seems to me that the constitutive norm must be *truth*: To count as believing at all, you need to be guided by the norm of only believing what's actually true; to count as (sincerely) asserting, you need to be guided by the norm of only asserting what's actually true. But someone can sincerely believe that X is true (and believe that they believe it, etc.) without taking themselves to know that X is true.

Expand full comment