Regarding "P but I don't believe P" maybe it's relevant to consider some more complex examples"
To me it seems equally weird to say "P and Q but I don't believe P" or "P or (P and Q) but I don't believe P". However, if we increase the complexity of the left side enough it not longer seems weird, eg "The Riemann zeta hypothesis implies P but I don't believe P" seems fine even if it turns out that the Riemann zeta hypothesis is a logical truth (yes I know some ppl think mathematical truths aren't logical but, if necessary, we can make the sentence say if zeta is provable in ZFC then P).
I feel like this suggests there is at least an element of consideration of the complexity of the inference involved (and I think the same trick works on the belief side eg I don't believe P or Q is still weird).
(looks like first time substack ate this) Regarding the last puzzle does it raise a problem for your solution that "it's raining but there is only a 90% chance that's true" also seems weird?
My analysis in this is that the problem comes from the implied tension. To my ear it seems fine to assert, "It's raining and I'm only 90% sure." I think the difference is that we make the background assumption that there is some implied level of certainty required to make assertions in this context so if you don't think it reached that level why did you assert it and if you do why did you imply a tension.
I'd like to extend this to the certainty case by suggesting that saying "I'm not certain" in this context has the Gricean implicature that there is enough reason to doubt that you should take notice. But if that's so then asserting the claim in the first place violated the norm of helpfulness (if you knew I needed to know with probability at least 99% that it was raining why would you say it's raining when you only have probability 90%).
Indeed, I think this can be extended to a general solution. I agree on the metacoherence part but aren't quite convinced of the rest of your solution. I'd instead argue that the problem is that the Gricean norm of relevance means that we wouldn't assert the qualifying half of the sentence if it wasn't relevant to the person/context we are talking about.
However, the level of confidence needed to assert something fluctuates with context and is itself governed by the Gricean norm of helpfulness. So what's weird about these cases is that the first half suggests we have sufficient confidence in the claim that you can treat it as true while that means the second half violates the relevance norm.
I think we can test this by noting that none of these statements (except the one directly about belief covered by metacoherence) seem weird if we imagine that instead of coming in one sentence someone asks something like "do you have any reason to doubt that" or "are u positive" in between.
But maybe you had something like this in mind too and I'm misunderstanding.
When we say "It is raining" and *mean* that it is raining, then we are expressing our belief that it is raining. And expressing your belief that it is raining means expressing *that* you believe that it is raining. - Doesn't that sound plausible? And isn't it sufficient to explain the *defectiveness* of Moore-paradoxical statements? (Take your first example: Here you express that you believe that it is raining by saying that it is raining, and express that you don't believe that it is raining by saying that you don't believe that.)
I'm pretty sure the Moorean paradox of "P, but I don't know P" is really just a result of us using the word "knowledge" in many different ways in ordinary contexts. For instance, psychologists sometimes talk about "knowing P" when discussing expectation effects (e.g. placebo and nocebo effects), even though it's clearly confident belief that matters for such effects and not knowledge.
However, if you want to take the Moorean paradox seriously, isn't the much more obvious solution just to say that there is a norm of thought which requires something like "Don't think P without qualification unless you are rationally believe that P is true with high confidence"? If so, then you avoid the messiness of making knowledge the norm of belief while still solving all the problems:
1. "It is raining, but I don’t believe that" violates the norm of thought because you can't rationally believe that P is true with high confidence while believing that you don't believe that P.
(In order to rationally believe that P is true with high confidence you must be aware of the grounds of your belief that P is true and the relation between that belief and its grounds, since some such relations will undermine justification and thus high confidence. Yet, if you are aware of the relation between your belief and its grounds, then you have sufficient grounds for rationally believing that you believe P. If you have sufficient grounds for rationally believing that you believe P and instead choose to believe that you don't believe P, then you are being irrational in believing that "I don't believe that" and thus violating the norm of thought.)
2. "It is raining, but that’s not true" violates the norm of thought because you can't rationally believe that P is true while believing that P is not true.
3. "It is raining, but there’s no reason to think that" violates the norm of thought because you can't rationally believe that P is true while believing that there is no reason to believe that P.
4. "It is raining, but I based that belief on a false premise" violates the norm of thought because you can't rationally believe that P is true while believing that this belief is based on a false premise which would make P not true.
(If you remove the last bit about making P not true, then I don't see the paradox. People base beliefs on approximate truths which are false premises all the time. For that matter, you can know things based on approximate truths, too, so you need the last bit for your solution to work as well.)
5. "It is raining, but my way of finding that out is unreliable" violates the norm of thought because you can't rationally believe that P is true while believing that this belief was formed through an unreliable method.
6. "It is raining, but my justification for that is defeated" violates the norm of thought because you can't rationally believe that P is true while believing that the justification for this belief is (all-things-considered) defeated.
7. "It is raining, but I don’t know that" violates the norm of thought because you can't rationally believe that P is true with high confidence while believing that you don't know that P.
(If you rationally believe that P is true with high confidence, then it is irrational to form the belief that you don't know that P because in order to rationally have sufficiently high confidence you must have and be aware of sufficient grounds for rationally believing that you know P. If you have sufficient grounds for rationally believing that you know P and instead choose to believe that you don't know P, then you are being irrational in believing that "I don't know that" and thus violating the norm of thought.)
Why would the norm of thought say "Don't think P without qualification unless you are rationally believe that P is true with high confidence"? Because thinking that P without qualification only seems appropriate if you have sufficiently high confidence that P is true. If you don't have such confidence, you should add qualifications (e.g. "P seems probably true" or "It seems to me that P") instead of thinking to yourself "P" without any qualifications.
Although this norm of thought is also a norm of assertion (you should add qualifications if you aren't confident when asserting "P"), it's a norm of thought because it applies to thoughts without assertion. For example, if you are politically tribal, you might sometimes find yourself thinking some very strong thoughts on a recent political event without qualification and without sufficient grounds for rational high confidence, and that is inappropriate. You should qualify your own thoughts in such circumstances.
An additional benefit of this solution (besides avoiding all the problems with making knowledge the norm of belief) is that it can explain why properly qualified assertions and thoughts avoid the Moorean paradox. For example, there is no paradox in saying that "P is probably true, but I don't know P for sure" or that "P seems to me to be true, but that seeming is unreliable." Some Moore paradoxes still arise for improperly qualified thoughts, but can be explained using the same sorts of explanations as I did with #1-#7. For example, "I think P is true, but there’s no reason to think that" is paradoxical for the same reason that #3 above is.
I didn't follow this: "In order to rationally believe that P is true with high confidence you must be aware of the grounds of your belief that P is true and the relation between that belief and its grounds, since some such relations will undermine justification". I think a person can rationally believe P with high confidence provided that they have strong grounds for it, and they have no defeaters. Granted, some relations between the belief and its grounds would be defeaters, so no such relations can obtain. It doesn't follow from this that the person must, in addition, have beliefs about their belief. That sounds like a level confusion, like the claim that to be justified in believing P, you have to be justified in believing that you're justified in believing P.
Re: "P is probably true, but I don't know P for sure" or "P seems to me to be true, but that seeming is unreliable.":
These are fine by the knowledge norm. The speaker never asserts P in either case; he only asserts (a) that P *is probable*, or (b) that P *seems to him* to be true. And both of those are things he knows.
Regarding the part you didn't follow: Awareness doesn't require belief. Awareness is just meant to mean here something like "internally accessible", available in the way that internalists want (I thought this was semi-standard usage of the word "aware" in this context, but I might be wrong). So, you can be rational in believing that P with high confidence without having the belief that you have strong grounds for P, the belief that your belief that P is based on those strong grounds, or the belief that your belief that P is rational.
I would say that you can only rationally believe P with high confidence provided that you have strong grounds for it, the belief is based on those strong grounds, you are aware of those strong grounds, and you are aware that the belief is based on those strong grounds. Is there some compelling counterexample to that? I think most analyses of such internal accessibility should still work in the explanation above of #1.
(If I recall, you have a paper on internalism where you present your phenomenal conservatist view as one of the ways of understanding such internal accessibility, but I forget the details of how that would work so I'm not sure if some issue arises when we substitute awareness in my explaining of #1 with your analysis of it.)
Regarding qualified thoughts and assertions: What you say here makes sense. It wasn't as obvious to me how this would work when I wrote my original response above for some reason, though what you say here seems pretty straightforward and plausible. Still, I think making knowledge the norm of belief leads to other problems (which you seem to acknowledge in your blog post).
At first I wanted to quibble about contingencies that might make “it's raining” true, false, or ambiguous. But that's not the point, we could replace “it's raining” with something unambiguous like “the sun is up.”
If I believe P, but I include a contingency in my plan in case P turns out to be false, is that a contradiction? I’ve been thinking that belief in P is willingness to act on P, including willingness to disbelieve new propositions if they contradict P.
Maybe contingency plans are more about “if this step fails for whatever reason” and less about “if this specific belief turns out to be wrong.”
I don't buy that the norm for assertion or belief is *knowledge*, because it seems to me that people assert and believe things all the time that they'd admit they don't *know*. One example would be religious faith: some religious people will say they know that God exists (based on personal religious experience, perhaps), but other religious people will freely admit that they don't know it, because their belief is based on faith rather than evidence. Yet they might still genuinely believe, and assert, that God exists. More generally, if you ask people what it takes to count as genuinely knowing something, many will have a fairly high standard; there will be lots of things they believe to be true that don't meet that standard.
Instead, it seems to me that the constitutive norm must be *truth*: To count as believing at all, you need to be guided by the norm of only believing what's actually true; to count as (sincerely) asserting, you need to be guided by the norm of only asserting what's actually true. But someone can sincerely believe that X is true (and believe that they believe it, etc.) without taking themselves to know that X is true.
I feel like there are two reasons to worry knowledge and assertion come apart.
1) Knowledge may be a less contextually flexible standard than assertion. I think the obvious case here is opinions on who is going to win some sports game. In that context we think it's ok to assert claims with relatively low confidence, eg, I might be willing to assert: the bears will win the game tomorrow even if I only assign 55% probability to that outcome (even if it's clear from context it's not pure puffery). Yet in those contexts I might well say that I don't know.
2) The seperation between knowledge and JTB.
I could imagine a contrived case setup to ensure you had strong reason to think you had a true belief in some claim but that some defeasor that prevents knowledge would be present and I suspect that wouldn't stop you from asserting the claim.
Ofc, the situation would have to be really weird. Maybe it would literally be impossible.
However, at least in principle I want to claim that our attitude is: don't assert X unless your belief in X is justified at the appropriate degree of confidence not that you know X. Whether or not it's metaphysically possible to create a case where you expect that you have JTB in X but not knowledge.
Well, no, no one would say that in that way. But I can imagine saying, "There is a God, but I don't *know* there's a God." So that might have more to do with Gricean implicatures or...something.
Okay, on reflection, I think the case for my view is a lot stronger for believing than for asserting. I can see a case for thinking that to be entitled to *assert* something ("To assert not "I think there's a God" but just "There is a God"), I need to take myself to have knowledge, not merely belief. I don't actually think even that is right--I think that's a norm people *should* follow, but not a constitutive norm, because people failing to be guided by that norm still genuinely count as asserting. But I can at least see the case against me.
But I definitely don't think *believing* has a constitutive norm of knowledge, because people can easily say/think "I believe there's a God but I don't know there is," without the slightest contradiction.
Regarding "P but I don't believe P" maybe it's relevant to consider some more complex examples"
To me it seems equally weird to say "P and Q but I don't believe P" or "P or (P and Q) but I don't believe P". However, if we increase the complexity of the left side enough it not longer seems weird, eg "The Riemann zeta hypothesis implies P but I don't believe P" seems fine even if it turns out that the Riemann zeta hypothesis is a logical truth (yes I know some ppl think mathematical truths aren't logical but, if necessary, we can make the sentence say if zeta is provable in ZFC then P).
I feel like this suggests there is at least an element of consideration of the complexity of the inference involved (and I think the same trick works on the belief side eg I don't believe P or Q is still weird).
On reflection maybe this isn't that helpful. Ofc that's just a generic feature of speaking.
(looks like first time substack ate this) Regarding the last puzzle does it raise a problem for your solution that "it's raining but there is only a 90% chance that's true" also seems weird?
My analysis in this is that the problem comes from the implied tension. To my ear it seems fine to assert, "It's raining and I'm only 90% sure." I think the difference is that we make the background assumption that there is some implied level of certainty required to make assertions in this context so if you don't think it reached that level why did you assert it and if you do why did you imply a tension.
I'd like to extend this to the certainty case by suggesting that saying "I'm not certain" in this context has the Gricean implicature that there is enough reason to doubt that you should take notice. But if that's so then asserting the claim in the first place violated the norm of helpfulness (if you knew I needed to know with probability at least 99% that it was raining why would you say it's raining when you only have probability 90%).
Indeed, I think this can be extended to a general solution. I agree on the metacoherence part but aren't quite convinced of the rest of your solution. I'd instead argue that the problem is that the Gricean norm of relevance means that we wouldn't assert the qualifying half of the sentence if it wasn't relevant to the person/context we are talking about.
However, the level of confidence needed to assert something fluctuates with context and is itself governed by the Gricean norm of helpfulness. So what's weird about these cases is that the first half suggests we have sufficient confidence in the claim that you can treat it as true while that means the second half violates the relevance norm.
I think we can test this by noting that none of these statements (except the one directly about belief covered by metacoherence) seem weird if we imagine that instead of coming in one sentence someone asks something like "do you have any reason to doubt that" or "are u positive" in between.
But maybe you had something like this in mind too and I'm misunderstanding.
They Know What Is What But They Don't Know What Is What They just strut
Mark 9:24
When we say "It is raining" and *mean* that it is raining, then we are expressing our belief that it is raining. And expressing your belief that it is raining means expressing *that* you believe that it is raining. - Doesn't that sound plausible? And isn't it sufficient to explain the *defectiveness* of Moore-paradoxical statements? (Take your first example: Here you express that you believe that it is raining by saying that it is raining, and express that you don't believe that it is raining by saying that you don't believe that.)
I'm pretty sure the Moorean paradox of "P, but I don't know P" is really just a result of us using the word "knowledge" in many different ways in ordinary contexts. For instance, psychologists sometimes talk about "knowing P" when discussing expectation effects (e.g. placebo and nocebo effects), even though it's clearly confident belief that matters for such effects and not knowledge.
However, if you want to take the Moorean paradox seriously, isn't the much more obvious solution just to say that there is a norm of thought which requires something like "Don't think P without qualification unless you are rationally believe that P is true with high confidence"? If so, then you avoid the messiness of making knowledge the norm of belief while still solving all the problems:
1. "It is raining, but I don’t believe that" violates the norm of thought because you can't rationally believe that P is true with high confidence while believing that you don't believe that P.
(In order to rationally believe that P is true with high confidence you must be aware of the grounds of your belief that P is true and the relation between that belief and its grounds, since some such relations will undermine justification and thus high confidence. Yet, if you are aware of the relation between your belief and its grounds, then you have sufficient grounds for rationally believing that you believe P. If you have sufficient grounds for rationally believing that you believe P and instead choose to believe that you don't believe P, then you are being irrational in believing that "I don't believe that" and thus violating the norm of thought.)
2. "It is raining, but that’s not true" violates the norm of thought because you can't rationally believe that P is true while believing that P is not true.
3. "It is raining, but there’s no reason to think that" violates the norm of thought because you can't rationally believe that P is true while believing that there is no reason to believe that P.
4. "It is raining, but I based that belief on a false premise" violates the norm of thought because you can't rationally believe that P is true while believing that this belief is based on a false premise which would make P not true.
(If you remove the last bit about making P not true, then I don't see the paradox. People base beliefs on approximate truths which are false premises all the time. For that matter, you can know things based on approximate truths, too, so you need the last bit for your solution to work as well.)
5. "It is raining, but my way of finding that out is unreliable" violates the norm of thought because you can't rationally believe that P is true while believing that this belief was formed through an unreliable method.
6. "It is raining, but my justification for that is defeated" violates the norm of thought because you can't rationally believe that P is true while believing that the justification for this belief is (all-things-considered) defeated.
7. "It is raining, but I don’t know that" violates the norm of thought because you can't rationally believe that P is true with high confidence while believing that you don't know that P.
(If you rationally believe that P is true with high confidence, then it is irrational to form the belief that you don't know that P because in order to rationally have sufficiently high confidence you must have and be aware of sufficient grounds for rationally believing that you know P. If you have sufficient grounds for rationally believing that you know P and instead choose to believe that you don't know P, then you are being irrational in believing that "I don't know that" and thus violating the norm of thought.)
Why would the norm of thought say "Don't think P without qualification unless you are rationally believe that P is true with high confidence"? Because thinking that P without qualification only seems appropriate if you have sufficiently high confidence that P is true. If you don't have such confidence, you should add qualifications (e.g. "P seems probably true" or "It seems to me that P") instead of thinking to yourself "P" without any qualifications.
Although this norm of thought is also a norm of assertion (you should add qualifications if you aren't confident when asserting "P"), it's a norm of thought because it applies to thoughts without assertion. For example, if you are politically tribal, you might sometimes find yourself thinking some very strong thoughts on a recent political event without qualification and without sufficient grounds for rational high confidence, and that is inappropriate. You should qualify your own thoughts in such circumstances.
An additional benefit of this solution (besides avoiding all the problems with making knowledge the norm of belief) is that it can explain why properly qualified assertions and thoughts avoid the Moorean paradox. For example, there is no paradox in saying that "P is probably true, but I don't know P for sure" or that "P seems to me to be true, but that seeming is unreliable." Some Moore paradoxes still arise for improperly qualified thoughts, but can be explained using the same sorts of explanations as I did with #1-#7. For example, "I think P is true, but there’s no reason to think that" is paradoxical for the same reason that #3 above is.
I didn't follow this: "In order to rationally believe that P is true with high confidence you must be aware of the grounds of your belief that P is true and the relation between that belief and its grounds, since some such relations will undermine justification". I think a person can rationally believe P with high confidence provided that they have strong grounds for it, and they have no defeaters. Granted, some relations between the belief and its grounds would be defeaters, so no such relations can obtain. It doesn't follow from this that the person must, in addition, have beliefs about their belief. That sounds like a level confusion, like the claim that to be justified in believing P, you have to be justified in believing that you're justified in believing P.
Re: "P is probably true, but I don't know P for sure" or "P seems to me to be true, but that seeming is unreliable.":
These are fine by the knowledge norm. The speaker never asserts P in either case; he only asserts (a) that P *is probable*, or (b) that P *seems to him* to be true. And both of those are things he knows.
Regarding the part you didn't follow: Awareness doesn't require belief. Awareness is just meant to mean here something like "internally accessible", available in the way that internalists want (I thought this was semi-standard usage of the word "aware" in this context, but I might be wrong). So, you can be rational in believing that P with high confidence without having the belief that you have strong grounds for P, the belief that your belief that P is based on those strong grounds, or the belief that your belief that P is rational.
I would say that you can only rationally believe P with high confidence provided that you have strong grounds for it, the belief is based on those strong grounds, you are aware of those strong grounds, and you are aware that the belief is based on those strong grounds. Is there some compelling counterexample to that? I think most analyses of such internal accessibility should still work in the explanation above of #1.
(If I recall, you have a paper on internalism where you present your phenomenal conservatist view as one of the ways of understanding such internal accessibility, but I forget the details of how that would work so I'm not sure if some issue arises when we substitute awareness in my explaining of #1 with your analysis of it.)
Regarding qualified thoughts and assertions: What you say here makes sense. It wasn't as obvious to me how this would work when I wrote my original response above for some reason, though what you say here seems pretty straightforward and plausible. Still, I think making knowledge the norm of belief leads to other problems (which you seem to acknowledge in your blog post).
At first I wanted to quibble about contingencies that might make “it's raining” true, false, or ambiguous. But that's not the point, we could replace “it's raining” with something unambiguous like “the sun is up.”
If I believe P, but I include a contingency in my plan in case P turns out to be false, is that a contradiction? I’ve been thinking that belief in P is willingness to act on P, including willingness to disbelieve new propositions if they contradict P.
Maybe contingency plans are more about “if this step fails for whatever reason” and less about “if this specific belief turns out to be wrong.”
I don't buy that the norm for assertion or belief is *knowledge*, because it seems to me that people assert and believe things all the time that they'd admit they don't *know*. One example would be religious faith: some religious people will say they know that God exists (based on personal religious experience, perhaps), but other religious people will freely admit that they don't know it, because their belief is based on faith rather than evidence. Yet they might still genuinely believe, and assert, that God exists. More generally, if you ask people what it takes to count as genuinely knowing something, many will have a fairly high standard; there will be lots of things they believe to be true that don't meet that standard.
Instead, it seems to me that the constitutive norm must be *truth*: To count as believing at all, you need to be guided by the norm of only believing what's actually true; to count as (sincerely) asserting, you need to be guided by the norm of only asserting what's actually true. But someone can sincerely believe that X is true (and believe that they believe it, etc.) without taking themselves to know that X is true.
So, if you asked people, "Is there a God?", they would say, "I don't know, but there is."?
I feel like there are two reasons to worry knowledge and assertion come apart.
1) Knowledge may be a less contextually flexible standard than assertion. I think the obvious case here is opinions on who is going to win some sports game. In that context we think it's ok to assert claims with relatively low confidence, eg, I might be willing to assert: the bears will win the game tomorrow even if I only assign 55% probability to that outcome (even if it's clear from context it's not pure puffery). Yet in those contexts I might well say that I don't know.
2) The seperation between knowledge and JTB.
I could imagine a contrived case setup to ensure you had strong reason to think you had a true belief in some claim but that some defeasor that prevents knowledge would be present and I suspect that wouldn't stop you from asserting the claim.
Ofc, the situation would have to be really weird. Maybe it would literally be impossible.
However, at least in principle I want to claim that our attitude is: don't assert X unless your belief in X is justified at the appropriate degree of confidence not that you know X. Whether or not it's metaphysically possible to create a case where you expect that you have JTB in X but not knowledge.
Well, no, no one would say that in that way. But I can imagine saying, "There is a God, but I don't *know* there's a God." So that might have more to do with Gricean implicatures or...something.
Okay, on reflection, I think the case for my view is a lot stronger for believing than for asserting. I can see a case for thinking that to be entitled to *assert* something ("To assert not "I think there's a God" but just "There is a God"), I need to take myself to have knowledge, not merely belief. I don't actually think even that is right--I think that's a norm people *should* follow, but not a constitutive norm, because people failing to be guided by that norm still genuinely count as asserting. But I can at least see the case against me.
But I definitely don't think *believing* has a constitutive norm of knowledge, because people can easily say/think "I believe there's a God but I don't know there is," without the slightest contradiction.