(Untitled)

Jun 14, 2006 22:55

Leave a comment

Comments 64

sisyphus June 15 2006, 03:28:15 UTC
I've asked this same thing almost exactly here, though with less insight and such.

Since I've learned to stop worrying and love nihilism I just think that any statement that says I 'ought' to do something is false, so it doesn't worry me any more. The only justification I can think of to prefer truth to falsehood is a very uninteresting instrumentalist cases that don't take us very far at all like 'believing that you will die if you jump off of the Eiffel Tower will keep you alive.'

Reply

paulhope June 15 2006, 13:51:05 UTC
Well...at some point I may give up hope and agree with you.

But, to draw from another domain, if I understand you correctly I think your views on what constitutes knowledge are stricter than mine. So when you say you're a skeptic, and when I say I'm not, we are agreeing on all matters of substance except for what we are labeling.

I suspect there might be something similar going on here. There are some strict definitions of morality under which I would probably classify myself as a nihilist without much hope in turning back. But I am wiilling to relax the requirements of such terms if I can get something interesting out of them.

As far as 'uninteresting' instrumentalist cases go, I think the interesting part of this is the proper selection of one's goals, which might be relatively broad and therefore more interesting (at least philosophically).

Reply


jalden June 15 2006, 04:47:56 UTC
Bernard Williams' Truth and Truthfulness attempts to answer this question with a genealogy explaining how truth came to be something of value. He argues that truth became intrinsically valuable when linked with the virtues of sincerity and accuracy, defined respectively as the desire to tell and know the truth.

I read the book two years ago, that's about all I can remember while lacking sleep and sobriety.

Reply

paulhope June 15 2006, 13:53:47 UTC
Does Williams intend to get anything normative about that historical genealogy? (Does he believe that the linking of truth w/ sincerity and accuracy is legitimate? Does he think sincerity and accuracy have intrinsic worth, as virtues? etc.)

Reply

jalden June 15 2006, 21:58:44 UTC
Yes, he calls them the "virtues of truth." But I believe he also believes that we are responsible for the generation of the virtues of accuracy and sincerity and the value of truth.

Reply

paulhope June 15 2006, 22:35:54 UTC
Does he elaborate at all on the mechanism by which we generate our virtues?

Reply


philosophyjeff June 15 2006, 05:13:21 UTC
"To illustrate the point of the last three paragraphs, consider the case where believing what, e.g. corresponds with reality results in all cases in disorientation, frustration, hopelessness, despair, and suicide. Could we really claim that in such a world we ought to believe the truth?"

Ernest Sosa's "The Raft and the Pyramid" (or some such name) covers that kind of issue. His idea there seems to be that there is a merit that certain beliefs can have purely on epistemic grounds. Well-justified, true beliefs are epistemically superior to unjustified, false beliefs. Contrast that with the case of what it is prudent to believe. It may be good to believe you can jump the chasm if believing so will help you succeed. But that kind of value isn't epistemic and should be distinguished from the epistemic values. Most epistemologists seem to agree. Most papers in the epistemology literature presuppose such a distinction.

Reply

paulhope June 15 2006, 14:52:03 UTC
Well-justified, true beliefs are epistemically superior to unjustified, false beliefs. ... Most epistemologists seem to agree. Most papers in the epistemology literature presuppose such a distinction.

I think it makes sense for epistemologists to have this as a methodological assumption, but I'm taking a step back out of epistemology here and asking for justification for this point--why are justified/true beliefs superior to unjustified/false ones?

Unfortunately, I've lost my copy of the Raft/Pyramid article and don't remember much of it. What is Sosa's defense of this claim?

Also, out of curiosity: which does Sosa prefer--true and unjustified beliefs or false and justified ones?

Finally, I think that prudence, with all its connotations, is only a subset of the broader set of consequentialist virtues. As I've said elsewhere, I'm don't want to commit myself to anything approaching a crudely utilitarian instrumentalism here. Perhaps the illustration you quoted was poorly chosen--what I meant to raise is the possibility that ( ... )

Reply

philosophyjeff June 15 2006, 15:15:01 UTC
I don't think I'm making my my point clearly. Let me try again.

There are many ways in which a belief can be, broadly, a good one or a bad one to have. Sometimes having a belief can be good according to some criteria, but bad according to another. Epistemic value is, in this sense, distinct from any ethical value, aesthetic value, or practical value. What Ernie wanted to do in that part of the paper was to isolate that issue and remove it from discussion. Epistemologists usually want to talk about epistemic things, not those other things.

But leaving that aside, Timothy Williamson argues in Knowledge and Its Limits that knowing is more likely to result in successful action than merely justifiedly truly believing. That's a quick characterization, if you want more, we can discuss it.

Reply

paulhope June 15 2006, 18:00:39 UTC
To Ernie:

How do we justify holding something epistemically valuable?

To Williamson:

Hearing more would be great.

Reply


lukifer June 15 2006, 07:51:04 UTC
My problem with this would be that you seem to presuppose that the normative value of knowing the truth or of believing justified beliefs can only subsist if it can be somehow 'grounded' in some other normative value. That is, you only have a 'motivation' (how ironic) to adopt an account of truth that 'grounds' the value of truth on the assumption that 1) if you do, that value will be solid and comfortable and have a nice home, and 2) if you don't, it will be kicked out onto the streets and die alone in the gutter: it makes a difference whether you ground it or not ( ... )

Reply

4. paulhope June 15 2006, 18:20:27 UTC
I don't understand the first paragraph of this, although I don't think I'm presupposing what you're saying I'm presupposing. Could you rephrase, maybe?

Essentially, what I'm asking is 'given your implicit rejection of the idea of there just being values to things like truth, how can you support this 'foundationalist' system?'The short answer is that I don't support the foundationalist system. I think there have to be other ways to beat or accomodate the infinite regress. I'm not saying that such a thing is trivial, or not prone to all sorts of other problems, but I think we should be investigating in its direction ( ... )

Reply

Re: 4. lukifer June 15 2006, 21:21:52 UTC
Could you rephrase, maybe?
I think it's basically the same as what you've understood, about the infinite regress and intrinsic values.

I think there have to be other ways to beat or accomodate the infinite regress. I'm not saying that such a thing is trivial, or not prone to all sorts of other problems, but I think we should be investigating in its direction.Which is all very good, but it seems to me that you're putting the cart before the horse with this post. Here, you seem to be saying 'we ought to adopt an account of truth which grounds its value, and this is instrumentalism (or any other suggestion that does the same thing)'. Now it seems to me we only have a reason to adopt such an account if and when you can show us that it actually does grounds the value of truth. It's like I'm living in my correspondence-coherence-or-whatever-house on the seaside, and you come in and yell 'the tsunami of ungrounded values is coming! You must come with me to my house of instrumental truth where you will be safe.' Now, if your house looks ( ... )

Reply

Re: 4. paulhope June 16 2006, 15:53:23 UTC
If you can show a coherent way to ground the practical values that are meant to ground truth, then the appeal to come over to the instrumentalism hut will be persuasive.

You are, of course, absolutely right. I wrote this post more to try and motivate the inquiry into instrumentalism (by showing that there was something unsatisfying about the hut you're in) than to actually convince anybody that it was a good idea. My intention is to write more posts in the future looking into some of the problems with instrumentalism, but now to have something to point back at to when people say "But why are you trying to save instrumentalism anyway? It's stupid!"

I can only value certain things I think this is a question that needs to be answered empirically. But my impression of people's capacity to value is that they often confuse what is instrumentally valuable with what is intrinsically valuable. Or, rather, whatever capacity we have to value doesn't always keep track of which things were valuable first. It turns into a big tangled mess ( ... )

Reply


truth and belief have little in common redslime June 15 2006, 15:27:12 UTC

The subject says it. It would be nice if our beliefs have some correspondence with the truth. It appears to enhance survival, etc. However, both historical and personal experience ought to tell us that there is no ought there. There are beliefs and there is truth. It seems to me that we ought not believe anything, as experience tells us that beliefs are most often wrong. We ought to create models that appear to predict the world of experience, and we ought to remember that they are just models.

Reply

Re: truth and belief have little in common paulhope June 15 2006, 18:10:31 UTC
I don't know how much we have to disagree about.

It would be nice if our beliefs have some correspondence with the truth.

Why? Was that what "It appears to enhance survival, etc." refers to? If so, why do you thinking having beliefs that correspond with the truth enhances survival?

However, both historical and personal experience ought to tell us that there is no ought there.

How so?

It seems to me that we ought not believe anything, as experience tells us that beliefs are most often wrong.

That sounds like getting rid of the baby with the bathwater to me. It also makes your position self-refuting, I think. If I accept your position, I can't believe in it. But where does that get me?

We ought to create models that appear to predict the world of experience, and we ought to remember that they are just models.

I agree. But I don't think that goes against anything I was saying. I mean, we would still have to, for example, believe (or what else do you mean by 'remember"?) that those models are just models.

Reply

Re: truth and belief have little in common redslime June 15 2006, 18:41:11 UTC

First I'm not at all sure that having beliefs that correspond with truth enhances survival. It is just something many of us seem to believe. But yes, that is what "survival" refered to.

How so?

Doesn't your experience tell you that most of you beliefs were in fact wrong?

That sounds like getting rid of the baby with the bathwater to me.

Nothing is being thrown out at all. You just have a belief that says you must believe something. Does a dog need to have beliefs to bark at the mailman? Does a cow need beliefs to munch on the grass? Beliefs are not required to operate in the world.

remember?

Remembering does not require belief. When you remember a dream, do you believe it is true?

Reply

Re: truth and belief have little in common paulhope June 15 2006, 20:19:39 UTC

Does a dog need to have beliefs to bark at the mailman? Does a cow need beliefs to munch on the grass?

Yes and yes? I think this raises good questions about what a belief is (or, I'd rather think of it as "What is it to ascribe truth-value to?"), but I think dogs and cows have things that are approximately beliefs.

Even if those seem far fetched, I think there's a stronger case for all people believing something, it those beliefs being pretty much essential to operating in the world. I mean, again, are you saying you don't believe the stuff you just asserted (that beliefs aren't necessary for operating in the world)?

Remembering does not require belief. When you remember a dream, do you believe it is true?

No. But when I say that I remember that P, where P is a propositional attitude, there's a presupposition there that P is true. And what we normally really mean when we say things presupposing our correctness is that we believe those things very strongly.

Reply


Leave a comment

Up