And then I had to find that my naive theory of intelligence didn't hold water: intelligent people were just as prone as less intelligent people to believing in obviously absurd superstitions. Only their superstitions would be much more complex, elaborate, rich, and far reaching than an inferior mind's superstitions.
Intelligent people more often (though not always) prefer to hold beliefs supported by argument. But they're also more capable of constructing arguments to support their absurd beliefs, if they're so inclined.
I do think that you're overstating Jeff's arguments here, though. In particular, I think that your statement that Jeff "concludes that the ways a value system grounded on happiness differ from my intuitions are problems with my intutions" is a misleading quote. The full sentence is, "But when happiness comes so close to fitting I have to consider that it may be right and the ways a value system grounded on happiness differ from my intuitions are problems with my intutions," and that "I have to consider that it may be right" is significant. He's saying that he "has to consider" that the difference may be a problem with his intuitions, you say he "concludes" that it is.
As a side-note, Yudkowsky's arguments regarding Coherent Extrapolated Volition seem to be trying to save preference utilitarianism from exactly the sort of dead-ends Jeff's arguments indicate.
Intelligent people more often (though not always) prefer to hold beliefs supported by argument. But they're also more capable of constructing arguments to support their absurd beliefs, if they're so inclined.
I do think that you're overstating Jeff's arguments here, though. In particular, I think that your statement that Jeff "concludes that the ways a value system grounded on happiness differ from my intuitions are problems with my intutions" is a misleading quote. The full sentence is, "But when happiness comes so close to fitting I have to consider that it may be right and the ways a value system grounded on happiness differ from my intuitions are problems with my intutions," and that "I have to consider that it may be right" is significant. He's saying that he "has to consider" that the difference may be a problem with his intuitions, you say he "concludes" that it is.
As a side-note, Yudkowsky's arguments regarding Coherent Extrapolated Volition seem to be trying to save preference utilitarianism from exactly the sort of dead-ends Jeff's arguments indicate.
Reply
Leave a comment