So, I'm writing about a girl named Ada and a robot named Galatea and an ornery old barkeeper named Deuce X. McKenna, and i'm doing research on strong A.I. on Wikipedia to make sure I have my facts at least believable if not right, and I start reading about the Turing test and such, and then I get into the
"Chinese Room" thought experiment. John
(
Read more... )
I was actually saying the OP seems to mash all this up in one big heap called "A.I.".
It can't learn and adapt.
I anticipated that someone would bring this up ... but not you. Learning theory opens up that nasty bag of worms called psychology. That's what the article touches on but is too timid to directly engage. Did the man learn to read? This is what you and the OP are talking about. 'How we know' is the most direct inquiry. Is reading essentially behaviorism? Some say learning is mimicking behavior of others. Would a reading comprehension test for A.I. be merely accessing some expected response? This final thought is what the OP jumped to and where I demand that we define intelligence before we go off looking for it (or heaven forbid, go off making it).
This where you stabbed (at my heart!): a true intelligence, and an intelligence that passes the Turing Test can do, is improvise.
The implications of this definition for the mentally or physically challenged is devastating. Imagine one of those aliens from Star Trek who is essentially energy. Would they not argue that humans aren't intelligent because we are corporeally bound? It's not just that we are physically challenged from their perspective; the alien could also mean that we don't know how to transcend our bodies, and because spacetime morphs solid states (as changing forms), then we can't improvise to match universal reality. In other words, we die (in the terminal sense where there's no afterlife). This may sound off tangent, but your definition rattled a nerve from educational psychology.
Still, I totally concur that the "number of instructions would be [unbound]". (Brackets mine)
It doesn't necessarily have an inherent knowledge of self.
I think that only matters to (some) humans. There are some brilliant humans -- who score high on our intelligence tests -- that act without regard to other selves. The tyrannical narcissists from history come to mind. Having one instance of a lack of awareness about some self meets my universal standard for lacking total awareness, aka. it's silly to talk about A.I. if the machine is only aware of itself (because it wouldn't give a damn about our conclusions, i.e. we wouldn't exist). So self-awareness isn't necessarily a piece of the A.I. puzzle. It is a piece that people want to see in A.I. only because a consciousness could drive humane morals; and prevent machines from taking over the world and such.
Reply
And yes, this does strike a nerve with educational psychology. But again, the late hours proved detrimental to my word choice. (Though I do understand your point about the energy beings -- claiming that an autistic person is less than human because of their mental limits is something that I do not agree with.) When I said that even a computer which passes the Turing test "doesn't necessarily have an inherent knowledge of self," I meant it more in the idea of consciousness of self, of conscience. A computer does not consider moral implications of what it does, potentially, even one which passes the Turing Test. Neither does a rabbit. But complete narcissism isn't necessarily a lack of morality, it's just a lack of communal morality -- favoring the self first is a completely valid moral stand point, though a personally distasteful one. Of course, you are right, getting into discussions of morality is slightly outside the original topic, though while I understand your position on the relevance of morality to true AI, I generally disagree, and not because I am afraid of Skynet. I just expect a true artificial intelligence to be able to respond on more than just a strictly rigidly programmed design (or even a flexible and adaptable design.)
I don't think we'll ever have anything at the level which will actually pass the Turing Test, let alone an emotional or moral test.
Reply
Reply
Leave a comment