It is not possible to have a conscious AI

Apr 25, 2016 11:12


The Chinese Room demonstrates that outward appearance of an understanding of meaning is by no means an actual indicator of understanding.
Bona fide understanding is a main feature of conscious thinking. If something is not conscious, it is not possible for it to understand.
What goes on inside the Chinese Room is an analog of programming. AI is ( Read more... )

Leave a comment

Conflating two concepts dragonlord66 May 12 2016, 06:38:06 UTC
I believe that you are conflating two concepts.

1) the actuality that a computer could be sentient

2) a thought experiment that proves that something can seem intelligent while actually operating to a set of rules.

An implication if what you're saying is that as soon as we understand how the brain works to a sufficient level, humans will stop being sentient, because they will be (theoretically) deterministic.

what the chinese room does mean, is that we can't prove sentience, just as the brain in the jar means we can't prove the world is real. This means that we then have to take the fact that we are sentient and that the world is real on faith. By extension that everyone else is real is an act of faith.

Reply

Programming AI does not entail an understanding of the brain nanikore May 14 2016, 00:37:55 UTC
There is no equivalence between programming and "a sufficient understanding of the brain", whatever that means. In another thread with forum member esl, I questioned this understanding with the issues of underdetermination and exaustiveness.

The Chinese Room is a demonstration against what is dubbed "strong AI" by Searle: http://www.iep.utm.edu/chineser/

Reply

Re: Programming AI does not entail an understanding of the brain dragonlord66 May 14 2016, 09:16:58 UTC
the brain works by massively parallel computing of the inputs supplied from the body. memories are formed by neurons making permanent paths that reproduce the processing that occured. sentience arises out of this increasing complexity (less complex brains are less sentient).

therefore any computer program that becomes sufficently complex will start to show signs of sentience.

Take the chinese room to a logical extent. we (outside observers) know that there is a person in that room, we also know that people can learn languages. Therefore it is not outside the realms of possibility that the person will learn written chinese, especially if they start playing around with the reaponses.

that's before we start getting into things like true ai programming, that uses analogues of brain nurons. these systems are trained not programmed, and are already showing sufficent complexity that we can't identify how they got to certain answers.

which brings us back to the brain in a jar, how can you prove that you're not just a computer

Reply

Re: Programming AI does not entail an understanding of the brain nanikore May 14 2016, 10:51:25 UTC
If complexity gives rise to consciousness, then cell phones today would be conscious ( ... )

Reply

Re: Programming AI does not entail an understanding of the brain dragonlord66 May 14 2016, 12:04:48 UTC
1) a transistor is not a circuit, they now have dedicated neuron chips, but a transistor is a component in a gate. think of it as a digital protein. Also, the circuts in a traditional computer are not actually that complex compared to a brain. This is because, while they may have a similar number of discrete components, those components only connect to 1 or 2 other components, while a brain has orders of magnitude more connections between components ( ... )

Reply

Re: Programming AI does not entail an understanding of the brain nanikore May 15 2016, 02:36:02 UTC
Okay. It seems that the contention has been narrowed down to one single issue- that of complexity ( ... )

Reply

Re: Programming AI does not entail an understanding of the brain dragonlord66 May 15 2016, 10:15:57 UTC
At this point you're going to have to define conscious, as without knowing what definition you're working on. As to my mind a rat isn't conscious in a sentience sense. I believe that there are tests for self awareness already (such as recognising yourself in a mirror ( ... )

Reply

Re: Programming AI does not entail an understanding of the brain nanikore May 15 2016, 11:08:20 UTC

Again, you're focusing on function, being a card-carrying functionalist.

Consciousness is not a function. Something that handles syntax without comprehension is not understanding anything. This goes back to my original post. The argument hasn't changed one bit.

You've repeatedly asked the same questions in your subthread as another user asked and this time it's no different.

As I've said to user esl in the other subthread,

A conscious entity, i.e. a mind, must possess

1. Intentionality
2. Qualia

Exactly which logical errors do the p-zombie argument contain? They provide indistinguishable output just as Chinese Rooms do. I call them "consciousness rooms".
There is no full simulation because it would be simulacra as per my previous reply. Not only that, a supposed full account contains no guarentees against underdetermination.
Brain in jar is a red herring for this topic for reasons I've stated in an earlier reply.
Whether something "learns" is irrelevant. Machine learning programs are not conscious- to say that it does would be relying ( ... )

Reply

Re: Programming AI does not entail an understanding of the brain dragonlord66 May 15 2016, 12:49:40 UTC
How do you know you have intentiality or qualia? How do you know the people around you do ( ... )

Reply

Re: Programming AI does not entail an understanding of the brain nanikore May 16 2016, 08:28:24 UTC
Do you know exactly what it's like to be me? If you do, then you can deny subjective experience ( ... )

Reply

Re: Programming AI does not entail an understanding of the brain dragonlord66 May 16 2016, 09:39:33 UTC
We are talking about orders of magnitude more complexity than even the space launch system. Each neuron can make tens of thousands of connections in the brain and there are more than 200 billion neurons in the human brain (source: https://www.sciencedaily.com/releases/2010/11/101117121803.htm... )

Reply

Re: Programming AI does not entail an understanding of the brain nanikore May 18 2016, 06:40:27 UTC
Where does the bottom stop? Billions? Tens of thousands? There is no clear delineation anywhere. In contrast, based on the definition of what makes an organism belong in the animal kingdom, we can make delineations. Lack of delineation would make a definition incoherent ( ... )

Reply

Re: Programming AI does not entail an understanding of the brain dragonlord66 May 18 2016, 09:12:08 UTC
Logical assertaion is how most of philosophy works, the thought experiments are just ways to provide a mental playground to test certain ideas ( ... )

Reply

Re: Programming AI does not entail an understanding of the brain nanikore May 19 2016, 06:32:54 UTC
Logical assertion is a step in a series of steps in a logical proof. What you offered was a flat assertion in the form of ( ... )

Reply


Leave a comment

Up