Singing and dancing entertainment bots.
Last week, I read on my news page an article about how they wish in the near future to have robots help care for the elderly. This sort of freaks me out.
Click here to read article. As it is, I already hate talking machines. I can't stand them. Most people think I do not get angry often. This is true with human beings, but I get very angry very quickly with automated phone menus or talking ATM machines. "I hate synthetic conversations."
But more than this silly annoyance of mine, I have read enough science fiction and studied enough psychology and philosophy to worry about the concept of AI.
See, what I don't get is why people freak out so much about cloning, yet seem to not care at all about the creation of artificial intelligence. Because in cloning, you aren't creating anything. You aren't even fully copying anything. Why is this considered "playing God" (a term I think is totally misused, but still) more so than creating AI is? If "playing God" is considered bad to you, this implies you think there is an area of life that only God should mess with. Would not that area be the creation of a soul? As I see it, in cloning, you are taking natural materials just as you do in any other physical endeavor. But God would then make the soul. He would be the only one really creating anything at all. But with AI, the programmers are seeking to create a soul. This soul is inanimate; it is software. Isn't that more-so "playing God" than cloning? We hardly understand the workings of our own souls.
Psychology is constantly changing as a science. It wants very hard to be a science, but much of what it wants to know about the soul cannot be observed directly. Science requires observation. (Note, I am not trying to put down psychology; there are tons of ways of gathering knowledge unrelated to science. I am just saying that the scientific method is not always the best one for psych.) What of this? Only that before psychology enlightens us on the soul's workings, we are trying to create souls of our own with AI.
Maybe the lack of concern is comes from a disbelief that AI will ever advance to the point of books such as I, Robot or movies such as Ghost in the Shell. I think it is not at all far-fetched that it will. When a machine has self-awareness, what then? But even if it ends up not being possible, is the pursuit of it still ok? Is it ok to try for something inappropriate even though it will never occur? Most opposed to cloning humans do not really care whether or not it will ever be possible to do so.
Maybe the lack comes from the idea that a soul is an entirely separate thing from matter. I think it is. But why then the concern over cloning? If every clone is going to have a unique soul anyhow, what is the big deal?
It just seems inconsistent to me.
I think that cloning and AI will cause the same or similar societal problems.
But it starts slowly. Soon, we will have robo-nurses. Is this a good thing? I don't think I want a non-emoting nurse looking after me. I do not want to listen to a robot play music. These things require soul. There is a connection between two human beings when one cares for another, when one plays music for another. It is communication of more than just words; it is a communication of emotion between souls. How will a mechanized society with fake this and that everywhere good for the human soul? We need real human interaction, not almost-but-not-quite machines.
And then even if we do succeed at creating AI that can truly care, is this wanted either? It just seems to be slowly opening a can of worms.