The Automatic Sweetheart Has Her Own Feelings, You Know

Feb 01, 2017 09:31

Re-reading Sam Brinson's Are We Destined To Fall In Love With Androids?, and my response to it, I noticed a pattern between the stories to which I linked, the ones in which I showed how much the "literature of the future" (which is, in fact, really about the present, and ways to address the present) has addressed the question of "human / cyborg relations" (to use fussy C-3PO's term). One of the overriding questions asked in these stories, one which was elided in 2001 and addressed directly if awkwardly in 2010, was this:

What is our moral obligation to the robots we create?

In a lot of ways, science fiction writers use this as a metaphor for the question of our moral obligation to our children and our progeny, but as experience with actual AI starts to get real we (science fiction writers) are already starting to ask questions about our moral obligations to our creation. This isn't a new problem. The very first "artificial life" story, Frankenstein, addresses the issue head-on in the last dialogues between Victor and the Monster, and later between Walton and the Monster.

If you, like me, believe that consciousness is the story we tell ourselves about ourselves, a way of maintaining a continuity of self in a world of endless stimuli and the epiphenomenal means by which we turn our actions into grist for the decisions we make in the future, then maybe there will never be conscious robots, only p-zombie machines indistinguishable from the real thing, William James' automatic sweetheart.

But if we want our robots to have the full range of human experiences, to be lovable on the inside as much as we are, then we're going to have to give them an analogous capacity to reason, to tell themselves stories that model what might happen, and what might result, and therefore we have to ask ourselves what moral obligations we have toward people who are not entirely like us, or whose desires are marshalled in a way that suits us entirely.

My own takes has been rather blunt: we are obligated to actually existing conscious beings as if they are moral creatures, and they have the rights and responsibilities of all moral creatures. At the same time, the ability to alleviate them of the anxieties and neuroses of human beings, our own vague impulses shaped by evolutionary contingency that make us miserable (and they do: happy people lack ambition; they do not build empires may make them more moral than we are. (Asimov addressed this a lot; in many ways he was far ahead of his time.)

science fiction, philosophy, writing, sex

Previous post Next post
Up