kmo

The Doomer Dies a Thousand Times

Jan 15, 2017 21:06


Sam Harris has a rap about how people who like to think about the future of artificial intelligence seem unable to associate the appropriate emotional affect with the dystopian possibilities they envision around AI. Loss of human freedom? Bad in theory, but the details are fascinating. Tweak a variable and see if things get better or worse. Which is more interesting? That's not fair, peace and prosperity are pleasurable to live through but boring to contemplate.

The AI catastrophe is also fun to think about because it assumes that the present will continue in its current mode, and if you're working in the tech sector and feeling no real risk of unemployment or displacement in the short term, the basic working premise of an AI catastrophe in future decades is built on the assumption that you personally will continue to do well in the here and now.

I almost always contrast the techno-visionary mentality, the one that gravitates to the possibilities around developments in artificial intelligence, both utopian and dystopian, with the Doomer mentality. The Doomer is drawn to scenarios in which the current state of civilization is knocked back on its heels, perhaps permanently. She is less likely to have a secure and lucrative job in tech, though the Doomer mentality wins converts even in the nests of insular privilege. The Doomer is more likely to resent those who like to think about robots and advanced computation, and for her, the upside of impending collapse is that it will take down the smug and exalted ones as it grinds everyone into the dirt.

When it comes to emotional affect, the Doomer is not the mirror image of the technophile. When the Doomer contemplates the seizure of the technology-enabled processes that support billions of people on a planet suited to sustaining half a billion without industrial agriculture, she is quite likely to imagine, in vivid detail, the grief, the sense of loss, and the despair that would be the common experience of humanity during the dieback.

The worst case scenario for the Doomer is the crash that crushes everyone EXCEPT the elite. Somehow, their plans for underground luxury bunkers, or mountaintop resort strongholds, or libertarian seastead paradises work out, and they do quite well while everyone else suffers. In this scenario, the Randroid Lords of Silicon Valley hold self-congratulatory colloquies on their artificial islands and cluck in false pity for the poor, clueless masses.

"They were always stupid, and vulgar and lazy, but it's too bad they had to die in such abject misery and barbarism. And it's tragic that the last of them have prolonged their misery for so long. It would be so much easier for everyone if they'd just get on with it and cease their interminable, pointless strivings and make way for the rightful inheritors of the Earth."

In my own skin, I don't worry about the big picture much. My genuine fears are mostly personal, and sometimes, admittedly, quite irrational. I fear for my sons, for the economic landscape in which they will be expected to carve out a living, for their likely encounters with police, with opiates, with self-sabotage and low expectations. Those fears, I think, are rational. The irrational comes when I'm alone in the dark and feel a tingle of supernatural dread. I turn on the lights, even though my rational mind knows that I am alone. My rational mind provides poor comfort.

575 words

ai, pessimism, technology

Previous post Next post
Up