kmo

The Wrong Kind of Collapse

Jan 05, 2017 00:09


As a former starry-eyed techno-utopian and also a (mostly) former peak oil doomer, I can switch hit when it comes to participating in conversations about both rapid advances in artificial intelligence and the industrial civilization's crucial dependence on fossil fuels. There are others with a foot in both camps, but not many.

It seems as though the majority of players in either arena have not stomach for the conversation going on in the other. Sam Harris put his finger on this phenomenon in his TED talk on the dangers of artificial intelligence. He described his lack of appropriate emotional affect when talking about the potential existential risks posed by artificial super-intelligence. Even if you take seriously the possibility that godlike AI might destroy humanity either out of malice or, more likely, out of sheer indifference to our needs and interests, it's hard to feel any sense of fear around that scenario. If you're thinking seriously about the topic, you probably enjoy talking about technology and spinning wild "what if" scenarios about the future.

We don't fear death by science fiction because we think SF is cool, even SF stories about human extinction.

To which I would add, if you don't think SF is cool, then you probably have no patience for talk of the dangers posed by artificial intelligence. You simply will not countenance any discussion of the topic, and if someone manages to drag you kicking and screaming into such a discussion, your observations are likely to be embarrassingly superficial, even if you've demonstrated a competency for subtle analysis on other topics.

It seems like doomers who revel in spinning scenarios of industrial civilization falling prey to its key dependency on petroleum or to financial schemes run amok in an era of toothless and clueless regulatory regimes should be all over the idea that software is eating the world, starting with routine cognitive labor and climbing the skill ladder. You can't have a consumer economy without jobs, right? That should be right up the alley of forecasters who say the day is coming when the whole system will grind to a halt.

But no. The idea that artificial intelligence will play any part in the run up to the inevitable day of reckoning is a non-starter for most of the doomers I encounter on social media. They just won't have it. Moore's Law is for geeks and fantasists. It has no place in any adult conversation about the potential shelf-life of industrial civilization. If it involves computers, machine learning or rapidly advancing technology, they reason, it must be a techno-utopian wet dream, even if the end result is economic or environmental collapse.

The take-home message for me is that technophiles and technophobes alike pick their future scenarios based on their subcultural allegiances and their emotional needs. Once they select a scenario, they bring can intelligence, scholarship and great creativity to elaborating and supporting it, but the actual selection process is pre-rational.

Exactly 500 words!

peak oil, ai, collapse

Previous post Next post
Up