Global warming and the Singularity

Sep 21, 2009 20:23

Last night I made a serious strategic error: I dared to suggest to some Less Wrongers that unFriendly transcendent AI was not the most pressing danger facing Humanity.

In particular, I made the following claims:
  1. That runaway anthropogenic climate change, while unlikely to cause Humanity's extinction, was very likely (with a probability of the order ( Read more... )

doomed, ai, environmentalism, grim meathook future

Leave a comment

pozorvlak September 21 2009, 22:01:26 UTC
Aren't you assuming a population comparable to that of today?

Good point. I suppose I was tacitly assuming that a high population is essential for progress, particularly in a pre-industrial world. But that assumption bears examining.

Of course, we can do a lot with very little energy now - my favourite example is solar water pasteurisers, which are incredibly simple devices that produce drinkable water using only the heat of the Sun, a square metre of tinfoil, and 200 years of advances in microbiological understanding. Whether we'd retain said understanding through a Collapse is another question :-(

And once civilisation has reached a point a few centuries from where we are now, with asteroid mining, resource exhaustion is essentially an irrelevance.

We've got to get there first, which is why a 21st century ACC-inspired Collapse is such a serious problem. Give us another hundred years of industrial civilisation and we'd get stable fusion generators and could run all the carbon scrubbers we like, but by then the damage would be done.

Without knowing too many of the details, would it not be possible to essentially `switch off' any AI system based on current technology

You've got to be fast enough, and you've got to realise what's happening. If you're going to try to build a human-or-greater level general AI and potentially expose Humanity to extinction, you must assume that the AI can become smarter, faster and sneakier than you are without you noticing it. Google for "AI Box".

Reply


Leave a comment

Up