Global warming and the Singularity

Sep 21, 2009 20:23

Last night I made a serious strategic error: I dared to suggest to some Less Wrongers that unFriendly transcendent AI was not the most pressing danger facing Humanity.

In particular, I made the following claims:
  1. That runaway anthropogenic climate change, while unlikely to cause Humanity's extinction, was very likely (with a probability of the order ( Read more... )

doomed, ai, environmentalism, grim meathook future

Leave a comment

Comments 44

(The comment has been removed)

pozorvlak September 21 2009, 22:01:26 UTC
Aren't you assuming a population comparable to that of today?

Good point. I suppose I was tacitly assuming that a high population is essential for progress, particularly in a pre-industrial world. But that assumption bears examining.

Of course, we can do a lot with very little energy now - my favourite example is solar water pasteurisers, which are incredibly simple devices that produce drinkable water using only the heat of the Sun, a square metre of tinfoil, and 200 years of advances in microbiological understanding. Whether we'd retain said understanding through a Collapse is another question :-(

And once civilisation has reached a point a few centuries from where we are now, with asteroid mining, resource exhaustion is essentially an irrelevance.We've got to get there first, which is why a 21st century ACC-inspired Collapse is such a serious problem. Give us another hundred years of industrial civilisation and we'd get stable fusion generators and could run all the carbon scrubbers we like, but by then the damage would be done ( ... )

Reply


necaris September 21 2009, 22:36:10 UTC
Your links scare me all the more because I know most of what they say is true. It makes me wonder, a bit, whether it'll be possible for me to go home without a boat in twenty years' time.

Reply


ext_207808 September 22 2009, 00:17:40 UTC
"You can't run an industrial civilisation on wood and charcoal ( ... )

Reply

pozorvlak September 22 2009, 07:57:35 UTC
So, it seems that you could, in fact, run a civilization at the 1800 level of industrial energy use by making use of bio-fuels like ethanol.

OK, then: you can't run an industrial civilisation on biofuels without an 85% die-off. Happy? :-)

Of course, it's more complicated than that: you could support many more people (or provide a higher standard of living) on that amount of energy using modern technology, which is much more efficient than 1800 technology. But I'm explicitly talking about a post-Collapse economy, which is hard to predict in detail but would probably have more in common with 1800 tech than with 2000 tech.

Reply

ext_207808 September 22 2009, 14:24:11 UTC
But if you are in favour of creating energy whilst emitting less CO2, you must believe that there are ways of sustaining our lifestyles and our population using renewables such as wind turbines, solar Stirling engines and hydroelectric.

If we conclude that post AGW the human race wouldn't go extinct, rather it would slip back to a 1600-level civilization and have to re-do the industrial revolution, and that biofuels would allow industry as fossil fuels did in the first industrial revolution, then presumably you think that that new civilization would develop renewable sources of energy like solar stirling engines and hydroelectric dams, and would eventually rise to something like the level we see today.

A lot of people would die prematurely in this process. The lives lost would be on the order of 5 billion - most of the population. Many people would suffer. Most of those people would be people who don't even exist yet, because they haven't been born yet.

Reply

pozorvlak September 22 2009, 19:19:36 UTC
But if you are in favour of creating energy whilst emitting less CO2, you must believe that there are ways of sustaining our lifestyles and our population using renewables such as wind turbines, solar Stirling engines and hydroelectric.

That doesn't strictly follow; but since I do in fact believe those things, I'll let it slide.

presumably you think that that new civilization would develop renewable sources of energy like solar stirling engines and hydroelectric dams, and would eventually rise to something like the level we see today.

Not necessarily. A lot of renewable energy sources rely on modern materials science and computation to attain acceptable efficiency - things that we only developed by going through the Oil Age. A rebooting civilisation wouldn't have this luxury. Coal might do as a bootstrap fuel, but I'm not convinced of that - it takes a lot of energy to liquefy, it's much dirtier, it's harder to extract, and so on.

Reply


Preventing death ext_207808 September 22 2009, 00:34:52 UTC
Miles said ( ... )

Reply

Re: Preventing death half_of_monty September 22 2009, 03:58:52 UTC
How many deaths is climate change responsible for per year at the moment?

About 300 thousand. But that's only with - what are we on now? - 0.6 degrees of warming.

Reply

Re: Preventing death necaris September 22 2009, 06:38:05 UTC
It should also be noted that Wikipedia is a) potentially BS, and b) evidently very privileged. How many deaths occur from TB every year? Aren't they preventable? (Plus, 2.5 million from *obesity*? I am incredibly wary of figures "due to obesity").

Reply

Re: Preventing death pozorvlak September 22 2009, 10:11:31 UTC
I can't find that table anywhere in the paper it's supposedly cited from. The tables I can find, however, do mention TB. Apologies for lack of formatting.

Low-and-middle-income countries
Cause Deaths (millions) % of total deaths
1 Ischaemic heart disease 5·70 11·8%
2 Cerebrovascular disease 4·61 9·5%
3 Lower respiratory infections 3·41 7·0%
4 HIV/AIDS 2·55 5·3%
5 Perinatal conditions 2·49 5·1%
6 Chronic obstructive pulmonary disease 2·38 4·9%
7 Diarrhoeal diseases 1·78 3·7%
8 Tuberculosis 1·59 3·3%
9 Malaria 1·21 2·5%
10 Road traffic accidents 1·07 2·2%

High-income countries
Cause Deaths (millions) % of total deaths
1 Ischaemic heart disease 1·36 17·3%
2 Cerebrovascular disease 0·78 9·9%
3 Trachea, bronchus, lung cancers 0·46 5·8%
4 Lower respiratory infections 0·34 4·4%
5 Chronic obstructive pulmonary disease 0·30 3·8%
6 Colon and rectum cancers 0·26 3·3%
7 Alzheimer's disease and other dementias 0·21 2·6%
8 Diabetes mellitus 0·20 2·6%
9 Breast cancer 0·16 2·0%
10 Stomach cancer 0·15 1·9%

Reply


Expected utility vs. probability ext_207808 September 22 2009, 00:59:08 UTC
Miles said: " think the onus is on you to explain why it's so overwhelmingly probable that we'll be saved at the last minute by a deus ex machina."

- You don't have to think that it is overwhelmingly probable. a 1% probability of destruction of the human race by superintelligence might be a greater expected utility loss than certainty of moderate climate change damage.

This is especially true if you consider an action that also produces a 1% increase in a benevolent AI scenario. The utility of a benevolent AI scenario, if you are prepared to care equally about all the humans in the world, dwarfs the utility gain from preventing poverty caused by climate change. All the people dying of Hypertension, smoking, old age. All the people living lives that are hard and unrewarding. All the people who suffer terribly. All the cancer patients and their relatives. Benevolent AI cures all that, if it is indeed possible.

Reply

Re: Expected utility vs. probability half_of_monty September 22 2009, 03:56:59 UTC
a 1% probability of destruction of the human race by superintelligence might be a greater expected utility loss than certainty of moderate climate change damage

Yes, but we do not have certainty of moderate climate damage. [Actually, the world needs to work pretty damn hard at Copenhagen to get the slightest chance of only having moderate climate damage.] Butanyway, these same issues are discused in the context of climate change here.

There is much debate on the pdf of climate sensitivity - one prestigious recent paper shows its variance to be infinite (and so far less extreme damage functions would need to be taken to prove the same point as Weitzman makes here); others claim not. But, as pozorvlak says, the most optimistic of these studies puts P(s>4.5) at around 5%. And recall again: climate sensitivity is the equilibrium temperature change you get if you manage to restrict atmospheric concentrations to double the pre-industrial level - which would be hard (and if you manage to somehow maintain the Earth's current albedo ( ... )

Reply

Re: Expected utility vs. probability pozorvlak September 22 2009, 08:20:35 UTC
I know little about unfriendly AI. But I would be surprised if the numbers were comparable.

It's an article of faith a common belief among Singularity types that Friendly AI could bring about infinite life, either through consciousness upload or through cryonic revival, and hence that the payoff of building one would be infinite. All rather Pascal's Wager, if you ask me.

Reply

Re: Expected utility vs. probability ext_207808 September 22 2009, 12:39:42 UTC
"hence that the payoff of building one would be infinite."

You don't have to assign infinite positive utility to building an FAI. It is enough that it cures aging and prevents all other extinction risks, which it probably would.

Anyone who assigns infinite utility to an outcome is a bit odd; they would sacrifice anything else for any positive chance of that outcome occurring.

Reply


Leave a comment

Up