Every watt is not equal

Sep 09, 2007 23:03

Having spent the past few months working at renewable energy organisation, I've learned more than most people probably want to know about the workings of power grids. I find the whole thing quite fascinating, and I'll try to explain a few issues that really matter for pollution reduction from electricity generation. All the specifics will be with reference to Puget Sound, but the issues are fairly general.

In the US, utilities include a "fuel mix disclosure" with electricity bills. [I just noticed that the same is now true in the EU; it wasn't when I lived there]. This is a breakdown of what percentage of the electricity delivered by that utility was bought from which source, over the past month or year (I forget the details). In practice, this is only an approximation of the proportions of your individual electricity useage, and explaining why will take some time, so bear with me.

This has ended up rather jumbled because I wrote it in several sessions; I hope it makes sense.

Demand satisfaction

Storage of electricity is inherently inefficient and power grids tend to have few, if any, resources to store already-generated electricity. So there's a huge premium for the ability to generate precisely as much electricity as is needed at any given moment. Generate too much, and some ends up simply being sent to ground as pure waste ("dump load" - sometimes this has to be done because generation can't be shut down fast enough without damaging hardware). Generate slightly too little, and customers experience dropouts; interruptions to power so momentary that most users wouldn't notice, but which can cause serious damage to sensitive, unprotected equipment. Fall a little shorter, and customers get brownouts, which are longer voltage reductions that can damage a lot of electrical hardware. Fall shorter again, and power cuts become necessary, in order to at least deliver a usable voltage to some customers.

Shortfalls can be caused by simply not having enough total generation capacity, as is common in developing countries, and was behind California's rolling blackouts of a few years ago. But more often, they are to do with generation assets not being able to respond fast enough to demand that was higher than predicted. One consequence of this is that utilities put a great deal of resources into trying to forecast demand so they can be ready for it. This is why so much is known about things like the enormous power spikes during breaks in sports games - if the utilities fail to see that coming, the result will be a huge blackout.

Base load

Since there is always some demand on a typical power grid, and the cheapest ways to generate electricity are all very slow to change their output, there is usually a base load set of generators that are cheap to run but inflexible. Examples of this would be coal-fired generation (China; much of the US, East of the Rockies), nuclear power (a proportion of Britain's and the US's base load; I think France is the only country where it dominates), geothermal (Iceland; a share in NZ) or hydro (main source for NZ; 75% of all power generation in the US Pacific Northwest).

Hydropower is an odd one in that it's about the only thing that can be used as base load or in response to demand (which I'll come to below). It's suitable for base load because although next year's hydro availability can't necessarily be forecast, tomorrow's can with great accuracy, and almost all the cost of a hydro facility is front-loaded in the construction.

If tidal power hits the mainstream [no pun intended] it could potentially supply some base load, because its output is totally predictable. A technology that somehow smooths over the uneven availability (most energy halfway between high and low tides, none at all the moment the tide is turning) could provide an entire base load, but even without that it could take a lot of load away from fuel-burning or hydroelectric power. Many power plants that can't be switched on and off abruptly are perfectly capable of modulating their output smoothly to cope with a predictable variable like the flow of the tide.

Unpredictably available power

I don't recall if there's a proper technical term for these-there must be really-but I'm referring to the majority of renewable energy sources, with which generation is effectively free, but availability depends on environmental conditions. In current use, wind and solar power are the key examples here: all of their costs, both financial and environmental, are from the installation of the hardware, so it makes sense to always use as much power as they can provide, but that quantity is inherently unpredictable and usually only a small fraction of their potential output.

In practice, this limits how large a proportion of electricity demand can be supplied by wind or solar power, without some kind of on-demand generation available to make up the gaps. It also means that building enough wind turbines (or solar arrays) to theoretically provide 100% of power demand most of the time would be massively wasteful, because on a windy day there would be a large dump load through most of the day.

Peaking power

For at least part of a typical day, and certainly any time the grid is operating close to its maximum capacity, the base load can't satisfy demand. Because these peaks are not perfectly predictable, and sometimes even when they are the base load generators can't respond fast enough, peaking power plants are needed. In theory hydro plants can provide peaking power because just opening a sluice gate provides an almost instant increase in output; in practice I think this is rather rare because the fluctuations in river flow downstream would be dangerous, and hydro power is a much safer base load if the management of the reservoir can be planned in advance. Everywhere that I know of, peaking power comes from natural gas fired turbines, which respond quickly to demand, don't cost a great deal to build and maintain, but use a fuel which is both expensive and unpredictably priced.

Peak load

Peak load is the maximum demand placed on a grid. In most places, utilities know in general terms when it will happen and roughly how big it will be, but the details depend on the weather + some noise. In Seattle, peak loads are in the winter because of the combination of electric heating, relatively low use of air conditioning, and days that get short enough for people to be getting home and turning lights on before workplaces have closed for the day. In Arizona, peak loads are late afternoon on the hottest summer days, because air conditioning accounts for the biggest share of demand.

This is important because it determines how useful a given technology is. For instance, solar power is the best suited renewable energy source for most [probably all, but I don't have the data to hand] of Arizona, and also the Puget Sound area [I'm not joking - our winds are far too unreliable, as anyone who's tried to sail round here will appreciate]. But it's much more useful in Arizona, because its highest output is not far off correlating with peak demand, whereas round here solar power provides no output whatsoever at the usual peaks in demand.

Generation efficiency

No form of electricity 'generation' actually creates energy, as you probably realise. It's always a matter of converting some sort of natural energy store (originating more or less directly with the sun) into electricity. For most forms of power plant, there are several steps to this conversion; for instance a gas-fired plant burns the gas, which heats some water till it evaporates, then the steam drives a turbine which rotates some magnets, and the magnets induce a current in wires by a process that I only understand in general terms.

The important thing is that at every stage of conversion, some energy is lost. Not all the heat from the fuel goes into the water, not all the energy of the steam ends up turning the turbine, the turbine has some friction, and so on. How much is lost depends on the design and build quality of the generator, plus there are generally economies of scale and trade-offs between responsiveness (the ability to increase or decrease output quickly) and efficiency. The upshot of this is that base load generation tends to be a lot more efficient than the peaking plants.

Transmission losses

Transmitting electricity always results in some loss. There are many factors affecting this, but all other things being equal, the longer the distance it is transmitted over the more will be lost. A city's power delivery would be most efficient if the electricity were generated in the centre of the city, but this has obvious drawbacks so it's not the norm.

Transmission bottlenecks

A given transmission line has a limited capacity. I don't exactly know if this is an absolute limit, or a matter of efficiency decreasing as the load goes up, but either way there's a practical limit to how much can go through a given line. Additional lines are costly to build and maintain, so there are usually bottlenecks in a power grid. Around Seattle, there are significant bottlenecks for power coming in from the south or east. There are only two large power line corridors across the Cascades; they follow I-90 and US-2, but what really matters is that there are only two.

Some maps make clear the significance to Seattle of transmission losses and bottlenecks:
  • Washington wind power resources. Classes 4 and above are generally considered worth developing commercially; naturally a class 4 or above resource at the peak of Mount St. Helen's is less so than one on a plain east of Yakima. Notice how all the commercially worthwhile wind resources are east of the Cascades, while the vast majority of Washington's population is west of them.
  • Dams on the Columbia River and its tributaries. Notice how all are to the south and east of Seattle, and the closest ones are the other side of the Cascades bottlenecks.
  • Location of Washington's only nuclear power plant. Again, it's across one of the Cascades transmission bottlenecks.
  • I can't find an easily linkable solar map, but basically it's a similar pattern - the best resources are in the SE of the state
Apart from meaning that when the big one hits we are all but guaranteed to lose our electricity supply for a while, this creates a major practical problem for efforts to supply more of Puget Sound's electricity from renewable sources, as the best places to put the generating facilities are separated from all the population by transmission bottlenecks that are already at capacity during peak load.

Marginal capacity

All of the rambling behind the cut is principally to justify one key point: the source of marginal capacity. This is the capacity that will actually be switched on or off in response to a change in demand. Visualise it like this: my computer is drawing around 100 watts; the marginal capacity is the 100 watts that would actually be saved if I turned this computer off. The source of this marginal capacity is not the same as the overall fuel mix, and depends on all the other factors in the grid.

For instance, if the grid is currently running on nuclear-generated base load alone, saving that 100 watts transiently probably won't make the slightest bit of difference, because the base load can't be dialled back in response, so it will just be dumped instead. On the other hand, if that 100 watts is consistently saved, over time the output of the generators will be adjusted, so over the long term it actually would bring about some reduction in the amount of power generated, which in turn would mean less spent nuclear fuel for our childrens' childrens' childrean1000 to deal with.

On the other hand, if I turn this computer off at the moment of peak load, the consequences are quite different. If the grid is really overloaded, energy savings can prevent brownouts. If it's operating anywhere between base load and its upper limit, energy savings will come from peaking power, and the higher the peak the less efficient a source is likely to be involved.

In Seattle, the vast majority of our power comes from hydroelectric generation. But because of all the complications around transmission and peaking, marginal changes are likely to affect the output from natural gas fired power stations, even though they only account for 1.1% of the total. This has some interesting implications:
  • When you consume electricity is significant. Choosing to run a high power draw appliance (e.g. a washing machine) at a low demand time (e.g. in the morning) is likely to reduce greenhouse gas emissions, even though it consumes exactly the same amount of power.
  • Consequently, we'd have a less polluting system if more people were on a variable tariff that actually bills differently according to the time of day, giving them an incentive to put machines on timers.
  • Because peak load is heating-related, the efficiency of an electric home heating system makes more difference to greenhouse gas emissions than the efficiency of home cooling; the opposite is true in Arizona.
  • I have heard suggestions that Seattleites should heat their homes with electric rather than gas systems, to reduce greenhouse gases, but this is poor advice. Although in theory an electric heater in Seattle is using 89.8% hydro power, in practice someone switching to electric heat adds to the marginal load at peak times, which means that the new electricity demand is actually satisfied by gas burning. And it's much more efficient to burn the gas in the place that is being heated than go through all the steps of conversion and transmission....
Of course, that list is Seattle-specific, and in some ways our system is quite peculiar, but a lot of the principles generalise, and the main take-home message is that fuel mix disclosures concentrate on the finger and miss all that heavenly glory complexity.

Oh, and I haven't even touched on the economics that apply to this problem.

science, electricity, environment

Previous post Next post
Up