Two interesting articles on the “alternative energy” front yesterday, the first from our cousins south of the 49th. Back in the winter months, I briefly discussed the energy situation in the Lone Star State (“Braking Wind in Texas”, 3 February 2011), noting, among other things, that ice storms had caused damage to transmission lines and gas-fired backup power plants, leading to rolling blackouts during periods of high heating demand. Part of the problem is that about 8% of the total generating capacity of the state’s various electricity producers is provided by wind turbines, and as Britons found out this past winter, when their wind farms consumed more power than they produced in order to keep turbine gearboxes from freezing up, the wind often doesn’t blow when it’s really cold.
Well, as I remarked a few weeks back during the early July heat wave in Ontario, the wind tends not to blow when it’s really hot, either, which of course is when electrical demand climbs due to air conditioning requirements. This poses sort of a meta-question for the whole wind power industry: if the systems are least effective when power demands are highest, are they really the sort of generation systems we need to be subsidizing to the tune of billions of taxpayer dollars? Anyway, not surprisingly, the same thing is happening in Texas right now as happened back in February, although for a slightly different reason. Back then, persistently cold weather increased demand and reduced supply; now, persistently hot weather is leading to increased demand, and reducing supply.
Figures provided by ERCOT (the Electricity Reliability Council of Texas, Inc., an Orwellian title given that they are currently spending more time explaining why electricity is not reliable than providing reliable electricity) the largest single power utility in the state (kind of like OPG here in Ontario) illustrate the nature of the problem.[Note A] On Wednesday this week, the demand on ERCOT’s generating capacity reached 68,294 MW. That’s just 50 MW shy of the utility’s total generating capacity. 50 MW is 25 wind turbines - so much of Texas was only two-dozen bird-blenders away from rolling blackouts. And there lies the heart of the problem. The installed capacity of ERCOT’s wind turbines is 9000 MW, but on Wednesday, they were generating only 2000 MW - because the wind tends not to blow when it’s really hot. That’s a capacity factor of 22.2%. In other words, for every 1 MW of wind generation you plan to get, you have to install 4.5 MW of wind generation capacity. Compare that to nuclear, gas and coal-fired plants, where for every 9 MW you need, you have to install 10 MW of capacity. This is not calculus, people, it’s basic arithmetic. And here’s some more: the standard production rate price per MWh in Texas runs about $110. At peak periods, producers can earn $300-$400 per MWh. But when the supply-demand margin goes razor-thin, as it did on Wednesday, prices go completely nuts; generators chugging out the last few MW to meet skyrocketing demand can earn up to $3000/MWh, which is the state cap for production prices. Think about that - it’s thirty times the going rate, and those increases get passed on to consumers through higher rates. Imagine what gasoline would be like if instead of fluctuating by 5-10% in response to per-bbl price changes, the pump price changed by a factor of 30, and on the Friday before a long weekend you had to pay $36/litre to tank up? Clearly there is some financial benefit to generators to keeping capacity right at the breaking point, which wouldn’t happen if supply was designed to meet peak demand. Of course, running right at that razor edge is pretty hard when you have no idea what proportion of your 9000 MW worth of wind turbines are going to be spinning, and how many are just going to be sitting out there like modern art sculptures. When the difference between state-wide power and rolling blackouts comes down to 25 turbines out of the 1000 or so (out of 4500 installed) that are actually running, you’re really gambling on the breeze.
One of the ERCOT representatives interviewed argued that capacity shouldn’t be designed with “extreme” situations in mind. Kent Saathof, ERCOT’s VP of system planning and operations, said, “You have to determine if it is worth spending millions or billions to avoid a one in 10-year event.” [Note A] That’s generally a fairly sound businessman’s argument, and I’ve made it many times myself in design discussions. But when you’re talking about state-wide power generation, there are a few problems with it. First, when 8% of your generators cannot be relied on to produce at least 90% of their nameplate capacity, your margin of error is a lot bigger and therefore a lot less predictable. If Texas had suffered the same sort of wide-area heat dome effect that eastern Ontario suffered back in July, the 2000 MW of generation they were getting on Wednesday (out of the theoretical 9000 MW of installed wind capacity) could easily have dropped by 90%. It’s a lot harder to make up 2000 MW of generation with gas-fired emergency turbines - in fact, it’s pretty much impossible. That’s two whole nuclear power plants worth of electricity - plants that Texas hasn’t got. Second, if you’re structuring your capacity on a “one in ten-year event” with heavy reliance on intermittent sources like solar and wind power, you’ve either got to be prepared with reliable (which means fossil-fuelled) backup systems, or you’ve simply got to accept that there will be blackouts when demand goes too high. This is what the UK grid chief, Steve Holliday, meant when he said last March that one of the consequences of the UK government’s decision to “go green” was that “the era of constant electricity at home is ending” and that “families will have to get used to only using power when it is available.” [Note B]
Remember when electrification was one of the triumphs of the modern world? Remember when we used to mock the Soviets for being unable to provide continuous electricity to their citizens? Remember last decade, when certain politicians in the US decried every brownout in Baghdad as evidence of the failure of post-war reconstruction efforts? When exactly did not having electricity become virtuous?
The third problem, of course, is the breathtaking inconsistency of these arguments. The whole reason for ”going green” is to lower the carbon (dioxide, everybody forgets the dioxide) emissions that are allegedly killing the planet. Global warming, we are continuously being told, is now inevitable, and extreme heat waves are going to become more common. In other words, according to the most ardent proponents of “green energy”, heat waves like the one in Texas this week, and the one in Ontario last week, are not going to be “one in 10-year events”; they are going to become routine. If these folks actually believed their rhetoric, they would logically be structuring electrical generating capacity to deal with heat waves as occurrences that are inevitably going to become more frequent. And yet they’re not. They’re acting as though either (a) they expect major heat waves to remain a rare occurrence; or (b) they expect heat waves to become more common, but think that the end of the “era of constant electricity at home” is a good thing. They’re either hypocrites or Luddites.
So much for the late, great state of Texas. The second interesting article was also about alternative energy, but has nothing to do with wind turbines, blackouts or the Alamo. Anybody remember who Pons and Flesichmann were? They were the chaps who, back in 1989, announced that they had discovered “cold fusion”. Basically, they reported anomalous heat production and nuclear reaction by-products in a fuel cell where heavy water (normal water with a high proportion of deuterium) was being electrolysed by a palladium electrode. Their experiment could not be replicated, and “cold fusion”, like Piltdown Man, became a running joke for any bogus scientific claim. Well, fast-forward 20 years. For the past few months there have been reports coming out (not via the MSM, which doesn’t seem to have noticed the situation) from a private research lab in Italy where a scientist/entrepreneur named Andrea Rossi has been working on a similar concept using nickel substrates instead of palladium. An experiment back in January consumed 400 Watts of power and produced 12,000 Watts of heat, a result that would have been virtually impossible absent some nuclear-level effect. While Rossi hasn’t published his process (presumably because he hopes to make money from patenting it) or allowed observers to deconstruct his apparatus, objective monitoring and evaluations of his setup by physicists suggest that there are no other explanations for how much energy is produced by his “energy catalyzer”.
Yes, it sounds like something between black magic, alchemy and phlogiston. Yes, it has yet to be independently verified. However, Rossi is currently working on a 1 MW commercial version of his reactor based on ganging-up modules of the sort used in the January demo. It is expected to produce output water at 500 degrees C (See Note C; I won’t reproduce the article as it’s an excellent summary of the project to date). Such a device could be used for a wide variety of commercial (and military) applications. For large-scale power plants, for example, it could be used to provide input water to a final-stage boiler, generating - at almost no cost - the first 500 degrees of the 600 degrees generally needed by steam turbines, and thereby cutting fuel requirements for electrical generation by more than three-quarters.
Is it true? Maybe. From a scientific perspective, we have at present a small amount of confirmatory evidence, and no falsifying evidence. If Rossi produces and sells a 1 MW variant, it’ll be testable, and the evidence, pro or con, will pile up rapidly. In the meantime, though, what I find very interesting about this is that it came pretty much out of left field. In his famous 2003 speech entitled “Aliens Cause Global Warming”, Michael Crichton opined on how it was impossible to predict the future. Looking back at the year 1900, he asked how it would have been possible for citizens of that era to predict what the world would be like in 2000. His survey of the technological and sociological changes alone is worth a read:
Let’s think back to people in 1900 in, say, New York. If they worried about people in 2000, what would they worry about? Probably: Where would people get enough horses? And what would they do about all the horseshit? Horse pollution was bad in 1900, think how much worse it would be a century later, with so many more people riding horses?
But of course, within a few years, nobody rode horses except for sport. And in 2000, France was getting 80% its power from an energy source that was unknown in 1900. Germany, Switzerland, Belgium and Japan were getting more than 30% from this source, unknown in 1900. Remember, people in 1900 didn’t know what an atom was. They didn’t know its structure. They also didn’t know what a radio was, or an airport, or a movie, or a television, or a computer, or a cell phone, or a jet, an antibiotic, a rocket, a satellite, an MRI, ICU, IUD, IBM, IRA, ERA, EEG, EPA, IRS, DOD, PCP, HTML, internet. interferon, instant replay, remote sensing, remote control, speed dialing, gene therapy, gene splicing, genes, spot welding, heat-seeking, bipolar, prozac, leotards, lap dancing, email, tape recorder, CDs, airbags, plastic explosive, plastic, robots, cars, liposuction, transduction, superconduction, dish antennas, step aerobics, smoothies, twelve-step, ultrasound, nylon, rayon, teflon, fiber optics, carpal tunnel, laser surgery, laparoscopy, corneal transplant, kidney transplant, AIDS... None of this would have meant anything to a person in the year 1900. They wouldn’t know what you are talking about.
Now. You tell me you can predict the world of 2100. Tell me it’s even worth thinking about. Our models just carry the present into the future. They’re bound to be wrong. Everybody who gives a moment’s thought knows it. (Note D)
I particularly like the second sentence of the second paragraph - “in 2000, France was getting 80% of its power from an energy source that was unknown in 1900.” In 1900, we didn’t know what an atom was, and today nuclear power is prevalent worldwide. Is Rossi’s “energy catalyzer” one of the things that nobody saw coming as much as 10 years ago, and that in 20 years, could be an important source of worldwide energy? Maybe, maybe not. Maybe it’s a load of bushwah. Maybe we’ll be getting the bulk of our power from something different entirely - perhaps a scientific development for which the basic science still has to be done. But if Rossi’s “E-Cat” is real, it will render discussions of “peak oil” and the nuclear willies of the hand-wringing post-Fukushima panickers entirely moot. If it’s real, wind turbines will be quickly abandoned as the costly and ineffective white elephants they are (or used for pumping water on farms, as they have been for centuries). And if, like oil, low-energy ’fusion’ does end up changing the world, then we - like the 19th Century prophets of the horse-manure apocalypse - didn’t see it coming, because, as Crichton warns, you can’t predict where technology will go, and we were, as a result, simply projecting current trends into the future.
Here’s hoping that Rossi’s got something, that it will be commercializable in the near term, and that it will contribute to electrical generation. Because the 1,323-page “Cross-State Air Pollution Rule” that the Environmental Protection Agency just passed at the end of July and that is due to take effect on 1 January 2012 will force Texas to close 18 power plants that provide about 11,000 MW of coal-fired generation capacity. Imagine what the past week would have been like for Texans without that electrical power.
Looks like the “end of the era of constant electricity at home” isn’t going to just be a British phenomenon.