Saturday, December 15, 2012

9 February 2012 – Where’s the Yorktown?


The most costly of all follies is to believe passionately in the palpably not true.  - H.L. Mencken
 

Colleagues,

The other day I was reading an article from the July 2011 edition of the Journal Of Military History - "Some Myths of World War II", by Gerhard Weinberg - when one of the passages leapt out at me.  I'll cite it in its entirety for you in a moment, but I wanted to explain first what it was that had caught my eye. 

As those of you who read a line or two into these missives before pressing the delete key have no doubt noticed, I'm a strong proponent of evidence as the core and foundation of science.  Evidence, I've argued in the past, plays the key role in every step in a rigorous scientific endeavour.  The first step in science - observation - involves noticing some sort of data, recording it, and wondering what caused it.  The second step, hypothesization, requires coming up with a thesis that explains and is consistent with one's initial observations.  The third step, experimentation, is an organized search for more evidence - especially evidence that tends to undermine your hypothesis; and the fourth, synthesis, requires the scientist to adapt his hypothesis so that it explains and is consistent with not only the original observed evidence, but also all of the evidence garnered through research and experimentation.  This last criterion is key to the scientific method: if one's hypothesis is not consistent with observed data, it must be changed; and if it cannot be changed to account for observations, it must be discarded.  Science is like evolution in that regard; only the fittest hypotheses survive the remorseless winnowing process, and progress requires enormous amounts of time, and enormous amounts of death.  The graveyards of science are littered with extinct theories.

But there's a fifth stage to science, one that tends to be overlooked because it's technically not part of the "holy quaternity" I've outlined above.  The fifth stage is communicating one's conclusions - not only to one's fellow seekers-of-fact, but also to those on whose behalf one's efforts are engaged.  This stage requires telling the whole story of the endeavour, from start to finish, including all of the errors and pitfalls.  Publicizing one's misapprehensions and mistakes is a scientific duty, for two reasons: first, because it improves the efficiency of science by helping one's colleagues avoid the errors of observation and analysis that you've already made (in army terms, "There's no substitute for recce"); but more importantly, because total disclosure of data and methods, and frank, transparent discussion of errors of fact and judgement, are what give science its reliability and rigour.  Transparency and disclosure are the keystone of the scientific method, and are the reason that science works - and the reason, I might add, that science has produced steady progress since the Enlightenment, and has come to be trusted as the only reliable means of understanding the natural world.

The thing is, disclosure of data and methods, and frank discussion of errors of fact and judgement, might be strengths in the scientific field, but in other fields of endeavour, they are often interpreted as weaknesses.  In a competitive business environment, for example, disclosure of data and methods makes it easier for competitors to appropriate or undermine your ideas.  In such a cut-throat atmosphere, the individual who conceals his activities, who refuses to discuss with colleagues (whom he likely views as competitors) what he is doing, and how, and why, will enjoy a comparative advantage vis-à -vis those who are more open about their data, methods and conclusions.  As for discussion of errors of fact and judgement, in a non-scientific atmosphere, the admission that one has made errors in experimental design or in data collection, or the confession that one may have misinterpreted observations or the results of an experiment, may be taken as a sign of incompetence or lack of ability, rather than as an indication of the type of honest transparency that is the hallmark of good science.

Acknowledgement of error, of course, is also key to correcting an erring course of action - and this is true in every area of activity in life, not merely in science.  The first step, as they say, is admitting that you have a problem - and this leads me back to the citation from the article that prompted this line of thought in the first place.  In the article I mentioned above, Weinberg - a distinguished scholar who has been writing about the Second World War for the better part of 70 years - discusses a wide variety of some of the "myths" about some of the well-known leaders of World War II.  When he comes to Admiral Isoroku Yamamoto, one of his more trenchant questions pertains to why the famous naval commander, in preparing for the pre-emptive attack on Pearl Harbour, was "so absolutely insistent on a project with defects that were so readily foreseeable."  Why, for example, would you attack a fleet in a shallow, mud-bottomed harbour, where damaged vessels could be so easily raised and repaired (18 of the 21 ships "severely damaged" at Pearl Harbour eventually returned to service)?  Weinberg suggests that it may have been just as well for Yamamoto's peace of mind that he was dead by the time of the battle of Surigao Strait in October 1944, "when a substantial portion of the Japanese Navy was demolished by six American battleships of which two had allegedly been sunk and three had been badly damaged" three years earlier at Pearl Harbour.  Weinberg speculates that Yamamoto may have been so "personally invested" in his tactical plan for the Pearl Harbour attack that he was "simply unable and unwilling to analyze the matter objectively":

There is an intriguing aspect of the last paper exercise of the plan that he conducted in September 1941, about a month before the naval staff in Tokyo finally agreed to his demand that his plan replace theirs.  In that exercise, it was determined that among other American ships sunk at their moorings would be the aircraft carriers including the Yorktown.  Not a single officer in the room had the moral courage to say, “But your Excellency, how can we sink the ‘Yorktown’ in Pearl Harbor when we know that it is with the American fleet in the Atlantic?”  None of these officers lacked physical courage: they were all prepared to die for the Emperor, and many of them did.  But none had the backbone to challenge a commander whose mind was so firmly made up that none dared prize it open with a touch of reality.[Note A]

Where do we lay the blame for this error of fact?  On Yamamoto, for coming up with a scenario whose fundamental assumptions were known to be wrong?  On his subordinates, who accepted those assumptions, knowing them to be wrong?  Remember, this was not a matter of a difference of opinion between experts; it wasn't a question of differing interpretations of equivocal results.  Nor was this a notional, peace-time exercise; this was an exercise that took place in the work-up stage of what was intended to be the very definition of a critical strategic operation - a knock-out blow that would curtail the ability of the US to interfere with Japan's plans for the Pacific.  In such circumstances, one would think that accuracy of data would be at a premium, and that fundamental errors of fact would be corrected as soon as they were noticed.  But this didn't happen.  This is a very telling anecdote, and Weinberg distils from it a devastatingly important observation: that men who had no fear of death, and who would (and did) fight to their last breath, nonetheless somehow lacked the moral courage to disagree with their superior on a matter of indisputable fact: that the exercise scenario did not reflect the real world.  Where the USS Yorktown actually was (it was conducting neutrality patrols between Bermuda and Newfoundland throughout the summer and autumn of 1941) was in fact very important - and what is more, Japanese intelligence had that information.  Yet no one stepped up to correct the scenario.  No one challenged the planning assumptions when actual data showed those assumptions to be wrong. 

Yorktown's location should have mattered to Yamamoto.  When the Japanese Navy struck Pearl Harbour on 7 December 1941, USS Yorktown was in port in Norfolk, Virginia.  She left Norfolk on 16 December, and was in San Diego by 30 December.  She was later involved in launching the Smiley raids against the Marshall Islands.  Her air wing accounted for numerous Japanese warships and landing craft at the Battle of the Coral Sea. 
 
 
She was also at Midway, where her air wing destroyed the Japanese carriers Soryu and Hiryu, the latter while flying from USS Enterprise after Yorktown was disabled. 
 
 
In less than six months, that one ship accounted for 25% of Japan's carrier strength. 
 
 
Where she was in the fall of 1941 mattered, because none of these things would have happened if Yorktown had been badly damaged or sunk at Pearl Harbour.
 


Why did Yamamoto's exercise plan assume that Yorktown would be among the ships tied up in Battleship Row, when he knew it was on the other side of the world?  Why did no one challenge the scenario?  It's easy, as many historians have done (and not without justification) to write off staff subservience to a commander as the inevitable consequence of centuries of deification of the Emperor, the warrior philosophy of bushido, and decades of Imperial Japanese militarism.  But how can we simultaneously accept that such an organizational entity is at once both a dangerously innovative and adaptive enemy, and yet so rigidly hierarchical that the thought processes of the senior leadership are utterly impervious even to objective fact?  The two qualities, it seems to me, are mutually exclusive.  One cannot be both intellectually flexible and intellectually hidebound.

Yamamoto's refusal to incorporate known facts into his exercise plan, and his subordinates' refusal to insist on their incorporation, resulted in flawed planning assumptions.  Over time, the impact of these flawed assumptions came back to bite them in their collective posteriors, in some cases personally.  But they bit the Empire, too.  In her 1984 treatise on folly, Barbara Tuchman spends most of her time castigating governmental elites for long-term, dogged pursuit of policy contrary to the interests of the polity being governed; but she also investigates the sources of folly.  One of these, upon which she elaborates with characteristic rhetorical verve, is the refusal of those in positions of authority to incorporate new information, including changes in objective facts, into their plans:

Wooden-headedness, the source of self-deception, is a factor that plays a remarkably large role in government.  It consists in assessing a situation in terms of preconceived fixed notions while ignoring or rejecting any contrary signs.  It is acting according to wish while not allowing oneself to be deflected by the facts.  It is epitomized in a historian’s statement about Phillip II of Spain, the surpassing wooden-head of all sovereigns: ‘No experience of the failure of his policy could shake his belief in its essential excellence’. [Note B]

Or to put it more simply, as economist John Maynard Keynes is reputed to have said when he was criticized for changing his position on monetary policy during the Great Depression, "When the facts change, I change my mind.  What do you do, sir?"

I could go on about this at great length, citing innumerable examples (as I expect we all could) but I thought that just one would do - surprisingly, from the field of climate science.  In its 4th Assessment Report, issued in 2007, the IPCC claimed that Himalayan glaciers were receding faster than glaciers anywhere else in the world, and that they were likely to "disappear altogether by 2035 if not sooner."  In 2010, the IPCC researchers who wrote that part of the report confessed that they had based it on a news story in the New Scientist, published in 1999 and derived from an interview with an Indian researcher - who, upon being contacted, stated that his glacial melt estimates were based on speculation, rather than formal research.  Now, according to new, peer-reviewed research based on the results of satellite interferometry - i.e., measured data rather than speculation - the actual measured ice loss from glaciers worldwide is "much less" than has previously been estimated. 

As for the Himalayas - well, it turns out they haven't lost any ice mass at all for the past ten years.(Note C)  As one of the researchers put it, the ice loss from high mountain Asia was found to be "not significantly different from zero." 

Amongst other things, this might explain why satellite measurements show that sea levels, far from rising (as the IPCC said they would), have been stable for a decade, and now appear to be declining.  After all, the amount of Aitch-Two-Oh on Earth is finite, so if the glaciers aren't melting, that would explain why the sea levels aren't rising, non?

Remember, according to the IPCC AR4, under the most optimistic scenario for GHG reductions (i.e., massive reductions in GHG emissions, which certainly haven't happened), sea level would increase by 18-38 cm by the year 2099, which equates to an absolute minimum increase of 1.8 mm per year (see the AR4 SPM, pp 7-8, Figure SPM.5 and Table SPM.1).  According to the IPCC's model projections, over the past decade sea levels should have risen by at least 1.8 cm (on the above chart, that's half the Y-axis, which is 35 mm from top to bottom.  In other words, the right hand end of the curve should be well above the top of the chart).  But according to the most sensitive instruments we've yet devised, sea levels haven't risen at all over the past decade, and for the past three years have been steadily falling.  This isn't a model output; it's not a theory.  Like the non-melting of Himalayan glacier ice, it's observed data.

Why point this out?  Well, our current suite of planning assumptions state that, amongst other things, Himalayan glaciers (along with glaciers all over the world) are melting, and that as a consequence, sea levels are rising.  "Rising sea levels and melting glaciers are expected to increase the possibility of land loss and, owing to saline incursions and contamination, reduce access to fresh water resources, which will in turn affect agricultural productivity." (Note D)  Those assumptions were based on secondary source analysis - basically, the IPCC reports, which were themselves largely based on secondary source analysis (and, as shown above, occasionally on sources with no basis in empirical data) - but when verified against primary research and measured data, they are demonstrably wrong.  In essence, stating that defence planning and operations are likely to be impacted by "rising sea levels" or "melting glaciers" is precisely as accurate as Yamamoto planning to sink the USS Yorktown at Pearl Harbour when everyone knew that it was cruising the Atlantic.

A big part of folly consists of ignoring facts that don't conform to one's preconceptions.  Actual planning or policy decisions may be above our pay grade as mere scientists - but what isn't above our pay grade is our moral and professional obligation to point out the facts, and to remind those who pay us that when facts conflict with theories or assumptions, it's the theories or assumptions that have to change. 

So when we see people basing important decisions on inflammatory rhetoric from news reports – for example, a CBC piece hyping an already-hyped “scientific” study arguing that sea level rises have been “underestimated”...


Some scientists at an international symposium in Vancouver warn most estimates for a rise in the sea level are too conservative and several B.C. communities will be vulnerable to flooding unless drastic action is taken.



The gathering of the American Association for the Advancement of Science heard Sunday the sea level could rise by as little as 30 centimetres or as much as one metre in the next century.



But SFU geology professor John Clague, who studies the effect of the rising sea on the B.C. coast, says a rise of about one metre is more likely.

...it behooves us, as scientists, to look at the data.  And what do the data say in this case? Well, they say that over the past century, the sea level at Vancouver hasn’t risen at all.
 

The average sea level at Vancouver for the past century has been 7100 mm.  If, as the above-cited report claims, Vancouver's sea level had risen by 30 cm, then the average at the right-hand end of the screen should be 7400 mm, close to the top of that chart.  But it's not; it's still 7100 mm. The trend is flat. Absolutely flat.

So as it turns out, according to one of the supposed "key indicators" of anthropogenic climate change is concerned, nothing is happening. If sea levels are the USS Yorktown, the Yorktown isn't at Pearl Harbour at all. She's nowhere near Hawaii. She's not even in the Pacific; she's on the other side of the world, off cruising the Atlantic. But nobody has the moral courage to say so. All of the planning assumptions in use today are firmly fixated on the idea that the Yorktown is tied up at Pearl.

Well, assume away, folks. But no matter how many dive bombers you send to Battleship Row, you are simply not going to find and sink the Yorktown, because the facts don't give a tinker's damn about your assumptions. She is just...not...there.

Of course, finding out the facts – the observed data – is only the first part of our duty as scientists. The next and even more important part is...what do we do with them?  Unlike Yamamoto's juniors, whose ethical code condemned them to obedience, silence and eventually death, our ethical code requires us to sound the tocsin - to speak up when decisions are being made on a basis of assumptions that are provably flawed. And more than that; we shouldn't just be reporting contrarian evidence when we stumble across it, we should be actively looking for it.  A scientific theory can never be conclusively proved; it can only be disproved, and it only takes a single inexplicable fact to disprove it.  Competent scientists, therefore, hunt for facts to disprove their theories - because identifying and reconciling potential disproofs is the only legitimate way to strengthen an hypothesis.  We have to look for things that tend to undermine the assumptions that are being used as the foundation for so much time, effort, and expenditure, because that's the only way to test whether the assumptions are still valid. If we don't do that - if we simply accept the assumptions that we are handed instead of examining them to see whether they correspond to known data, and challenging them if they don't - then we're not doing the one thing that we're being paid to do that makes us different from the vast swaths of non-scientists that surround us. 

Playing the stoic samurai when important, enduring, expensive and potentially life-threatening decisions are being based on demonstrably flawed assumptions is fine if your career plan involves slamming your modified Zero into the side of a battleship and calling it a day.  But if you'd prefer to see your team come out on top, sometimes you've got to call BS.  That should be easier for scientists than it is for most folks, because it's not just what we're paid to do; it's our professional and ethical obligation to speak out for the results of rigorous analysis based on empirical fact.  
 
So when you think about it, "Where's the Yorktown?" is in many ways the defining question that we've all got to ask ourselves: is she where our assumptions, our doctrine, last year's exercise plan, the conventional wisdom, the pseudo-scientific flavour of the month, or "the boss" says she ought to be? 

Or is she where, based on observed data, we know her to be?
How we answer that question when it's posed to us determines whether we are scientists...or something else.

Cheers,

//Don//

Post Scriptum
Today, the USS Yorktown is in 3 miles of water off of Midway Island.  She was rediscovered by Robert Ballard on 19 May 1998, 56 years after she was lost.


For the record, the average sea level at Midway Atoll has increased by about 5 cm since 1947 (65 years). This amounts to 0.74 mm per year, which is less than 1/4 the alleged minimum rate of sea level increase expected due to "climate change", according to the IPCC. And even that tiny increase is due not to "anthropogenic climate change", but rather to isostatic adjustment as the weight of the coral atoll depresses the Earth's crust, and - according to this recent study - to tidal surges associated with storminess that correlates with the increasing Pacific Decadal Oscillation over the past sixty years.

It's all about the evidence, friends. You either have it or you don't. And if you have it, you either try to explain it - like a real scientist - or you ignore it. Ignoring the evidence didn't turn out well at all for Yamamoto, the Imperial Japanese Navy, or in the long run, Imperial Japan.

Notes
Photos courtesy the
US Navy History and Heritage Command.

A) Gerhard L. Weinberg, “Some Myths of World War II – The 2011 George C. Marshall Lecture in Military History”, Journal of Military History 75 (July 2011), 716.

B) Barbara Tuchman, The March of Folly from Troy to Vietnam (New York: Ballantine Books, 1984), 7.

C)
http://www.guardian.co.uk/environment/2012/feb/08/glaciers-mountains

D) The Future Security Environment 2008-2030 - Part 1: Current and Emerging Trends, 36.

Friday, December 14, 2012

Apollo 17 - 40 years, and still waiting

Colleagues,

40 years ago today, Gene Cernan was the last human to walk on another planet.



It had only been 11 years since Alan Sheppard had flown into space for a grand total of five minutes, atop a glorified bottle rocket. Cernan and his crewmates Ron Evans and Harrison Schmitt spent three days on the Moon.

To quote Tom Hanks (as Apollo 13 commander Jim Lovell), "It's not a miracle.  We just decided to go."

Man's reach must exceed his grasp, else what's a heaven for?

Monday, December 10, 2012

Bad Gas

Colleagues,

Last year, when the EPA announced its new fuel efficiency standards for automobiles, I penned this little analysis. I thought I'd post it now, seeing as how Canada, courtesy the irrepressible bureaucrats at Environment Canada, has decided to follow Obama down the rabbit hole.

--------------------------------------------



Colleagues,

Just a short note today to signal Wednesday's announcement by the White House of new EPA regulations that will require automobiles, by 2025, to achieve a fuel efficiency standard of 54.5 miles per gallon (Note A).

…yeah, okay then.  For the record, the auto industry, and the internal combustion engine that is it's principal component, is over 100 years old.  Automakers have been attempting to improve fuel efficiency for pretty much the whole of that period.  How far, you ask, have they gotten?  Well, according to Ford, the top fuel efficiency of the Model-T - the Flivver, the infamous Tin Lizzie, mass production of which began in September of 1908, a little over 104 years ago - was 25 MPG.  The highway fuel efficiency rating of a 2012 4WD Ford Fusion is…25 MPG.  The latter is naturally a little more comfortable, what with air conditioning, a CD player, and cushioned as opposed to wooden seats, but the fuel efficiency is pretty much the same.

Think I'm joking?  Let's go to the data.  In 1978, a 6-cylinder Jeep CJ got 18.1 MPG on the highway.  In 2011, a Jeep Patriot 2WD, after 33 years of gas crunches, climbing gas prices, and EPA efficiency targets, got 29 MPG on the highway.  Not interested in SUVs?  In 1978, the top fuel efficiency for a production car was 35.1 MPG by the Chevy Chevette (it beat the famously fulminatory Pinto by half-a-MPG) with 35.1 MPG on the highway.  The Chevette, for those of you who never got to see Star Wars in a theatre, was a two-door hatchback weighing in at less than a ton.  A comparable vehicle today?  How about a Honda Civic?  It gets 36 MPG.

I expect you see the pattern.  Let's make it graphical.  Here's US Government data on the MPG characteristics of the current suite of 2012 production vehicles on offer.  Along the Y-axis we have highway fuel efficiency in MPG; along the bottom, vehicular class, as follows: 1=2-seaters, 2=minicompact, 3=subcompact, 4=compact, 5=midsize cars, 6=large cars, 7=small station wagons, 8=midsize station wagons, 9=large station wagons, 10=small 2WD pickups, 11=small 4WD pickups, 12=standard 2WD pickups, 13=standard 4WD pickups, 14=cargo vans, 15=passenger vans, 17=2WD special purpose vehicles, 20=2WD minivans, 21=4WD minivans, 22=2WD SUVs, and 23=4WD SUVs.




2015 is three years away.  Which cars meet the EPA's mandated fuel efficiency targets for that year?  Well, using composite city-highway milage, there are 8 that do (they don't all show up because of overlap on the graph).  Here they are:  The Toyota Prius, the Prius wagon, the Honda Civic Hybrid, the Toyota CT200H, the Ford Fusion Hybrid, the Ford Lincoln MKZ hybrid, the Chevy Volt, and the Toyota Scion iQ.

Guess what they all have in common?  That's right.  They're all hybrids.  Notice something else?  They're all small.  Even the one that nominally rates as a wagon - the Prius v - isn't exactly a battleship.  In other words, it's impossible for today's automakers to make a car that meets the EPA's 2015 fuel efficiency standard without making it both small and a hybrid.

What if we look just UNDER the EPA standard?  Well, the Volkswagon Passat gets 35 MPG combined (it gets 43 MPG on the highway, which is better than every hybrid vehicle except the Prius and the Civic hybrid).  Moreover, it's the same size as the Prius (bigger than the Volt or the Civic).  So what's the difference?  It's in the MSRP, amigos.

 

2012 Volkswagen Passat Sedan - base price $19,995

2012 Chevy Volt - base price $41,545

 

With that kind of price differential, who'd buy a Volt?  Well, would it change your mind if you knew that Dalton McGuinty would give you back $10,000 if you did? (Note B)  That still leaves you paying a $12,000 premium for a smaller vehicle.  At a difference in fuel efficiency of only 5.4% (combined; the Passat is actually 7% more fuel-efficient than the smaller Volt in highway driving), it's going to take a long time to make up $12K.

Now, bearing in mind that the fuel efficiency of the Model-T 100 years ago was 25 MPG, take a look at how many production vehicles currently meet the EPA fuel efficiency target for 2025.  The answer is "none".  The only one that even comes close is the Prius, and it's still 10% shy of the gold standard.  In fact, the only vehicles that presently meet that target are the all-electric Volt and the Tesla - all-electric vehicles that don't burn any fuel at all.

NOW do you get the picture?  The purpose of the EPA's 2025 target is to eliminate the internal combustion engine.

Here's my question: just how likely do you think that is?
 
//Don//
 
Notes

A) [http://www.nationalreview.com/planet-gore/283363/epa-2025-pigs-will-fly-henry-payne]
B) [http://gm-volt.com/2009/07/17/canadian-government-gives-10000-chevy-volt-subsidy-angers-toyota/]

---------------------------------------------

Just to drive the point home (so to speak), the average increase in fuel efficiency for cars over the past century has been zero. The Ford Fusion gets the same MPG as a Model T. But Obama's EPA - and now Environment Canada - expect auto manufacturers to achieve a 100% increase in fuel efficiency by 2025. That's thirteen years away. The laws of physics haven't changed. The only possible conclusion - the ONLY conclusion - is that this is an ideologically-driven attempt to regulate the internal combustion engine out of existence.

And why? As an auto-da-fe; an act of faith to propitiate the Gods of Carbon Dioxide.  As a little reminder, there has been no statistically significant warming for 16 years despite a 10% increase in atmospheric carbon dioxide concentrations. There is no justification in science or reason for the anthropogenic global warming thesis - and thus no justification in science or reason for costly, pointless government regulations aimed at outlawing one of the fundamental technologies that drive the western world.

A few more words on the Canadian take on this little endeavour. Here's how the government plans to sell the standards change to Canadians.

"These new regulations improve fuel efficiency so that by 2025 new cars will consume 50% less fuel and emit 50% less GHGs than a similar 2008 model, leading to significant savings at the pump," said Environment Minister Peter Kent. "At today's gas prices, a Canadian driving a model year 2025 vehicle would pay, on average, around $900 less per year compared to driving today's new vehicles."
 
Awesome. So at a savings of $900 per annum, it'll only take the average Canadian family thirteen and a half years to pay off the extra $12,000 that their Volt will cost them. Assuming, of course, that they didn't finance the difference, and assuming that the Ontario government can afford to continue subsidizing every Volt that's sold to the tune of $10,000 a pop.

Ontarians buy 45,000-50,000 new passenger cars per month. That's 540,000 - 600,000 new cars per year. If everyone buys Volts - and once these new regulations are in force, that's all that anyone in Ontario will be allowed to buy - then the Ontario government is going to be on the hook for $10,000 x 550,000 = $5,500,000,000 in hybrid car subsidies every year.

Ontario's budget deficit in 2012 was already $15.2 billion. These subsidies would bump that up by a third.  All to support sales of a car that can go at most 80 km on a 10-hour charge, provided it's not too cold.

Can't politicians do arithmetic?

Thursday, December 6, 2012

Doha, dear (A repeat from December 2011)


Colleagues,

In honour of the UN's annual climate alarmapalooza at Doha last week, I thought I'd reprise a little piece I wrote a year ago, after the 17th Climate Kerfuffle at Durban, South Africa - because since then, the only thing that's changed is that there is more empirical evidence than ever that the Sun controls terrestrial climate, and still no empirical evidence whatsoever that carbon dioxide - let alone the 5% of atmospheric CO2 attributable to human activities - plays any role whatsoever in twisting the planetary thermostat.

Enjoy.

---------------------------------------
The UN's annual climate conference (17th Conference of Parties, or COP) at Durban wrapped up this past weekend.  I'll spare you all the gory details; basically, delegates kicked the can down the road, agreeing to hammer out a new binding framework agreement by 2015 with provisions for greenhouse gas reductions that would kick in around 2020.  Negotiators painted this as a victory for diplomacy, climate skeptics as a victory for skepticism, and environmental crusaders as a guarantee of apocalypse.

I'm not going to rant and rail about this; I'm simply going to offer 3000 words on it in the form of three pictures.  Charts, actually.  Together, these explain the outcome at Durban.

The first one explains the problem with climate science:

 
The AGW thesis asserts that temperature responses will scale linearly with forcings, the most important of which, according to the IPCC, is anthropogenic carbon dioxide emissions.  However, anthropogenic carbon dioxide emissions account for less than 5% of atmospheric carbon dioxide - and as this chart shows, according to measured data, there is no correlation over the course of the past decade between atmospheric carbon dioxide concentration and average global temperature.  All of the arguments underlying the Kyoto Protocol, and all conceivable successor agreements, are based on linking anthropogenic CO2 to temperature - and yet temperature does not appear to respond to CO2 at all.  That's the first problem.

Here's the second problem:



The data are from British Petroleum.  That's a chart showing the CO2 emissions of the world's top 14 emitters - every country that currently puts out more than 400 million tonnes of CO2 per year.  Only three of them matter: the US (dark blue), whose emissions are declining; China (scarlet), whose emissions are skyrocketing; and India (beige), whose emissions are climbing slowly but accelerating.  Every other major emitter is lower than these three, and is showing stagnation or decline.  Accordingly, any emissions control or reduction agreement that requires reductions from the US or any other western state, but does not require significant (i.e. massively disproportionate) reductions from China and India, is utterly pointless - as is talk of any "carbon debt" owed to the "Third World".

Incidentally, activists at Durban made much of Canada's alleged "delinquency" on the carbon file.  Take a look at the data and explain to me why Canada (a delightful fuschia) is a bigger problem for the environment than, say, China.  Or India.  Or the US.  Or Japan.  Or Germany.  Or for that matter South Africa, whose emissions are at the same level as ours.  And don't be fooled by arguments about "per capita" emissions; that matters to us, but Gaia doesn't care how many people live in your country; she only cares how much CO2 you pump out.

Speaking of "carbon debts", here's the third and last chart:



This is data from Japan's Ibuki satellite, which measures carbon dioxide flux.  What is shows is the daily rate of change in CO2 flux over various parts of the planet, in the four different seasons.  As we all know, most of the world's human-produced CO2 is produced by northern hemisphere countries, because that's where most of the industry is (the only state in the above list that isn't in the northern hemisphere is South Africa).  And we see from this data that the CO2 flux changes massively from season to season.  In high summer, the northern hemisphere is a massive carbon sink.  In autumn, with the exception of the northereastern US states and parts of Siberia, the northern hemisphere is a net CO2 producer.  In the depths of winter, the northern hemisphere - except for the Middle East, the Subcontinent, and Central Asia - is a net CO2 producer; and in the spring, the northern hemisphere, except for Europe and Northeastern Canada, is a net CO2 producer.  Looks pretty simple, doesn't it?  After all, we burn more fuel to stay warm in the autumn, winter, and spring, don't we?

Well, it's not that simple. Take a look at China.  It's the only country that doesn't change colour.  Throughout the year, China is a net CO2 producer.  The oceanic bands are interesting, too; in the winter, when the northern hemisphere is producing a lot of CO2, the North Atlantic is absorbing CO2 like crazy.  This is because cold water can hold more dissolved gas than warm water.  It's also worth noting that the temperate southern hemisphere oceanic bands seem to be absorbing a lot of CO2.

Why does this matter?  Well, take another look at Canada.  We're supposed to be climate criminals - and yet it's the seasons, not human activity, that are the key determinant of our status as a net carbon emitter or consumer.  It's tempting to blame the change on the use of heating fuel, but that explanation just doesn't hold water; if you look at fuel consumption patterns for the US, fuel oil consumption peaks in the winter, when gasoline consumption troughs, and vice-versa - except that US consumers use twice as many barrels of gasoline as they do of fuel oil at any given time.  In other words, you would expect CO2 emissions to be higher in the summer time, when gasoline consumption is high, and fuel oil consumption is low; but summer is when CO2 flux over North America is at its lowest. 

The answer lies in the forests, and in photosynthesis.  When it's summer, the northern hemisphere - the boreal and temperate forest regions - becomes biologically more active and consumes more CO2 than it produces.  In the winter, photosynthetic activity declines, and CO2 emissions climb.  CO2 concentrations, in other words, seem to respond less to fuel consumption patterns than to seasonal variation leading to change in the metabolic rates of plants.

Bottom line is this: as these satellite observations illustrate, we understand an awful lot less about where CO2 comes from and where it goes than we purport to understand. Maybe the caliphs of carbon at Durban [and Doha! -ed.] ought to consider taxing states that don't grow enough trees per capita.

Yeah - can you imagine anyone from Qatar agreeing to that?

Cheers,

//Don//

Tuesday, December 4, 2012

3 February 2012 – That’s no moon…

Colleagues,

You may recall that shortly before Christmas I sent around a message in which I discussed the design, developmental work and testing that had been done on the Vought SLAM - the nuclear ramjet-powered, H-bomb sowing flying leviathan that was one of many unbelievable but terrifyingly realistic weapons systems dreamed up by atomic eggheads in the 1950s and 1960s.  Not surprisingly, this little trot down memory lane sparked a good many comments, most of them concerning the sheer lunacy of creating something that carried a belly-full of nuclear weapons, irradiated anything it flew over, and had a virtually unlimited range.  In one subsequent conversation, however, the point came up that, with such maniacal inventions cluttering up our collective history, there didn't seem to be much point in unleashing speculation in an attempt to posit the sorts of innovations ("disruptive technologies", if you like) that might pop up in the future.  This is not to suggest that speculation isn't fun, just that there isn't much point in it - particularly when there's no way to predict where technology will go, and especially when we're so woefully ignorant about our own past, and haven't figured out how to deal with things that we ourselves invented half a century ago, but just somehow didn't get around to putting into production.  We don't have to go to the history of the space race for such examples; we only need to look into our own archives.

For example, we're all familiar with HEAT rounds - they've been around since WWII, and are fairly simple in concept.

The British PIAT - Projector, Infantry, Anti-Tank - relied on a HEAT warhead to (occasionally) penetrate enemy armour.  A HEAT warhead consists of an explosive charge with a conical well in the centre, lined with metal (usually copper).  The charge is initiated by a base fuze.  In the case of the PIAT warhead (below), the projectile is fired at a target; the extended probe on the nose fuze ("transit plug") transmits the shock of impact to the fuze at the base of the HE charge.  When the HE charge detonates, the shock wave compresses the copper cone into a jet of molten metal travelling at the speed of the explosion - roughly 7000 m/s in the case of a conventional TNT or Composition B fill.  The liquid metal jet penetrates the armour of the target and does corresponding damage to the interior of the vehicle, and its crew.  HEAT rounds are very effective, which is probably why they continue to constitute part of the basic load (along with kinetic penetration munitions, like APFSDS) of main battle tanks and armoured fighting vehicles even today.  They also continue to make fantastic infantry AT weapons; virtually all current light, medium and heavy AT missiles and rockets, from the venerable RPG-7 to the modern TOW2 missile use HEAT warheads.


Defending against HEAT rounds requires different strategies.  First, you can keep the jet away from your armour plate.  That means hanging something on your vehicle to make the incoming round detonate further away.  Second, you can keep the jet from forming; one way of doing so being explosive reactive armour panels, which detonate when the HEAT round strikes them, destroying the round as the jet is forming.  Third, you can thicken up your armour (bearing in mind the requirement that the vehicle still has to be able to move and carry stuff).  And fourth, you can try to disrupt the jet and prevent it from penetrating all the way through to the interior of the vehicle (which led to layered armour, with various materials sandwiched between plates to disperse the jet horizontally).
Research in the 1980s and later on took the HEAT concept somewhat further, into explosively formed projectiles (EFPs, also known as self-forging fragment projectiles).  During my first visit to Suffield as a staff officer back in the early 90's, I was shown test fragments and videos from trials on a new type of experimental munition: a scaled-up version of an EFP.  By thickening the conical well liner in the explosive charge, or by changing to a different, tougher metal than copper (e.g., iron), the charge, when detonated, would - instead of forming a liquid metal jet - compress the metal cone into a slug moving at very high speed.  The slug would not be affected by stand-off detonation mechanisms or explosive-reactive armour panels, and layering armour to disperse a metal jet horizontally wouldn't be much help. 
Moreover, you could make the slug big.  Really, really big.
 

THIS big.  That's from a test at Suffield back in the 90's.  I recall handling something like this during a visit.  It was more than a foot long and weighed about 30 pounds.  Imagine that thing coming at you at several thousand metres per second.  And the creation of them, by the way, is an exercise in perfect machining backed up by mathematics.  Here's an image from a DRES paper from 1995 (by one of our own colleagues - see note A) on modelling EFPs:

Note the similarities - and the caption which states that the mathematical models were confirmed by experimentation.  That hunk of metal started out looking like a wok about an inch thick, and after being whapped with a couple dozen kilos of HE, ended up looking like the lawn dart from Hell.  Math is awesome.
There's no point in going into too much more detail on EFPs, because that isn't what I really wanted to talk about in this message anyway.  I simply wanted to emphasize the fact that this technology is now old - so old that the Iraqi insurgents, al Qaeda, and other jihadist adversaries have adapted self-forging fragment technology to off-road mines and IEDs, and we're still having a heck of a time dealing with it.  What I'm getting at is that we don't need to invent science-fictiony "future" threats like "tunable weapons" and "gray goo nanobots" and "hyper-empowered individuals" if we're already facing things invented decades ago, but that have got us completely boggled.
Which takes me to today's topic - the Death Star.  Or at least the Soviet equivalent, Polyus. 
 

A few years back I penned a tech note looking at the arms control implications of space testing missions, specifically the October 2009 LCROSS experiment in which NASA slammed a rocket body into the Moon as part of its search for water on the lunar surface. The paper attracted its fair share of mocking laughter due to the title, which I wrote in jest ("Bombing the Moon"), but anyone who'd taken a moment to read the thing - it wasn't long - would have realized that I was trying to point out the implications of arms control treaties, agreements and regimes for otherwise legitimate space exploration and testing exercises, and vice versa. More knowledgable individuals with a higher security clearance who read that note would have recognized that I was trying to discuss in synecdoche a much more profound incident with significant legislative implications.

Of course, these days most folks don't seem to go in for specialized knowledge, and those who do often seem to lack the security clearance (or the simple interest) to delve deeper into important, paradigm-altering problems that actually impact us on a daily basis. People styling themselves "scientists" seem to prefer to fiddle with models rather than data and evidence, blathering on in bland, meaningless generalities devoid of any linkage to the real world rather than grappling with current problems. I guess that's easier and safer. Whether it's anything more than a complete and utter waste of time and taxpayer money, on the other hand...that's for other folks to decide.

But I digress. In the course of that tech note, I discussed the arms control prohibitions against space-based weapons, briefly mentioning the 1987 launch of an 80-tonne orbital object by the USSR.  This vehicle, it has been suggested, was to have been the forerunner of a series of orbital battle platforms intended to neutralize the US Strategic Defence Initiative systems (which of course were never deployed). 

Polyus failed to achieve orbit and ended up in the Pacific Ocean, and the Soviets never tried again - but the point is, they tried once.  Polyus wasn't some postulated "disruptive technology" or theorized "future threat". It was very real. 

And yes, I know the above picture has "MIR" on the side of the big black thing; according to the official article on Polyus from the Buran website, MIR space station modules were used in its construction.  Here's a pic of the vehicle on the launch pad at Baikonur in 1987; the "Polyus" name is clearly visible on the side (you can sort of see it in the colour pic above, too):


For the sake of reference, the Polyus vehicle in the image above is 40 m long, about 4 m in diameter, and weighed 80 tons.  The Space Shuttle Orbiter is 37 m long, and weighs about 70 tons empty (its gross liftoff weight is about 109 tons).  So this was no mere firecracker.

How real was all this?  Well, real enough that Mikhail Gorbachev showed up at Baikonur on 11 May 1987 to see the thing shortly before it was launched.


According to one news report, one of the purposes of Gorby's visit was to confirm that Polyus was not carrying any weapons.  Did it, or didn't it?  The actual story is a little hard to get a grip on; there are numerous pictures available from different archives, and various articles published by project personnel over the years tell different stories.  Schematics abound on the innerwebz:


 If your Russian is as good as mine, you won't have gotten any of that.  Here's an alleged translation, according to an article penned by one Ed Grondine:

There isn't much in the way of empirical support for Grondine's assertions.  Most of what is available in the public domain about Polyus comes from official websites (for example, Buran), which don't mention self-defence armaments, much less "nuclear space mines".  A lot of the funkier stuff comes from a 2005 article by Konstantin Lantratov, a former press officer in the Russian space industry, entitled "Star Wars That Didn't Happen".  According to the Buran website, for example, the Polyus vehicle carried ten separate scientific experiments - the first of which was testing the USSR's ability to orbit super-heavy packages...like Polyus.  Interestingly, the site contains dozens of photos of the spacecraft in the assembly stages; it appears to have been cobbled together out of spare parts:

The service block looked like a "Salyut" slightly modified for this task and was made up from parts of the ships "Cosmos-929, -1267, -1443, -1668" and from modules of MIR-2 station. In this block took place the management systems and on-board displacement, the telemetric control, the radiocommunication, the heating system, the antennas and finally the scientific installations. All the apparatuses wich not supporting the vacuum were installed in the hermetic section. The part of the engines made up of 4 propulsion engines, 20 auxiliary engines for stabilization and the orientation, 16 precision engines, as well as tanks and pneumo-hydraulics conduits. Lastly, the production of electricity was made by solar panels which were spread when Polyus was into working orbit. (Note B)

The size, design and components of the vehicle - not to mention the secrecy with which it was fabricated and launched (which was not at all uncommon during the Cold War, remember) - would naturally spark all manner of conspiracy theories.  The vehicle according to Buran contained large quantities (420 kg) of xenon and krypton in 42 cylinders of 32 L capacity, with an injector to squirt the gas into the upper atmosphere to "generate ionized signals with long waves".  Grondine argues that the purpose of this was to produce light by fluorescence, in order to signify that a container (possibly holding a nuclear space mine?) had been launched without generating radio energy, which could be tracked.  According to other sources (e.g., the always infallible Wikipedia), the gases were intended to be used to test, with the appearance of innocence, the venting apparatus for a zero-torque exhaust system for a 1 megawatt carbon dioxide laser intended to damage Strategic Defence Initiative satellites.
Grondine also commented, as many others did, on the "optically black shroud" covering the whole thing.  Painting a space object black is one way to make it more difficult to see via reflected light - although the point of doing so when you've got huge solar panels sticking out of the sides of the thing escapes me.  Also, painting it black would tend to make it hot, as it would absorb rather than reflect solar radiation; and unless the thing incorporated stealth technology, it would still be easily visible by radar, which is how SpaceCom tracks large orbital objects anyway.
The fate of Polyus was in any event not a happy one.  It was launched on 15 May 1987, two days after Gorby's visit to Baikonur ended.  The Buran website has a comical description of why the GenSec missed the launch:

The first launch of Energia and Polyus was so important for the direction of the party that the General Secretary of the Central Committee of the Communist Party itself, Mikhaïl Sergeevich Gorbatchev, went. However, it is well-known that any apparatus, so simple is it, have a strong probability of breaking down during a demonstration or in the presence of VIPs, this is why the Management committee had decided (on May 8) to delay the departure on May 15, under pretext of technical problems, knowing that M.S. Gorbatchev could not remain because it had a voyage to the head office of UNO at New York.(Note B)

Their precautions turned out to be well-founded.  Because the Energiya had been designed with hang-points for the Buran space shuttle system, Polyus had to use the same connection mechanisms.  This led to it being mounted backwards, i.e. with the main thruster engines facing forward, resulting in a complicated mission profile.  In order to achieve orbit, after about 8 minutes into the flight program, at an altitude of about 110 km, the Polyus would jettison its engine shroud, separate from the Energiya booster, and execute a 180 degree turn using its thrusters.  Once this was complete, about 15 minutes into the flight and at an altitude of about 155 km, it would fire its main engines periodically to level the craft, and eventually achieve a stable orbit at 280 km altitude, by about 30 minutes after launch.
That's not what happened.  Only one of the positioning thrusters functioned, and the Polyus, instead of making a 180 degree rotation, made a full 360, leaving the main engines pointing forward.  Instead of accelerating the craft into orbit, the engines decelerated it, and Polyus deorbited into the Pacific Ocean, reportedly landing in water that was several kilometres deep.  According to open sources, the spacecraft was never retrieved.
So, Polyus was real.  The Soviets really built it, and they really launched it.  Did they arm it? Was it supposed to be the first real space battle station?  Would it have worked?  A 1-megawatt laser isn't much in atmosphere, where blooming and attenuation quickly destroy beam coherence; but in space, it might be fairly effective over a reasonably long range.  Could it also have carried "nuclear space mines", presumably for use against US orbital assets?  I think a more important question is, could it have carried nuclear warheads as part of a fractional orbital bombardment system, or FOBS?  That was one of the big worries of the 1960s, and it was one of the key reasons that the US and USSR negotiated the 1967 Outer Space Treaty, which prohibited placing "nuclear weapons or other weapons of mass destruction" either in orbit or on celestial bodies.(Note C - and here we are back at the "Bombing the Moon" technical note again. Funny how this arms control nonsense keeps coming back to haunt us. Almost like it was relevant or something.) 
Would the Soviets have broken the OST? Well, when you can't figure out why somebody's doing what they're doing, or whether they're likely to be doing something they shouldn't, you've got two choices: pull a guess out of your nether regions (the preferred option for "analysts" who don't know anything about anything and think that history is "stuff that's in books"); or use actual evidence. In such cases, the only evidence we have to go on is historical precedent - i.e., what have the suspects done in the past, and why.  Would the USSR have abrogated the 1967 OST by placing nuclear weapons in orbit?  Well, they signed the 1972 Biological and Toxin Weapons Convention, which prohibited producing biological weapons...and then went on to build the biggest biological weapons complex in the world, churning out weaponized anthrax, smallpox, and a host of other pathogens literally by the metric tonne.  By the late 1970s, the USSR was consuming 400,000 fresh eggs per week simply to incubate the weaponized India-1 strain of Variola Major, and had developed refrigerated, heat-dissipating ICBM warheads specifically designed to keep viral and bacterial agents alive during re-entry.
So you could say that, when it comes to the former USSR and its adherence to non-proliferation, arms control and disarmament conventions, there are some legitimate trust issues.
 
Soviet-era fermenters in Building 221 at Stepnogorsk, Kazak SSR.  Fool me once, shame on you.  Fool me twice...

I guess the final take-away from this is that when it comes to trying to figure out what a potential enemy might be able to do in the near future, one of the best guides is knowing what they've done to you in the near past.  If nothing else, the existence of things like the Vought SLAM and the Polyus Space Battle Station should give us a smidgeon of perspective on some of the prerequisites and challenges involved in creating massive and potentially threatening items of military hardware.  In other words, if we want to figure out whether somebody might put an orbital battle station in Low Earth Orbit and use it to dazzle or destroy our satellites or FOB a nuke onto one of our cities, the first thing we should do is make a list of folks who (a) can build space stations, (b) have a heavy-lift rocket capability, and (c) don't like us.  The intersection in that Venn diagram is where we ought to start looking. 
And if the intersection is empty, maybe we shouldn't waste our time making up non-existant things to fill it.
Anyway, if anyone wants to read Lantratov's article and feels like slogging through 28 pages of "Google-translated" grammar, just let me know.  He gives all the details about cannons, targets, gas generators, and mentions that the black finish on the vehicle was to help maintain working temperature by absorbing solar energy.  It's a cornucopia of awesome, and by the time you're finished reading it you'll be muttering "Commence primary ignition!" under your breath.
Cheers - and may the farce be with you!

//Don//

Notes
A) The paper is available from the DRDC online archive.
B) http://www.buran-energia.com/polious/polious-desc.php
C) http://www.unoosa.org/oosa/SpaceLaw/outerspt.html.  The OST also prohibits laying claim to celestial terrain.