As has been noted in a few other spots during this week, your humble correspondent was in Washington earlier this week to cover the most recent (Third) International Conference on Climate Change, which was organized by the Heartland Institute.
For those of you who have asked, “The last one was in March – are these quarterly now?”…. the answer is, no. Given the recent effort to steamroller “cap-and-trade” legislation through Congress at flank speed – on the grounds that “climate change” is accelerating catastrophically (sic) – a decision was made to quickly have an event in Washington to get some other views onto the table, literally within sight of the Capitol dome.
In contrast to its two-day predecessors, this was a one-day event that ran on one single track (rather than multiple tracks). The content was also more strongly oriented toward the political aspects of the issue – which was the intent given the venue and recent happenings.
My notes are rather extensive, so I thought I’d adopt the strategy that Jay Nordlinger uses when he covers events like Davos for NRO – a serialization in reasonably-sized parts until the supply of material runs out.
So that’s what I’ll do – and Part I begins below the fold.
At the opening breakfast on Tuesday morning, I walked into the main room and sat down at one of the tables appropriately labeled “Reserved for Media” (this to me is my equivalent of traveling in cognito). There were some other media types there, and to make a longer story short there was some kvetching that seemed to be based on the notion that the media types would all take a different view than most of the attendees.
One gripe was that “everyone here is in complete agreement – there is no diversity of views.” This in and of itself was very amusing – since in New York back in March the usual media suspects were complaining that the absence of complete agreement and a “single message” meant that the attendees should not be taken seriously. It was left to me to rudely point out that the “complete agreement” canard was blatantly false for two reasons. First, it wasn’t the case – there’s a very fundamental difference in viewpoint (politely) about whether or not we need to find a “forcing function” for temperature changes, or if natural variability is sufficient to explain things (I lean toward the latter view myself); this kind of argument is always strange, since in “science” people meet to discuss different viewpoints on the same data and to identify methods of possibly resolving the differences between the varying explanatory viewpoints. And second, in sharp contrast to the big AGW-scare-promotion events, the leading alarmists (including Mr. Gore and Mr. Hansen) have thrice been invited to speak and have thrice refused. (I still wait for the media priesthood to come down on Mr. Gore for blatantly and pro-actively restricting media access to and reporting on his speeches – but that’s another story.)
The second gripe was that the event had too much concentration on politics and not enough on “science.” It was left to me (again) to point out that there was a considerable amount of science on the program, there had been more in New York three months earlier, this event was stilted more to the political because it had intentionally been put in Washington due to the complete politicization of the topic, and that the enormous economic consequences of the suggested “remediation” methods needed to be presented more clearly to the taxpayers and citizens who will be expected to bear the implied burden.
At this point, further discussion of this topic ended, since it had been made clear that I wasn’t part of the snob-school-of-thought on this topic. But it’s pretty clear that the alarmists and their anointed stenographers will continue to play games with this (even in self-contradiction) for as long as they think that they can get mileage out of it.
Heartland Institute President Joe Bast opened the formal proceedings with a few informal remarks of note.
He noted that the past is littered with a number of occasions when populaces have found themselves in the grip of popular delusions – and that all of these situations had many things in common…. and those things can be seen in the present “AGW” panic. A most notable aspect is that these panics inevitably reward interest groups, and this creates a feedback effect that causes the interest groups to feed the panic.
In the present situation, a key problem is that the interest circle now extends to include business interests; when the purported “regulation” of the situation comes into play, business interests will try to use their influence to see that the impending regulatory regime is (as much as possible) stilted in their favor.
A final problem is that the political posturing now going on has reached a very advanced stage, particularly in Washington. Since politicians are regularly (and deservedly) rounded upon for focusing only on the short term (usually their own prospects in the next election), this subject provides an easy way to posture about having concern for the longer-term. It also allows many politicians to act like they know something about science (Mr. Waxman’s recent remarks not withstanding) – and of course serves as a convenient vehicle to increased influence for both themselves and their like-minded colleagues.
The opening keynote talk was given by Prof. Richard Lindzen, who holds a chaired professorship in meteorology at MIT. Prof. Lindzen’s talk was less technical than the one he delivered in New York back in March, but of course this was a Washington DC venue.
He noted that by his reckoning, one of the sadder consequences of the entire “global warming”/”climate change” panic is that it has served to thoroughly corrupt “climate science” over the past 20 years. The situation with weather forecasters and state climatologists has been better – but there has been a recent ominous trend of the forcing-out of state climatologists for their non-belief in climate-change-hysteria. (The situation surrounding Pat Michaels, the State of Virginia, and the University of Virginia remains a swirl of controversy – as a search will clearly show – but it seems clear that Dr. Michaels was the target of retribution for refusing to sign on to the designated political orthodoxy with regard to “climate change.”)
Dr. Lindzen noted a three-stage process by which climate science has been (ab)used to support alarmism:
1) Triage; this basically involves figuring out what “they” want to hear and then telling “them” that.
2) Opportunism of the weak; weak science is molded to fit the “consensus,” and is used as a vehicle by weak scientists to gain unwarranted authority.
3) Relate “your specialty” to “climate change;” this has become a common occurrence (note the wide variety of things that are now abstractly blamed on “global warming” or “climate change”) – it is a nuisance, but it serves to help maintain the sense of alarm.
I found this triplet (three-headed monster?) to be a bit stunning since – sadly – it is not confined to “climate change.” This kind of stuff has become all too common in my own world (risk capital and technology start-ups) – where the financial side of the aisle has become so weak on scientific and technical issues as to be ready meat for the same sort of abuse. (Find out what the finance people dearly want to hear, carefully mold your tangible results to indicate that you have the magic “open-sesame” tricks, and link it all to some presently-faddish issue of the day. Here, we are at the “please don’t get me started” point so I’ll stop there.)
One of the extraordinary aspects of the “climate change” panic is that it appears to be confined almost entirely within Washington itself; issue polling continues to show that in the hierarchy of worries, Americans continue to rank “climate change” dead last among their concerns. (And perhaps the alarmists in Washington are starting to notice this – since as I noted yesterday, as of this week (post-Memorial-Day recess) the “climate change legislation” is now being rebranded as an “energy bill.”) The climate panic meme chiefly appeals to the educated classes on the east and west coasts – and this clearly feeds into an old sociological story, since at its core the meme is authoritarian and based on belief.
Focusing a bit more on the scientific details, Prof. Lindzen noted that since the beginning of the Industrial Age, the concentration of atmospheric carbon dioxide has increased from some 280 ppmv (parts-per-million by volume) to 380 ppmv. While that may represent an apparently large increase, the reality is that carbon dioxide is (and remains) a very trace gas in the Earth’s atmosphere. The claimed global temperature anomaly during that period amounts to some 0.5 – 0.8 degrees Celsius – so even if one accepts that quantitative result, it remains problematic to claim that such as change in the Earth’s apparent temperature can be attributed entirely to such a tiny amount of carbon dioxide. Such changes are entirely within the realm of natural variations, and it is natural variations that provide the simplest explanation for the observed results. (As noted above, on this matter I find that my own analysis of this problem agrees with Prof. Lindzen’s analysis; to a trained eye-and-mind, an even-cursory analysis of the data and the “system” indicates that natural variability is very large in this system – and such large variability is an overwhelmingly-powerful determinant of specific outcomes.)
As such, the narrative which makes much noise about qualitative “directions” of possible trends is grossly insufficient – the critical information that needs more focus is the quantitative inquiry regarding “How much?” If the small magnitude of claimed changes is not properly grasped, then we risk conflating the trivial with the serious.
Prof. Lindzen also noted that the IPCC’s climate models do not produce known patterns of natural behavior – such as the El Niño-Southern Oscillation (ENSO – more popularly known simply as “El Niño”), the Pacific Decadal Oscillation (PDO), and the Atlantic Multidecadal Oscillation (AMO). Somewhat perversely, the inability of the models to produce these phenomena is used to argue that “human forcing” must the missing factor.
While the IPCC models cheerfully predict runaway global warming, the IPCC’s own data do not support this outcome; connecting these things points to an ongoing problem – there is not enough emphasis on testing the various models.
I can understand this situation from my own background in mathematical modeling for more applied engineering purposes. In an ideal world (and this is a world that seems to be assumed implicitly in the “marketing” of “climate models” to the general public), all of the knowledge and physics would be fully embodied in the mathematical structure of the “model” itself – while the specific details can be included by providing specific numerical values for specific fully-physical parameters included in that fundamental model. Unfortunately, this situation simply cannot exist for anything but the most trivial physical systems; of necessity, some assumptions and approximations must enter into any mathematical model. In even modestly-complex systems, detailed knowledge is imperfect at best – some degree of semi-empirical or empirical structure must be included; even in cases where the physics is modestly well-known, the resulting mathematics are often poor-suited for use in iterative computation, and must be modified to provide suitability for that required task.
This leads to two structural problems. First, compensation for a deficiency of knowledge is often attained by fitting mathematical functions to the available data. Second, the modeling challenge is often broken down into numerous attacks on very small pieces of the problem – and the small pieces are then stitched together into a larger skein on the assumption (which is dangerous to no end) that all the pieces will simply “play nice” together and exhibit proper behavior.
In these situations, the best a “model” can do is reproduce the data that was fed into it; this can be useful when the objective is to create a reasonably-predictive “between the points” analytical description that can be used to predict outputs for previously non-tabulated inputs. However, whenever you use that “model” beyond the range bounded by the input data tabulation…. all bets are off. In those situations, about the worst thing you can do is see some crazy behavior and tell yourself that it has basic meaning and/or that you’ve “discovered” something remarkable. A few years back, I watched a fiasco of this sort – where a “model” was showing negative differential capacitances that a) had never been expected before, and b) made absolutely no sense at all from the viewpoint of fundamental physics. The correct notion to get from such a “result” is that the model has serious limitations and cannot be regarded as predictive; instead, for nearly two years the claim kept being made that somehow this mathematical model was generating a discovery of some “new physics” that for some reason no one had noticed before (and which no one could actually measure).
Ultimately, that’s what’s happening now with many of these climate models. Data is going in showing very little happening…. and an output is produced showing a spectacular runaway in the near future. The way to understand this situation is quite simple: The model is behaving with such spectacular strangeness that it tells us that the “prediction” is simply impossible and irrelevant. The model itself is telling us that it is inherently limited, and is being used far outside its range of capability.
But to return to Prof. Lindzen’s narrative, he closed his talk with one final point about the inherent problem with the IPCC models and their predictions. The notion that an increase of the atmospheric CO2 concentration will increase the radiative forcing is a relatively simple one; as any chemistry (or physics) undergraduate knows, carbon dioxide has a very strong absorption peak in the infrared (infrared photons are absorbed by excitation of the vibrational states of the molecule). Since atmospheric CO2 absorbs infrared energy non-isotropically (from the “ground” direction) but re-radiates it isotropically, there is some net capture of infrared radiation that would otherwise have escaped into space. However, as noted earlier, the atmospheric concentration of CO2 is very small – and the non-isotropic/isotropic change-maker is also only a small portion of the ground-emitted infrared radiation. This tiny amount of extra-captured infrared radiation is far too small to lead to any notable sort of atmospheric warming.
Where the models go next is to where any complex system goes – to feedback mechanisms. Feedback of course is used by intent in circuit theory (if you have a radio that works, you’re using that idea). Natural systems are filled with feedback mechanisms; most of these mechanisms are (not surprisingly) negative feedback mechanisms – since negative feedback basically underlies all of the self-regulating natural systems that we see around ourselves every day. Not surprisingly, all of the data in the meteorological realm show negative feedback; however, all of the IPCC models include positive feedback in their mathematics. It is this suspect positive feedback that is necessary for these “models” to make their apocalyptic, non-data-based predictions of a catastrophic, runaway “greenhouse” event.
This is a rather remarkable situation, since – in Prof. Lindzen’s words – it clearly shows that the foundation of “global warming theory” is…. wrong.
(Snide aside: Nature is filled with negative feedback mechanisms, while politics is filled with positive feedback mechanisms. Hmm….)
Well, I’ll stop there, since that’s enough for today. Stay tuned for Part II, in which we’ll discuss the release of the massive report by the Non-Governmental International Panel on Climate Change (NIPCC), and get some gems of wisdom from Anthony Watts, Fred Singer, Willie Soon, and Harrison Schmitt.