Stalking the uncertainty monster

by Judith Curry
Its time to check in with the Climate Uncertainty Monster.

The occasion for this post is an invitation to present a keynote talk at the 2nd International Workshop on Econometric Applications in Climatology.  The Workshop website is [here.]
To those of you that are new to Climate Etc., the concept of the ‘climate uncertainty monster’ seeded my inaugural posts at Climate Etc. in 2010 (Tag Uncertainty for entire series, see especially the earlier posts).
New presentation
My new ppt presentation can be downloaded here [uncertainty].  Check out the presentation; lots of good monster cartoons. Below is the text of my prepared remarks (I rambled on at the end including some material from my recent testimony that isn’t included in these remarks):
I’ve long been concerned about how the IPCC treats uncertainty, and in 2003 I started gathering my thoughts on this. A seminal event in my thinking on this subject occurred in 2010, when I attended the Royal Society Meeting on Scientific Uncertainty.
Let me start by describing the uncertainty monster, in context of the debate on climate change.  The “monster” is a metaphor used in analysis of the response of the scientific community to uncertainties at the climate science-policy interface. Confusion and ambiguity is associated with:

  • knowledge versus ignorance
  • objectivity versus subjectivity
  • facts versus values
  • prediction versus speculation
  • science versus policy

The climate uncertainty monster has its roots in philosophy and sociology.  Monster theory regards monsters as symbolic expressions of cultural unease that pervade a society and shape its collective behavior.  Dutch philosopher Martijntje Smits articulated the monster as co-existence of public fascination and discomfort with newer technologies.  Dutch social scientists Jeroen van der Sluijs articulated the ‘uncertainty monster’ as related to ways in which the scientific community responds to the monstrous uncertainties associated with environmental problems.
By way of introduction to this topic, I’m going to go through some uncertainty monster coping strategies, that are in evidence at the interface between climate science and policy.
Uncertainty monster hiding or the “never admit error” strategy can be motivated by a political agenda or because of fear that uncertain science will be judged as poor science by the outside world.  Apart from the ethical issues of monster hiding, the monster may be too big to hide and uncertainty hiding enrages the monster.
Ignoring the monster is typified by this statement from President Obama’s web page:  Call out the Climate Deniers – 97% of scientists agree.  A dubious paper that found a 97% consensus on fairly trivial aspects of climate change is then morphed into 97% of scientists agree that human-caused climate change is dangerous.
Monster simplifiers attempt to transform the monster by subjectively quantifying and simplifying the assessment of uncertainty. Monster simplification is formalized in the IPCC  by guidelines for characterizing uncertainty in a consensus approach consisting of expert judgment in the context of a subjective Bayesian analysis.
The uncertainty monster exorcist focuses on reducing the uncertainty through advocating for more research. In the 1990’s, a growing sense of the infeasibility of reducing uncertainties in global climate modeling emerged in response to the continued emergence of unforeseen complexities and sources of uncertainties.  For each head climate science chops off the uncertainty monster, several new monster heads tend to pop up.
The first type of uncertainty monster detective is the scientist who challenges existing theses and works to extend knowledge frontiers.  A second type is the watchdog auditor, whose main concern is accountability, quality control and transparency of the science. A third type distorts and magnifies uncertainties as an excuse for inaction for financial or ideological reasons.
Monster assimilation is about learning to live with the monster and giving uncertainty an explicit place in the contemplation and management of environmental risks.  Assessment and communication of uncertainty and ignorance, along with extended peer communities, are essential in monster assimilation. The challenge to monster assimilation is the ever-changing nature of the monster and the birth of new monsters.
The IPCC faces a daunting challenge with regards to characterizing and reasoning about uncertainty, assessing the quality of evidence, linking the evidence into arguments, identifying areas of ignorance, and assessing confidence levels. The IPCC uses a common vocabulary to express quantitative levels of confidence based on the amount of evidence (number of sources of information) and the degree of agreement (consensus) among experts.   Because of the difficulties of objective uncertainty assessments, the IPCC relies primarily on expert judgment in the context of a subjective Bayesian analysis.  A quantitative likelihood scale represents ‘a probabilistic assessment of some well-defined outcome having occurred or occurring in the future.’
The IPCC characterization of uncertainty is based upon a consensus building process that is an exercise in collective judgment in areas of uncertain knowledge. The general reasoning underlying the IPCC’s arguments for anthropogenic climate change combines a compilation of evidence with subjective Bayesian reasoning. A ‘consilience of evidence’ argument consists of independent lines of evidence that are explained by the same theoretical account.
In my assessment, the IPCC has institutionalized overconfidence. Scientists disagree because:

  • Insufficient observational evidence
  • Disagreement about the value of different classes of evidence (e.g. models)
  • Disagreement about the appropriate logical framework for linking and assessing the evidence
  • Assessments of areas of ambiguity and ignorance
  • Belief polarization as a result of politicization of the science

The climate debate is unfortunately characterized by competing certainties, characterized by the two guys hitting each other over the head. If uncertainty and ignorance are acknowledged adequately, then the competing certainties disappear. Disagreement then becomes the basis for focusing research in a certain area, and so moves the science forward.
About 5 years ago, following Climategate in fact, I became acutely concerned that climate scientists were focused on uncertainty hiding and simplification, which I regarded as a very unhealthy state of affairs for climate science. I began writing about this problem from multiple perspectives, including mathematics, philosophy, engineering applications, regulatory science, and even social psychology. I was seeking some new ideas for overcoming scientists’ bias about this topic and for employing more objective methods for understanding, characterizing and communicating uncertainty.

  • Curry, JA 2011: Reasoning about climate uncertainty. Climatic Change
  • Curry, JA and Webster PJ 2011: Climate science and the uncertainty monster. Bull Amer Meteorol. Soc.
  • Curry, JA 2011: Nullifying the climate null hypothesis. WIRES Climate Change
  • Curry JA, 2013: Climate change: No consensus on consensus. CAB Review

 
My main concern has been the overconfident conclusions put forward by the IPCC:

  • Consensus building process introduces biases
  • Ignorance and ambiguity is unaccounted for
  • Politicization acts to marginalize skeptical perspectives
  • Leads to overconfident conclusions

Symptoms of an enraged uncertainty monster include increased levels of confusion, ambiguity, discomfort and doubt.
Politicization of the issue of climate change has introduced huge biases into the science. However, when a scientific issue becomes politicized, and scientists attempt to speak consensus to power, then a scientific discussion of uncertainties is regarded as a political act.  There is an ideology that many climate scientists subscribe to, which I’ve termed the UNFCCC/IPCC ideology:

  1. Anthropogenic climate change is real
  2. Anthropogenic climate change is dangerous
  3. Action is needed to prevent dangerous climate change
  4. Deniers are attacking climate science and scientists
  5. Deniers and fossil fuel industry are delaying UNFCCC CO2 stabilization policies

The problem with scientists subscribing to this ideology is that there is a tendency for absence of doubt,
 intolerance of debate
, appeal to authority
, a desire to convince others of the ideological truth, and a willingness to punish those that don’t concur.
Given the enormous biases that ‘expert judment’ and ideology introduce into climate science, I have been pondering the feasibility of some more objective ways of understanding, characterizing and communicating uncertainty.
The bar on the bottom provides a good illustration of the different levels of uncertainty, starting on the left with determinism (implying no uncertainty). Statistical uncertainty Is when we have formulated a robust well defended PDF. The next level of uncertainty is when we don’t know the full PDF,, but we have some well defended percentile bounds. Scenario uncertainty means that we have some estimate of likelihood. The next level of uncertainty is that we have confidence in the sign or trend. With greater uncertainty than that, we head into the territory of ignorance. Personally, I would assess the 20th century climate attribution and 21st century climate projections at the 4.2 level (encompassing elements of scenario uncertainty and recognized ignorance).
In the previous slide, we discussed the uncertainty level. This diagram illustates also the nature of uncertainty. Epistemic uncertainty means that we have limited knowledge or information – this is the uncertainty type that in principle can be reduced. Ontic or aleatory uncertainty is irreducible; this relates to unavoidable predictability.
Another key factor to include in uncertainty assessments is quality of evidence.

  • High quality –  Further research is very unlikely to change our confidence in the estimate of effect
  • Moderate quality – Further research is likely to have an important impact on our confidence in the estimate of effect and may change the estimate.
  • Low quality – Further research is very likely to have an important impact on our confidence in the estimate of effect and may change the estimate
  • Very low quality –  Any estimate of effect is very uncertain.

As an example, I would argue that the quality of the historical surface temperature record is moderate to high quality. I suspect that paleoclimate estimates of global surface temperature are very low to low quality.
One of my biggest concerns about reasoning about climate uncertainty is that Bayesian methods have trouble dealing with true ignorance. In classical two valued logic, unknowns are undifferentiated which may lead to false assertions. Evidence based 3 valued logic, or the so-called italian flag, is more honest about unknowns and allows for a better analysis of uncertainty.
So here is the problem as I see it.  The drive to reduce scientific uncertainty in support of precautionary and optimal decision making strategies regarding CO2 mitigation has arguably resulted in:

  • unwarranted high confidence in assessments of climate change attribution, sensitivity and projections
  • relative neglect of black swans and dragon kings
  • relative neglect of decadal and longer scale modes of natural climate variability
  • conflicting “certainties” and policy inaction

The current focus on the precautionary principle and optimal decision making is driving climate policy to a position between a rock and hard place.  Motivated by the precautionary principle, emissions targets are being set based on highly uncertain climate model simulations.  Classical decision analysis can suggest statistically optimal strategies for decision makers when uncertainty is well characterized and model structure is well known.  Optimal decision making is  a poor fit for the climate change problem.
The reason that we find ourselves between a rock and a hard place on the climate change issue is that policy makers have mistaken climate change for a tame problem. Climate change is better viewed as a ‘wicked mess’. A wicked problem is complex with dimensions that are difficult to define and changing with time. A mess is characterized by the complexity of interrelated issues, with suboptimal solutions that create additional problems.
When confronted with deep uncertainties surrounding a complex wicked problem, better decision analytic frameworks include:

  • Enlarge the knowledge base for decisions
  • Adaptive management
  • Build a resilient society

In closing, I leave you with this quotation by Bruce Beck:
“Being open about uncertainty should be celebrated: in illuminating where our explanations and predictions can be trusted and in proceeding, then, in the cycle of things, to amending their flaws and blemishes.”
JC reflections
In the 5 years since I started stalking the uncertainty monster, we’ve seen a lot of intellectual progress on how to frame and approach this issue.  It is becoming easier for scientists to do and publish research that challenges the consensus.  That’s the good news.
The bad news is that the interface between climate science and policy remains badly broken.  Many politicians seem to have become uncertainty deniers, with President Obama leading the pack.  The UNFCCC/IPCC is on a collision course with reality; it will be interesting to see how the Paris meeting goes next Dec, and how the IPCC AR6 will proceed.  But science seems less and less relevant to what is going on in the policy arena.  Which is fine; please get out of our way and let us do our science so that we can try to figure all this out by exploring the knowledge frontiers, rather than pledging allegiance to the consensus.Filed under: Uncertainty

Source