Embracing uncertainty in climate change policy (!)

We argue for a redesign of climate change mitigation policies to be ‘anti-fragile’ with respect to scientific uncertainty. – Otto et al.

A very interesting new article in Nature Climate Change:
Embracing uncertainty in climate change policy
Friederike E. L. Otto, David J. Frame, Alexander Otto and Myles R. Allen
Abstract. The ‘pledge and review’ approach to reducing greenhouse-gas emissions presents an opportunity to link mitigation goals explicitly to the evolving climate response. This seems desirable because the progression from the Intergovernmental Panel on Climate Change’s fourth to fifth assessment reports has seen little reduction in uncertainty. A common reaction to persistent uncertainties is to advocate mitigation policies that are robust even under worst-case scenarios , thereby focusing attention on upper extremes of both the climate response and the costs of impacts and mitigation, all of which are highly contestable. Here we ask whether those contributing to the formation of climate policies can learn from ‘adaptive management’ techniques. Recognizing that long-lived greenhouse gas emissions have to be net zero by the time temperatures reach a target stabilization level, such as 2 °C above pre-industrial levels, and anchoring commitments to an agreed index of attributable anthropogenic warming would provide a transparent approach to meeting such a temperature goal without prior consensus on the climate response.
Published in Nature Climate Change [link]
Extended excerpts, with embedded comments:
The primary reasons for the slow progress in global mitigation policy are not scientific. They are strategic — economic and political barriers to action arising from weak incentives to mitigate and strong incentives to free-ride on the efforts of others, internationally and inter-generationally. To be successful, a climate change mitigation policy not only has to overcome those economic and political barriers, but also has to withstand and adapt to other external pressures that originate from shifts in the economy (for example, ‘austerity’) and political interests (for example, ‘climate scepticism’). Attempts have been made to design policies that are more robust to these external pressures, for example, by attempting to find ways for regulators to credibly commit both themselves and their successors in an environment of changing power structures, locking in certain policies through institutional design, capitalizing on emergent government structures  and self-reinforcing effects of certain policies.
Connecting these lines of thought to those of adaptive management and the governance of complex systems, here we argue for a redesign of climate change mitigation policies to be ‘anti-fragile’ with respect to scientific uncertainty. Anti-fragile means that uncertainty and changes in scientific knowledge make the policy more successful by allowing for trial and error at low societal costs. Hence, anti-fragile re-design allows the incorporation of a wider range of risks of concern to policymakers, potentially allowing more successful mitigation policies.  Arguably, a key pre-requisite for an anti-fragile climate policy is an index not beholden to high scientific uncertainty. Here we suggest ‘attributable anthropogenic warming’ as an anti-fragile index against which pledges could be reviewed, independent of the details of individual countries’ mitigation policies.
Precautionary mitigation policies
The predominant approach to the design of climate mitigation policies refers to the precautionary principle, embedded in the United Nations Framework Convention on Climate Change (UNFCCC). As any climate policy has the joint goals of enabling continued human development while staying within the boundaries posed by the limitations of the climate system, a trade-off between these goals has to be struck.
In the context of climate change, by far the most discussed structure pertaining to the cost-effectiveness approach is the 2°C goal adopted by the UNFCCC in Cancun. A substantial body of research into how we might achieve successful management of the climate change problem has focused on meeting the agreed 2°C target — an approach that is especially common among physical climate scientists researching the problem.
JC comment:  My concerns with the  2°C target are discussed in Politics of the 2C target and Challenging the 2C target.
First, minimizing the risk of high damages requires a huge and immediate mitigation effort that is too demanding of communities with multiple priorities. Second, by focusing on the upper tail of the distribution of possible future warming, the required mitigation effort for meeting the climate target becomes very sensitive to the upper bound of the climate-system response, which is badly constrained by observations, and hence easily contested by different interest groups.
Policies invoking this interpretation of the precautionary principle can, therefore, lead to high and uncertain mitigation costs to guard against potentially high but equally uncertain impacts. They are, therefore, ‘fragile’ in the sense that uncertainty in both mitigation costs and impacts make it more difficult for any policy to be adopted, providing a strong incentive to defer decisions until these uncertain-ties are resolved. Yet, this could mean a recipe for indefinite procras-tination: some uncertainties, including the costs of mitigation and the speed at which temperatures respond to falling emissions, may only be resolved after substantial mitigation efforts are already under way. The potentially paralysing impact of uncertainty becomes particularly acute if rational fears of over-mitigating combine with the politics of special interests to create additional pressures on negotiations.
In view of these issues, we argue that an approach that (a) is less beholden to the contestable tails of climate distributions, (b) more fully accounts for the set of risks governments care about, and (c) is less dependent on a globally binding mandate, may be a better way of preserving flexibility in climate mitigation. There are many currents of thought associated with adaptive management, resilience and more recently ‘anti-fragility’ that argue for a more iterative approach to the management of complex problems. Although these approaches are usually associated with environmental or natural resource management or, when in the field of climate change, responses to climate change and adaptation strategies , we argue that some of this thinking could be constructively used in mitigation strategies, too. Basing strategy on more robust statistical properties, such as median estimates of both climate impacts and mitigation costs, reduces dependence on contestable tails of these distributions.
JC comment:  I agree with their general framing of this.  However, I don’t think that focusing on the median versus the tail really helps, given the uncertainties.
To be credible, however, such a policy must also adapt to new scientific findings in a predictable way that itself minimizes the risk of unacceptable outcomes, such as a sudden and precipitate revision in mitigation pathway, and avoids placing an intolerable burden on future decision-makers. Simply stating that policies will be revised in the light of new evidence is insufficient: some constraints are needed on the scale of these revisions if policies are to be used as a basis for investment.
Flexible policies have been advocated before that internalize costs of emission-externalities contingent on observed climate states and thus adjust to new information about the uncertain climate response to emissions. Policies that automatically adjust expenditures or efforts on the basis of some numerical parameter (usually consumer prices) are commonplace. Indexing makes it easier for politicians to commit to long-term stability than might otherwise be the case if explicit assent were required for every policy adjustment. It can also help create a normative aura around policies if they are seen to reflect an underlying fairness in the indexing.
An index of anthropogenic warming
A number of features are desirable in the index variable: first, it should be clearly relevant to the overall policy goal; second, it should evolve predictably to minimize short-term policy volatility; and third, it should be simple to calculate and update regularly. Since governments have already adopted the goal of limiting global aver-age warming above pre-industrial temperatures to 2°C, and recognizing that the majority of climate impacts scale more closely with this than any other readily accessible variable, an index based on global average near-surface temperature is a logical starting point.
Global temperature itself, however, is subject to natural inter-annual and interdecadal variability that would significantly increase the risks of indexing climate policy on this variable alone. Investments in energy infrastructure mature over timescales of decades. If a global carbon tax were anchored to global temperature, as proposed by McKitrick, then a large volcano or an upward fluc-tuation in the Pacific Decadal Oscillation could depress or inflate carbon prices for a decade or more. Neither is relevant to the long-term goal of limiting anthropogenic warming, but could unnecessarily bankrupt investors in either renewable or fossil energy supplies, respectively.
A more predictable variable that is also more closely tied to the overall policy goal would be an index of warming attributable to human influence:  this has been defined in terms of a weighted least-squares fit between observed temperatures and the expected temperature responses to anthropogenic and natural factors.
Estimates of attributable warming are traditionally updated in the scientific literature when new statistical methods or new simulations of anthropogenic and natural warming become available, and assessed every few years by the IPCC. This would be inappropriate for an index variable: the method of calculating the index should be subject to scientific scrutiny, but if the value of the index itself were directly dependent on scientific judgement, this would place undue pressure on the scientists making the assessment.Fortunately, when the target is net anthropogenic warming, very simple approaches based solely on global mean temperature and radiative forcing time-series give results that are statistically indistinguishable from the most complex statistical and modelling tools available. 
JC comment:  This index assumes that all climate change on relevant time scales (multi-decadal to centuries) is externally forced.  I believe this assumption to be incorrect (this issue is at the heart of the attribution argument), as it ignores internal variability (e.g. ocean oscillations on multi-decadal and longer timescales).
This index of anthropogenic warming requires no complex model calculations and can be updated as soon as new figures for annual mean temperatures and radiative forcing are released. It would have been assigned a value of 0.54°C in 1992, and has since monotonically increased by 0.37°C. The rate of increase slowed slightly after 2000 in response to the so-called hiatus in observed warming, showing how this index responds to evolving observations, but it does so sufficiently slowly that it would not compromise its use as a policy index. A plot of regression residuals shows nothing unprecedented about the past two decades.
JC comment:  I like the idea of an index variable, especially one that does not rely on global climate models.  The two choices on the table are global temperature (McKitrick’s idea) and the Otto et al. index related to external forcing. McKitrick’s index is perhaps too sensitive to natural variability while Otto et al. is too insensitive to natural variability.  Perhaps split the difference here?
Anti-fragile policies
Given the burgeoning uptake of adaptive management techniques in the climate adaptation and natural resource management domains, their absence from mitigation discussions is striking. Using the index described above (or a variant of it), a range of automatically indexed policies could be explored: here we simply outline some illustrative examples reflecting the goal of limiting anthropogenic warming to 2°C and the recognition that net emissions of long-lived greenhouse gases, including carbon dioxide and nitrous oxide, have to reach zero to stabilize temperatures.The simplest policy would be indexed emission reductions: countries could commit to reduce their emissions from a predetermined baseline by a fraction proportional to anthropogenic warming from the time the policy is adopted, rising to 100% when this warming reaches 2°C.
Indexing to attributable anthropogenic warming allows a transparent link between the policy instrument and the policy goal. It renders the policy ‘anti-fragile’ or ‘adaptive’ in the sense that disputes over the climate response are no longer an impediment to policy adoption. In fact, such disputes make the policy easier to adopt, as stakeholders who are convinced that future anthropogenic warming will be slower than current models predict will be reassured that the policy will ‘bite’ correspondingly more slowly, while the converse is also true for those concerned about unexpectedly rapid warming in the future. Even if climate policies directly indexed to attributable anthropogenic warming are not adopted formally, this concept provides a simple and natural way of monitoring the overall consistency between the evolving climate change signal, individual countries’ emission ‘pledges’ and the overall goal of achieving net zero emissions of long-lived greenhouse gases by the time anthropogenic warming reaches 2°C. Annual updates of anthropogenic warming, based on a simple and transparent algorithm, should be as much a part of a full suite of climate services as an annual update of global temperature.
JC comment:  I like the idea of framing climate policy in terms of anti-fragility -I raised this issue on a previous post Bouncing forward (not back).     However, ‘anti-fragile’ should not be equated with ‘adaptive’.  While adaptive is one component of anti-fragile, anti-fragility has an element of getting stronger through stress (making black swans work for you).
JC reflections
There is much to like about this paper:

  • a realistic perspective on the challenges of rapid and deep mitigation of CO2
  • emphasis on uncertainty of climate response, dangers, and social factors
  • adaptive framework whereby CO2 reductions are linked to the evolution of the climate (attributable warming)
  • framing of the climate response challenge in context of anti-fragility

The devil is of course in the details:

  • determining a credible index of anthropogenic warming that fully accounts for multidecadal and longer internal variability and solar indirect effects on the attribution of warming
  • a more sensible analysis of ‘dangerous’ climate change, beyond the arbitrary 2C target.  This requires a better impact assessment of current warming temperatures, and a better understanding of the causes of paleo sea level rise.

Otto et al. have proposed a new decision-analytic framework, with many similarities that that proposed previously by McKitrick.  The current decision-analytic framework has focused scientific research on climate sensitivity, with little scientific progress over the past several decades in narrowing the uncertainty.
This new decision-analytic framework would focus research on detection and attribution, and a better assessment of ‘dangerous’ in context of societal vulnerabilities.  This way lies progress.Filed under: Attribution, Policy

Source