IPCC and treatment of uncertainties

by Judith Curry
A new review paper on the IPCC and treatment of uncertainties.

IPCC and treatment of uncertainties: topics and sources of dissensus
Carolina Adler and Gertrude Hirsch Hadorn
Abstract. Characterizing uncertainty in the assessment of evidence is common practice when communicating science to users, a prominent example being the Intergovernmental Panel on Climate Change (IPCC) Assessment Reports (ARs). The IPCC guidance note is designed to assist authors in the assessment process by assuring consistent treatment of uncertainties across working groups (WGs). However, debate on this approach has surfaced among scholars on whether applying the guidance note indeed yields the desired consistent treatment of uncertainties thus facilitating effective communication of findings to users. The IPCC guidance note is therefore a paradigmatic case for reviewing concerns regarding treatment of uncertainties for policy. We reviewed published literature that outline disagreement or dissensus on the guidance note in the IPCC assessment process, structured as three distinct topics. First, whether the procedure is reliable and leads to robust results. Second, whether the broad scope of diverse problems, epistemic approaches, and user perspectives allow for consistent and appropriate application. Third, whether the guidance note is adequate for the purpose of communicating clear and relevant information to users.Overall,we find greater emphasis placed on problems arising from the procedure and purpose of the assessment, rather than the scope of application. Since a procedure needs to be appropriate for its purpose and scope, a way forward entails not onlymaking deliberative processesmore transparent to control biases. It also entails developing differentiated instruments to account for diversity and complexity of problems, approaches, and perspectives, treating sources of uncertainty as relevant information to users.
The paper is published as an Advanced Review in WIREs, it is publicly available online for one month [link].
This is a very interesting paper, and it refers to some relevant papers that I haven’t come across previously  (see esp Table 2; my papers Uncertainty Monster and Reasoning about climate uncertainty and No consensus on consensus are referenced.)   The review is framed in the following context:
Two questions guided our appraisal of concerns raised: What are the topics of dissensus? and what reasons are given for dissensus?
A paper on the IPCC that focuses on dissensus rather than consensus – gotta love it.
From Debate on Topics and Related Sources of Dissensus:
The idea of a systematic treatment of uncertainty consists of assembling and aggregating relevant information and related uncertainties, then translating the aggregated result into a degree of certainty about this result.
First, the position that a degree of certainty is precise if it is attributable on a quantitative basis, as in the case of the likelihood scale, has been criticized. The basis for this criticism stems from lack of traceability as to what evidence is included or left out. Furthermore, given the inevitable role of value judgment, the understanding of terms such as ‘very likely’ is ambiguous. In addition, the attribution of a certain degree of likelihood or confidence is not without doubt, thus giving rise to inconsistent characterization of uncertainties. Degree of certainty may differ depending on whether or not probabilistic methods were used, types of methods used to calculate consistency of ensemble modeling results, or on the choice of spatial aggregation scale.Biases or inconsistencies in expert judgments may relate to characteristics of the events described, or they may be due to group dynamics and heterogeneity of contributing authors.
Second, criticism has been raised on how both the quantitative likelihood scale and the qualitative confidence scale are constructed and expected to be used. The construction of the qualitative scale is a source of dissensus, given that it is not clear whether and how to conceptually distinguish between agreement, evidence and consistency, or whether these concepts are appropriate for the purpose. Furthermore, both scales are criticized for being incomplete, since they do not systematically circumscribe areas of ignorance or controversy. Ignorance may be due, for instance, to a lack of resolution on relevant scales or to controversy on hypotheses among experts who build on different assumptions or theoretical perspectives. If no degree of certainty can be attributed, information that is nevertheless relevant may be excluded or given minor attention. However, ignoring information which does not meet the conditions for attributing a degree of certainty may further lead to overestimation or underestimation of events
Third, arguments have been put forward to show that explicit distinctions between different sorts of uncertainties are important to avoid misinterpretations of uncertainty characterizations. This is exemplified with regard to structural model uncertainty, i.e., assumptions that enter modeling and scenarios. In this context, it is worth noting that robust results, i.e., if different models or methods lead to similar results, do not provide a sufficient basis for high probability because robustness might be due to models which are not independent from each other or do not account for important variables or parameters.
Purpose deals with a general contention toward the linear model of expertise enshrined in the IPCC process. In this linear model, the IPCC is said to privilege scientific knowledge as the authority on climate change, thereby constraining political deliberation on whether to react to this information and address problems on the ground. In this context, the contested issue rests on how adequate the linear model of communication is at delivering perceived relevant knowledge, versus how the receiver understands this information. Reaffirming this point, Hulme and Mahoney state that knowledge ‘claimed by its producers to have universal authority is received and interpreted very differently in different political and cultural settings. [Therefore], revealing the local and situated characteristics of climate change knowledge … becomes central for understanding both the acceptance and resistance that is shown towards the knowledge claims of the IPCC’. The content of information that is communicated, and how it is interpreted by users, manifest as sources of dissensus.
Interpretation of information refers to users’ perception and understanding of the assessment findings in IPCC ARs. The assumption here is that simplified communication of facts, highlighting consensus, raises confidence in users that they have suitable and unambiguous information for policies formulation. However, arguments presented in the papers reviewed disagree with this assumption on two fronts.
First, simplification of complex information on degrees of certainty masks and down-plays the importance of nuanced and fine-grained reporting of diverse value judgements on evidence, a point also raised by users who support reporting on dissensus to better interpret evidence presented. Vasileiadou et al. affirm that the logic of reasoning that underpins assessment of evidence, including where and how disagreements manifest, is not sufficiently transparent in the IPCC assessment process. Improving this transparency would better serve deliberation, interpretation, and reflexivity on evidence presented—among scientists and users alike, improving confidence in the quality of the IPCC ARs. According to van der Sluijs assessment findings that lack details on deliberation make policies vulnerable to scientific errors, insisting that robust and flexible policy strategies should take into account uncertainty and plurality in science.
From the concluding Future Directions:
 Any procedure for assessing uncertainty should be appropriate for its purpose. So, what can be learned from dissensus on whether communicating findings with a degree of certainty is what is relevant for and is clear to users? To the extent that WGs have to assess information on wicked problems with persisting deep uncertainty, there are good reasons for different judgments on which information is relevant and how certain findings are. As a way forward, we suggest to go beyond Hulme’s position that ‘[action on] climate change can only be understood from a position of dissensus’. We propose to proceed from dissensus on a singular position toward consensus on a plurality of relevant, even controversial positions or findings, as assessment results for users. This accounts on the one hand for the purpose of consensus, which is to reach inter-subjectivity, and on the other hand for its limits, since any consensus may be mistaken, especially when it comes to problems with deep uncertainty. A consequence for the science-policy interface would be to interact on a different assumption, namely within frameworks such as adaptive governance. Learning from dissensus on procedures for attributing a degree of certainty to findings serves several purposes: not only to make deliberative processes more transparent and consequently enabling control of bias, but also to develop more differentiated instruments that treat sources of uncertainty as relevant information to users. Last but not least, a broader understanding of relevant information for policy, together with fostering a culture of learning about uncertainty in the policy process, better accommodates for plurality in uncertainty assessment to account for diverse problems, approaches, and perspectives.
JC reflections
This paper hits on a lot of the same points that have concerned me regarding the IPCC, treatment of uncertainties, and consensus-seeking approach.
At the heart of the problem is the linear model of expertise, e.g. ‘speaking consensus to power.’  As discussed in my No consensus on consensus paper, this approach simply does not work when you are dealing with a wicked problem and conditions of deep uncertainty.
I am not sure where the IPCC will go from here, but a more mature approach to dealing with uncertainty, disagreement, dissensus and ignorance is badly needed.  This review paper by Adler and Hadorn points the arrow in the right direction.
 
 Filed under: Uncertainty

Source