Sea level rise: what’s the worst case?

by Judith Curry
Draft of article to be submitted for journal publication.

Well, I hope you are not overdosing on the issue of sea level rise.  But this paper is somewhat different, a philosophy of science paper.  Sort of how we think about thinking.
I would appreciate any comments, as well as suggestions as to which journals I might submit to.  I have two in mind, but am open to suggestions (and I may need backups).
Thanks in advance for your comments.
Sea level rise: What’s the worst case?
Abstract. The objective of this paper is to provide a broader framing for how we bound possible scenarios for 21st century sea level rise, in particular how we assess and reason about worst-case scenarios. This paper integrates climate science with broader perspectives from the fields of philosophy of science and risk management. Modal logic is used as a basis for describing construction of the scenario range, including modal inductivism and falsification. The logic of partial positions and strategies for speculating on black swan events associated with sea level rise are described. The rapidly advancing front of background knowledge is described in terms of how we extend partial positions and approach falsifying extreme scenarios of 21st century atmospheric CO2 concentrations, warming and sea level rise. The application of partial positions and worst-case scenarios in decision making strategies is described for examples having different sensitivities to Type I versus Type II errors.
 

  1. Introduction

Sea level rise is an issue of significant concern, given the large number of people who live in coastal regions. The concern over sea level rise is not so much about the 20 cm  or so that global mean sea level has risen since 1900. Rather, the concern is about projections of 21st century sea level rise based on climate model simulations of human-caused global warming.
Scientists and policy makers using projections of sea level rise are susceptible to making both Type I and Type II errors. An overestimation of a given impact is a Type I error (i.e., a false positive), while an underestimation of the impact is a Type II error (false negative). While we do not yet know the outcome of 21st century sea level rise, and hence Type I and II errors are correctly regarded as potential errors, we can assess errors in reasoning that lead to potential Type I or II errors.
The Intergovernmental Panel on Climate Change (IPCC) assessments have focused on assessing a ‘likely’ range (>66% probability) in response to different emissions concentration pathways. Brysse et al. (2013) argues that the IPCC consensus building process has effectively resulted in a focus on the avoidance of Type I (false-positive errors). A case in point is the assessment of sea level rise in the IPCC AR4 (2007). The AR4 deliberately neglected dynamic ice sheet melt from its projections of future sea level rise because future rates of dynamic ice sheet melt could not be projected with any confidence – a Type II error.
Curry (2011, 2018a) raises a different concern, that the climate change problem has been framed too narrowly, focusing only on human-caused climate change. In the context of this framing, the impacts of long-term natural internal variability, solar variations, volcanic eruptions, geologic processes and land use are relatively neglected as a source of 21st century climate change. This narrow framing potentially introduces a range of both Type I and II errors with regards to projections of 21st century climate change, and leaves us intellectually captive to unchallenged assumptions.
Oppenheimer et al. (2007) contends that the emphasis on consensus in IPCC reports has been on expected outcomes, which then become anchored via numerical estimates in the minds of policy makers. Thus, the tails of the distribution of climate impacts, where experts may disagree on likelihood or where understanding is limited, are often understated in the assessment process. Failure to account for both Type I and Type II errors leaves a discipline or assessment processes in danger of misrepresentation and unnecessary damages to society and human well being.
In an effort to minimize Type II errors regarding projections of future sea level rise, there has been a recent focus on the possible worst-case scenario. The primary concern is related to the potential collapse of the West Antarctic Ice Sheet, which could cause global mean sea level to rise in the 21st century to be substantially above the IPCC AR5 (2013) likely range of 0.26 to 0.82 m. Recent estimates of the maximum possible global sea level rise by the end of the 21st century range from 1.5 to 6 meters (as summarized by LeCozannet et al, 2017; Horton et al., 2014). These extreme values of sea level rise are regarded as extremely unlikely or so unlikely that we cannot even assign a probability. Nevertheless, these extreme, barely possible values of sea level rise are now becoming anchored as outcomes that are driving local adaptation plans.[1]
Reporting the full range of possible outcomes, even if unlikely, controversial or poorly understood, is essential for scientific assessments for policy making. The challenge is to articulate an appropriately broad range of future scenarios, including worst-case scenarios, while rejecting impossible scenarios.
This paper integrates climate science with broader perspectives from the fields of philosophy of science and risk management. The objective is to provide a broader framing of the 21st century sea level rise problem in context of how we assess and reason about worst-case scenarios. 

  1. Searching for black swans

Projections of future sea level rise are driven by climate-model generated projections of surface temperature in response to scenarios that increase atmospheric greenhouse gases. What type of climate change or sea level rise events, not covered by the current climate assessment reports, could possibly occur?
Potential surprises relative to background knowledge are often referred to as ‘black swans.’ There are two categories of black swan events (e.g. Aven and Renn, 2015):

  • Events or processes that are completely unknown to the scientific community (unknown unknowns).
  • Known events or processes that were ignored for some reason or judged to be of negligible importance by the scientific community (unknown knowns; also referred to as ‘known neglecteds’).

Efforts to avoid surprises begin with a fully imaginative consideration of possible future outcomes. Two general strategies have been employed for articulating black swan events related to climate change:

  • Statistical extrapolation of inductive knowledge beyond the range of limited experience using fat-tailed probability distributions.
  • Physically-based scientific speculation on the possibility of high impact scenarios, even though we can neither model them realistically nor provide an estimate of their probability.

2.1 Dismal theorem and fat tails
In a seminal paper, Weitzmann (2009) articulated the dismal theorem, implying the evaluation of climate change policy is highly sensitive to catastrophic outcomes, even if they occur with vanishingly small, but fat-tailed,[2] probability. The dismal theorem contrasts sharply with the conventional wisdom of not taking seriously extreme temperature change probabilities because such probability estimates are not based on hard science and are statistically insignificant.
Weitzmann argued that probability density function (PDF) tails of the equilibrium climate sensitivity, fattened by structural uncertainty using a Bayesian framework, can have a large effect on the cost-benefit analysis. Weitzmann’s analysis of the equilibrium climate sensitivity (ECS) was based on the IPCC AR4 (2007) assessment that ECS was ‘likely’ (> 66% probability) to be in the range 2 to 4.5oC with a best estimate of 3oC, is ‘very unlikely’ (< 10% probability) to be less than 1.5oC, and values substantially higher than 4.5oC cannot be excluded. Proceeding in the Bayesian paradigm, Weitzmann fitted a Pareto distribution to these values, resulting in a fat tail that produced a probability of ECS of 0.05% exceeding 11oC, and 0.01% probability of exceeding 20oC.
Subsequently, the IPCC AR5 (2013) modified their assessment of ECS, dropping the lower bound of the ‘likely’ range to 1.5oC and 1.0oC for the ‘very likely’ range (>90%), and more clearly defining the upper range with a 10% probability of exceeding 6oC. Most significantly, the IPCC AR5 stated that no best estimate for equilibrium climate sensitivity can now be given because of a lack of agreement on values across assessed lines of evidence and studies. While intuitively it might seem that a lower bottom would be good news, Freeman et al. (2015) considered a family of distributions using the AR5 parameters and found both the lowering of the lower bound and the removal of the best estimate actually fatten the ECS tail.
Annan and Hargreaves (2006) and Lewis and Curry (2015) have criticized high values of ECS derived from estimated PDFs, owing to unjustified assumptions and inappropriate statistical methods. The uncertainty surrounding ECS is not intrinsic to ECS itself, but rather arises from uncertainties in the parameters used to calculate ECS, e.g. external forcing data and magnitude of ocean heat uptake.
Curry (2018a) argues that given the deep uncertainty surrounding the value of climate sensitivity, we simply do not have grounds for formulating a precise probability distribution. With human-caused climate change, we are trying to extrapolate inductive knowledge far outside the range of limited past experience. While artificially-imposed bounds on the extent of possibly ruinous disasters can be misleading (Type II error), so can statistical extrapolation under conditions of deep uncertainty also be misleading (Type I error).
2.2 Physically-based scenario generation
Rather than sampling from a probability distribution, physically-based scenario generation develops different possible future pathways from coherent storylines that are based on particular assumptions.
Formally, each possible future can be regarded as a modal sentence (Betz, 2009), stating what is possibly true of our climate system. Betz articulates two general methodological principles that may guide the construction of the scenario range: modal inductivism and modal falsificationism. Modal inductivism states that a certain statement about the future is possibly true if and only if it is positively inferred from our relevant background knowledge. Modal falsificationism further permits creatively constructed scenarios to be accepted as long as the scenarios cannot be falsified by being incompatible with background knowledge. Modal inductivism is prone to Type II errors, whereas modal falsification is prone to Type I errors.
Betz (2009) argues that modal inductivism explains the controversy surrounding the conclusions in the IPCC AR4 regarding sea level rise (e.g. Oppenheimer et al. 2007). The AR4 summary statement anticipated a likely rise in sea level of 18-59 cm by the year 2100. This result was derived from climate model-based estimates and did not include the potential for increasing contributions from rapid dynamical processes in the Greenland and West Antarctic ice sheets. Although the AR4 recognized the possibility of a larger ice sheet contribution, this possibility is not reflected in its main quantitative results. Betz argues that the possible consequences of rapid ice-dynamical changes were not included because there was no model that could infer positively the ice-dynamical changes.
2.2.1 Modal inductivism: scenario generation by climate models
The IPCC Assessment Reports provide projections of future climate using global climate models that are driven by scenarios of future greenhouse gas emissions. Limitations of the IPCC projections of future climate change are described by the IPCC AR5 (2017; Section 11.3.1, 12.2.3). Internal variability places fundamental limits on the precision with which future climate variables can be projected. There is also substantial uncertainty in the climate sensitivity to specified forcing agents. Further, simplifications and parameterizations induce errors in models, which can have a leading-order impact on projections. Also, models may exclude some processes that could turn out to be important for projections.
Apart from these uncertainties in the climate models, there are three overarching limitations of the climate model projections employed in the IPCC AR5 (Curry, 2018a):

  • The scenarios of future climate are incomplete, focusing only on emissions scenarios (and neglecting future scenarios of solar variability, volcanic eruptions and multi-decadal and longer term internal variability).
  • The ensemble of climate models do not sample the full range of possible values of ECS, only covering the range 2.1 to 4.7 oC and neglecting values between 1 and 2.1 oC, with values between 1.5 and 2.1 oC being within the IPCC AR5 likely range
  • The opportunistic ensemble of climate model simulations used in the IPCC assessment reports does not provide the basis for the determination of statistically meaningful probabilities.

In summary, existing climate models provide a coherent basis for generating scenarios of climate change. However, existing climate model simulations do not produce decision-relevant probabilities and do not allow exploration of all possibilities that are compatible with our knowledge of the basic way the climate system actually behaves. Some of these unexplored possibilities may turn out to be real ones.
2.2.2 Modal falsification: alternative scenario generation
Smith and Stern (2011) argue that there is value in scientific speculation on policy-relevant aspects of plausible, high-impact scenarios, even though we can neither model them realistically nor provide a precise estimate of their probability.
When background knowledge supports doing so, modifying model results to broaden the range of possibilities they represent can generate additional scenarios, including known neglecteds. Simple climate models, process models and data-driven models can also be used as the basis for generating scenarios of future climate. The paleoclimate record provides a rich source of information for developing future scenarios. Network-based dynamical climatology can also be used as the basis for generating scenarios. More creative approaches, such as mental simulation and abductive reasoning, can produce ‘what if’ scenarios (NAS 2018).
In formulating scenarios of future climate change, Curry (2011) raises the issue of framing error, whereby future climate change is considered to be driven solely by scenarios of future greenhouse gas emissions. Known neglecteds include: solar variability and solar indirect effects, volcanic eruptions, natural internal variability of the large-scale ocean circulations, geothermal heat sources and other geologic processes. Expert speculation on the influence of known neglecteds would minimize the potential for missing black swans events that are associated with known events or processes that were ignored for some reason.
The objective of alternative scenario generation is to allow for and stimulate different views and perspectives, in order to break free from prevailing beliefs. Construction of scenarios that provide plausible but unlikely outcomes can lead to the revelation of unknown unknowns or unknown knowns.

  1. Scenario justification

As a practical matter for considering policy-relevant scenarios of climate change and its impacts, how are we to evaluate whether a scenario is possible or impossible?  In particular, how do we assess the possibility of potential black swan scenarios?
Confirmation (verification) versus falsification is at heart of a prominent 20th century philosophical debate. Lukyanenko (2015) argues that verification and falsification each contain contradictions and ultimately fail to capture the full complexity of the scientific process.
If the objective is to capture the full range of policy-relevant scenarios and to broaden the perspective on the concept of scientific justification, then both verification and falsification strategies are relevant and complementary. The difference between modal inductivism and modal falsificationism can also be thought of in context of regarding the allocation of burdens of proof. Consider a contentious scenario, S. According to modal inductivism, the burden of proof falls on the party that says S is possible. By contrast, according to modal falsificationism, the party denying that S is possible carries the burden of proof. Hence verification and falsification play complementary roles in scenario justification.
The problem of generating a plethora of potentially useless future scenarios is avoided by subjecting the scenarios to an assessment as to whether the scenario is deemed possible or impossible, based on our background knowledge. Further, some possible scenarios may be assigned a higher epistemic status if they are well grounded in observations and/or theory.
Under conditions of deep uncertainty, focusing on the upper bound of what is physically possible can reveal useful information. For example, few if any climate scientists would argue that an ECS value of 20 oC is possible. But what about an ECS value of 10 or 6 oC? We should be able to eliminate some extreme values of ECS as impossible, based upon our background understanding of how the climate system processes heat and carbon in response to gradual external forcing from increasing atmospheric carbon dioxide.
3.1 Scenario verification
As a practical matter for considering policy-relevant scenarios of climate change and its impacts, how are we to evaluate whether a scenario is possible or impossible?  Betz (2010, 2012) provides a useful framework for evaluating the scenarios relative to their degrees of justification and evaluating the outcomes against our background knowledge. A high degree of justification implies high robustness and relative immunity to falsification.
Below is a classification of future climate scenarios based upon ideas developed by Betz:

  • Strongly verified possibility – supported by basic theoretical considerations and empirical evidence
  • Corroborated possibility – it has happened before
  • Verified possibility – consistent with relevant background knowledge
  • Unverified possibility – climate model simulation
  • Borderline impossible – consistency with background knowledge is disputed (‘worst case’ territory)
  • Impossible – inconsistent with relevant background knowledge

Climate model simulations are classified here as unverified possibilities. Oreskes (1994) has argued that verification and validation of numerical models of natural systems is impossible. However there is a debate in the philosophy of science literature on this topic (e.g. Katzav, 2014).  The argument is that some climate models may be regarded as producing verified possibilities for some variables (e.g. temperature).
The epistemic status of verified possibilities is greater than that of unverified possibilities; however, the most policy-relevant scenarios may be the unverified possibilities and the borderline impossible ones (potential black swans). Clarifying what is impossible versus what is possible is important to decision makers, and the classification provides important information about uncertainty.
As an example, consider the following classification of values of equilibrium climate sensitivity (overlapping values arise from different scenario generation methods and different judgment rationales):

  • <0: impossible
  • >0 to <1 oC: implies negative feedback (unverified possibility)
  • 1.0-1.2 oC: no feedback climate sensitivity (strongly verified, based on theoretical analysis and empirical observations).
  • 1.15 to 2.7 oC: empirically-derived values based on energy balance models with verified statistical and uncertainty analysis methods (corroborated possibilities)
  • 2.1 – 4.7 oC: derived from climate model simulations (unverified possibilities)
  • >4.5 to 10 oC: borderline impossible (for equilibration time scales of a few centuries)
  • >10oC: impossible (for equilibration time scales of a few centuries)

There is a strongly verified anchor on the lower bound — the no-feedback climate sensitivity, which is nominally ~1 oC. Determination of ECS from observational data is represented by the Lewis and Curry (2018) analysis, the values from which are regarded as corroborated possibilities. The climate model range reported by the IPCC AR5 of 2.1 to 4.7 oC is classified as unverified possibilities. The borderline impossible range is open to dispute. Annan and Hargreaves (2006) argue for an upper bound of 4.5 oC. The IPCC AR5 put a 90% probability at 6 oC.  None of the ECS values cited in the AR5 extend much beyond 6 oC (one tail extends to 9 oC), although in the AR4 several long-tailed distributions were cited, extending beyond 10 oC.
It is rational to believe with high confidence a partial position (Betz, 2012) that equilibrium climate sensitivity is at least 1 oC and between 1 and 2.7 oC, which encompasses the strongly verified and corroborated possibilities. This partial position with a high degree of justification is relatively immune to falsification. It is also rational to provisionally extend one’s position to believe values of equilibrium climate sensitivity up to 4.7 oC (the range simulated by climate models), although these values are vulnerable to the growth or modification of background knowledge and improvements to climate models whereby portions of this extended position may prove to be false. This argument supports why a rational proponent should be interested in adopting a partial position with high degree of justification. High degree of justification ensures that a partial position is highly immune to falsification and can be flexibly extended in many different ways when constructing a complete position.
If values beyond 10 oC are impossible, then the fat tail values generated by Weitzmann (2009) are impossible. Is it physically justified to eliminate extreme outcomes of ECS? Because of the nature of the definition of equilibrium climate sensitivity, with a very long equilibration timescale and the possibility of very long timescale feedbacks, attempting to identify an impossible threshold may be an ill-posed problem. Because of the central role that ECS plays in Integrated Assessment Models used to determine the social cost of carbon, this issue is not without consequence.
Even if is impossible to falsify high values of ECS owing to ambiguities in the definition of ‘equilibrium,’ there are physical constraints on how rapidly temperature or sea level can change by 2100 in response to CO2 doubling.
3.2 Scenario falsification and expert judgment
While scientific theories can never be strictly verified, they must be falsifiable. This leads to a corollary that predicted outcomes based on a theory of change should in principle be falsifiable.
How do we approach falsifying extreme scenarios? Extreme scenarios can be evaluated based on the following criteria:

  1. Evaluation of the possibility of each link in the storyline used to create scenario
  2. Evaluation of the possibility of the outcome, in light of physical constraints and possibility of the inferred rate of change.

The first criteria is a mechanistic one, whereby individual processes and links among them are evaluated. The second criteria is an integral constraint on the scenario outcome, related to the possibility of the outcome itself and the required rate of change to achieve the outcome over a specified period.
Assessing the strength of background knowledge is an essential element in assessing extreme scenarios. Extreme scenarios are by definition at the knowledge frontier. Hence the background knowledge against which extreme scenarios are evaluated is continually changing, which argues for frequent re-evaluation of extreme scenarios.
Scenario falsification requires expert judgment, assessed against background knowledge. This raises several questions:

  • Which experts and how many?
  • By what methods is the expert judgment formulated?
  • What biases enter into the expert judgment?

Expert judgment encompasses a wide variety of techniques, ranging from a single undocumented opinion, to preference surveys, to formal elicitation with external validation (e.g. Oppenheimer et al., 2016).
Expert judgment plays a prominent role in the IPCC process. The multiple lines of evidence surrounding equilibrium climate sensitivity used in the IPCC’s expert judgment are quite clear. However, sea level rise projections are a much more complex situation for expert judgment, owing to their dependence on projections of ice sheet behavior with relatively few lines of evidence and a great deal of uncertainty. Hence sea level rise projections have been heavily dependent on expert judgment.
Issues surrounding the process of expert judgment are revealed in context of an expert elicitation on sea level rise conducted by Horton et al. (2014), which presented results of a broad survey of 90 experts. Gregory et al. (2014) criticized several aspects of the elicitation. The first criticism addresses the issue of ‘which experts?’ The respondents were a subset (18%) of the 500 experts whom Horton et al. identified; the other 82% could not be contacted, declined to respond, or supplied incomplete or inconsistent responses.
While overall the elicitation provided similar results as cited by the IPCC AR5, Figure 2 of Horton et al. shows that several of the respondents placed the 83-percentile for global mean sea level rise by 2100 for RCP8.5 to be higher than 2.5 m, i.e. more than 1.5 m above the AR5 likely range, with the highest estimate exceeding 6 m. Gregory et al. argue that such high values are physically untenable. They state that there is a large difference in rigor between the IPCC assessment and an expert elicitation. An expert elicitation is opaque; the respondents are not asked to justify, and we cannot know how they arrived at their conclusions. The IPCC assessment process is designed to avoid Type I errors, whereas the expert elicitation elicited several expert opinions that arguably make a Type II error.
Curry (2011a) argues that because of the complexity of the issues in climate science, individual experts use different mental models for evaluating the interconnected evidence. Biases can abound when reasoning and making judgments about such a complex problem. Bias can occur by excessive reliance on a particular piece of evidence, the presence of cognitive biases in heuristics, failure to account for indeterminacy and ignorance, and logical fallacies and errors including circular reasoning.
Research in cognitive psychology shows that powerful and sometimes subtle biases play significant role in scientific justification. Tversky and Kahnemann (1974) identified numerous cognitive biases that proliferate into our regular and scientific thinking, and often compete with inductive and deductive forms of logical reasoning.

  1. Sea level rise scenario verification and falsification

A comprehensive summary of recent sea level rise projections is provided by Horton et al. (2018). In assessing these projections for application to decision making, a broader framing of possible climate change scenarios is provided here, that includes natural climate variability and geologic processes.
4.1 Scenario generation
Physically-based scenarios of future sea level change are derived from the following methods: extrapolation of recent trends, semi-empirical approaches based on past relationships of sea level rise with temperature, and process-based methods using models.
Sea level rise projections are directly tied to projections of surface temperature, which are based upon simulations from global climate models that are forced by different emissions scenarios.
Most assessments have focused on bounding the likely range (>66%). Since the IPCC AR5 was published in 2013, new scenario and probabilistic approaches have been used for 21st century sea level rise projections. However, these new projections are based on the same climate model simulations used in the IPCC AR5.
Of particular note: the NOAA Technical Report entitled Global and Regional Sea Level Rise Scenarios for the United States (NOAA, 2017) provides a range of global mean sea level rise scenarios for the year 2100. The worst-case upper-bound scenario for global sea level rise (the H++ scenario) is 2.5 meters by the year 2100. The lower bound scenario is 0.3 meters by the year 2100.
Here we critically evaluate the upper and lower bounds: the worst-case scenario and also the lower bound best-case scenario.
3.2 Worst-case scenario
The worst-case scenario is judged to be the most extreme scenario that cannot be falsified as impossible based upon our background knowledge (Betz, 2010).   Strategies for generating the worst-case sea level rise scenarios include: process modeling that employs the worst-case estimate for each component, estimates based on the deglaciation of the last ice age and the previous interglacials, and expert judgment.
Most of the recent estimates of the worst-case scenario for global sea level rise in the 21st century range from 1.5 to 3.0 meters, with the recent NOAA Report (NOAA, 2017) using a value of 2.5 meters. In the expert elicitation study of Horton et al. (2014), 5 of the 90 respondents cited a value exceeding 3 m, with the highest value exceeding 6 m. These values of sea level rise imply rates of sea level rise as high as 50-100 mm/year by the end of the 21st century. For reference, the current global rate of sea level rise is about 3 mm/year. Are these scenarios of sea level rise by 2100 plausible? Or even possible?
4.2.1 Worst-case storylines
Worst-case scenarios have been developed around story lines of irreversible reduction in ice mass of the Greenland and/or West Antarctic ice sheets. Worst-case scenarios for 21st century sea level rise have been developed in different ways: convening an expert committee to develop extreme scenarios (e.g. Katsman et al., 2011), conducting a large expert assessment survey (Horton et al., 2014), or combining process models or expert assessment of ice sheet contribution with climate model projections (e.g. Bamber and Aspinall, 2013).
The primary concern over future sea level rise in the 21st century is related to the potential collapse of the West Antarctic Ice Sheet (WAIS). For the WAIS, marine ice shelves and tongues that buttress inland, grounded ice are believed to be critical for the ice-sheet stability. Marine Ice Sheet Instability (runaway retreat of the ice sheet) could be initiated if the buttressing effect of this ice is lost from erosion by a warming ocean or altered circulation in coastal seas.
The most vulnerable region of the WAIS is the Amundsen Sea sector. Scenarios for increased ice discharge from this region have been articulated by Pfeffer et al. (2008), based on kinematic constraints on the discharge of glaciers. DeConto and Pollard (2016) introduced new instability mechanisms related to marine ice-cliff instabilities and ice-shelf hydrofracturing (rain and meltwater-enhanced crevassing and calving). Their high-end estimate exceeded 1.7 m of sea-level rise from Antarctica alone in 2100 under the RCP8.5 scenario.
The most extreme 21st sea level rise scenarios from process-based models are reported by Schlegel et al. (2018). They assessed how uncertainties in snow accumulation, ocean-induced melting, ice viscosity, basal friction, bedrock elevation, and the presence of ice shelves impact the future sea level contribution from the Antarctic ice sheet. They found that over 1.2 m of Antarctic ice sheet contribution to global mean sea level contribution is achievable over the next century, but not likely, as this increase tenable only in response to unrealistically large melt rates and continental ice shelf collapse. As an extreme worst case, plausible combination of model parameters produced simulations of 4.95 m sea level rise from the Antarctic ice sheet by 2100.
Prior to these sophisticated ice sheet model simulations, the rationale for the highest scenarios that were elicited by Horton et al. (2014) – exceeding 6 m – do not seem to be justified by process-based models, but rather by top-down semi-empirical methods that relate sea levels to global mean surface temperatures during current and previous interglacials. Hansen et al. (2016) considered sea level and rates of sea level rise during the late Eemian (previous interglacial, about 124,000 years ago), as a justification for predictions of several meters of sea level rise in the 21st century.
Another possible storyline relates to newly discovered geothermal heat fluxes in the vicinity of the Greenland and Antarctic ice sheets (e.g. DeVries et al. 2017), although these processes have not yet explicitly figured into worst-case sea level rise scenarios.
4.2.2 Worst-case constraints
While associated with physically plausible mechanisms, the actual quantification of the worst-case scenarios for 21st century sea level rise remains highly speculative. As a check on scenarios developed from process models and/or more speculative methods, integral constraints on basic physical processes provide a rationale for potentially falsifying extreme scenarios.
Deglaciation following the last ice age provides an opportunity to examine the stability of marine ice sheets and possible rates of sea level rise.  During the Meltwater Pulse 1A (MWP-1A), the most rapid deglaciation occurred around 14.5ka BP. Recent research by DesChamps et al. (2012) constrained the rapid melting to a period of ~340 years. The most probable value of sea level rise during this period was between 14 and 18 m, implying that the rate of sea-level rise exceeded 40 mm/yr during this pulse. Two conflicting scenarios have been proposed for the source of MWP1A – a northern scenario, with partial melting of the large North American and Eurasian ice sheets, and a southern scenario that points to an Antarctic source. If the northern scenario is correct, then MWP-1A is not a very useful constraint for possible 21st century sea level rise.
Additional insights of relevance to the current configuration of the ice sheets are provided from the last interglacial (~130 to ~115 ky ago; the Eemian). Kopp et al. (2009) estimated a late Eemian sea level highstand with median value of sea level exceeding present values by 6.6 m (95% probability and unlikely (33% probability) to have exceeded 9.4 m. Kopp et al. concluded that present ice sheets could sustain a rate of global sea level rise rise of about 56–92 cm per century for several centuries, with these rates potentially spiking to higher values for shorter periods. Kopp et al. inferred that achieving global sea level in excess of 6.6 m higher than present likely required major melting of both the Greenland and the West Antarctic Ice Sheets.
Rohling et al. (2013) provide a geologic/paleoclimatic perspective on the worst-case scenario for 21st century sea level rise by examining the past 5 interglacial periods. They investigated the natural timescales and rates of change in ice-volume adjustment to a disequilibrium state, relative to a forcing increase. Projected rates of sea level rise above 1.8 m by 2100 are larger than the rates at the onset of the last deglaciation, even though today’s global ice volume is only about a third of that at the onset of the last deglaciation. Starting from present-day conditions, such high rates of sea level rise would require unprecedented ice-loss mechanisms without interglacial precedents, such as catastrophic collapse the West Antarctic Ice Sheet or activation of major East Antarctic Ice Sheet retreat.
An alternative strategy for falsifying ice loss scenarios relates to identifying physical constraints on specific ice loss mechanisms. Pfeffer et al. (2008) falsified extreme scenarios based on kinematic constraints on glacier contributions to 21st century sea level rise. They found that a total sea-level rise of about 2 meters by 2100 could occur under physically possible glaciological conditions but only if all variables are quickly accelerated to extremely high limits. They concluded that increases in excess of 2 meters are physically untenable.
The most extreme process-based sea level rise scenarios (e.g. DeConto and Pollard 2016; Schlegel et al., 2018) are derived from linking atmospheric warming with hydrofracturing of buttressing ice shelves and structural collapse of marine-terminating ice cliffs in Antarctica. Prediction of 21st century contributions depends critically on uncertain calibration to sea level rise in the Pliocene (about 3 million years ago) and debated assumptions about the Antarctic contribution to sea level rise during the Eemian.
Worst-case scenarios for 2100 and collapse of the West Antarctic Ice Sheet are driven by the RCP8.5 greenhouse gas concentration scenario. An additional constraint on the worst-case sea level rise scenario is an assessment of whether RCP8.5 is a possible scenario. RCP8.5 is an extreme scenario that may be impossible, given unrealistic assumptions and constraints on recoverable fossil fuel supply (e.g. Wang et al., 2016). Ritchie and Dowlatabadi (2017) explain that RCP8.5 contains a return to coal hypothesis, requiring increased per capita coal use that is based on systematic errors in coal production outlooks. Here, RCP8.5 is classified as borderline impossible.
Scenarios of 21st century sea level rise exceeding about 1.8 m require conditions without natural interglacial precedents. These worst-case scenarios require a cascade of events, each of which are extremely unlikely to borderline impossible, based on our current knowledge base. The joint likelihood of these extremely unlikely events arguably crosses the threshold to impossible.
How to rationally make judgments about the possibility of extreme scenarios remains a topic that has received too little attention.
4.3 Best case scenario
There has been much less focus on the possible best-case scenario, which is defined here as the lowest sea level rise for the 21st century that cannot be falsified as impossible based upon our background knowledge. Consideration of the best case is needed to provide bounds on future sea level rise. Further, verification/falsification analysis of the best case can provide important insights into uncertainty and the possible impacts of known neglecteds.
Parris (2012) recommends a lower bound of 0.2 m for 21st century global mean sea level rise, which is basically the observed rate of sea level rise during the 20th century. NOAA (2017) recommends that this value be revised upward to 0.3 m, because the global mean sea level rise rate as measured by satellite altimeters has averaged 3 mm/year for almost a quarter-century.
It is difficult to defend an argument that it is impossible for the 21st century sea level rise to occur at the same average rate as observed in the 20th century, especially since many if not most individual tide gauge records show no recent acceleration in sea level rise (e.g. Watson 2016).
Is it possible for global sea level to decrease over the 21st century? Kemp et al. (2018) provided an estimate of mean global sea level for the past 3000 years. There are several periods with substantial rates of sea level decline, notably 1000 to 1150 AD and 700 to 400 BC. Century-scale sea level decreases of the magnitude determined by Kemp et al. (about half the magnitude of the 20th century rate)[3] are not sufficient to completely counter the likely sea level rise projected by the IPCC AR5. Given the thermal inertia present in the oceans and ice sheets, it is arguably impossible for global mean sea level rise to decrease on the time scale of the 21st century.
However, it is possible for 21st century sea level rise to be less than in the 20th century. Possible scenarios of solar variations, volcanic eruptions and internal variability associated with large-scale ocean circulations could combine to reduce the 21st century rate of sea level rise relative to the 20th century. The relative importance to sea level change of human-caused warming versus natural climate variability depends on whether equilibrium climate sensitivity is on low end of the range (< 2 oC) or the high end (>4 oC) of current estimates.
The recent acceleration in global mean sea level rise since 1993 is attributed to increased melting of the Greenland ice sheet (e.g. Chen et al. 2017). This acceleration in Greenland melt has been largely attributed to natural variability associated with large-scale ocean and atmospheric circulation patterns – the Atlantic Multidecadal Oscillation (AMO) and the North Atlantic Oscillation (NAO) (e.g. Hahn et al. 2018). A future transition to the cool phase of the AMO and/or positive phase of the NAO would slow down (or possibly even reverse) the mass loss from Greenland, and hence slow down the rate of global sea level rise. Such a scenario for the Greenland mass balance is regarded as a corroborated possibility, since we have seen such a scenario recently during the 1970’s and 1980’s (e.g. Fettweis et al, 2013). The implications of such a scenario in the 21st century would significantly reduce sea level rise potentially for several decades, with the relative importance of this scenario for Greenland depending on whether equilibrium climate sensitivity to CO2 is on low end of the range or the high end of current estimates.
An additional best-case scenario relates to the recent finding by Barletta et al. (2018) that the ground under the rapidly melting Amundsen Sea Embayment of West Antarctica is rising at a rate of more than 4 cm per year.  This rise is acting to stabilize the West Antarctic Ice Sheet. Ice loss spurs uplift in the sea floor (isostatic rebound), which is occurring rapidly owing to low viscosity under the Amundsen Sea Embayment. Such processes have a strong direct impact on West Antarctic Ice Sheet evolution at the centennial time scale. Gomez et al. (2015) articulate a negative feedback process whereby the combination of bedrock uplift and sea surface drop associated with ice sheet retreat significantly reduces ice sheet mass loss.
4.4 Possibility distribution
Given the deep uncertainty associated with projections of 21st century sea level rise (e.g. Horton et al., 2018), a way to stratify the current knowledge base about likelihood and possibility of a range of 21st century sea level outcomes is a possibility distribution (e.g. Mauris 2011). As an example, LeCozannet et al. (2017) has constructed a possibility distribution and diagram for projections of 21st century sea level rise scenario outcomes.
Here, a possibility diagram is constructed (Figure 1) under different assumptions than used by LeCozannet et al. The variable U denotes the outcome for 21st century sea level change. U is cumulative, so that an 80 cm outcome must necessarily first pass through lower values of sea level rise. Values less than U therefore represent partial positions for U. The function π(U) represents the state of knowledge of U, distinguishing what is necessary and possible from what is impossible.
π(U) = 1: nothing prevents U from occurring; U is a completely possible value and may be regarded as necessary
π(U) = 0: U is rejected as impossible based on current background knowledge
Intermediate values of π(U) reflect outcomes whereby there would be no particular surprise if U does occur, or no particular surprise if U does not occur. Following the classification introduced in section 3.1, values of U are assigned the following values of π(U) (Figure 2):

  • π(U) ≥0.9: sea level rise up to 0.3 m; corroborated possibilities
  • 0.5 > π(U) > 0.9: sea level rise exceeding 0.3 m and up to 0.63 m; verified possibilities contingent on DT, based on IPCC AR5 likely range (but excluding RCP8.5).
  • 0.5 ≥ π(U) > 0.1: sea level rise exceeding 0.63 m and up to 1.6 m; unverified possibilities
  • 0.1 ≥ π(U) > 0: sea level rise between 1.6 and 2.5 m; borderline impossible
  • π(U) = 0: sea level rise exceeding 2.5 m; impossible based upon background knowledge
  • π(U) = 0: negative values of sea level change; impossible based on background knowledge

Figure 1: Possibility diagram of projections of cumulative 21st century sea level rise.
These π assignments are based on justifications provided in previous subsections (see also Curry, 2018b); however, this particular classification represents the judgment of one individual. One can envision an ensemble of curves, using different assumptions and judgments. The point is not so much the exact numerical judgments provided here, but rather to demonstrate a way of stratifying the current knowledge base that is consistent with deep uncertainty.
The possibility distribution in Figure 1 does not directly map to a PDF — the level of uncertainty is such that there is no particular basis for selecting a median or mean value for some hypothetical PDF of future sea level rise. LeCozannet et al. (2017) argue that no single PDF can represent the whole range of uncertainty sources related to future sea-level rise.
While there is a great deal of uncertainty surrounding the possible impact of marine ice cliff instability in the 21st century, rejecting such a scenario  is a Type II error. Our background knowledge base will change in the future, and it is certainly possible that in the future that such a scenario will be considered to have a greater likelihood. Based upon our current background knowledge, it is arguably more rational to reject the RCP8.5 concentration scenario than it is to reject the ice cliff instability scenario.

  1. Decision making under deep uncertainty about sea level rise

The concepts of the possibility distribution, worst case scenarios, scenario verification and partial positions are relevant to decision making under deep uncertainty, where precautionary and robust approaches are appropriate. A precautionary appraisal is initiated when there is uncertainty. A robust policy is defined as yielding outcomes that are deemed to be satisfactory across a wide range of plausible future outcomes (e.g. Walker et al. 2016). Robust policy making interfaces well with possibilistic approaches that generate a range of possible futures. Worst-case scenarios are an essential feature of precaution.
These concepts are applied in general terms to two decision making challenges related to sea level rise that have different sensitivities to Type I and II errors:

  • Infrastructure siting in coastal areas: Type II errors are of the greatest concern
  • Tort litigation: Type I errors are of the greatest concern

5.1 Infrastructure siting in coastal areas
Consider a hypothetical decision related to siting of major infrastructure near the coast, such as an international airport or a nuclear power plant. For infrastructure siting decisions having a multi-decade lifecycle, a Type II error (underestimation of the impact) would have the most adverse consequences. In this case, it is arguably more important to assess the worst-case scenario than to assess what is likely to happen.
NOAA (2017) provides the following advice about scenarios in the context of robust decision making. First, define a scientifically plausible worst-case scenario as a guide for overall system risk and long-term adaptation strategies. Then define a central estimate or mid-range scenario as a baseline for shorter-term planning. This strategy assumes that adaptive management, e.g. including additional flood defenses such as sea walls, is feasible.
Considering the worst-case scenario is consistent with the precautionary principle. The precautionary principle is best applied to situations where the potential harm can be controlled by the decision maker. In this case, the actual siting of the infrastructure is completely controllable.
So for purposes of decision making regarding infrastructure siting, which worst-case scenarios should be considered? Guided by the possibility diagram in Figure 1, a prudent strategy is to select a provisional worst-case scenario of 1 to 1.6 m (a partial position), with a contingent strategy for adding additional flood defenses if needed.
Actual siting decisions involve a large range of factors not considered here (e.g., Hall et al., 2016), but this example of decision making arguably benefits from explicit consideration of the worst-case scenario.
5.2 Climate change litigation
Kilinsky (2008) argues for the theory that plaintiffs can prevail on claims arising from the threat of potential injury attributable to a failure to adapt to or prevent climate change. Allen (2003) discusses the challenges from a climate science perspective related to demonstrating liability for climate change.
In evaluating litigation claims that rely on projections of future climate change and sea level rise, it is instructive to consider the role that the strength of the knowledge base of future climate change and sea level rise might have in these claims. The standard legal definitions for evidentiary standards and burden of proof can be mapped onto the possibility classifications:

  • Credible evidence: evidence that is not necessarily true but that is worthy of belief and worthy of consideration –> unverified possibilities
  • Preponderance of the evidence, or balance of probabilities: greater than fifty percent chance that the proposition is true; more likely than not to be true –> verified possibilities
  • Clear and convincing evidence: highly and substantially more probable to be true than not –> corroborated possibilities
  • Beyond reasonable doubt: there is no plausible reason to believe otherwise –> necessary.

Based upon this classification, unverified possibilities and worst-case scenarios of sea level rise would not support such a tort case. The range of verified possibilities arguably defines the maximum projected sea level rise that would meet the standard of preponderance of evidence. The challenge in developing evidence for such a case is to demonstrate that the projections based on climate model simulations of global temperature change meet the standards of verified possibilities, with a high degree of justification and relatively immune to falsification, even if the verified scenario represents only a partial position.

  1. Conclusions

The purpose of generating scenarios of future outcomes is that we should not be too surprised when the future actually arrives. Projections of 21st century sea level rise are associated with deep uncertainty and a rapidly advancing knowledge frontier. The dynamic nature of the knowledge frontier on worst-case sea level rise scenarios is highlighted by Kopp et al. (2017), who compared recent projections with past expert assessments. The objective of this paper has been to articulate a strategy for portraying scientific understanding of the full range of possible scenarios of 21st century sea level rise, with a focus on worst-case scenarios and the avoidance of Type II errors.
An argument for alternative scenario generation has been presented, to stimulate different views and perspectives. In particular, considering climate change to be solely driven by scenarios of future greenhouse gas emissions is arguably a framing error, that neglects possible scenarios of future solar variability, volcanic eruptions, natural internal variability of the large-scale ocean circulations, and geothermal and other geologic processes.
A framework for verifying and falsifying future scenarios is presented, in the context of modal logic. A classification of future scenarios is presented, based on levels of robustness and relative immunity to falsification. The logic of partial positions allows for clarifying what we actually know with confidence, versus what is more speculative and uncertain.
A possibility diagram of scenarios of 21st century cumulative sea level rise that ranks the possibilities from necessary to impossible provides a better representation of the deeply uncertain knowledge base than a probability distribution, since no single PDF can represent the whole range of uncertainty sources related to future sea-level rise. Apart from the limits of necessary and impossible, the intermediate possibilities do not map to likelihood since they also include an assessment of the quality of the knowledge base.
Hence, the possibility diagram avoids classifying scenarios as extremely unlikely if they are driven by processes for which we have a low level of understanding.
The possibility diagram for sea level rise projections considers sea level rise outcomes as resulting from a cumulative process, whereby a higher sea level outcome must first pass through lower levels of sea level rise. Therefore, lower values of sea level rise represent a partial position for the higher scenario. Partial positions can discriminate between lower values for which we have greater confidence, and higher values that are more speculative.
The concepts of the possibility distribution, worst case scenarios, scenario verification and partial positions are applied here to two decision making challenges related to sea level rise that have different sensitivities to Type I and II errors. The possibility distribution interfaces well with robust decision making strategies, and the worst-case scenario with partial positions is an important factor in precautionary considerations.
The approach presented here is very different from the practice of the IPCC assessments and their focus on determining a likely range, and provides numerous new challenges to the scientific community. There are some efforts (e.g. Horton et al., 2018) to develop decision-relevant probabilities of future sea level rise as part of a science-based uncertainty quantification. The state of our current understanding of sea level rise is far from being able to support such probabilities. The possibility distribution provides a framework for better classifying our knowledge about sea level rise scenarios.
[1] https://www.scientificamerican.com/article/prepare-for-10-feet-of-sea-level-rise-california-commission-tells-coastal-cities/
[2] A fat-tailed distribution is a probability distribution that exhibits a large skewness or kurtosis, relative to to a normal or exponential distribution.
References [ References]

Source