In case after case, the theory that best fits the data is the one that also leads inexorably to the conclusion that human influence is one of the most important forces currently changing the climate, writes Robert J. Lempert.
A new set of scenarios, referred to as Shared Socio-economic Pathways (SSPs), examines challenges to mitigation and challenges to adaptation. Developing SSPs with a "backwards" approach could help inform the development of SSPs to ensure the storylines focus on the driving forces most relevant to distinguishing between the SSPs.
Climate information alone cannot be sufficient for anticipating and reducing climate impacts. Enhanced vulnerability science is needed, including local to global monitoring, to support effective anticipatory efforts to increase societal resilience to potentially disruptive events.
An examination of Info-Gap and RDM finds that the two approaches reach similar but not identical policy recommendations and that their differing attributes raise important questions about their appropriate roles in decision support applications.
Climate change adds additional uncertainty to many policy issues. Some tools available to project climate change (including downscaling techniques) are better than others, just as some are better able to assess and quantify the corresponding uncertainty.
Any successful response to climate change—both the challenges of limiting the magnitude of future climate change and adapting to its impacts—will clearly involve policies that evolve over time in response to new information and that are robust over a wide range of difficult-to-predict future conditions. Robust control theory is means to evaluate such robust and adaptive policies for reducing greenhouse gas emissions.
If it were really possible to explain millions of years of Earth data with a theory that doesn't also imply a recent human influence on the climate, some ambitious, self-interested team of scientists somewhere in the world would seek scientific renown by doing so, writes Robert Lempert.
Water managers in Southern California, who grapple with how to address climate change in their near-term and long-term plans, are beginning to seek methods for incorporating such changes in their planning processes.
This study uses a simple computer simulation model to compare several alternative frameworks for decision making under uncertainty—optimal expected utility, the precautionary principle, and three different approaches to robust decision making—for addressing the challenge of adding pollution to a lake without triggering unwanted and potentially irreversible eutrophication.
Integrated assessment modeling of global climate change has focused primarily on gradually occurring changes in the climate system. However, atmospheric and earth scientists have become increasingly concerned that the climate system may be subject to abrupt, discontinuous changes on short time scales, and that anthropogenic greenhouse-gas emissions could trigger such shifts.
This study performs a standard econometric analysis on the simulation model outputs from six scenarios from the Special Report on Emissions Scenarios (SRES) to assess the extent to which the projected CO 2 and NO x emissions reflect Environmental Kuznets Curve (EKC) behavior.
In this paper, we use the optimal estimation technique with a formal characterization of the errors to retrieve NO2 concentration profiles from slant column observations made at Eureka during March and April 1999.
Probability-based estimates can have serious limitations when applied to a problem such as climate change. The Intergovernmental Panel on Climate Change should also consider approaches to decision-making under conditions of uncertainty that do not depend on expert consensus on probabilities.
The retrieval of accurate vertical column amounts of stratospheric constituents from zenith-sky spectroscopy is dependent on accurately modeling the transfer of radiation through the atmosphere and calculating suitable air mass factors (AMFs).