Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain
the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in
Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles
and JavaScript.
The 2021 Nobel Prize in Physics has been awarded to Syukuro Manabe, Klaus Hasselmann and Giorgio Parisi for their advances in complex physical systems. In recognition of this award, Nature Portfolio presents a collection of research, review and opinion articles that celebrates the direct contributions by the awardees and the discoveries they have inspired.
Syukuro Manabe and Klaus Hasselmann are awarded the Nobel Prize “for the physical modelling of Earth’s climate, quantifying variability and reliably predicting global warming”. Their contributions laid the foundation for advances in climate modelling and development, and detection and attribution methodologies.
Giorgio Parisi received the award “for the discovery of the interplay of disorder and fluctuations in physical systems from atomic to planetary scales”, with applications including glasses, random lasers and optimization problems.
Syukuro Manabe, Klaus Hasselmann and Giorgio Parisi split the award for their work on complex systems — including modelling Earth’s climate and global warming.
Successful projection of the distribution of surface temperature change increases our confidence in climate models. Here we evaluate projections of global warming from almost 30 years ago using the observations made during the past half century.
A classic paper in 1967 reported key advances in climate modelling that enabled a convincing quantification of the global-warming effects of carbon dioxide — laying foundations for the models that underpin climate research today.
Climate science celebrates three 40th anniversaries in 2019: the release of the Charney report, the publication of a key paper on anthropogenic signal detection, and the start of satellite temperature measurements. This confluence of scientific understanding and data led to the identification of human fingerprints in atmospheric temperature.
For a problem as complex as turbulence, combining universal concepts from statistical physics with ideas from fluid mechanics has proven indispensable. Three decades since this link was formed, it is still providing food for new thought.
A new generation of sophisticated Earth models is gearing up for its first major test. But added complexity may lead to greater uncertainty about the future climate, finds Olive Heffernan.
A scientific consensus that humans are influencing the climatewill be behind any agreements on greenhouse-gas reductions nextmonth. But how can climate research have an optimal influenceon climate policy in the future?
Multi-actor integrated assessment models based on well-being concepts beyond GDP could support policymakers by highlighting the interrelation of climate change mitigation and other important societal problems.
Initialized climate predictions offer distinct benefits for multiple stakeholders. This Review discusses initialized prediction on subseasonal to seasonal (S2S), seasonal to interannual (S2I) and seasonal to decadal (S2D) timescales, highlighting potential for skilful predictions in the years to come.
Many different methods have been developed to forecast climate phenomena like the El Nino-Southern Oscillation (ENSO) which makes a fair comparison of their capabilities crucial. In this perspective, the authors discuss how choices in the evaluation method can lead to an overestimated perceived skill of ENSO forecasts.
Seasonal forecasting skill in machine learning methods that are trained on large climate model ensembles can compete with, or out-compete, existing dynamical models, while retaining physical interpretability.
Although the differential equations that describe the physical climate system are deterministic, there are reasons, both theoretical and practical, why computational representations of these equations should be stochastic. This Perspective surveys the benefits of stochastic modelling of weather and climate.
Earth system models project likely future climates, however, evaluation of their output is challenging. This Perspective discusses new evaluation approaches, considering both simulations and observations, to ensure credible information for decision-making.
Complex networks are used to analyse global-scale teleconnections between extreme-rainfall events, revealing a peak in the distance distribution of statistically significant connections at around 10,000 kilometres.
In 1990 the First Assessment Report of the Intergovernmental Panel on Climate Change was produced. It contained a prediction of the global-mean-temperature trend for 1990–2030 which, halfway through that period, appears accurate. This is remarkable in hindsight, considering a number of important external forcings were not included. This study concludes the greenhouse-gas-induced warming is largely overwhelming the other forcings.
Updated models are being used for the new assessment report from the Intergovernmental Panel on Climate Change. This study compares projections from the latest models with those from earlier versions. The spread of results has not changed significantly, and some of the spread will always remain due to the internal variability of the climate system. As models improve, they are able to represent more processes in greater detail, allowing for greater confidence in their projections, in spite of model spread.
Quantifying the temperature impacts of anthropogenic emissions helps monitor proximity to the Paris Agreement goals. Human activities warmed global mean temperature during the past decade by 0.9 to 1.3 °C above 1850–1900 values, with 1.2 to 1.9 °C from greenhouse gases and −0.7 to −0.1 °C from aerosols.
Changes in monthly temperature extremes are governed by mean climate warming, whereas changes in monthly precipitation extremes respond more to changes in variability, suggest analyses of large-ensemble climate simulations.
Anthropogenic greenhouse gas emissions are responsible for the observed decrease in subseasonal temperature variability in the northern extratropics, according to an attribution analysis using a large ensemble of climate model simulations.
Short-duration rainfall extremes are determined by complex processes that are affected by the warming climate. This Review assesses the evidence for the intensification of short-duration rainfall extremes, the associated drivers and the implications for flood risks.
Climate models project an intensification of extreme precipitation under climate change, but this effect is difficult to detect in the observational record. Here, the authors show that a physically interpretable anthropogenic impact on extreme precipitation is detectable in global observational data sets.
Detection and attribution typically aims to find long-term climate signals in internal, often short-term variability. Here, common methods are extended to high-frequency temperature and humidity data, detecting instantaneous, global-scale climate change since 1999 for any year and 2012 for any day.
Climate models predict that by 2020, 20–55% of the three key ocean basins express an anthropogenic fingerprint of change. The well-ventilated Southern Ocean water masses are particularly sensitive, emerging as early as the 1980–1990s, consistent with observations of change over the past 30 years.
Heat waves have become increasingly frequent in the United States, but their occurrence is largely linked to natural variability. Model simulations reveal anthropogenically forced signals will first emerge in the western United States and Great Lakes regions by ~2030.
The contribution of human-induced climate change to global heavy precipitation and hot extreme events is quantified. The results show that of the moderate extremes, 18% of precipitation and 75% of high-temperature events are attributable to warming.
This work investigates when the anthropogenic signal in regional sea-level rise will emerge from natural variability. Considering thermal expansion and changes in density and circulation, 50% of the global ocean will show an anthropogenic signal by the early-to-mid 2040s, whereas when all variables are considered, the anthropogenic signal will emerge in over 50% of the global ocean by 2020. This is substantially earlier than for surface air temperature and has little dependence on emissions scenarios.
The impact of external influences on European temperatures before 1900 has been thought to be negligible. An analysis of reconstructions of seasonal European land temperatures and simulations from three global climate models instead suggests that external forcing is responsible for a best guess of 75% of the observed winter warming since the late seventeenth century.
A significant effect of anthropogenic activities has already been detected in observed trends in temperature and mean precipitation. But so far, no study has formally identified such a human fingerprint on extreme precipitation — an increase in which is one of the central theoretical expectations for a warming climate. This study compares observations and simulations and detects a statistically significant effect of increased greenhouse gases on observed increases in extreme precipitation events over much of the Northern Hemisphere land area.
The relative importance of regional and global changes in atmospheric greenhouse gas and aerosol concentrations for regional changes in climate is not well known. A climate model analysis of tropical, mid-latitude and polar regions shows that the extratropics and, in particular, the Arctic region are sensitive to local changes in radiative forcing.
Potential energy landscape models are often used to describe transitions in the glassy state. Here, the authors report that the landscape is much rougher than usually assumed, and demonstrate that it undergoes a transition to fractal basins before the jamming point is reached.
The physics that underlies the glass transition is both subtle and non-trivial. A machine learning approach based on graph networks is now shown to accurately predict the dynamics of glasses over a wide range of temperatures, pressures and densities.
Artificial neural networks now allow the dynamics of supercooled liquids to be predicted from their structure alone in an unprecedented way, thus providing a powerful new tool to study the physics of the glass transition.
The response of amorphous solids to external stress is not very well understood. A study now shows that certain glasses, upon decreasing temperature, undergo a phase transition characterized by diverging nonlinear elastic moduli.
While temperature chaos is an equilibrium notion that denotes the extreme fragility of the glassy phase with respect to temperature changes, it remains unclear whether it is present in non-equilibrium dynamics. Here the authors use the Janus II supercomputer to prove the existence of dynamic temperature chaos, a nonequilibrium phenomenon that closely mimics equilibrium temperature chaos.
Replica symmetry breaking, in which identical systems subject to identical conditions evolve to different end states, has been predicted to occur in many contexts but has yet to be observed experimentally. Ghofraniha et al.report evidence for its occurrence in the pulse-to-pulse variations of a random laser.
Replica symmetry breaking describes identical copies of a randomly interacting system exhibiting different dynamics. Here, Pierangeli et al. observe this critical phenomenon in the optical wave propagation inside a disordered nonlinear waveguide.
The K-satisfability problem is a combinatorial discrete optimization problem, which for K=3 is NP-complete, and whose random formulation is of interest for understanding computational complexity. Here, the authors introduce the backtracking survey propagation algorithm for studying it for K=3 and K=4.
Quantum annealing is expected to solve certain optimization problems more efficiently, but there are still open questions regarding the functioning of devices such as D-Wave One. A numerical and experimental investigation of its performance shows evidence for quantum annealing with 108 qubits.
Automated learning from data by means of deep neural networks is finding use in an ever-increasing number of applications, yet key theoretical questions about how it works remain unanswered. A physics-based approach may help to bridge this gap.
Disorder and geometric frustration usually lead to magnetic spins that point in random directions, as in a spin glass. So how can spin-glass behaviour emerge in a well-ordered system without static frustration? The presence of ‘dynamic frustration’ may explain the situation.
Coexistence of a spin-glass phase with antiferromagnetism in an intercalated crystal produces a large exchange bias effect. This is due to the interplay of disorder and frustration.
NMR and ultrasound measurements show that the spin-glass phase exists in a cuprate all the way up to the doping that marks the end of the pseudogap phase. This highlights the possible connection between the pseudogap and Mott physics.