Дослідники, в тому числі з Google, активно вивчають, як шум впливає на квантові системи та чи можуть вони все ще пропонувати обчислювальні переваги над класичними системами.

#Tech #Google #NISQ #RCS #XEB #Квантовіобчислення

https://thetransmitted.com/tech/google-ai-vyvchaye-sposoby-oczinky-produktyvnosti-kvantovyh-kompyuteriv-u-prysutnosti-shumu/

Google AI вивчає способи оцінки продуктивності квантових комп'ютерів у присутності шуму | TheTransmitted

Дослідники, в тому числі з Google, активно вивчають, як шум впливає на квантові системи та чи можуть вони все ще пропонувати обчислювальні переваги над класичними системами.

TheTransmitted

Title: Thermal tides in neutrally stratified atmospheres: Revisiting the Earth's Precambrian rotational equilibrium.

Rotational dynamics of the Earth, over geological timescales, have profoundly
affected local and global climatic evolution, probably contributing to the
evolution of life. To better retrieve the Earth's rotational history, and [...]

Authors: Mohammad Farhat, Pierre Auclair-Desrotour, Gwenaël Boué, Russell Deitrick, Jacques Laskar

Link: http://arxiv.org/abs/2309.11946

Thermal tides in neutrally stratified atmospheres: Revisiting the Earth's Precambrian rotational equilibrium

Rotational dynamics of the Earth, over geological timescales, have profoundly affected local and global climatic evolution, probably contributing to the evolution of life. To better retrieve the Earth's rotational history, and motivated by the published hypothesis of a stabilized length of day during the Precambrian, we examine the effect of thermal tides on the evolution of planetary rotational motion. The hypothesized scenario is contingent upon encountering a resonance in atmospheric Lamb waves, whereby an amplified thermotidal torque cancels the opposing torque of the oceans and solid interior, driving the Earth into a rotational equilibrium. With this scenario in mind, we construct an ab initio model of thermal tides on rocky planets describing a neutrally stratified atmosphere. The model takes into account dissipative processes with Newtonian cooling and diffusive processes in the planetary boundary layer. We retrieve from this model a closed-form solution for the frequency-dependent tidal torque which captures the main spectral features previously computed using 3D general circulation models. In particular, under longwave heating, diffusive processes near the surface and the delayed thermal response of the ground prove to be responsible for attenuating, and possibly annihilating, the accelerating effect of the thermotidal torque at the resonance. When applied to the Earth, our model prediction suggests the occurrence of the Lamb resonance in the Phanerozoic, but with an amplitude that is insufficient for the rotational equilibrium. Interestingly, though our study was motivated by the Earth's history, the generic tidal solution can be straightforwardly and efficiently applied in exoplanetary settings.

arXiv.org

Title: Theoretical tools for understanding the climate crisis from Hasselmann's program and beyond.

Klaus Hasselmann's revolutionary intuition in climate science was to take
advantage of the stochasticity associated with fast weather processes to probe
the slow dynamics of the climate system. This has led to fundamentally new ways
to study the response of climate models to perturbations, and to perfo [...]

Authors: Valerio Lucarini, Mickaël Chekroun

Link: http://arxiv.org/abs/2303.12009

Theoretical tools for understanding the climate crisis from Hasselmann's program and beyond

Klaus Hasselmann's revolutionary intuition in climate science was to take advantage of the stochasticity associated with fast weather processes to probe the slow dynamics of the climate system. This has led to fundamentally new ways to study the response of climate models to perturbations, and to perform detection and attribution for climate change signals. Hasselmann's program has been extremely influential in climate science and beyond. We first summarise the main aspects of such a program using modern concepts and tools of statistical physics and applied mathematics. We then provide an overview of some promising scientific perspectives that might better clarify the science behind the climate crisis and that stem from Hasselmann's ideas. We show how to perform rigorous model reduction by constructing parametrizations in systems that do not necessarily feature a time-scale separation between unresolved and resolved processes. We propose a general framework for explaining the relationship between climate variability and climate change, and for performing climate change projections. This leads us seamlessly to explain some key general aspects of climatic tipping points. Finally, we show that response theory provides a solid framework supporting optimal fingerprinting methods for detection and attribution.

arXiv.org

Title: Theoretical tools for understanding the climate crisis from Hasselmann's program and beyond.

Klaus Hasselmann's revolutionary intuition in climate science was to take
advantage of the stochasticity associated with fast weather processes to probe
the slow dynamics of the climate system. This has led to fundamentally new ways
to study the response of climate models to perturbations, and to perfo [...]

Authors: Valerio Lucarini, Mickaël Chekroun

Link: http://arxiv.org/abs/2303.12009

Theoretical tools for understanding the climate crisis from Hasselmann's program and beyond

Klaus Hasselmann's revolutionary intuition in climate science was to take advantage of the stochasticity associated with fast weather processes to probe the slow dynamics of the climate system. This has led to fundamentally new ways to study the response of climate models to perturbations, and to perform detection and attribution for climate change signals. Hasselmann's program has been extremely influential in climate science and beyond. We first summarise the main aspects of such a program using modern concepts and tools of statistical physics and applied mathematics. We then provide an overview of some promising scientific perspectives that might better clarify the science behind the climate crisis and that stem from Hasselmann's ideas. We show how to perform rigorous model reduction by constructing parametrizations in systems that do not necessarily feature a time-scale separation between unresolved and resolved processes. We propose a general framework for explaining the relationship between climate variability and climate change, and for performing climate change projections. This leads us seamlessly to explain some key general aspects of climatic tipping points. Finally, we show that response theory provides a solid framework supporting optimal fingerprinting methods for detection and attribution.

arXiv.org

Title: Theoretical tools for understanding the climate crisis from Hasselmann's program and beyond.

Klaus Hasselmann's revolutionary intuition in climate science was to take
advantage of the stochasticity associated with fast weather processes to probe
the slow dynamics of the climate system. This has led to fundamentally new ways
to study the response of climate models to perturbations, and to perfo [...]

Authors: Valerio Lucarini, Mickaël Chekroun

Link: http://arxiv.org/abs/2303.12009

Theoretical tools for understanding the climate crisis from Hasselmann's program and beyond

Klaus Hasselmann's revolutionary intuition in climate science was to take advantage of the stochasticity associated with fast weather processes to probe the slow dynamics of the climate system. This has led to fundamentally new ways to study the response of climate models to perturbations, and to perform detection and attribution for climate change signals. Hasselmann's program has been extremely influential in climate science and beyond. We first summarise the main aspects of such a program using modern concepts and tools of statistical physics and applied mathematics. We then provide an overview of some promising scientific perspectives that might better clarify the science behind the climate crisis and that stem from Hasselmann's ideas. We show how to perform rigorous model reduction by constructing parametrizations in systems that do not necessarily feature a time-scale separation between unresolved and resolved processes. We propose a general framework for explaining the relationship between climate variability and climate change, and for performing climate change projections. This leads us seamlessly to explain some key general aspects of climatic tipping points. Finally, we show that response theory provides a solid framework supporting optimal fingerprinting methods for detection and attribution.

arXiv.org

Title: Water Condensation Zones around Main Sequence Stars.

Understanding the set of conditions that allow rocky planets to have liquid
water on their surface -- in the form of lakes, seas or oceans -- is a major
scien [...]

Authors: Martin Turbet, Thomas J. Fauchez, Jeremy Leconte, Emeline Bolmont, Guillaume Chaverot, Francois Forget, Ehouarn Millour, Franck Selsis, Benjamin Charnay, Elsa Ducrot, Michaël Gillon, Alice Maurel, Geronimo L. Villanueva

Link: http://arxiv.org/abs/2308.15110

Water Condensation Zones around Main Sequence Stars

Understanding the set of conditions that allow rocky planets to have liquid water on their surface -- in the form of lakes, seas or oceans -- is a major scientific step to determine the fraction of planets potentially suitable for the emergence and development of life as we know it on Earth. This effort is also necessary to define and refine the so-called "Habitable Zone" (HZ) in order to guide the search for exoplanets likely to harbor remotely detectable life forms. Until now, most numerical climate studies on this topic have focused on the conditions necessary to maintain oceans, but not to form them in the first place. Here we use the three-dimensional Generic Planetary Climate Model (PCM), historically known as the LMD Generic Global Climate Model (GCM), to simulate water-dominated planetary atmospheres around different types of Main-Sequence stars. The simulations are designed to reproduce the conditions of early ocean formation on rocky planets due to the condensation of the primordial water reservoir at the end of the magma ocean phase. We show that the incoming stellar radiation (ISR) required to form oceans by condensation is always drastically lower than that required to vaporize oceans. We introduce a Water Condensation Limit, which lies at significantly lower ISR than the inner edge of the HZ calculated with three-dimensional numerical climate simulations. This difference is due to a behavior change of water clouds, from low-altitude dayside convective clouds to high-altitude nightside stratospheric clouds. Finally, we calculated transit spectra, emission spectra and thermal phase curves of TRAPPIST-1b, c and d with H2O-rich atmospheres, and compared them to CO2 atmospheres and bare rock simulations. We show using these observables that JWST has the capability to probe steam atmospheres on low-mass planets, and could possibly test the existence of nightside water clouds.

arXiv.org

Title: The High-Frequency and Rare Events Barriers to Neural Closures of Atmospheric Dynamics.

Neural parameterizations and closures of climate and turbulent models have
raised a lot of interest in recent years. In this short paper, we point out two
fundamental problems in this endeavour, one tied to sampling issues due to rare
events, and the other one tied to t [...]

Authors: Mickaël D. Chekroun, Honghu Liu, Kaushik Srinivasan, James C. McWilliams

Link: http://arxiv.org/abs/2305.04331

The High-Frequency and Rare Events Barriers to Neural Closures of Atmospheric Dynamics

Recent years have seen a surge in interest for leveraging neural networks to parameterize small-scale or fast processes in climate and turbulence models. In this short paper, we point out two fundamental issues in this endeavor. The first concerns the difficulties neural networks may experience in capturing rare events due to limitations in how data is sampled. The second arises from the inherent multiscale nature of these systems. They combine high-frequency components (like inertia-gravity waves) with slower, evolving processes (geostrophic motion). This multiscale nature creates a significant hurdle for neural network closures. To illustrate these challenges, we focus on the atmospheric 1980 Lorenz model, a simplified version of the Primitive Equations that drive climate models. This model serves as a compelling example because it captures the essence of these difficulties.

arXiv.org

Title: Hasselmann's Program and Beyond: New Theoretical Tools for Understanding the Climate Crisis.

Klaus Hasselmann's revolutionary intuition was to take advantage of the
stochasticity associated with fast weather processes to probe the slow dynamics
of the climate system. This has led to fundamentally new ways to study the
response of climate models to perturbations, and to perform detection and
at [...]

Authors: Valerio Lucarini, Mickaël Chekroun

Link: http://arxiv.org/abs/2303.12009

Theoretical tools for understanding the climate crisis from Hasselmann's program and beyond

Klaus Hasselmann's revolutionary intuition in climate science was to take advantage of the stochasticity associated with fast weather processes to probe the slow dynamics of the climate system. This has led to fundamentally new ways to study the response of climate models to perturbations, and to perform detection and attribution for climate change signals. Hasselmann's program has been extremely influential in climate science and beyond. We first summarise the main aspects of such a program using modern concepts and tools of statistical physics and applied mathematics. We then provide an overview of some promising scientific perspectives that might better clarify the science behind the climate crisis and that stem from Hasselmann's ideas. We show how to perform rigorous model reduction by constructing parametrizations in systems that do not necessarily feature a time-scale separation between unresolved and resolved processes. We propose a general framework for explaining the relationship between climate variability and climate change, and for performing climate change projections. This leads us seamlessly to explain some key general aspects of climatic tipping points. Finally, we show that response theory provides a solid framework supporting optimal fingerprinting methods for detection and attribution.

arXiv.org

Title: An evaluation of deep learning models for predicting water depth evolution in urban floods.

In this technical report we compare different deep learning models for
prediction of water depth rasters at high spatial resolution. Efficient,
accurate, and fast methods for water depth prediction are nowaday [...]

Authors: Stefania Russo, Nathanaël Perraudin, Steven Stalder, Fernando Perez-Cruz, Joao Paulo Leitao, Guillaume Obozinski, Jan Dirk Wegner

Link: http://arxiv.org/abs/2302.10062

An evaluation of deep learning models for predicting water depth evolution in urban floods

In this technical report we compare different deep learning models for prediction of water depth rasters at high spatial resolution. Efficient, accurate, and fast methods for water depth prediction are nowadays important as urban floods are increasing due to higher rainfall intensity caused by climate change, expansion of cities and changes in land use. While hydrodynamic models models can provide reliable forecasts by simulating water depth at every location of a catchment, they also have a high computational burden which jeopardizes their application to real-time prediction in large urban areas at high spatial resolution. Here, we propose to address this issue by using data-driven techniques. Specifically, we evaluate deep learning models which are trained to reproduce the data simulated by the CADDIES cellular-automata flood model, providing flood forecasts that can occur at different future time horizons. The advantage of using such models is that they can learn the underlying physical phenomena a priori, preventing manual parameter setting and computational burden. We perform experiments on a dataset consisting of two catchments areas within Switzerland with 18 simpler, short rainfall patterns and 4 long, more complex ones. Our results show that the deep learning models present in general lower errors compared to the other methods, especially for water depths $>0.5m$. However, when testing on more complex rainfall events or unseen catchment areas, the deep models do not show benefits over the simpler ones.

arXiv.org

Title: Using uncertainty-aware machine learning models to study aerosol-cloud interactions.

Aerosol-cloud interactions (ACI) include various effects that result from
aerosols entering a cloud, and affecting cloud properties. In general, an
increase in aerosol concentration results in smaller droplet sizes which leads
to larger, brighter, longer-lasting clouds that reflect more sun [...]

Authors: Maëlys Solal, Andrew Jesson, Yarin Gal, Alyson Douglas

Link: http://arxiv.org/abs/2301.11921

Using uncertainty-aware machine learning models to study aerosol-cloud interactions

Aerosol-cloud interactions (ACI) include various effects that result from aerosols entering a cloud, and affecting cloud properties. In general, an increase in aerosol concentration results in smaller droplet sizes which leads to larger, brighter, longer-lasting clouds that reflect more sunlight and cool the Earth. The strength of the effect is however heterogeneous, meaning it depends on the surrounding environment, making ACI one of the most uncertain effects in our current climate models. In our work, we use causal machine learning to estimate ACI from satellite observations by reframing the problem as a treatment (aerosol) and outcome (change in droplet radius). We predict the causal effect of aerosol on clouds with uncertainty bounds depending on the unknown factors that may be influencing the impact of aerosol. Of the three climate models evaluated, we find that only one plausibly recreates the trend, lending more credence to its estimate cooling due to ACI.

arXiv.org