TY - JOUR
T1 - Machine Learning for Stochastic Parameterization
T2 - Generative Adversarial Networks in the Lorenz '96 Model
AU - Gagne, David John
AU - Christensen, Hannah M.
AU - Subramanian, Aneesh C.
AU - Monahan, Adam H.
N1 - Publisher Copyright:
©2020. The Authors.
PY - 2020/3/1
Y1 - 2020/3/1
N2 - Stochastic parameterizations account for uncertainty in the representation of unresolved subgrid processes by sampling from the distribution of possible subgrid forcings. Some existing stochastic parameterizations utilize data-driven approaches to characterize uncertainty, but these approaches require significant structural assumptions that can limit their scalability. Machine learning models, including neural networks, are able to represent a wide range of distributions and build optimized mappings between a large number of inputs and subgrid forcings. Recent research on machine learning parameterizations has focused only on deterministic parameterizations. In this study, we develop a stochastic parameterization using the generative adversarial network (GAN) machine learning framework. The GAN stochastic parameterization is trained and evaluated on output from the Lorenz '96 model, which is a common baseline model for evaluating both parameterization and data assimilation techniques. We evaluate different ways of characterizing the input noise for the model and perform model runs with the GAN parameterization at weather and climate time scales. Some of the GAN configurations perform better than a baseline bespoke parameterization at both time scales, and the networks closely reproduce the spatiotemporal correlations and regimes of the Lorenz '96 system. We also find that, in general, those models which produce skillful forecasts are also associated with the best climate simulations.
AB - Stochastic parameterizations account for uncertainty in the representation of unresolved subgrid processes by sampling from the distribution of possible subgrid forcings. Some existing stochastic parameterizations utilize data-driven approaches to characterize uncertainty, but these approaches require significant structural assumptions that can limit their scalability. Machine learning models, including neural networks, are able to represent a wide range of distributions and build optimized mappings between a large number of inputs and subgrid forcings. Recent research on machine learning parameterizations has focused only on deterministic parameterizations. In this study, we develop a stochastic parameterization using the generative adversarial network (GAN) machine learning framework. The GAN stochastic parameterization is trained and evaluated on output from the Lorenz '96 model, which is a common baseline model for evaluating both parameterization and data assimilation techniques. We evaluate different ways of characterizing the input noise for the model and perform model runs with the GAN parameterization at weather and climate time scales. Some of the GAN configurations perform better than a baseline bespoke parameterization at both time scales, and the networks closely reproduce the spatiotemporal correlations and regimes of the Lorenz '96 system. We also find that, in general, those models which produce skillful forecasts are also associated with the best climate simulations.
KW - climate
KW - generative adversarial networks
KW - lorenz
KW - machine learning
KW - stochastic parameterization
KW - weather
UR - https://www.scopus.com/pages/publications/85083261086
U2 - 10.1029/2019MS001896
DO - 10.1029/2019MS001896
M3 - Article
AN - SCOPUS:85083261086
SN - 1942-2466
VL - 12
JO - Journal of Advances in Modeling Earth Systems
JF - Journal of Advances in Modeling Earth Systems
IS - 3
M1 - e2019MS001896
ER -