Efficient Uncertainty Quantification and Bayesian Analysis for Computational Mechanics at Scale
Please login to view abstract download link
Recent advances in machine learning have supercharged the usage of data-driven approaches in computational science and engineering. Given that datasets from experiments or high-resolution simulation of PDEs can be high-dimensional, computational frameworks are required that are able to process and analyse this data in a very efficient way. We are addressing these challenges using Korali [1], a high-performance framework for uncertainty quantification, optimization, and deep reinforcement learning. Korali’s engine provides support for large-scale HPC systems and a multi-language interface compatible with distributed computational models. For Bayesian Inverse Problems, Markov Chain Monte Carlo (MCMC) methods such as Transitional MCMC [2] are used to sample from hierarchical Bayesian posteriors in a massive parallel way. To optimize hyperparameters or other quantities of interest, gradient- free optimisation algorithms such as CMA-ES [3] and its extension to mixed-integer cases as well as gradient-based approaches are available. In case the evaluations of the model or the model gradients are expensive, a neural-network based surrogate model can be trained and used instead. We have defined an interface to couple Korali with multi-physics solvers such as MSolve [4] and deployed the resulting combined software framework on the Swiss supercomputer Piz Daint using a container-based approach. The framework can be executed massively parallel and scales well to multiple nodes. This work is part of DCoMEX [5], a European High Performance Computing Joint Undertaking project. We are planning to couple Korali with other multi-physics solvers to show its versatility and test the framework on other supercomputers.