This was part of Data Sciences for Mesoscale and Macroscale Materials Models

Multifidelity stacking networks for physics-informed training

Amanda Howard, Pacific Northwest National Laboratory (PNNL)

Monday, May 13, 2024



Slides
Abstract:

Physics-informed neural networks and operator networks have shown promise for effectively solving equations modeling physical systems. However, these networks can be difficult or impossible to train accurately for some systems of equations. One way to improve training is through the use of a small amount of data, however, such data is expensive to produce. We will introduce our novel multi-fidelity framework for stacking physics-informed neural networks and operator networks that facilitates training by progressively reducing the errors in our predictions for when no data is available. In stacking networks, we successively build a chain of networks, where the output at one step can act as a low-fidelity input for training the next step, gradually increasing the expressivity of the learned model.