The open access article was written by Andrea Bonfanti, Roberto Santana, Marco Ellero and Babak Gholami. The abstract reads as follows:
Generalization is a key property of machine learning models to perform accurately on unseen data. Conversely, in the field of scientific machine learning (SciML), generalization entails not only predictive accuracy but also the capacity of the model to encapsulate underlying physical principles. In this paper, we delve into the concept of generalization for Physics-informed neural networks (PINNs) by investigating the consistency of the predictions of a PINN outside of its training domain. Through the lenses of a novel metric and statistical analysis, we study the scenarios in which a PINN can provide consistent predictions outside the region considered for training and hereinafter assess whether the algorithmic setup of the model can influence its potential for generalizing. Our results highlight why overparametrization is not a crucial component in SciML while encouraging overfitting on the training data. Despite being counterintuitive, the outcome of our analysis serves as a guideline for training PINNs for engineering applications.