IgaNets: Physics-Informed Machine Learning Embedded Into Isogeometric Analysis

  • Möller, Matthias (Delft University of Technology)

Please login to view abstract download link

Many engineering problems of practical interest are modelled by systems of partial differential equations equipped with initial and boundary conditions and complemented by problem-specific constitutive laws. For decades, numerical methods like the finite element method have been the method of choice for computing approximate solutions to problems that cannot be solved analytically. Starting with the seminal paper on physics-informed neural networks (PINNs) a new paradigm has entered the stage: learning the behavior of the problem instead of discretizing it and solving the resulting linear(ized) systems of equations brute-force. Their ease of implementation and fast response time, once training is completed, makes learning-based methods particularly attractive for engineering applications as they offer the opportunity to explore many different designs without costly simulation. In this talk we propose a novel approach to embed the PINN paradigm into the framework of Isogeometric Analysis (IGA) to combine the best of both worlds. IGA is an extension of the finite element method that integrates the simulation-based analysis into the computer-aided geometric design pipeline. In short, the same mathematical formalism, namely (adaptive) B-splines or NURBS, that is used to model the geometry is adopted to represent the approximate solution, which is computed following the same strategy as in classical finite elements. In contrast to classical PINNs, which predict point-wise solution values to (initial-)boundary-value problems directly, our IgaNets learn solutions in terms of their expansion coefficients relative to a given B-Spline or NURBS basis. This approach is furthermore used to encode the geometry and other problem parameters such as boundary conditions and parameters of the constitutive laws and feed them into the feed-forward neural network as inputs. Once trained, our IgaNets make it possible to explorer various designs from a family of similar problem configurations efficiently without the need to perform a computationally expensive simulation for each new problem configuration. Next to discussing the method conceptually and presenting numerical results, we will shed some light on the technical details of our C++ reference implementation in Torch. In particular, we will discuss matrix-based implementation of B-splines that is particularly suited for efficient backpropagation.