?Equivariant Machine Learning,Structured Like Classical Physics?
Soledad Villar, PhD
Applied Mathematics and Statistics
Johns Hopkins University
Abstract:There has been enormous progressin the last few years in designing neural networks that respect the fundamentalsymmetries and coordinate freedoms of physical law. Some of these frameworksmake use of irreducible representations, some make use of high-order tensorobjects, and some apply symmetry-enforcing constraints. Different physical lawsobey different combinations of fundamental symmetries, but a large fraction(possibly all) of classical physics is equivariant to translation,rotation,reflection (parity), boost (relativity), and permutations. Here we show that itis simple to parameterize universally approximating polynomial functions thatare equivariant under these symmetries, or under the Euclidean, Lorentz, andPoincaré groups, at any dimensionality d. Thekey observation is thatnonlinear O(d)-equivariant (and related-group-equivariant) functions can beuniversally expressed in terms of a lightweight collection of scalars — scalarproducts and scalar contractions of the scalar, vector, and tensor inputs. Wecomplement our theory with numerical examples that show that the scalar-basedmethod is simple, efficient, and scalable.
Biography: SoledadVillar is anAssistant Professor in the Applied Mathematics and Statisticsdepartment atJohns Hopkins University. She co-organizes the MINDS/CIS as wellas the AMSseminar. If you want to suggest (and host) a speaker in 2022, youcan contact her at email@example.com.We are hoping to have in-person talks soon.