We introduce a kernel-based framework for learning operators between Banach spaces. We show that even with simple kernels, our approach is competitive in terms of cost-accuracy trade-off and either matches or beats the performance of NN methods on a majority of PDE-based benchmarks. Additionally, our framework offers several advantages inherited from kernel methods: simplicity, interpretability, convergence guarantees, a priori error estimates, and Bayesian UQ. It is, therefore, a natural benchmark for operator learning problems.
Matthieu Darcy is a second-year PhD student in the Computing and Mathematical Sciences Department at Caltech, working under the supervision of Houman Owhadi. His research focuses on scientific machine learning, particularly on the application of kernel methods to differential equations and dynamical systems. Before joining Caltech, he obtained an MSc from Imperial College and a Master's from ENS Paris-Saclay.
To become a member of the Rough Path Interest Group, register here for free.