Abstract
In this talk, we present simple ideas to combine nonparametric approaches based on positive definite kernels with deep learning models. There are many good reasons for bridging these two worlds. On the one hand, we want to provide regularization mechanisms and a geometric interpretation to deep learning models, as well as a functional space that allows to study their theoretical properties (eg invariance and stability). On the other hand, we want to bring more adaptivity and scalability to traditional kernel methods, which are crucially lacking. We will start this presentation by introducing models to represent graph data, then move to biological sequences, and images, showing that our hybrid models can achieves state-of-the-art results for many predictive tasks, especially when large amounts of annotated data are not available. This presentation is based on joint works with Alberto Bietti, Dexiong Chen, and Laurent Jacob.