Chris Eliasmith

Slides      Video

Abstract

We have recently proposed a new kind of neural network, called a Legendre Memory Unit (LMU) that is provably optimal for compressing streaming time series data. In this talk, I describe this network, and a variety of state-of-the-art results that have been set using the LMU. I will include recent results on speech and language applications that demonstrate significant improvements over transformers. I will discuss variants of the original LMU that permit effective scaling on current GPUs and hold promise to provide extremely efficient edge time series processing.

Our speaker

Professor Chris Eliasmith holds the Canada Research Chair in Theoretical Neuroscience. He is the co-inventor of the Neural Engineering Framework (NEF), the Nengo neural development environment, and the Semantic Pointer Architecture (SPA), all of which are dedicated to leveraging our understanding of the brain to advance AI efficiency and scale. His team has developed Spaun, the world's largest functional brain simulation. He won the prestigious NSERC Polanyi Award for this research. Chris has published two books, and over 120 journal articles and patents. He is jointly appointed in the Philosophy and Systems Design Engineering departments, as well being cross-appointed to Computer Science. He is the founding director of the Centre for Theoretical Neuroscience (CTN) at the University of Waterloo. Chris has a Bacon-Erdos number of 8.