Physics-inspired learning on graphs

发布者:Michael Bronstein 发布时间:2024-09-24 浏览量:36

报告信息

报告题目:Physics-inspired learning on graphs

报告人:Michael Bronstein,牛津大学讲席教授

报告时间:2024年7月3日(星期三)15:00-16:30

报告地点:上海交通大学闵行校区老行政楼401室

主持人:严骏驰,上海交通大学人工智能学院教授


Biography

Michael Bronstein,University of Oxford

Michael Bronstein is the DeepMind Professor of AI at the University of Oxford. He previously served as Head of Graph Learning Research at Twitter, professor at Imperial College London, and held visiting appointments at Stanford, MIT, and Harvard. He is the recipient of the Royal Society Wolfson Research Merit Award, Royal Academy of Engineering Silver Medal, Turing World-Leading AI Research Fellowship, five ERC grants, two Google Faculty Research Awards, and two Amazon AWS ML Research Awards. He is a Member of the Academia Europaea, Fellow of IEEE, IAPR, BCS, and ELLIS, ACM Distinguished Speaker, and World Economic Forum Young Scientist. In addition to his academic career, Michael is a serial entrepreneur and founder of multiple startup companies, including Novafora, Invision (acquired by Intel in 2012), Videocites, and Fabula AI (acquired by Twitter in 2019). He is the Chief Scientist at VantAI and scientific advisor at Recursion Pharmaceuticals.


Abstract

The message-passing paradigm has been the “battle horse” of deep learning on graphs for several years, making graph neural networks a big success in a wide range of applications, from particle physics to protein design. From a theoretical viewpoint, it established the link to the Weisfeiler-Lehman hierarchy, allowing to analyse the expressive power of GNNs. We argue that the very “node-and-edge”-centric mindset of current graph deep learning schemes may hinder future progress in the field. As an alternative, we propose physics-inspired “continuous” learning models that open up a new trove of tools from the fields of differential geometry, algebraic topology, and differential equations so far largely unexplored in graph ML.