Abstract
This project situates itself in the area of graph learning, an increasingly popular area in machine learning,
and focusses on the development of a theoretical framework for designing and analyzing expressive, yet efficient,
graph neural networks.
In spite of advances in hardware, when designing graph neural networks one has to take efficiency into
consideration. This implies, for example, that most graph neural networks use update functions that require
a linear amount of computation. A consequence is that such networks can only learn simple functions.
Although more advanced graph neural networks have been proposed, which can learn more complex functions, their applicability is limited. This is due to the fact that quadratic (or more) computation is needed, which is out of reach of large graph datasets.
In this project, we aim to understand what graph neural networks can achieve *in-between* this linear and
quadratic cost. We propose to formalize, study and analyze sub-quadratic graph neural networks. Such
networks are still feasible (less than quadratic) and still powerful (more than what linear networks can
achieve). Furthermore, a number of very recent graph neural networks fall into this sub-quadratic category.
Apart from developing a mathematical framework for sub-quadratic graph neural networks, we also study
their capabilities, both from a theoretical and practical point of view.
Researcher(s)
Research team(s)
Project type(s)