ArXiv Preprint
Deploying graph neural networks (GNNs) on whole-graph classification or
regression tasks is known to be challenging: it often requires computing node
features that are mindful of both local interactions in their neighbourhood and
the global context of the graph structure. GNN architectures that navigate this
space need to avoid pathological behaviours, such as bottlenecks and
oversquashing, while ideally having linear time and space complexity
requirements. In this work, we propose an elegant approach based on propagating
information over expander graphs. We provide an efficient method for
constructing expander graphs of a given size, and use this insight to propose
the EGP model. We show that EGP is able to address all of the above concerns,
while requiring minimal effort to set up, and provide evidence of its empirical
utility on relevant datasets and baselines in the Open Graph Benchmark.
Importantly, using expander graphs as a template for message passing
necessarily gives rise to negative curvature. While this appears to be
counterintuitive in light of recent related work on oversquashing, we
theoretically demonstrate that negatively curved edges are likely to be
required to obtain scalable message passing without bottlenecks. To the best of
our knowledge, this is a previously unstudied result in the context of graph
representation learning, and we believe our analysis paves the way to a novel
class of scalable methods to counter oversquashing in GNNs.
Andreea Deac, Marc Lackenby, Petar Veličković
2022-10-06