Graph Neural Networks and Transformer workshop
Graph Neural Networks and Transformers are neural network architectures which are quickly gaining in popularity due to how many problems can easily be modeled as graphs and sets. In this workshop we will take a deep dive into these architecture and how you can use them to solve complex problems where the input domain can be of different size.
This course assumes familiarity with deep learning, attendants should have implemented neural networks before using PyTorch. It also assumes basic understanding of linear algebra. Experience in working with graphs will make the material easier to go through, but is not considered a prerequsite.
The material is divided into four main blocks:
Who is the course for?
This course is aimed at students, researchers, engineers and programmers who are already familiar with implementing neural networks in PyTorch and wants to understand the details about graph neural networks and transformers. The focus is on explaining the architectures and the choices involved in implementing a GNN, but does not look into detail of how to model diverse problems using the frameworks. The lessons assumes that the participant is familiar with:
Programming in PyTorch - having implemented and trained a simple neural network
Linear algebra - Basic understanding of linear transformations and vector arithmetics
Machine learning - Basic understanding of training procedures and data issues
About the course
The lesson material is licensed under CC-BY-SA-4.0 and can be reused in any form (with appropriate credit) in other courses and workshops. Instructors who wish to teach this lesson can refer to the Instructor’s guide for practical advice.
The workshop material is based on Jupyter Notebooks designed for Google Colaboratory. You can find links to the noteboks under each lesson block. Before starting to work on the notebooks, you should create your own copy under File->Save a copy in Drive. This way changes you make to the notebook will remain, otherwise they will be lost when you close the colab window.
There are a growing number of resourses for graph neural network, here are good places to start:
Stanford CS224W: Machine Learning with Graphs. This material covers all manner of statistical learning on graphs, as well as many fundamental topics from graph theory. The lectures and notebook exercises from previous offerings of the course are available online.
Geometrical Deep Learning : The Erlangen program of ML: This work by Bronstein et. al. takes a unifying perspective on Graph Neural Networks and show how it encapsulates very general ideas in deep learning. There are multiple treatments, all of which are excellent:
Invited talk at ICLR 2021 lays out the ideas
The pre-print book Geometric Deep Learning: Grids, Groups, Graphs, Geodesics and Gauges expands on this.
A series of video lecture covers the material in depth.
PyTorch Geometric: This framework is quickly becoming the de-facto package for working with Graph Neural Networks. It doesn’t support the Transformer approach we show in this workshop, but for regular Graph Neural Networks it’s a high level framework with efficient kernels for sparse computation.