Logistics
Instructor: Arlei Silva
Lectures: Monday 2:002:50, room: MEB 128, building: Mechanical Engineering Building
Summary
Machine learning on graphs (MLG) is an exciting and growing research topic mainly for two reasons: (1) Many relevant realworld problems (in recommendation, infrastructure, healthcare, etc.) have some structure that can be captured as nodes, edges, and their attributes; and (2) machine learning has been the hottest topic in computer science in the past decade, so it has impacted the entire field. In this seminar, we will discuss some of the most recent papers on deep learning on graphs (Graph Neural Networks), which arguably has become the goto approach for MLG. Many of these developments can be posed as 'How can I do X on a graph?', where 'X' might be convolution, pooling, reinforcement learning, etc. However, deep learning also provides a common framework with the potential to solve most (if not all) problems in network science, graph mining, and even graph algorithms, by fitting welldesigned functions using data (sometimes lots of it). In the last six years, at least three books, 20 surveys, and hundreds of journal and conference papers supporting this claim have been published. This seminar is an opportunity to read and discuss the stateoftheart in machine learning on graphs.
Credit Hours: 13
This is a one credit seminar. It can be converted to 3 credits with the addition of a class project or a (high quality) survey (both subject to the instructor’s approval).
Mission
Read and discuss recent papers on machine learning on graphs.
Target Audience
 Graduate students working on machine learning, network science, graphs, etc. that want to broaden their knowledge on machine learning with graphs;
 Undergraduate students who want to be exposed to machine learning beyond the classical problems (clustering, classification, regression, etc.);
 Anyone else who is interested.
Recommended Prerequisites
A course on machine learning (some examples are COMP 502, COMP 540, COMP 542, COMP 576, COMP 602, COMP 6403, COMP 680).
Course Materials
The seminar will be focused on research papers that are publicly available.
Format
The seminar will be composed of two introductory lectures and paper presentations with discussions. Each presentation will be 30 minutes long and will be followed by 20 minutes of discussion. Each student should submit reviews for five papers of their choice during the semester. The reviews should follow the
NeurIPS'22 guidelines and should be submitted the day before the presentation.
Tips for a good presentation:
 Provide the relevant background;
 Make sure the problem, motivation, and assumptions are clear;
 Skip the details and focus on the major contribution;
 Don't limit yourself to the paper, make connections with other work;
 Criticise the paper, point out how it could be improved.
You can contribute to the discussion in the following ways:
 Complementing the presentation with relevant information;
 Criticizing the paper in terms of contributions, soundness, presentation, ethics, etc.;
 Asking questions you had while reading the paper;
 Proposing novel research based on the paper;
 Suggesting other papers for further reading;
 Commenting on the how the paper fits the author(s) broader research agenda.
Grading
Grading will be Pass/Fail.
 Read each paper presented;
 Write a review for 5 out of 10 papers;
 Participate of the paper discussions;
 Present at least one paper.
Logistics
Paper reviews and slides should be submitted on Canvas. Questions should be submitted on Piazza. For personal matters, please email the instructor.
Rice Honor Code
Students are expected to adhere to the
Rice Honor Code. You are encouraged to collaborate and to find resources online. However, all the material to be graded is expected to be original. The exceptional use of someone else's work, such as a figure, should be properly recognized.
Students with Disabilities
If you have a documented disability that may affect academic performance, you should: 1) make sure this documentation is on file with Disability Support Services (Allen Center, Room 111 / adarice@rice.edu / x5841) to determine the accommodations you need; and 2) meet with me to discuss your accommodation needs.
Related Courses at Rice
 COMP 559: Machine Learning with Graphs (taught by Prof. Arlei Silva in the Spring).
 ELEC 573: Network Science and Analytics (taught by Prof. Segarra in the Fall);
 ECE677001: Distributed Optimization and ML (taught by Prof. Uribe in the Fall);
 CAAM 570: Graph Theory (taught by Prof. Hicks in the Spring).
Schedule
 08/22: Logistics, overview, intro
 08/29: Graph Neural Networks
 09/05: Labor day
 09/12: Paper presentation 1
 09/19: Paper presentation 2
 09/26: Paper presentation 3
 10/03: Paper presentation 4
 10/10: Midterm recess
 10/17: Paper presentation 5
 10/24: Paper presentation 6
 10/31: Paper presentation 7
 11/07: Paper presentation 8
 11/14: Paper presentation 9
 11/21: Paper presentation 10
 11/28: Discussion
Suggested topics and papers

Architecture: e.g. pooling, aggregation, depth
 Yang et al. Graph neural networks inspired by classical iterative algorithms. ICML, 2021
 Alon and Yahav. On the bottleneck of graph neural networks and its practical implications. ICLR, 2020

Theory: e.g. expressive power, invariances
 Ganea et al. Independent se (3)equivariant models for endtoend rigid protein docking. ICLR, 2021
 Xu et al. Optimization of graph neural networks: Implicit acceleration by skip connections and more depth. ICML, 2021

Supervised learning: e.g. node classification, graph classification, link prediction
 Qu et al. Neural structured prediction for inductive node classification. ICLR, 2021
 Baek et al. Accurate learning of graph representations with graph multiset pooling. ICLR, 2020

Scalability and systems: e.g. distributed algorithms, simplified architectures
 Sriram et al. Towards training billion parameter graph neural networks for atomic simulations. ICLR, 2021
 Kaler et al. Accelerating training and inference of graph neural networks with fast sampling and pipelining. MLSys, 2022

Fairness and privacy: e.g. information leakage, private computation
 Agarwal et al. Towards a unified framework for fair and stable graph representation learning. UAI, 2021
 Liao et al. Information obfuscation of graph neural networks. ICML, 2021

Reinforcement learning: e.g. policy learning
 Paliwal et al. Reinforced genetic algorithm learning for optimizing computation graphs. ICLR, 2019
 Trivedi et al. Graphopt: Learning optimization models of graph formation. ICML, 2020.

Alternatives: e.g. label propagation, Markov Random Fields, transformers
 Ying et al. Do transformers really perform badly for graph representation? Neurips, 2021.
 Huang et al. Combining label propagation and simple models outperforms graph neural networks. ICLR, 2020

Interpretability: e.g. GNN explainers
 Miao et al. Interpretable and generalizable graph learning via stochastic attention mechanism. ICML, 2022.

Generalizations: e.g. hypergraphs, heterogenous graphs, dynamic graphs
 Chen et al. Tamps2gcnets: coupling timeaware multipersistence knowledge representation with spatiosupragraph convolutional networks for timeseries forecasting. ICLR, 2021.
 Bodnar et al. Weisfeiler and lehman go topological: Message passing simplicial networks. ICML, 2021.

Unsupervised learning: e.g. autoencoders, selfsupervised learning
 Hassani and Khasahmadi. Contrastive multiview representation learning on graphs. ICML, 2020.
 Zhang and Li. Nested graph neural networks. Neurips, 2021.

Adversarial learning: e.g. attacks, robustness
 Wu et al. Graph information bottleneck. Neurips, 2020.
 Xi et al. Graph backdoor. USENIX Security, 2021.

Healthcare and bioinformatics: e.g. drug discovery, diagnosis, epidemics forecasting
 Kim et al. Learning dynamic graph representation of brain connectome with spatiotemporal attention. Neurips, 2021

NLP and vision: e.g. translation, word embeddings, questionanswering, text classification
 Shen et al. Unsupervised dependency graph network. ACL, 2022.
 Yuxian Meng, Shi Zong, Xiaoya Li, Xiaofei Sun, Tianwei Zhang, Fei Wu, and Jiwei Li. Gnnlm:
Language modeling based on global contexts via gnn. In International Conference on
Learning Representations, 2021.

Knowledge graphs: e.g. completion, alignment, reasoning
 Yu et al. Kgfid: Infusing knowledge graph in fusionindecoder for opendomain question answering. ACL, 2022.
 Zhang et al. Greaselm: Graph reasoning enhanced language models. ICLR, 2021.

Traffic and mobility: e.g. traffic forecasting, trajectory prediction
 Wang et al.
Metro passenger flow prediction via dynamic hypergraph convolution networks. IEEE Transactions
on Intelligent Transportation Systems, 2021.
 Bai et al. Adaptive graph convolutional recurrent network for traffic forecasting. Neurips, 2020.

Physical sciences: e.g. physical reasoning, simulation, PDE solving
 Lienen and Gunnemann. Learning the dynamics of physical systems from sparse observations with finite element networks. ICLR, 2021.

Program analysis: e.g. type inference, code summarization, bug detection
 Dinella et al. Hoppity: Learning graph transformations to detect and fix bugs in programs. ICLR, 2020.
 Pashakhanloo et al. Codetrek: Flexible modeling of code using an extensible relational representation. ICLR, 2021.

Hyperparameter tuning and AutoML: e.g. architecture search, graph learning
 Chen et al. A unified lottery ticket hypothesis for graph neural networks. ICML, 2021.
 Zhang et al. Deep and flexible graph neural architecture search. ICML, 2022.

Robotics: e.g. motion planning
 Zhou et al. Multirobot collaborative perception with graph neural networks. IEEE Robotics and Automation Letters, 2022.
 Yu and Gao. Reducing collision checking for samplingbased motion planning using graph neural networks. Neurips, 2021.

Algorithms: e.g. learning graph algorithms
 Bai et al. Glsearch: Maximum common subgraph detection via learning to search. ICML, 2021.
 Meirom et al. Optimizing tensor network contraction using reinforcement learning. ICML, 2022.
Additional references
 William L. Hamilton. Graph representation learning. Synthesis Lectures on Artificial Intelligence and Machine Learning. Morgan & Claypool Publishers, 2020
 Yao Ma, Jiliang Tang. Deep Learning on Graphs. Cambridge University Press, 2020
 Zonghan Wu, Shirui Pan, Fengwen Chen, Guodong Long, Chengqi Zhang, Philip S. Yu. A Comprehensive Survey on Graph Neural Networks. arxiv 2019
 Ziwei Zhang, Peng Cui, Wenwu Zhu. Deep Learning on Graphs: A Survey. arxiv 2018
 Ines Chami, Sami AbuElHaija, Bryan Perozzi, Christopher Ré, Kevin Murphy. Machine learning on graphs: A model and comprehensive taxonomy. ArXiv 2020
 Faezeh Faez, Yassaman Ommi, Mahdieh Soleymani Baghshah, Hamid R. Rabiee. Graph deep learning: State of the art and challenges. IEEE Access 2021
 N. A. Asif et al. Graph Neural Network: A Comprehensive Review on NonEuclidean Space. IEEE Access 2021
Other resources