This presentation explains basic ideas of graph neural networks (GNNs) and their common applications. Primary target audiences are students, engineers and researchers who are new to GNNs but interested in using GNNs for their projects. This is a modified version of the course material for a special lecture on Data Science at Nara Institute of Science and Technology (NAIST), given by Preferred Networks researcher Katsuhiko Ishiguro, PhD.
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Â
Introduction to Graph Neural Networks: Basics and Applications - Katsuhiko Ishiguro, PhD
1. An Introduction to Graph Neural Networks:
basics and applications
Katsuhiko ISHIGURO, Ph. D (Preferred Networks, Inc.)
Oct. 23, 2020
1
Modified from the course material of:
Nara Institute of Science and Technology
Data Science Special Lecture
2. Take home message
⢠Graph Neural Networks (GNNs): Neural Networks (NNs) to
compute nodesâ representation of graph-structured data
⢠Practical applications in industry, hard competitions in academia
⢠Model: The fundamental is approximated Graph Convolution
⢠Applications: applicable in several tasks in different domains
2
3. Table of Contents (1/2)
⢠Graph Neural Networks (GNN)s; what is it
â Graph structure data
â Overview of GNN: function, application
⢠The basic and the de-facto standard: GCN
â Graph Convolutions and the GCN model
â Exemplar task: semi-supervised node classification with GCN
3
4. Table of Contents (2/2)
⢠Application in Chemo-Informatics: Protein Interface Prediction
â A straightforward application of GNN as I/O for graphs
⢠Application in Computer Vision: Scene Graph generation
â Task-tailored GNN model
⢠Theoretical issues of GNN
â Deep GNN does not work well
â Representation power of GNN is upper-bounded
⢠Conclusion and Materials
4
5. Acknowledgements
Many thanks for helpful comments and provided
materials!!
Daisuke Okanohara, Shin-ichi Maeda, Kentaro Minami,
Yuta Tsuboi, Sousuke Kobayashi, Kenta Oono, Hiroshi
Maruyama (Preferred Networks)
5
6. Graph: for relational data
⢠Graph
⢠Set of nodes (vertices): ďź
⢠Set of edges:
â directional/non-directional
⢠Especially interested in Attributed graph
â Nodes and/or edges have some features (label,
numbers, vectors) 6
7. Matrix representation of Attributed graph
⢠Suitable for formulations/implementations of GNNs
7
Feature Matrix
Adjacency Matrix
N nodes, D dim. features
0 1 1 0 0
1 0 0 1 1
1 0 0 0 1
0 1 0 0 0
0 1 1 0 0
Edge existence between
N X N node pairs
Symmetric A <-- non-directional edges
9. Interface for intelligent processes on graphs
⢠H is informative and easy-to-handle âdataâ for machine-
learning (ML) models/tools
9
L-layered
GNN
Informative features
ML-friendly structure (set of vectors)
Classification
Regression
Interpretation
Generative
models
10. Application Domains of GNNs: concrete
applications
Chemical Molecular Graph
[Jin18JTVAE, Laugier18CGCNN]
[http://konect.uni-
koblenz.de/networks/ucidata-
zachary]
Social Network
Scene Graph
[Qi19AttentiveRN]
Pointcloud [Landrieu_Boussaha19PCS]
Physical Simulator [Sanchez-Gonzalez20GNS]
Knowledge Graph
[DeCao19QA2]
10
11. GNN in Machine Learning Researches
11
0
10
20
30
40
50
60
70
80
90
ICML ICLR NeurIPS
Numbers of accepted papers: âtitle: Graphâ
2014 2015 2016 2017 2018 2019 2020 [by the lecturer]
~2016: âgraphical modelsâ, âgraph kernelsâ, and pure graph theory works
2017~: GNN works add up the counts
13. Convolution
⢠Modify a signal f by a filter g, accumulate over ``shiftââ t
on the coordinate (axes) x
13
abcpedia.acoustics.jp
1D signal example
f: Sound signal
g: Impulse response
ď Hall resonated sound
(reverb)
f g f*g
14. E.g. ConvNet in computer vision [Krizhevsky12ConvNet]
14 [http://image-net.org/challenges/LSVRC/2012/ilsvrc2012.pdf]
CnvNet (Alexnet)
[Krizhevsky12ConvNet]
10% margin
Former (Non-DNN) top runners:
> 1% difference
2012 ILSVRC competition
15. E.g. ConvNet in computer vision [Krizhevsky12ConvNet]
⢠L- iterative conv. Operations for image pixels
15
Layer l
Layer l-1
Image signal (feature) at pixel (x,y) of layer l
shift signal filter
pixel (x, y)
[Kipf18talk]
16. How can we define Conv. in graphs?
⢠No âaxesâ in graphs unlike sound signals (time) and
image signals (x, y)
⢠What is âcoordinateâ? What is âshiftâ?
16 [ăăăă¨ă]
Spectral approach on Graphs
17. Fourier Transform and Convolution
Convolutions in the signal (spatial) domain = Multiplication
in the spectral (frequency) domain
K-th frequencyâsFourier base function
17
1D case:
For Graphs, what is the signal? How the FT defined??
20. Graph Fourier Transform =
Multiply Eigenvectors of (Normalized) Graph
Laplacians
20
i-th eigenvalue
= frequency of i-th base
i-th eigenvevtor
= i-th graph Fourier base
Graph signal in spectral (freq.)
space
Graph Fourier Transform(GFT)
Inner product between the graph
signal (x) and Fourier bases (U)
22. Graph Convolution via GFT
⢠Convolution is a multiplication in Fourier space
and GFT as well
22
GFT of the graph signal
Convolution = simple
multiplication in spectral
IGFT to get conv-filtered graph signal
23. NaĂŻve Graph Convolution
23
FixedTwo parameters:
ď Eig.Decompose of (normalized) graph Laplacians O(N^3) for each graph
Trainable
(amplify each spectral coefficients)
ď Simple linear algebra for a generic Graph Conv.
(eigenvectors of the Laplacian = Fourier bases over freq.)
24. ⢠Model N trainable param. with K (<<N) order
Chebychev Polynomials
Approximation: ChebyNet [Defferrard16ChebNet]
Chebychev
Polynomial
24
25. 25
ď Graph Laplacian is sparse in nature
ď Polynomial L is lightweight << O(N^3)
No U-s = No Eig. Decomp.
26. GCN [Kipf_Welling17GCN]: much simpler Convolution
26
Set K = 1, and replace l_N with 2 (the theoretical maximum value)
Assume w = -w1 = (w0-w1)
Very simple GraphConv
⢠No Eig. Decomp.
⢠No Matrix Polynomials
⢠single tunable parameter (w)
27. spatial interpretation of GCNâs convolution
⢠Weighted sum of neighbor nodes (shift) + self-link
⢠Number of neighbor nodes vary unlike image Cnvnet
27
Neighboring nodesSelf-link
29. Once trained, run anywhere (on different
topologies)
⢠The sole trainable parameter W does not care the
adjacency
29
ăăăă¨ă
https://www.irasutoya.com
GCN
Trained GCN
Applicable for
Different Graphs
30. The simplest GCN⌠it works quite well
30
[Kipf_Welling17GCN]
Perform better than
the original ChebNet
31. ? ?
Exemplar task: semi-supervised node
classification with GCN
31
Number of Nodes: N (many) Set of labeled nodes (supervisions)
Set of unlabeled nodes
Goal: predict the labels of unlabeled nodes yj
using a trained model (GCN and classifier)
Train: tune parameters of the model to recover yi
32. Forward path
Classifier computes the label prob. distribution based on the final H
Apply L-layered GCN to update the latent node representations, H
C = { , }
32
The initial node vector
= node attributes
q: trainable param of
a classifier
33. Objective Function and Training
33
``Labeledââ Objective function: cross entropy
â accuracy of label prediction on Labeled Nodes
Training: find the best parameters W(GCN) and q(classifier)
to minimize the negative of the objective function L
34. Prediction using the Trained Model
34
``unlabeledââ Predict yj (a label of node j) in unlabeled set,
perform forward path from the observed xj
36. Interacting proteins
⢠Where is a binding site between two proteins?
â Biding site: where a ligand and a receptor interacts
36
Ligand
Receptor
human organ, cell, âŚ.
Drug A
Drug Aâ
Binding sites:
interface for connection
37. Predict binding site nodes by GNN [Fout17Interface]
⢠Predict the binding site regions of ligand-receptor
proteins
â Possible binding = possible drug discovery and
improvement
⢠Protein: a graph of inter-connected amino acids
⢠Use GNN for the binding prediction
37
38. Attributed Protein Graphs
38
[Sanyal20PGCN]
Node (residue): a sub-structure
consists of amino acids
Ligand protein graph
Receptor protein graph
Binding sites: specific nodes interact between ligand and receptor
39. Training/Test of Interaction Prediction [Fout17Interface]
⢠Training: optimize parameters to predict the interaction
label between a node pair ( , ) (for all training
sampls)
⢠Test: predict interactions of node pairs in unseen
Ligand-Receptor pairs
â âTrain once, run any graphs (with different topology)â
39
minimize
41. GCN formulation choices
41
Considering nodes
features only
Node features plus
Edge features
With ordering of
nearest node orders
Neighbor nodes j are ordered by distances from the node i
[Fout17Interface]
45. Scene Graph: summary graph of image contents
⢠Useful representation of imagesâ understanding
⢠Want to generate a scene graph from image input
45
[Qi19AttentiveRN]
In a scene graph of an image,
node: object region in an image
edge: relation between object regions
46. Graph R-CNN for scene-graph generation
[Yang18GRCNN]
⢠Consists of three inner components
â (a) ď (b) Off-the-shelf object detector [Ren15RCNN] (omit)
â (b) ď (c) RePN to prune unnecessary branches (omit)
â (c) ď (d) Attentional GCN to predict labels of nodes and edges
[Yang18GRCNN]46
47. Graph
Expansion
Expand Graph with Relation Nodes
47
[Yang18GRCNN]
Objects detected
Original input graph
(sparsified by RePN):
node = object
edge = relation Object node
Obj-Obj edge
Relation node
Obj-Rel node
One GNN can infer
all objects and relations
48. Attentional GCN [Yang18GRCNN]
⢠GCN update with Attention [Bahdanau15Attention] -based variable
connection
48
GCN
Attentional GCN
Adjacency is fixed
Object nodeâs
latent vector
Relation nodeâs
latent vector
Parameter f can switch based on node types
Attention conncection
49. Ground Truth Scene GraphExpanded Graph
[Yang18GRCNN]
Inferring labels of objects/relations
49
Object nodeâs latent vector
Relation nodeâs latent vector
53. Theoretical Topics of GNN
⢠âDeepâ GNNs do no work well
â âOversmoothingâ
â Current solution: normalizer + residual connection
⢠The theoretical limit of representation powers of GNNs
â Graph Isomorphism test
â Invariance/Equivalence
53
54. ``Deepâ GNNs do not work well
54
[Kipf_Welling17GCN]
Better
Quite different from the successful deep Conv models in Computer vision
55. Oversmoothing Problem [Li18Oversmoothing]
⢠Latent node vectors get closer to each other as the
GCN layers go deeper (higher)
⢠Difficult to distinguish nodes in deeper GCNs
55
Latent node vectors of ``Karate clubââ social network [Li18Oversmoothing]
56. Oversmoothing is theoretically proven
⢠Deeper GCN converges to a solution where
connected nodes will have similar latent vectors
[Li18Oversmoothing, NT_Maehara19revisit]
⢠Such convergence in GCN proceeds very quickly
(exponential to the depth), regardless of the initial
node vectors [Oono_Suzuki20Exponential]
â Similar conclusion also holds for a generic GNN
56
57. A good workaround [Zhou20Deep, Chen20SimpleDeep, Li20DeeperGCN]
⢠Combining a proper normalizer and a residual term
â Normalizing the latent node vectors keep distant
from each other [Zhao_Akoglu20PairNorm]
â Residual terms keep the norm of loss gradients in
a moderate scale [Kipf_Welling17GCN]
57
Residual term:
Add the current layer ââas it isââ
Not a workaround for âdeeper layers, stronger GNNsâ (like image recognition)
58. The theoretical limit of representation powers of
GNNs
⢠Surprisingly, the limitation of GNNs is already known in
the problem of âgraph isomorphismâ test
⢠The theory does NOT directly give limitations of GNNâs
power for other tasks (i.e. node classification), but it is
loosely related [Chen19RGNN]
58
59. Graph isomorphism problem (for theory of GNNs)
⢠Classify whether two given graphs have an edge-
preserving bijective map of node sets
â the same topology in terms of the edges?
59
[ https://ja.wikipedia.org/wiki/ă°ăŠăĺĺ ]
The same
graph?
60. Weisfeiler-Lehman (WL) test algorithm [Weisfeiler_Lehman68WL]
⢠A popular heuristics to test the isomorphism
⢠Idea: concatenate the neighbour nodesâ labels to check
the edge topology
â Used in graph kernels [Shervashidze11WLKernel, Togninalli18WWL] and
GNNs [Jin17WLorg, Morris19kGNN]
60
62. Upper limit of the GNNâs representation power
⢠In terms of the Graph Isomorphism problem,
a generic GNN ⌠WL test algorithm [Xu19GIN, Morris19WL]
â There could be a case where WL test can decide
isomorphism but GNN can not.
62
63. Graph Isomorphism Network (GIN) [Xu19GIN]
⢠Proposed a specific GNN architecture that attain the
same graph isomorphism detection power
63
Each layer update must be as powerful as MLP
A layer update of
typical GNNs
One non-linear activation function
A layer update of
the proposed GIN
64. Higher-order WL/GNN
⢠k-dimensional WL (k-WL) test
â labels the k-tuple of nodes
â powerful than the originalWL
â K-GNN [Morris19kGNN] is as good
as k-WL test
64
[Maron19NeurIPS]
65. More powerful GNNs [Sato20Survey]
⢠K-order graph network
â consists of all linear functions that are invariant or
equivariant to node permutations [Maron19ICLR]
â as powerful as k-GNN, memory-efficient [Maron19NeurIPS]
⢠CPNGNN introduces the local node ordering, and is
strictly powerful than GIN [Sato19CPNGNN]
65
67. Take home message (Revisited)
⢠Graph Neural Networks (GNNs): Neural Networks (NNs) to
compute nodesâ representation of graph-structured data
⢠Practical applications in industry, hard competitions in academia
⢠Model: The fundamental is approximated Graph Convolution
⢠Applications: applicable in several tasks in different domains
67
69. Surveys and documents for studying GNN
⢠Survey papers on GNN
â [Wu19survey] [Zhou18survey]
â For experts: [Battaglia18survey]
⢠Tutorials, slideshare, blogs
â English: [Kipf18talk], [Ma20AAAI]
â ćĽćŹčŞ: [Honda19GNN]
69
70. The most famous dataset resources
⢠Prof. Leskovec (stanford)
â Stanford Large Network Dataset collections
http://snap.stanford.edu/data/index.html
⢠UC Santa Cruz LINQS group
â https://linqs.soe.ucsc.edu/data
â train/valid/test split:
https://github.com/kimiyoung/planetoid
70
71. Finally: GNN for your use
⢠Say good-bye for âD = {vector x_i, label y_i}â framework
with GNNs
â Applicable for generic structured data = graph
â The standard GCN is strong enough
⢠Many diverse application fields
â chemo, pharmacy, materials, computer vision, social
networks, âŚ
71
72. Dataset for specific domains
⢠MoleculeNet [Wu18MoleculeNet]
â Molecular graph datasets from several chemical field.
TensorFlow implmenetaions attached
⢠Scene Graphs
â Visual Genome[Krishna17VG]: 108K images
⢠Pointcloud
â S3DIS [Armeni16S3DIS]: inside office
72
73. Programing Libraries
⢠Recommended: PyTorch Geometric
https://github.com/rusty1s/pytorch_geometric
⢠Deep Graph Library https://www.dgl.ai/
⢠Chainer Chemistry https://github.com/pfnet-
research/chainer-chemistry
â Tailored for molecular graphs, but also applicable for
other domains
73
74. References A-G
[Anderson19Cormorant] Anderson+, âComorant: Covariant Molecular Neural Networksâ, NeurIPS, 2019.
[Armeni16S3DIS] Armeni+, â3D Semantics Parsing of Large-scale Indoor Spacesâ, CVPR, 2016.
[Bahdanau15Attention] Bahdanau+, âNeural Machine Translation by Jointly Learning to Align and Translateâ, ICLR,
2015.
[Battaglia18survey] Battaglia+, âRelational Inductive Biases, Deep Learning, and Graph Networksâ, arXiv:
1806.01261v3 [cs.LG], 2018.
[Bruna14Spectral] Bruna+, âSpectral Networks and Locally Connected Networks on Graphsâ, ICLR, 2014.
[Chem19RGNN] Chen+, âOn the Equivalence between Graph Isofmorphism Testing and Function Approximaion with
GNNsâ, NeurIPS 2019.
[Chen20SimpleDeep] Chen+, âSimple and Deep Graph Convolutional Networksâ, ICML 2020.
[DeCao19QA2] De Cao+, âQuestion Answering by Reasoning Across Documents with Graph Convolutional Networksâ,
ICML Workshop, 2019.
[Defferrard16ChebNet] Defferrard+, âConvoluional Neural Networks on Graphs with Fast Localized Spectral Filteringâ,
NIPS, 2016.
[Fout17Interface] Fout+, âProtein Interface Prediction using Graph Convolutional Networksâ, NIPS, 2017.
[Gilmer17MPNN] Gilmer+, âNeural Message Passing for Quantum Chemistryâ, ICML, 2017.
75. References H-K
[Hamilton17GraphSAGE] Hamilton+, âInductive Representation Learning on Large Graphsâ, NIPS 2017.
[Honda19GNN] Honda, âGNNăžă¨ă(1-3)â, 2019. https://qiita.com/shionhonda/items/d27b8f13f7e9232a4ae5
https://qiita.com/shionhonda/items/0d747b00fe6ddaff26e2
https://qiita.com/shionhonda/items/e11a9cf4699878723844
[Hu20RandLaNet] Hu+, âRandLA-Net: Efficient Semantic Segmentation of Large-Scale Point Cloudsâ, CVPR 2020.
[Jin17WLorg] Jin+, âPredicting Organic Reaction Outcomes with Weisfeiler-Lehman Networkâ, NIPS 2017.
[Jin18JTVAE] Jin+, âJunction Tree Variational Autoencoder for Molecular Graph Generationâ, ICML 2018.
[Kipf18talk] Kipf, âStructured Deep Models: Deep learning on graphs and beyondâ, 2018.
http://tkipf.github.io/misc/SlidesCambridge.pdf
[Kipf_Welling17GCN] Kipf and Welling, âSemi-supervised Classification with Graph Convolutional Networksâ, ICLR
2017.
[Klicepra20DimeNet] Klicepra+, âDirectional Message Passing for Molecular Graphsâ, ICLR 2020.
[Krizhevsky12ConvNet] Krizhevsky+, âImageNet Classification with Deep Convolutional Neural Networksâ, NIPS 2012.
[Krishna17VG] Krishna+, âVisual genome: Connecting language and vision using crowdsourced dense image
annotationsâ, ICCV, 2017. 75
76. References L
[Landrieu_Boussaha19PCS] Landrieu and Boussaha, âPoint Cloud Oversegmentation with Graph-Structured Deep
Metric Learningâ, CVPR, 2019.
[Laugier18CGCNN] Laugier+, âPredicting thermoelectric properties from crystal graphs and material descriptors - first
application for functional materialsâ, NeurIPS Workshop, 2018.
[Li16GGNN] Li+, âGated Graph Sequence Neural Networksâ, ICLR, 2016.
[Li18Oversmoothing] Li+, âDeeper Insights into Graph Convolutional Networks for Semi-Supervised Learningâ, arXiv:
1801.07606 [cs.LG], 2018.
[Li19DeepGCNs] Li+, âDeepGCNs: can GCNs go as deep as CNNs?â, ICCV 2019.
[Li20DeperGCN] Li+, âDeeperGCN: ALL You Need to Train Deeper GCNsâ, arXiv: 2006.07739, 2020
[Lu16VRD] Lu+,âVisual Relationship Detection with Language Priorsâ, ECCV, 2016.
[Liu18CGVAE] Liu+, âConstrained Graph Variational Autoencoders for Molecule Designâ, NeurIPS 2018.
76
77. References M-P
[Ma20tutorial] Ma+, âGraph Neural Networks: Models and Applicationsâ, AAAI, 2020.
http://cse.msu.edu/~mayao4/tutorials/aaai2020/
[Madhawa19GNVP] Madhawa+, âGraphNVP: An Invertible Flow Model for Generating Molecular Graphsâ, arXiv:
1905.11600, 2019.
[Marrn19NeurIPS] Marron+, âProvably Powerful Graph Networksâ, NeurIPS, 2019.
[Maron19ICLR] Marron+, âInvariant and Equivariant Graph Networksâ, ICLR 2019.
[Morris19kGNN] Morris+, âWeisfeiler and Lehman Go Neural: Higher-order Graph Neural Networksâ, AAAI, 2019.
[Ngueyn_Maehara20Homomorph] Nguyen and Maehara, âGraph Homomorphism Networkâ, ICML, 2020.
[NT_Maehara19revisit] NT and Maehara, âRevisiting Graph Neural Networks: All We Have is Low-pass Filtersâ, arXiv:
1905.09550, 2019.
[Oono_Suzuki20Exponential] Oono and Suzuki, âGraph Neural Networks Exponentially Lose Expressive Power for
Node Classificationâ, ICLR 2020.
[Pei20GeomGCN] Pei+, âGeom-GCN: Geometric Graph Convolutional Networksâ, ICLR, 2020.
77
78. References Q-V
[Qi19AttentiveRN] Qi+, âAttentive Relataional Networks for Mapping Images to Scene Graphsâ, CVPR, 2019.
[Sanchez-Gonzalez20GNS] Sanchez-Gonzalez+, âLearning to Simulate Complex Physics with Graph Networksâ,
arXiv: 2002.09405. 2020.
[Sanyal20PGCN] Sanyal+, âProtein GCN: Protein model quality assessment using graph convolutional networksâ,
bioRxiv, 2020.
[Sato19CPNGNN] Sato+, âApproximation Ratios of Graph Neural Networks for Combinatorial Problemsâ, NeurIPS,
2019.
[Sato20Survey] Sato, âA Survey on the Expressive Power of Graph Neural Networksâ, arXiv: 2003.04078.
[Scarselli08GNN] Scarselli+, âThe Graph Neural Network Modelâ, IEEE Trans. Neural Networks, 20(1), pp. 61-80,
2008
[Schlichtkrull18RGCN] Modeling Relational Data with Graph Convolutional Networks, ESWC 2018.
[Shervashidze11WLKernel: Shervashidez+, âWeisfeiler-Lehman Graph Kernelsâ, JMLR, 12, pp.2539-2661, 2011.
[Shuman13Graph] Shuman+, âThe emerging field of signal processing on graphs: extending high-dimensional data
analysis to networks and other irregular domainsâ, IEEE Signal Processing Magazine, 30(3), pp.83-98, 2013
[Togninalli18WWL] Togninalli+, âWasserstein Weisfeiler-Lehman Graph Kernelsâ, NeurIPS 2018.
[VeliÄkoviÄ18GAT] VeliÄkoviÄ+, âGraph Attention Networksâ, ICLR 2018.
79. References W-Z
[Wang18NLNN] Wang+, âNon-local Neural Networksâ, CVPR 2018.
[Weisfeiler_Lehman68WL] Weisfeiler and Lehman, âA Reduction of a Graph to a Canonical Form and an Algebra
Arising during this Reductionâ, Nauchno-Technicheskaya Informatsia, Ser.2(9), pp.12-16, 1968.
[Wu18MoleculeNet] Wu+, âMoleculeNet: a benchmark for molecular machine learningâ, Chemical Science, 9(513),
2018.
[Wu19SGC] Wu+, âSimplifying Graph Convolutional Networksâ, in Proc. ICML, 2019.
[Wu19survey] Wu+, âA Comprehensive Survey on Graph Neural Networksâ, arXiv:1901.00596v1 [cs.LG], 2019.
[Xu19GIN] Xu+, âHow powerful are Graph Neural Networks?â, in Proc. ICLR, 2019.
[Yang18GRCNN] Yang+, âGraph R-CNN for Scene Graph Generationâ, ECCV, 2018.
[You18GCPN] You+, âGraph Convolutional Policy Network for Goal-Directed Molecular Graph Generationâ, NeurIPS
2018.
[Zhao_Akoglu20PairNorm] Zhao and Akoglu, âPairNorm: Tackling Oversmoothing in GNNsâ, ICLR, 2020.
[Zhou20Deep] Zhou+, âEffective Training Strategies for Deep Graph Neural Networksâ, arXiv: 2006.07107, 2020.
[Zhou18survey] Zhou+, âGraph Neural Networks: A Review of Methods and Applicationsâ, arXiv: 1812.08434v2
[cs.LG], 2018. 79
80. GNN research history
80
Original GNN
(2008)
Spectral GNN
(ICLR14)
ChebNet
(NIPS16)
GCN
(ICLR17)
GAT
(ICLR18)
MPNN
(ICML18)
GraphSAGE
(NIPS17)
Birth of GNN The Door Opener
NLNN
(CVPR18)
GIN
(ICLR19)
k-GNN
(AAAI19)
CPNGNN
(NeurIPS19)
DimeNet
(ICLR20)
Model Research
Explosion
GNN extremes
Deeper theory
/ applications
Homomorphis
m (ICML20)
SGC
(ICML19)
Original GNN [Scarteslli08GNN] Spectral GNN [Bruna14Spectral] ChebNet[Defferrard16Cheb]
GGNN[Li16GGNN] GCN [Kipf_Welling17GCN] GraphSAGE [Hamilton17GraphSAGE] GAT [VeliÄkoviÄ18GAT]
MPNN [Gilmer17MPNN] NLNN [Wang18NLNN] GIN [Xu19GIN] K-GNN [Morris19kGNN] SGC [Wu19SGC]
CPNGNN [Sato19CPNGNN] Graph Homomorphism Convolution [Ngueyn_Maehara20Homomorph] DimeNet [Klcepra20DimeNet]
81. Def: Graph signals
⢠Suppose the 1D node attribute case: N-array
81
Low frequency graph signal
High frequency graph signal
[Ma20tutorial]