Graph Convolutional Networks (GCNs)
Application of deep learning to “network graph”.
Much more time saving than quantum computation.
GGNN is the first application of GRUs to the graph neural networks.
aims to introduce a recurrence relation between successive layers
GWM is equipped with both multi-relational attention mechanisms and GRUs to grant the module greater flexibility in the transmission of messages to the graph nodes. 
 Yujia Li, Daniel Tarlow, Marc Brockschmidt, and Richard Zemel. Gated Graph Sequence Neural Networks. In Proceedings of the 4th International Conference on Learning Representations (ICLR), 2016.
 Kyunghyun Cho, Bart Van Merriënboer, Caglar Gulcehre, Dzmitry Bahdanau, Fethi Bougares, Holger Schwenk, and Yoshua Bengio. Learning phrase representations using rnn encoderdecoder for statistical machine translation. In Proceedings of the 2014 Conference on Empirical Methods in Neural Language Processing (EMNLP), pages 1724–1734, 2014.
 Ishiguro, Katsuhiko & Maeda, Shin-ichi & Koyama, Masanori. (2019). Graph Warp Module: an Auxiliary Module for Boosting the Power of Graph Neural Networks.