Changes

Jump to navigation Jump to search
No change in size ,  13:46, 28 March 2019
no edit summary
: This paper explains how to apply the skip-gram model to learning node representations of graph-structured data. This new encoder-decoder model can be trained to generate representations for any arbitrary random walk sequence.
* [https://www.kdd.org/kdd2018/files/deep-learning-day/DLDay18_paper_27.pdf Learning Graph Representations with Recurrent NeuralNetwork Autoencoders (Taheri, Gimpel, Berger-Wolf)]
: In a similar process to other neural network encoders, this proposed architecture first generates sequential data from graphs, using BFS shortest path, and random walk algorithms. It then trains LSTM autoencoders to embed these graph sequences into a vector space.
65

edits

Navigation menu