From the course: Learning Graph Neural Networks

Unlock this course with a free trial

Join today to access over 24,400 courses taught by industry experts.

Message passing transformation and aggregation

Message passing transformation and aggregation

From the course: Learning Graph Neural Networks

Message passing transformation and aggregation

- [Instructor] In this video, we'll talk a little more about how message passing works in draft neural networks, and we'll dive deeper into the transformation and aggregation operations in a GNN. We've discussed the fact that transformation and aggregation operations allow a target node to collect local neighborhood information to update its embeddings based on local information from its neighboring nodes. We saw that the structure of the computation graph is different for each node in our graph. With A as the target node on the right is what its computation graph looks like. We also discussed that every node in the graph updates its node embeddings based on information from its neighbors. Now, every node in the graph gets information from its neighbors via a process called message passing. A single node might be connected to multiple neighboring nodes. Messages from the individual neighboring nodes are then aggregated and transformed, and then used to update the current nodes…

Contents