From the course: Learning Graph Neural Networks

Unlock this course with a free trial

Join today to access over 24,500 courses taught by industry experts.

Exercise: Training a GCN

Exercise: Training a GCN

- [Instructor] The training of the graph convolutional network is identical to the training of the dense neural network. So I won't go through this step by step. I'll just highlight some of the important points. We train for the same number of epochs 200. That's my for loop. We make a forward pass through the model online citeseer predictions, and then we compute the NLL loss on line 8. And this NLL loss is computed only on the training data. Note the use of the train mask. The NLL loss is the negative log likelihood loss, so loss function commonly used in classification tasks. Used to measure the performance of a classification model whose output is a probability distribution over class labels. Now, because our output passes through a log soft max layer, that's why we use this NLL loss. On lines 13 through 20, we compute the accuracy of the models predictions on the training data for this epoch. And then on line 23, we switched the model to evaluation mode and on lines 25 through 31,…

Contents