Faculty of Mathematical Sciences, Ferdowsi University of Mashhad
Graph Neural Networks Workshop (Part I)
Part I: ANNs & CNNs in PyTorch
In the first part of this workshop, we’ll lay the foundation for understanding graph neural networks by revisiting the core concepts of artificial neural networks (ANNs) and convolutional neural networks (CNNs). We’ll start by exploring the inner workings of ANNs & CNNs, delving into the key components that enable them to effectively extract features from grid-structured data, such as images. To provide a hands-on learning experience, we’ll introduce you to the PyTorch deep learning framework, which will serve as the backbone for our subsequent explorations. You’ll have the opportunity to familiarize yourself with PyTorch’s essential building blocks, including tensors, autograd, and neural network modules. Building on this, we’ll dive into the implementation of multilayer perceptrons (MLPs) using PyTorch, giving you a solid grasp of the fundamental architecture of deep learning models.
As we delve deeper into CNNs, we’ll draw parallels between the learned weights in the convolutional layers and the filter weights used in traditional image processing techniques, such as edge detection. Just as the filter weights in edge detection algorithms are designed to identify changes in pixel intensities, the weights learned by the CNN layers can be thought of as adaptive filters that capture important visual features from the input data. By understanding this connection, you’ll gain a more intuitive understanding of how CNNs can effectively extract meaningful information from images and other grid-structured data. This foundational knowledge will serve as a stepping stone towards your exploration of graph neural networks in the second part of the workshop.