• Introduction to Multi-Layer Graph Neural Networks (GNNs): The lecture discusses transitioning from a single-layer GNN to a multi-layer GNN, emphasizing the importance of stacking layers, which involves message transformation and aggregation operations, along with additional training techniques like batch normalization and L2 normalization.
  • Concept of Over-Smoothing in GNNs: A key challenge in multi-layer GNNs is over-smoothing, where node embeddings become too similar across the network as the number of layers increases. This is due to large receptive fields causing nodes to aggregate similar information, leading to a loss of distinctive features in embeddings.
  • Receptive Fields in GNNs: The receptive field of a node in a GNN is defined by its k-hop neighborhood. As the number of layers increases, the receptive field expands, often encompassing a significant portion of the network, which contributes to the over-smoothing problem.
  • Strategies to Combat Over-Smoothing: The lecture suggests several approaches to address over-smoothing, including limiting the number of GNN layers, enhancing the expressive power within individual GNN layers (e.g., using deeper networks for transformation and aggregation), and incorporating pre-processing and post-processing layers like multilayer perceptrons.
  • Utilizing Skip Connections: To improve GNN performance and mitigate over-smoothing, the lecture introduces skip connections. These connections allow for the combination of embeddings from different layers, preserving information from earlier layers and enabling the final node embedding to be a mixture of various layer outputs, thereby enhancing model expressiveness and performance.