Towards data augmentation in graph neural network: An overview and evaluation

Authors: Adjeisah, M., Zhu, X., Xu, H. and Ayall, T.A.

Journal: Computer Science Review

Volume: 47

ISSN: 1574-0137

DOI: 10.1016/j.cosrev.2022.100527

Abstract:

Many studies on Graph Data Augmentation (GDA) approaches have emerged. The techniques have rapidly improved performance for various graph neural network (GNN) models, increasing the current state-of-the-art accuracy by absolute values of 4.20%, 5.50%, and 4.40% on Cora, Citeseer, and PubMed, respectively. The success is attributed to two integral properties of relational approaches: topology-level and feature-level augmentation. This work provides an overview of some GDA algorithms which are reasonably categorized based on these integral properties. Next, we engage the three most widely used GNN backbones (GCN, GAT, and GraphSAGE) as plug-and-play methods for conducting experiments. We conclude by evaluating the algorithm's effectiveness to demonstrate significant differences among various GDA techniques based on accuracy and time complexity with additional datasets different from those used in the original works. While discussing practical and theoretical motivations, considerations, and strategies for GDA, this work comprehensively investigates the challenges and future direction by pinpointing several open conceivable issues that may require further study based on far-reaching literature interpretation and empirical outcomes.

Source: Scopus

Towards data augmentation in graph neural network: An overview and evaluation

Authors: Adjeisah, M., Zhu, X., Xu, H. and Ayall, T.A.

Journal: COMPUTER SCIENCE REVIEW

Volume: 47

eISSN: 1876-7745

ISSN: 1574-0137

DOI: 10.1016/j.cosrev.2022.100527

Source: Web of Science (Lite)

Towards data augmentation in graph neural network: An overview and evaluation

Authors: Adjeisah, M., Zhu, X., Xu, H. and Ayall, T.A.

Journal: Computer Science Review

DOI: 10.1016/j.cosrev.2022.100527

Abstract:

Many studies on Graph Data Augmentation (GDA) approaches have emerged. The techniques have rapidly improved performance for various graph neural network (GNN) models, increasing the current state-of-the-art accuracy by absolute values of 4.20%, 5.50%, and 4.40% on Cora, Citeseer, and PubMed, respectively. The success is attributed to two integral properties of relational approaches: topology-level and feature-level augmentation. This work provides an overview of some GDA algorithms which are reasonably categorized based on these integral properties. Next, we engage the three most widely used GNN backbones (GCN, GAT, and GraphSAGE) as plug-and-play methods for conducting experiments. We conclude by evaluating the algorithm’s effectiveness to demonstrate significant differences among various GDA techniques based on accuracy and time complexity with additional datasets different from those used in the original works. While discussing practical and theoretical motivations, considerations, and strategies for GDA, this work comprehensively investigates the challenges and future direction by pinpointing several open conceivable issues that may require further study based on far-reaching literature interpretation and empirical outcomes.

Source: Manual