The IRMA Community
Newsletters
Research IRM
Click a keyword to search titles using our InfoSci-OnDemand powered search:
|
Fundamental Concepts in Graph Attention Networks
|
Author(s): R. Soujanya (Gokaraju Rangaraju Institute of Engineering and Technology, Hyderabad, India), Ravi Mohan Sharma (Makhanlal Chaturvedi National University of Journalism and Communication, Bhopal, India), Manish Manish Maheshwari (Makhanlal Chaturvedi National University of Journalism and Communication, Bhopal, India)and Divya Prakash Shrivastava (Higher Colleges of Technology, Dubai, UAE)
Copyright: 2023
Pages: 12
Source title:
Concepts and Techniques of Graph Neural Networks
Source Author(s)/Editor(s): Vinod Kumar (Koneru Lakshmaiah Education Foundation (Deemed), India)and Dharmendra Singh Rajput (VIT University, India)
DOI: 10.4018/978-1-6684-6903-3.ch006
Purchase
|
Abstract
Graph attention networks, also known as GATs, are a specific kind of neural network design that can function on input that is arranged as a graph. These networks make use of masked self-attentional layers in order to compensate for the shortcomings that were present in prior approaches that were based on graph convolutions. The main advantage of GAT is its ability to model the dependencies between nodes in a graph, while also allowing for different weights to be assigned to different edges in the graph. GAT is able to capture both local and global information in a graph. Local information refers to the information surrounding each node, while global information refers to the information about the entire graph. This is achieved through the use of attention mechanisms, which allow the network to selectively focus on certain nodes and edges while ignoring others. It also has scalability, interpretability, flexibility characteristics. This chapter discusses the fundamental concepts in graph attention networks.
Related Content
Dankan Gowda V., Anjali Sandeep Gaikwad, Pilli Lalitha Kumari, Erdal Buyukbicakci, Sengul Ibrahimoglu.
© 2025.
32 pages.
|
Debasish Banerjee, Ranjit Barua, Sudipto Datta, Dileep Pathote.
© 2025.
18 pages.
|
Kok Yeow You, Man Seng Sim.
© 2025.
96 pages.
|
Man Seng Sim, Kok Yeow You, Fahmiruddin Esa, Raimi Dewan, DiviyaDevi Paramasivam, Rozeha A. Rashid.
© 2025.
38 pages.
|
Mandeep Kaur.
© 2025.
24 pages.
|
Ganesh Khekare, Priya Dasarwar, Ajay Kumar Phulre, Urvashi Khekare, Gaurav Kumar Ameta, Shashi Kant Gupta.
© 2025.
22 pages.
|
Manoj Kumar Elipey, P. S. Kishore, Ratna Sunil Buradagunta.
© 2025.
14 pages.
|
|
|