What is: Graph Attention Network v2?
Source | How Attentive are Graph Attention Networks? |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
The GATv2 operator from the “How Attentive are Graph Attention Networks?” paper, which fixes the static attention problem of the standard GAT layer: since the linear layers in the standard GAT are applied right after each other, the ranking of attended nodes is unconditioned on the query node. In contrast, in GATv2, every node can attend to any other node.
GATv2 scoring function: