Join GitHub today
GitHub is home to over 40 million developers working together to host and review code, manage projects, and build software together.Sign up
Is it possible to retrieve the attention weights of a specific node? #608
However, a node might be connected to other nodes with different degrees. I wanted to retrieve the attention that other nodes pay to a specific node, is that possible?
from scipy.sparse import lil_matrix def preprocess_attention(edge_atten, g, to_normalize=True): """Organize attentions in the form of csr sparse adjacency matrices from attention on edges. Parameters ---------- edge_atten : numpy.array of shape (# edges, # heads, 1) Un-normalized attention on edges. g : dgl.DGLGraph. to_normalize : bool Whether to normalize attention values over incoming edges for each node. """ n_nodes = g.number_of_nodes() num_heads = edge_atten.shape all_head_A = [lil_matrix((n_nodes, n_nodes)) for _ in range(num_heads)] for i in range(n_nodes): predecessors = list(g.predecessors(i)) edges_id = g.edge_ids(predecessors, i) for j in range(num_heads): all_head_A[j][i, predecessors] = edge_atten[edges_id, j, 0].data.cpu().numpy() if to_normalize: for j in range(num_heads): all_head_A[j] = normalize(all_head_A[j], norm='l1').tocsr() return all_head_A # Take the attention from one layer as an example # num_edges x num_heads x 1 A = self.g.edata['a_drop'] # list of length num_heads, each entry is csr of shape (num_nodes, num_nodes) A = preprocess_attention(A, self.g)