Skip to content

Commit

Permalink
Rectify the default attention mode "node" of the nn.HypergraphConv
Browse files Browse the repository at this point in the history
…layer (#8818)

Greetings, esteemed individuals.

While acquainting myself with the utilization of the nn.HypergraphConv
layer through the application of mock inputs, I have encountered an
issue with the default attention mode, namely "node," which disrupts the
functionality of the forward function.

Upon examining the source code, I have identified a potential problem
with the num_nodes parameter within the softmax function at line 177.
From my perspective, when attempting to compute attention scores for
nodes within a hyperedge, the number of hyperedges should be considered
as num_nodes.

Therefore, I have modified the value of num_nodes to correspond to the
pre-calculated quantity of num_edges. As a result, the issue has been
resolved, and the functionality has been restored harmoniously.

Warmest regards.
  • Loading branch information
A-LOST-WAPITI authored Jan 25, 2024
1 parent 2ab0351 commit 7875987
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions torch_geometric/nn/conv/hypergraph_conv.py
Original file line number Diff line number Diff line change
Expand Up @@ -174,9 +174,9 @@ def forward(self, x: Tensor, hyperedge_index: Tensor,
alpha = (torch.cat([x_i, x_j], dim=-1) * self.att).sum(dim=-1)
alpha = F.leaky_relu(alpha, self.negative_slope)
if self.attention_mode == 'node':
alpha = softmax(alpha, hyperedge_index[1], num_nodes=x.size(0))
alpha = softmax(alpha, hyperedge_index[1], num_nodes=num_edges)
else:
alpha = softmax(alpha, hyperedge_index[0], num_nodes=x.size(0))
alpha = softmax(alpha, hyperedge_index[0], num_nodes=num_nodes)
alpha = F.dropout(alpha, p=self.dropout, training=self.training)

D = scatter(hyperedge_weight[hyperedge_index[1]], hyperedge_index[0],
Expand Down

0 comments on commit 7875987

Please sign in to comment.