Skip to content

Commit

Permalink
do not drop batch size dimension for single inputs (#2878)
Browse files Browse the repository at this point in the history
  • Loading branch information
jppgks committed Dec 23, 2022
1 parent 7057c80 commit 461fa74
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion ludwig/explain/captum.py
Original file line number Diff line number Diff line change
Expand Up @@ -284,7 +284,7 @@ def get_total_attribution(
a_reduced = a.detach().cpu()
if a.ndim > 1:
# Convert to token-level attributions by summing over the embedding dimension.
a_reduced = a.sum(dim=-1).squeeze(0)
a_reduced = a.sum(dim=-1)
if a_reduced.ndim == 2:
# Normalize token-level attributions of shape [batch_size, sequence_length] by dividing by the
# norm of the sequence.
Expand Down

0 comments on commit 461fa74

Please sign in to comment.