Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

k-best propable paths are same when specify mask parameter #1

Open
wangjunji opened this issue Feb 18, 2022 · 0 comments
Open

k-best propable paths are same when specify mask parameter #1

wangjunji opened this issue Feb 18, 2022 · 0 comments

Comments

@wangjunji
Copy link

wangjunji commented Feb 18, 2022

I am dealing with variable sequence length. So need to mask padding tokens. But the _viteribi_decode_nbest function produces same paths when the mask contain zeros. Here is the snippet to reproduce

import torch
from torchcrf import CRF

num_tags = 5  # number of tags is 5
model = CRF(num_tags)
seq_length = 3  # maximum sequence length in a batch
batch_size = 2  # number of samples in the batch
emissions = torch.randn(seq_length, batch_size, num_tags)

# Computing log likelihood
tags = torch.tensor([[2, 3], [1, 0], [3, 4]], dtype=torch.long)  # (seq_length, batch_size)
model(emissions, tags)

mask = torch.tensor([[1, 1], [1, 1], [0, 1]])
# Decoding
print(model.decode(emissions, mask=mask))  # decoding the best path
print(model.decode(emissions, mask=mask, nbest=3))  # decoding the top 3 paths

result:

tensor([[[1, 0, 0],
         [3, 4, 0]]])
tensor([[[1, 0, 0],
         [3, 4, 0]],

        [[1, 0, 0],
         [3, 4, 2]],

        [[1, 0, 0],
         [3, 4, 1]]])

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant