Skip to content

Bug when computing positional IDs from embeddings #36537

@SabrinaRichter

Description

@SabrinaRichter

System Info

self.padding_idx + 1, sequence_length + self.padding_idx + 1, dtype=torch.long, device=inputs_embeds.device

I think it should be 1, sequence_length + 1, dtype=torch.long, device=inputs_embeds.device
I don't see how the index of my padding token in the alphabet has anything to do with my positions in the sequence

Who can help?

No response

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

Expected behavior

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions