Skip to content

Add a learned positional embedding layer #23

@mattdangerw

Description

@mattdangerw

We should expose the a keras layer for a learned positional embedding through keras_nlp.

This will take as input a maximum sequence length and embedding dimension size, and learn a positional embedding that can be combined with a token embedding.

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions