Skip to content

Mirror BERT testing speedups for other models #866

@mattdangerw

Description

@mattdangerw

We can imitate the changes in #859 for all other models in repo.

Let's do this one model at a time, to keep things granular.

You can run a single models test with pytest keras_nlp/models/XX and should see a noticeable improvement in time elapsed if all things go well.

Checklist for a given model:

  • Add a serialization test for all classes.
  • Mark the saved model testing as "large" for all classes.
  • Make the backbone sizes as small as possible while still testing all logic (e.g. num_layers=2).
  • Only run a single jit_compile=False test per model class. E.g. test_classifier_fit_no_xla.
  • Other minor readability and efficiency improvements (check the original PR carefully).

Metadata

Metadata

Assignees

No one assigned

    Labels

    stat:contributions welcomeAdd this label to feature request issues so they are separated out from bug reporting issues

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions