Skip to content

Conversation

mattdangerw
Copy link
Member

Overall, our model testing is getting unwieldy and slow. This speeds up the default testing for BERT where we can.

  • Models are created as small as possible.
  • Most testing is down with XLA, with only a single test for each model with jit_compile=False.
  • Saved model testing is marked large, with a much faster serialization test added that will run by default.

@mattdangerw
Copy link
Member Author

Once we are good with an approach, I will open a contributor issue to replicate.

Overall, our model testing is getting unwieldy and slow.
This speeds up the default testing for BERT where we can.

 - Models are created as small as possible.
 - Most testing is down with XLA, with only a single test for each
   model with jit_compile=False.
 - Saved model testing is marked large, with a much faster serialization
   test added that will run by default.
@mattdangerw mattdangerw merged commit 855b82f into keras-team:master Mar 17, 2023
kanpuriyanawab pushed a commit to kanpuriyanawab/keras-nlp that referenced this pull request Mar 26, 2023
* Speed up default BERT testing roughly 3x

Overall, our model testing is getting unwieldy and slow.
This speeds up the default testing for BERT where we can.

 - Models are created as small as possible.
 - Most testing is down with XLA, with only a single test for each
   model with jit_compile=False.
 - Saved model testing is marked large, with a much faster serialization
   test added that will run by default.

* Address comments
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants