You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Regarding examples around BERT, do we use HuggingFace transformers and tokenizers? Because then finetuning and downstream tasks can be shown in small examples like this.
Regarding examples around BERT, do we use HuggingFace transformers and tokenizers? Because then finetuning and downstream tasks can be shown in small examples like this.
And then we can show that doing the same thing on TPU is as easy as adding
and creating model under
strategy.scope()
.Does this feel like a good example?
The text was updated successfully, but these errors were encountered: