Skip to content

Commit

Permalink
Cleanup TPU bits from run_glue.py
Browse files Browse the repository at this point in the history
TPU runner is currently implemented in:
https://github.com/pytorch-tpu/transformers/blob/tpu/examples/run_glue_tpu.py.

We plan to upstream this directly into `huggingface/transformers`
(either `master` or `tpu`) branch once it's been more thoroughly tested.
  • Loading branch information
jysohn23 committed Nov 20, 2019
1 parent b421758 commit 6ef1edd
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion examples/run_glue.py
Original file line number Diff line number Diff line change
Expand Up @@ -474,7 +474,7 @@ def main():


# Saving best-practices: if you use defaults names for the model, you can reload it using from_pretrained()
if args.do_train and (args.local_rank == -1 or torch.distributed.get_rank() == 0) and not args.tpu:
if args.do_train and (args.local_rank == -1 or torch.distributed.get_rank() == 0):
# Create output directory if needed
if not os.path.exists(args.output_dir) and args.local_rank in [-1, 0]:
os.makedirs(args.output_dir)
Expand Down

0 comments on commit 6ef1edd

Please sign in to comment.