Skip to content

Commit

Permalink
docs(wandb): explain how to use W&B integration
Browse files Browse the repository at this point in the history
  • Loading branch information
borisdayma committed Jul 8, 2020
1 parent 40d98eb commit 810bf24
Showing 1 changed file with 29 additions and 0 deletions.
29 changes: 29 additions & 0 deletions examples/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -78,3 +78,32 @@ python examples/xla_spawn.py --num_cores 8 \
```

Feedback and more use cases and benchmarks involving TPUs are welcome, please share with the community.

## Logging experiments with Weights & Biases

You can easily log and monitor your runs code with [Weights & Biases](https://docs.wandb.com/library/integrations/huggingface).

Install wandb with:

```bash
pip install wandb
```

Then log in the command line:

```bash
wandb login
```

If you are in Jupyter or Colab, you should login with:

```python
import wandb
wandb.login()
```

Whenever you use `Trainer` or `TFTrainer` classes, your losses, evaluation metrics, model topology and gradients (for `Trainer` only) will automatically be logged.

For advanced configuration and examples, refer to the [W&B documentation](https://docs.wandb.com/library/integrations/huggingface).

When using 🤗 Transformers with PyTorch Lightning, runs can be tracked through `WandbLogger`. Refer to related [documentation & examples](https://docs.wandb.com/library/frameworks/pytorch/lightning).

0 comments on commit 810bf24

Please sign in to comment.