Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 0 additions & 2 deletions .azure-pipelines/gpu-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -108,8 +108,6 @@ jobs:
bash pl_examples/run_examples.sh --trainer.gpus=1
bash pl_examples/run_examples.sh --trainer.gpus=2 --trainer.strategy=ddp
bash pl_examples/run_examples.sh --trainer.gpus=2 --trainer.strategy=ddp --trainer.precision=16
bash pl_examples/run_examples.sh --trainer.gpus=2 --trainer.strategy=dp
bash pl_examples/run_examples.sh --trainer.gpus=2 --trainer.strategy=dp --trainer.precision=16
env:
PL_USE_MOCKED_MNIST: "1"
displayName: 'Testing: examples'
Expand Down
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,9 @@ __pycache__/
*.py[cod]
*$py.class
timit_data/
grid_generated*
grid_ori*



# C extensions
Expand Down
16 changes: 9 additions & 7 deletions docs/source/starter/lightning_lite.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,15 +3,14 @@ LightningLite - Stepping Stone to Lightning
###########################################


:class:`~pytorch_lightning.lite.LightningLite` enables pure PyTorch users to scale their existing code
on any kind of device while retaining full control over their own loops and optimization logic.

.. image:: https://pl-public-data.s3.amazonaws.com/docs/static/images/lite/lightning_lite.gif
:alt: Animation showing how to convert a standard training loop to a Lightning loop
:width: 600px
:alt: Animation showing how to convert your PyTorch code to LightningLite.
:width: 500
:align: center

|

:class:`~pytorch_lightning.lite.LightningLite` enables pure PyTorch users to scale their existing code
on any kind of device while retaining full control over their own loops and optimization logic.

:class:`~pytorch_lightning.lite.LightningLite` is the right tool for you if you match one of the two following descriptions:

Expand Down Expand Up @@ -246,6 +245,9 @@ from its hundreds of features.

You can see our :class:`~pytorch_lightning.lite.LightningLite` as a
future :class:`~pytorch_lightning.core.lightning.LightningModule` and slowly refactor your code into its API.
Below, the :meth:`~pytorch_lightning.core.lightning.LightningModule.training_step`, :meth:`~pytorch_lightning.core.lightning.LightningModule.forward`,
:meth:`~pytorch_lightning.core.lightning.LightningModule.configure_optimizers`, :meth:`~pytorch_lightning.core.lightning.LightningModule.train_dataloader`
are being implemented.


.. code-block:: python
Expand Down Expand Up @@ -300,7 +302,7 @@ future :class:`~pytorch_lightning.core.lightning.LightningModule` and slowly ref


Finally, change the :meth:`~pytorch_lightning.lite.LightningLite.run` into a
:meth:`~pytorch_lightning.core.lightning.LightningModule.__init__` and drop the inner code for setting up the components.
:meth:`~pytorch_lightning.core.lightning.LightningModule.__init__` and drop the fit method.

.. code-block:: python

Expand Down
10 changes: 9 additions & 1 deletion pl_examples/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ In this folder, we have 2 simple examples:

- [Image Classifier](./basic_examples/backbone_image_classifier.py) (trains arbitrary datasets with arbitrary backbones).
- [Image Classifier + DALI](./basic_examples/mnist_examples/image_classifier_4_dali.py) (defines the model inside the `LightningModule`).
- [Autoencoder](./basic_examples/autoencoder.py) (shows how the `LightningModule` can be used as a system)
- [Autoencoder](./basic_examples/autoencoder.py)

______________________________________________________________________

Expand All @@ -37,6 +37,14 @@ for advanced use cases.

______________________________________________________________________

## Basic Examples

In this folder, we have 1 simple example:

- [Image Classifier + DALI](./integration_examples/dali_image_classifier.py) (defines the model inside the `LightningModule`).

______________________________________________________________________

## Loop examples

Contains implementations leveraging [loop customization](https://pytorch-lightning.readthedocs.io/en/latest/extensions/loops.html) to enhance the Trainer with new optimization routines.
Expand Down
61 changes: 53 additions & 8 deletions pl_examples/basic_examples/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ Trains a simple CNN over MNIST using vanilla PyTorch.

```bash
# CPU
python image_classifier_1_pytorch.py
python mnist_examples/image_classifier_1_pytorch.py
```

______________________________________________________________________
Expand All @@ -25,7 +25,7 @@ This script shows you how to scale the previous script to enable GPU and multi-G

```bash
# CPU / multiple GPUs if available
python image_classifier_2_lite.py
python mnist_examples/image_classifier_2_lite.py
```

______________________________________________________________________
Expand All @@ -36,7 +36,7 @@ This script shows you how to prepare your conversion from [LightningLite](https:

```bash
# CPU / multiple GPUs if available
python image_classifier_3_lite_to_lightning_module.py
python mnist_examples/image_classifier_3_lite_to_lightning_module.py
```

______________________________________________________________________
Expand All @@ -47,10 +47,10 @@ This script shows you the result of the conversion to the `LightningModule` and

```bash
# CPU
python image_classifier_4_lightning_module.py
python mnist_examples/image_classifier_4_lightning_module.py

# GPUs (any number)
python image_classifier_4_lightning_module.py --trainer.gpus 2
python mnist_examples/image_classifier_4_lightning_module.py --trainer.gpus 2
```

______________________________________________________________________
Expand All @@ -61,11 +61,56 @@ This script shows you how to extract the data related components into a `Lightni

```bash
# CPU
python image_classifier_5_lightning_datamodule.py
python mnist_examples/image_classifier_5_lightning_datamodule.py

# GPUs (any number)
python image_classifier_5_lightning_datamodule.py --trainer.gpus 2
python mnist_examples/image_classifier_5_lightning_datamodule.py --trainer.gpus 2

# Distributed Data Parallel (DDP)
python image_classifier_5_lightning_datamodule.py --trainer.gpus 2 --trainer.strategy 'ddp'
python mnist_examples/image_classifier_5_lightning_datamodule.py --trainer.gpus 2 --trainer.strategy 'ddp'
```

______________________________________________________________________

#### AutoEncoder

This script shows you how to implement a CNN auto-encoder.

```bash
# CPU
python autoencoder.py

# GPUs (any number)
python autoencoder.py --trainer.gpus 2

# Distributed Data Parallel (DDP)
python autoencoder.py --trainer.gpus 2 --trainer.strategy 'ddp'
```

______________________________________________________________________

#### Backbone Image Classifier

This script shows you how to implement a `LightningModule` as a system.
A system describes a `LightningModule` which takes a single `torch.nn.Module` which makes exporting to producion simpler.

```bash
# CPU
python backbone_image_classifier.py

# GPUs (any number)
python backbone_image_classifier.py --trainer.gpus 2

# Distributed Data Parallel (DDP)
python backbone_image_classifier.py --trainer.gpus 2 --trainer.strategy 'ddp'
```

______________________________________________________________________

#### PyTorch Profiler

This script shows you how to activate the [PyTorch Profiler](https://github.com/pytorch/kineto) with Lightning.

```bash
python profiler_example.py
```
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@
import torch
import torchvision.transforms as T
from torch.nn import functional as F
from torchmetrics import Accuracy

from pl_examples import cli_lightning_logo
from pl_examples.basic_examples.mnist_datamodule import MNIST
Expand All @@ -31,6 +32,7 @@ def __init__(self, model=None, lr=1.0, gamma=0.7, batch_size=32):
super().__init__()
self.save_hyperparameters()
self.model = model or Net()
self.test_acc = Accuracy()

def forward(self, x):
return self.model(x)
Expand All @@ -45,6 +47,7 @@ def test_step(self, batch, batch_idx):
x, y = batch
logits = self.forward(x)
loss = F.nll_loss(logits, y.long())
self.log("test_acc", self.test_acc(logits, y))
return loss

def configure_optimizers(self):
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@
import torch
import torchvision.transforms as T
from torch.nn import functional as F
from torchmetrics import Accuracy

from pl_examples import cli_lightning_logo
from pl_examples.basic_examples.mnist_datamodule import MNIST
Expand All @@ -31,6 +32,7 @@ def __init__(self, model, lr=1.0, gamma=0.7, batch_size=32):
super().__init__()
self.save_hyperparameters()
self.model = model or Net()
self.test_acc = Accuracy()

def forward(self, x):
return self.model(x)
Expand All @@ -45,6 +47,7 @@ def test_step(self, batch, batch_idx):
x, y = batch
logits = self.forward(x)
loss = F.nll_loss(logits, y.long())
self.log("test_acc", self.test_acc(logits, y))
return loss

def configure_optimizers(self):
Expand Down