Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 6 additions & 6 deletions .run_local_tests.sh
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,11 @@

# use this to run tests
rm -rf _ckpt_*
rm -rf tests/save_dir*
rm -rf tests/mlruns_*
rm -rf tests/cometruns*
rm -rf tests/wandb*
rm -rf tests/tests/*
rm -rf lightning_logs
rm -rf ./tests/save_dir*
rm -rf ./tests/mlruns_*
rm -rf ./tests/cometruns*
rm -rf ./tests/wandb*
rm -rf ./tests/tests/*
rm -rf ./lightning_logs
coverage run --source pytorch_lightning -m py.test pytorch_lightning tests pl_examples -v --doctest-modules
coverage report -m
4 changes: 2 additions & 2 deletions docs/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ m2r # fails with multi-line text
nbsphinx
pandoc
docutils
git+https://github.com/PytorchLightning/lightning_sphinx_theme.git
sphinxcontrib-fulltoc
sphinxcontrib-mockautodoc
pip_shims
git+https://github.com/PytorchLightning/lightning_sphinx_theme.git
# pip_shims
12 changes: 8 additions & 4 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -348,7 +348,11 @@ def find_source():

autodoc_member_order = 'groupwise'
autoclass_content = 'both'
autodoc_default_flags = [
'members', 'undoc-members', 'show-inheritance', 'private-members',
# 'special-members', 'inherited-members'
]
autodoc_default_options = {
'members': True,
'special-members': '__call__',
'undoc-members': True,
# 'exclude-members': '__weakref__',
'show-inheritance': True,
'private-members': True,
}
8 changes: 0 additions & 8 deletions docs/source/documentation.rst

This file was deleted.

2 changes: 1 addition & 1 deletion docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
contain the root `toctree` directive.

PyTorch-Lightning Documentation
=============================
===============================

.. toctree::
:maxdepth: 1
Expand Down
3 changes: 2 additions & 1 deletion docs/source/lightning-module.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,8 @@
:class: hidden-section

LightningModule
===========
===============

.. automodule:: pytorch_lightning.core
:exclude-members:
_abc_impl,
Expand Down
7 changes: 0 additions & 7 deletions docs/source/modules.rst

This file was deleted.

18 changes: 9 additions & 9 deletions docs/source/tutorials.rst
Original file line number Diff line number Diff line change
@@ -1,20 +1,20 @@
Refactoring PyTorch into Lightning
==================================
`Tutorial <https://towardsdatascience.com/how-to-refactor-your-pytorch-code-to-get-these-42-benefits-of-pytorch-lighting-6fdd0dc97538>`_
----------------------------------
`How to refactor your PyTorch code to get these 42 benefits of PyTorch-Lighting <https://towardsdatascience.com/how-to-refactor-your-pytorch-code-to-get-these-42-benefits-of-pytorch-lighting-6fdd0dc97538>`_

Start a research project
=========================
------------------------
`Research seed <https://github.com/PytorchLightning/pytorch-lightning-conference-seed>`_

Basic Lightning use
====================
`Tutorial <https://towardsdatascience.com/supercharge-your-ai-research-with-pytorch-lightning-337948a99eec>`_
-------------------
`Supercharge your AI research with PyTorch-Lightning <https://towardsdatascience.com/supercharge-your-ai-research-with-pytorch-lightning-337948a99eec>`_

9 key Lightning tricks
========================
`Tutorial <https://towardsdatascience.com/9-tips-for-training-lightning-fast-neural-networks-in-pytorch-8e63a502f565>`_
-----------------------
`Tutorial on 9 key speed features in PyTorch-Lightning <9 key speed features in Pytorch-Lightning>`_

Multi-node training on SLURM
=============================
`Tutorial <https://towardsdatascience.com/trivial-multi-node-training-with-pytorch-lightning-ff75dfb809bd>`_
----------------------------
`Trivial multi node training with PyTorch-Lightning <https://towardsdatascience.com/trivial-multi-node-training-with-pytorch-lightning-ff75dfb809bd>`_

1 change: 1 addition & 0 deletions pl_examples/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,7 @@

The main function is your entry into the program. This is where you init your model, checkpoint directory,
and launch the training. The main function should have 3 arguments:

- hparams: a configuration of hyperparameters.
- slurm_manager: Slurm cluster manager object (can be None)
- dict: for you to return any values you want (useful in meta-learning, otherwise set to)
Expand Down
4 changes: 2 additions & 2 deletions pl_examples/basic_examples/cpu_template.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@
import numpy as np
import torch

import pytorch_lightning as pl
from pl_examples.basic_examples.lightning_module_template import LightningTemplateModel
from pytorch_lightning import Trainer

SEED = 2334
torch.manual_seed(SEED)
Expand All @@ -28,7 +28,7 @@ def main(hparams):
# ------------------------
# 2 INIT TRAINER
# ------------------------
trainer = Trainer()
trainer = pl.Trainer()

# ------------------------
# 3 START TRAINING
Expand Down
4 changes: 2 additions & 2 deletions pl_examples/basic_examples/gpu_template.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@
import numpy as np
import torch

import pytorch_lightning as pl
from pl_examples.basic_examples.lightning_module_template import LightningTemplateModel
from pytorch_lightning import Trainer

SEED = 2334
torch.manual_seed(SEED)
Expand All @@ -28,7 +28,7 @@ def main(hparams):
# ------------------------
# 2 INIT TRAINER
# ------------------------
trainer = Trainer(
trainer = pl.Trainer(
gpus=hparams.gpus,
distributed_backend=hparams.distributed_backend,
use_amp=hparams.use_16bit
Expand Down
3 changes: 1 addition & 2 deletions pl_examples/basic_examples/lightning_module_template.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,9 @@
from torchvision.datasets import MNIST

import pytorch_lightning as pl
from pytorch_lightning.core.lightning import LightningModule


class LightningTemplateModel(LightningModule):
class LightningTemplateModel(pl.LightningModule):
"""
Sample model to show how to define a template
"""
Expand Down
4 changes: 2 additions & 2 deletions pl_examples/multi_node_examples/multi_node_ddp2_demo.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@
import numpy as np
import torch

import pytorch_lightning as pl
from pl_examples.basic_examples.lightning_module_template import LightningTemplateModel
from pytorch_lightning import Trainer

SEED = 2334
torch.manual_seed(SEED)
Expand All @@ -29,7 +29,7 @@ def main(hparams):
# ------------------------
# 2 INIT TRAINER
# ------------------------
trainer = Trainer(
trainer = pl.Trainer(
gpus=2,
num_nodes=2,
distributed_backend='ddp2'
Expand Down
4 changes: 2 additions & 2 deletions pl_examples/multi_node_examples/multi_node_ddp_demo.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@
import numpy as np
import torch

import pytorch_lightning as pl
from pl_examples.basic_examples.lightning_module_template import LightningTemplateModel
from pytorch_lightning import Trainer

SEED = 2334
torch.manual_seed(SEED)
Expand All @@ -29,7 +29,7 @@ def main(hparams):
# ------------------------
# 2 INIT TRAINER
# ------------------------
trainer = Trainer(
trainer = pl.Trainer(
gpus=2,
num_nodes=2,
distributed_backend='ddp'
Expand Down
9 changes: 4 additions & 5 deletions pytorch_lightning/__init__.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
"""Root package info."""

__version__ = '0.6.0.dev'
__version__ = '0.6.1.dev'
__author__ = 'William Falcon et al.'
__author_email__ = 'waf2107@columbia.edu'
__license__ = 'Apache-2.0'
Expand All @@ -10,7 +10,6 @@
__docs__ = "PyTorch Lightning is the lightweight PyTorch wrapper for ML researchers." \
" Scale your models. Write less boilerplate."


try:
# This variable is injected in the __builtins__ by the build
# process. It used to enable importing subpackages of skimage when
Expand All @@ -28,12 +27,12 @@
import logging as log
log.basicConfig(level=log.INFO)

from .trainer.trainer import Trainer
from .core.lightning import LightningModule
from .core.decorators import data_loader
from .core import data_loader, LightningModule
from .trainer import Trainer

__all__ = [
'Trainer',
'LightningModule',
'data_loader',
]
# __call__ = __all__
5 changes: 4 additions & 1 deletion pytorch_lightning/core/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -96,6 +96,9 @@ def test_dataloader(self):
for a live demo.

"""

from .decorators import data_loader
from .lightning import LightningModule

__all__ = ['LightningModule']
__all__ = ['LightningModule', 'data_loader']
# __call__ = __all__
5 changes: 2 additions & 3 deletions pytorch_lightning/core/decorators.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,12 +3,11 @@


def data_loader(fn):
"""
Decorator to make any fx with this use the lazy property
"""Decorator to make any fx with this use the lazy property.

:param fn:
:return:
"""

wraps(fn)
attr_name = '_lazy_' + fn.__name__

Expand Down
40 changes: 21 additions & 19 deletions pytorch_lightning/core/lightning.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@


class LightningModule(ABC, GradInformation, ModelIO, ModelHooks):

def __init__(self, *args, **kwargs):
super(LightningModule, self).__init__(*args, **kwargs)

Expand Down Expand Up @@ -115,6 +116,7 @@ def training_step(self, *args, **kwargs):
:param int batch_idx: Integer displaying which batch this is
:return: dict with loss key and optional log, progress keys
if implementing training_step, return whatever you need in that step:

- loss -> tensor scalar [REQUIRED]
- progress_bar -> Dict for progress bar display. Must have only tensors
- log -> Dict of metrics to add to logger. Must have only tensors (no images, etc)
Expand Down Expand Up @@ -1061,30 +1063,30 @@ def load_from_checkpoint(cls, checkpoint_path, map_location=None):
it stores the hyperparameters in the checkpoint if you initialized your LightningModule
with an argument called `hparams` which is a Namespace or dictionary of hyperparameters

Example
-------
.. code-block:: python
Example
-------
.. code-block:: python

# --------------
# Case 1
# when using Namespace (output of using Argparse to parse command line arguments)
from argparse import Namespace
hparams = Namespace(**{'learning_rate': 0.1})
# --------------
# Case 1
# when using Namespace (output of using Argparse to parse command line arguments)
from argparse import Namespace
hparams = Namespace(**{'learning_rate': 0.1})

model = MyModel(hparams)
model = MyModel(hparams)

class MyModel(pl.LightningModule):
def __init__(self, hparams):
self.learning_rate = hparams.learning_rate
class MyModel(pl.LightningModule):
def __init__(self, hparams):
self.learning_rate = hparams.learning_rate

# --------------
# Case 2
# when using a dict
model = MyModel({'learning_rate': 0.1})
# --------------
# Case 2
# when using a dict
model = MyModel({'learning_rate': 0.1})

class MyModel(pl.LightningModule):
def __init__(self, hparams):
self.learning_rate = hparams['learning_rate']
class MyModel(pl.LightningModule):
def __init__(self, hparams):
self.learning_rate = hparams['learning_rate']

Args:
checkpoint_path (str): Path to checkpoint.
Expand Down
2 changes: 1 addition & 1 deletion pytorch_lightning/testing/model_base.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
# TODO: this should be discussed and moved out of this package
raise ImportError('Missing test-tube package.')

from pytorch_lightning import data_loader
from pytorch_lightning.core.decorators import data_loader
from pytorch_lightning.core.lightning import LightningModule


Expand Down
2 changes: 1 addition & 1 deletion pytorch_lightning/testing/model_mixins.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

import torch

from pytorch_lightning import data_loader
from pytorch_lightning.core.decorators import data_loader


class LightningValidationStepMixin:
Expand Down
1 change: 1 addition & 0 deletions pytorch_lightning/trainer/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,4 +26,5 @@
"""

from .trainer import Trainer

__all__ = ['Trainer']
1 change: 1 addition & 0 deletions pytorch_lightning/trainer/trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,7 @@ class Trainer(TrainerIOMixin,
TrainerTrainLoopMixin,
TrainerCallbackConfigMixin,
):

def __init__(
self,
logger=True,
Expand Down
6 changes: 6 additions & 0 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,6 +58,12 @@ def load_requirements(path_dir=PATH_ROOT, comment_char='#'):
setup_requires=[],
install_requires=load_requirements(PATH_ROOT),

project_urls={
"Bug Tracker": "https://github.com/PyTorchLightning/pytorch-lightning/issues",
"Documentation": "https://pytorch-lightning.rtfd.io/en/latest/",
"Source Code": "https://github.com/PyTorchLightning/pytorch-lightning",
},

classifiers=[
'Environment :: Console',
'Natural Language :: English',
Expand Down