Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mocking Loggers Part 5/5 (final) #3926

Merged
merged 42 commits into from
Oct 7, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
42 commits
Select commit Hold shift + click to select a range
8e245f0
base
awaelchli Oct 6, 2020
c5c0ec1
add xfail
awaelchli Oct 6, 2020
edf6a1d
new test
awaelchli Oct 6, 2020
fd6476b
import
awaelchli Oct 6, 2020
98fa225
missing import
awaelchli Oct 6, 2020
e16b5c4
Merge branch 'master' into tests/mock-mlflow-3
awaelchli Oct 6, 2020
485db30
Merge branch 'master' into tests/mock-mlflow-3
williamFalcon Oct 6, 2020
5d9330c
Merge remote-tracking branch 'PyTorchLightning/tests/mock-mlflow-3' i…
awaelchli Oct 6, 2020
f14834a
xfail if not installed
awaelchli Oct 5, 2020
a075e44
mock comet
awaelchli Oct 6, 2020
464a79f
line
awaelchli Oct 6, 2020
f3290de
line
awaelchli Oct 6, 2020
b9c45f6
Merge branch 'master' into tests/mock-comet-5
awaelchli Oct 6, 2020
59d8608
convert doctest
awaelchli Oct 6, 2020
e23bd0e
doctest
awaelchli Oct 6, 2020
9732720
docs
awaelchli Oct 6, 2020
785d7a3
Merge branch 'master' into tests/mock-comet-5
awaelchli Oct 6, 2020
9439260
prune Results usage in notebooks (#3911)
Borda Oct 6, 2020
be556e5
revamp entire metrics (#3868)
ananyahjha93 Oct 6, 2020
419ea6e
Callback docs with autosummary (#3908)
Oct 6, 2020
77fd8dd
skip some docker builds (temporally pass) (#3913)
Borda Oct 6, 2020
e01e4df
use badges only with push (#3914)
Borda Oct 6, 2020
fd10173
testtube
awaelchli Oct 6, 2020
5e6684e
mock test tube
awaelchli Oct 6, 2020
dfe1bd6
Merge branch 'master' into tests/mock-mlflow-3
awaelchli Oct 7, 2020
3599479
Merge remote-tracking branch 'PyTorchLightning/tests/mock-mlflow-3' i…
awaelchli Oct 7, 2020
a74c77f
mock mlflow
awaelchli Oct 7, 2020
6ef9a10
Merge branch 'tests/mock-testtube-1' into tests/mock-mlflow-final
awaelchli Oct 7, 2020
3669548
remove mlflow
awaelchli Oct 7, 2020
e657117
clean up
awaelchli Oct 7, 2020
f1e4069
test
awaelchli Oct 7, 2020
5f5373a
test
awaelchli Oct 7, 2020
31778b7
test
awaelchli Oct 7, 2020
782c91b
test
awaelchli Oct 7, 2020
ae12cff
test
awaelchli Oct 7, 2020
d945f59
test
awaelchli Oct 7, 2020
96d4087
Merge branch 'master' into tests/mock-mlflow-final
awaelchli Oct 7, 2020
dfeefb3
code blocks
awaelchli Oct 7, 2020
bf30255
remove import
awaelchli Oct 7, 2020
8c24d7b
codeblock
awaelchli Oct 7, 2020
7ee1a14
logger
awaelchli Oct 7, 2020
f9bf587
wandb causes stall
awaelchli Oct 7, 2020
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions docs/source/loggers.rst
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ First, install the package:

Then configure the logger and pass it to the :class:`~pytorch_lightning.trainer.trainer.Trainer`:

.. testcode::
.. code-block:: python

from pytorch_lightning.loggers import MLFlowLogger
mlf_logger = MLFlowLogger(
Expand Down Expand Up @@ -169,7 +169,7 @@ First, install the package:

Then configure the logger and pass it to the :class:`~pytorch_lightning.trainer.trainer.Trainer`:

.. testcode::
.. code-block:: python

from pytorch_lightning.loggers import TestTubeLogger
logger = TestTubeLogger('tb_logs', name='my_model')
Expand Down Expand Up @@ -232,7 +232,7 @@ Multiple Loggers
Lightning supports the use of multiple loggers, just pass a list to the
:class:`~pytorch_lightning.trainer.trainer.Trainer`.

.. testcode::
.. code-block:: python

from pytorch_lightning.loggers import TensorBoardLogger, TestTubeLogger
logger1 = TensorBoardLogger('tb_logs', name='my_model')
Expand Down
2 changes: 1 addition & 1 deletion docs/source/logging.rst
Original file line number Diff line number Diff line change
Expand Up @@ -306,7 +306,7 @@ Snapshot code
Loggers also allow you to snapshot a copy of the code used in this experiment.
For example, TestTubeLogger does this with a flag:

.. testcode::
.. code-block:: python

from pytorch_lightning.loggers import TestTubeLogger
logger = TestTubeLogger('.', create_git_tag=True)
Expand Down
41 changes: 22 additions & 19 deletions pytorch_lightning/loggers/mlflow.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,25 +43,28 @@ class MLFlowLogger(LightningLoggerBase):

pip install mlflow

Example:
>>> from pytorch_lightning import Trainer
>>> from pytorch_lightning.loggers import MLFlowLogger
>>> mlf_logger = MLFlowLogger(
... experiment_name="default",
... tracking_uri="file:./ml-runs"
... )
>>> trainer = Trainer(logger=mlf_logger)

Use the logger anywhere in you :class:`~pytorch_lightning.core.lightning.LightningModule` as follows:

>>> from pytorch_lightning import LightningModule
>>> class LitModel(LightningModule):
... def training_step(self, batch, batch_idx):
... # example
... self.logger.experiment.whatever_ml_flow_supports(...)
...
... def any_lightning_module_function_or_hook(self):
... self.logger.experiment.whatever_ml_flow_supports(...)
.. code-block:: python

from pytorch_lightning import Trainer
from pytorch_lightning.loggers import MLFlowLogger
mlf_logger = MLFlowLogger(
experiment_name="default",
tracking_uri="file:./ml-runs"
)
trainer = Trainer(logger=mlf_logger)

Use the logger anywhere in your :class:`~pytorch_lightning.core.lightning.LightningModule` as follows:

.. code-block:: python

from pytorch_lightning import LightningModule
class LitModel(LightningModule):
def training_step(self, batch, batch_idx):
# example
self.logger.experiment.whatever_ml_flow_supports(...)

def any_lightning_module_function_or_hook(self):
self.logger.experiment.whatever_ml_flow_supports(...)

Args:
experiment_name: The name of the experiment
Expand Down
33 changes: 17 additions & 16 deletions pytorch_lightning/loggers/test_tube.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,10 +21,8 @@

try:
from test_tube import Experiment
_TEST_TUBE_AVAILABLE = True
except ImportError: # pragma: no-cover
Experiment = None
_TEST_TUBE_AVAILABLE = False

from pytorch_lightning.core.lightning import LightningModule
from pytorch_lightning.loggers.base import LightningLoggerBase, rank_zero_experiment
Expand All @@ -41,22 +39,25 @@ class TestTubeLogger(LightningLoggerBase):

pip install test_tube

Example:
>>> from pytorch_lightning import Trainer
>>> from pytorch_lightning.loggers import TestTubeLogger
>>> logger = TestTubeLogger("tt_logs", name="my_exp_name")
>>> trainer = Trainer(logger=logger)
.. code-block:: python

from pytorch_lightning import Trainer
from pytorch_lightning.loggers import TestTubeLogger
logger = TestTubeLogger("tt_logs", name="my_exp_name")
trainer = Trainer(logger=logger)

Use the logger anywhere in your :class:`~pytorch_lightning.core.lightning.LightningModule` as follows:

>>> from pytorch_lightning import LightningModule
>>> class LitModel(LightningModule):
... def training_step(self, batch, batch_idx):
... # example
... self.logger.experiment.whatever_method_summary_writer_supports(...)
...
... def any_lightning_module_function_or_hook(self):
... self.logger.experiment.add_histogram(...)
.. code-block:: python

from pytorch_lightning import LightningModule
class LitModel(LightningModule):
def training_step(self, batch, batch_idx):
# example
self.logger.experiment.whatever_method_summary_writer_supports(...)

def any_lightning_module_function_or_hook(self):
self.logger.experiment.add_histogram(...)

Args:
save_dir: Save directory
Expand All @@ -83,7 +84,7 @@ def __init__(
create_git_tag: bool = False,
log_graph: bool = False
):
if not _TEST_TUBE_AVAILABLE:
if Experiment is None:
raise ImportError('You want to use `test_tube` logger which is not installed yet,'
' install it with `pip install test-tube`.')
super().__init__()
Expand Down
4 changes: 0 additions & 4 deletions requirements/extra.txt
Original file line number Diff line number Diff line change
@@ -1,9 +1,5 @@
# extended list of package dependencies to reach full functionality

# TODO: this shall be removed as we mock them in tests
mlflow>=1.0.0
test_tube>=0.7.5

matplotlib>=3.1.1
# no need to install with [pytorch] as pytorch is already installed and torchvision is required only for Horovod examples
horovod>=0.19.2, != 0.20.0 # v0.20.0 has problem with building the wheel/installation
Expand Down
6 changes: 0 additions & 6 deletions tests/base/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,12 +8,6 @@

from tests.base.datasets import TrialMNIST, AverageDataset, MNIST

try:
from test_tube import HyperOptArgumentParser
except ImportError as exp:
# TODO: this should be discussed and moved out of this package
raise ImportError('Missing test-tube package.') from exp

from pytorch_lightning.core.lightning import LightningModule


Expand Down
57 changes: 40 additions & 17 deletions tests/loggers/test_all.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
import atexit
import inspect
import os
import pickle
Expand All @@ -20,6 +19,7 @@
from pytorch_lightning.loggers.base import DummyExperiment
from tests.base import EvalModelTemplate
from tests.loggers.test_comet import _patch_comet_atexit
from tests.loggers.test_mlflow import mock_mlflow_run_creation


def _get_logger_args(logger_class, save_dir):
Expand All @@ -34,27 +34,31 @@ def _get_logger_args(logger_class, save_dir):


def test_loggers_fit_test_all(tmpdir, monkeypatch):
_patch_comet_atexit(monkeypatch)
""" Verify that basic functionality of all loggers. """

_test_loggers_fit_test(tmpdir, TensorBoardLogger)

with mock.patch('pytorch_lightning.loggers.comet.comet_ml'), \
mock.patch('pytorch_lightning.loggers.comet.CometOfflineExperiment'):
_patch_comet_atexit(monkeypatch)
_test_loggers_fit_test(tmpdir, CometLogger)

_test_loggers_fit_test(tmpdir, MLFlowLogger)
with mock.patch('pytorch_lightning.loggers.mlflow.mlflow'), \
mock.patch('pytorch_lightning.loggers.mlflow.MlflowClient'):
_test_loggers_fit_test(tmpdir, MLFlowLogger)

with mock.patch('pytorch_lightning.loggers.neptune.neptune'):
_test_loggers_fit_test(tmpdir, NeptuneLogger)

_test_loggers_fit_test(tmpdir, TensorBoardLogger)
_test_loggers_fit_test(tmpdir, TestTubeLogger)
with mock.patch('pytorch_lightning.loggers.test_tube.Experiment'):
_test_loggers_fit_test(tmpdir, TestTubeLogger)

with mock.patch('pytorch_lightning.loggers.wandb.wandb'):
_test_loggers_fit_test(tmpdir, WandbLogger)


def _test_loggers_fit_test(tmpdir, logger_class):
"""Verify that basic functionality of all loggers."""
os.environ['PL_DEV_DEBUG'] = '0'

model = EvalModelTemplate()

class StoreHistoryLogger(logger_class):
Expand All @@ -78,6 +82,13 @@ def log_metrics(self, metrics, step):
logger.experiment.id = 'foo'
logger.experiment.project_name = 'bar'

if logger_class == TestTubeLogger:
logger.experiment.version = 'foo'
logger.experiment.name = 'bar'

if logger_class == MLFlowLogger:
logger = mock_mlflow_run_creation(logger, experiment_id="foo", run_id="bar")

trainer = Trainer(
max_epochs=1,
logger=logger,
Expand Down Expand Up @@ -109,21 +120,27 @@ def log_metrics(self, metrics, step):


def test_loggers_save_dir_and_weights_save_path_all(tmpdir, monkeypatch):
_patch_comet_atexit(monkeypatch)
""" Test the combinations of save_dir, weights_save_path and default_root_dir. """

_test_loggers_save_dir_and_weights_save_path(tmpdir, TensorBoardLogger)

with mock.patch('pytorch_lightning.loggers.comet.comet_ml'), \
mock.patch('pytorch_lightning.loggers.comet.CometOfflineExperiment'):
_patch_comet_atexit(monkeypatch)
_test_loggers_save_dir_and_weights_save_path(tmpdir, CometLogger)

_test_loggers_save_dir_and_weights_save_path(tmpdir, TensorBoardLogger)
_test_loggers_save_dir_and_weights_save_path(tmpdir, MLFlowLogger)
_test_loggers_save_dir_and_weights_save_path(tmpdir, TestTubeLogger)
with mock.patch('pytorch_lightning.loggers.mlflow.mlflow'), \
mock.patch('pytorch_lightning.loggers.mlflow.MlflowClient'):
_test_loggers_save_dir_and_weights_save_path(tmpdir, MLFlowLogger)

with mock.patch('pytorch_lightning.loggers.test_tube.Experiment'):
_test_loggers_save_dir_and_weights_save_path(tmpdir, TestTubeLogger)

with mock.patch('pytorch_lightning.loggers.wandb.wandb'):
_test_loggers_save_dir_and_weights_save_path(tmpdir, WandbLogger)


def _test_loggers_save_dir_and_weights_save_path(tmpdir, logger_class):
""" Test the combinations of save_dir, weights_save_path and default_root_dir. """

class TestLogger(logger_class):
# for this test it does not matter what these attributes are
Expand Down Expand Up @@ -255,18 +272,24 @@ def on_train_batch_start(self, trainer, pl_module, batch, batch_idx, dataloader_
assert pl_module.logger.experiment.something(foo="bar") is None


@pytest.mark.skipif(platform.system() == "Windows", reason="Distributed training is not supported on Windows")
@pytest.mark.parametrize("logger_class", [
TensorBoardLogger,
CometLogger,
MLFlowLogger,
# NeptuneLogger, # TODO: fix: https://github.com/PyTorchLightning/pytorch-lightning/pull/3256
NeptuneLogger,
TensorBoardLogger,
TestTubeLogger,
])
@mock.patch('pytorch_lightning.loggers.neptune.neptune')
def test_logger_created_on_rank_zero_only(neptune, tmpdir, monkeypatch, logger_class):
@pytest.mark.skipif(platform.system() == "Windows", reason="Distributed training is not supported on Windows")
def test_logger_created_on_rank_zero_only(tmpdir, monkeypatch, logger_class):
""" Test that loggers get replaced by dummy loggers on global rank > 0"""
_patch_comet_atexit(monkeypatch)
try:
_test_logger_created_on_rank_zero_only(tmpdir, logger_class)
except (ImportError, ModuleNotFoundError):
pytest.xfail(f"multi-process test requires {logger_class.__class__} dependencies to be installed.")


def _test_logger_created_on_rank_zero_only(tmpdir, logger_class):
logger_args = _get_logger_args(logger_class, tmpdir)
logger = logger_class(**logger_args)
model = EvalModelTemplate()
Expand Down
11 changes: 10 additions & 1 deletion tests/loggers/test_mlflow.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,13 +5,22 @@
from unittest.mock import MagicMock
import pytest

from mlflow.tracking import MlflowClient

from pytorch_lightning import Trainer
from pytorch_lightning.loggers import MLFlowLogger
from tests.base import EvalModelTemplate


def mock_mlflow_run_creation(logger, experiment_name=None, experiment_id=None, run_id=None):
""" Helper function to simulate mlflow client creating a new (or existing) experiment. """
run = MagicMock()
run.info.run_id = run_id
logger._mlflow_client.get_experiment_by_name = MagicMock(return_value=experiment_name)
logger._mlflow_client.create_experiment = MagicMock(return_value=experiment_id)
logger._mlflow_client.create_run = MagicMock(return_value=run)
return logger


@mock.patch('pytorch_lightning.loggers.mlflow.mlflow')
@mock.patch('pytorch_lightning.loggers.mlflow.MlflowClient')
def test_mlflow_logger_exists(client, mlflow, tmpdir):
Expand Down