Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove deprecated TestTubeLogger #12859

Merged
merged 5 commits into from
Apr 24, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 0 additions & 3 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,6 @@ pip-wheel-metadata/
lightning_logs/
.vscode/

# Test-tube
test_tube_*/

# Documentations
docs/source/api
docs/source/*.md
Expand Down
3 changes: 3 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,6 +70,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).

### Removed

- Removed the deprecated `TestTubeLogger` ([#12859](https://github.com/PyTorchLightning/pytorch-lightning/pull/12859))


- Removed the deprecated `pytorch_lightning.core.memory.LayerSummary` and `pytorch_lightning.core.memory.ModelSummary` ([#12593](https://github.com/PyTorchLightning/pytorch-lightning/pull/12593))


Expand Down
1 change: 0 additions & 1 deletion docs/source/api_references.rst
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,6 @@ loggers
mlflow
neptune
tensorboard
test_tube
wandb

loops
Expand Down
2 changes: 1 addition & 1 deletion docs/source/common/evaluation_intermediate.rst
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ To run the test set on a pre-trained model, use this method.

model = MyLightningModule.load_from_checkpoint(
checkpoint_path="/path/to/pytorch_checkpoint.ckpt",
hparams_file="/path/to/test_tube/experiment/version/hparams.yaml",
hparams_file="/path/to/experiment/version/hparams.yaml",
map_location=None,
)

Expand Down
1 change: 0 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,6 @@ module = [
"pytorch_lightning.loggers.mlflow",
"pytorch_lightning.loggers.neptune",
"pytorch_lightning.loggers.tensorboard",
"pytorch_lightning.loggers.test_tube",
akihironitta marked this conversation as resolved.
Show resolved Hide resolved
"pytorch_lightning.loggers.wandb",
"pytorch_lightning.loops.epoch.training_epoch_loop",
"pytorch_lightning.strategies.ddp",
Expand Down
4 changes: 0 additions & 4 deletions pytorch_lightning/loggers/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,6 @@
from pytorch_lightning.loggers.comet import _COMET_AVAILABLE, CometLogger # noqa: F401
from pytorch_lightning.loggers.mlflow import _MLFLOW_AVAILABLE, MLFlowLogger # noqa: F401
from pytorch_lightning.loggers.neptune import _NEPTUNE_AVAILABLE, NeptuneLogger # noqa: F401
from pytorch_lightning.loggers.test_tube import _TESTTUBE_AVAILABLE, TestTubeLogger # noqa: F401
from pytorch_lightning.loggers.wandb import WandbLogger # noqa: F401
from pytorch_lightning.utilities.imports import _WANDB_AVAILABLE

Expand All @@ -40,8 +39,5 @@
if _NEPTUNE_AVAILABLE:
__all__.append("NeptuneLogger")

if _TESTTUBE_AVAILABLE:
__all__.append("TestTubeLogger")

if _WANDB_AVAILABLE:
__all__.append("WandbLogger")
251 changes: 0 additions & 251 deletions pytorch_lightning/loggers/test_tube.py

This file was deleted.

1 change: 0 additions & 1 deletion requirements/loggers.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,5 +2,4 @@
neptune-client>=0.10.0
comet-ml>=3.1.12
mlflow>=1.0.0
test_tube>=0.7.5
wandb>=0.8.21
8 changes: 1 addition & 7 deletions tests/deprecated_api/test_remove_1-7.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@

from pytorch_lightning import Callback, Trainer
from pytorch_lightning.callbacks.lr_monitor import LearningRateMonitor
from pytorch_lightning.loggers import LoggerCollection, TestTubeLogger
from pytorch_lightning.loggers import LoggerCollection
from pytorch_lightning.overrides.distributed import IndexBatchSamplerWrapper
from pytorch_lightning.plugins.environments import (
KubeflowEnvironment,
Expand Down Expand Up @@ -84,12 +84,6 @@ def _run(model, task="fit"):
_run(model, "predict")


@mock.patch("pytorch_lightning.loggers.test_tube.Experiment")
def test_v1_7_0_test_tube_logger(_, tmpdir):
with pytest.deprecated_call(match="The TestTubeLogger is deprecated since v1.5 and will be removed in v1.7"):
_ = TestTubeLogger(tmpdir)


def test_v1_7_0_on_interrupt(tmpdir):
class HandleInterruptCallback(Callback):
def on_keyboard_interrupt(self, trainer, pl_module):
Expand Down
Loading