Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move contrib.handlers #3204

Merged
merged 5 commits into from
Mar 22, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
40 changes: 4 additions & 36 deletions docs/source/contrib/handlers.rst
Original file line number Diff line number Diff line change
Expand Up @@ -25,40 +25,8 @@ Time profilers [deprecated]
Use :class:`~ignite.handlers.time_profilers.BasicTimeProfiler` instead, will be removed in version 0.6.0.
Use :class:`~ignite.handlers.time_profilers.HandlersTimeProfiler` instead, will be removed in version 0.6.0.

Loggers
-------
Loggers [deprecated]
--------------------

.. currentmodule:: ignite.contrib.handlers

.. autosummary::
:nosignatures:
:toctree: ../generated
:recursive:

base_logger
clearml_logger
mlflow_logger
neptune_logger
polyaxon_logger
tensorboard_logger
tqdm_logger

visdom_logger
wandb_logger

.. seealso::

Below are a comprehensive list of examples of various loggers.

* See `tensorboardX mnist example <https://github.com/pytorch/ignite/blob/master/examples/mnist/mnist_with_tensorboard_logger.py>`_
and `CycleGAN and EfficientNet notebooks <https://github.com/pytorch/ignite/tree/master/examples/notebooks>`_ for detailed usage.

* See `visdom mnist example <https://github.com/pytorch/ignite/blob/master/examples/mnist/mnist_with_visdom_logger.py>`_ for detailed usage.

* See `neptune mnist example <https://github.com/pytorch/ignite/blob/master/examples/mnist/mnist_with_neptune_logger.py>`_ for detailed usage.

* See `tqdm mnist example <https://github.com/pytorch/ignite/blob/master/examples/mnist/mnist_with_tqdm_logger.py>`_ for detailed usage.

* See `wandb mnist example <https://github.com/pytorch/ignite/blob/master/examples/mnist/mnist_with_wandb_logger.py>`_ for detailed usage.

* See `clearml mnist example <https://github.com/pytorch/ignite/blob/master/examples/mnist/mnist_with_clearml_logger.py>`_ for detailed usage.
.. deprecated:: 0.4.14
Loggers moved to :ref:`Loggers`.
46 changes: 43 additions & 3 deletions docs/source/handlers.rst
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
ignite.handlers
===============

Complete list of handlers
-------------------------
Complete list of generic handlers
----------------------------------

.. currentmodule:: ignite.handlers

Expand Down Expand Up @@ -33,6 +33,46 @@ Complete list of handlers
param_scheduler.ParamScheduler
state_param_scheduler.StateParamScheduler


Loggers
--------

.. currentmodule:: ignite.handlers

.. autosummary::
:nosignatures:
:toctree: generated
:recursive:

base_logger
clearml_logger
mlflow_logger
neptune_logger
polyaxon_logger
tensorboard_logger
tqdm_logger

visdom_logger
wandb_logger

.. seealso::

Below are a comprehensive list of examples of various loggers.

* See `tensorboardX mnist example <https://github.com/pytorch/ignite/blob/master/examples/mnist/mnist_with_tensorboard_logger.py>`_
and `CycleGAN and EfficientNet notebooks <https://github.com/pytorch/ignite/tree/master/examples/notebooks>`_ for detailed usage.

* See `visdom mnist example <https://github.com/pytorch/ignite/blob/master/examples/mnist/mnist_with_visdom_logger.py>`_ for detailed usage.

* See `neptune mnist example <https://github.com/pytorch/ignite/blob/master/examples/mnist/mnist_with_neptune_logger.py>`_ for detailed usage.

* See `tqdm mnist example <https://github.com/pytorch/ignite/blob/master/examples/mnist/mnist_with_tqdm_logger.py>`_ for detailed usage.

* See `wandb mnist example <https://github.com/pytorch/ignite/blob/master/examples/mnist/mnist_with_wandb_logger.py>`_ for detailed usage.

* See `clearml mnist example <https://github.com/pytorch/ignite/blob/master/examples/mnist/mnist_with_clearml_logger.py>`_ for detailed usage.


.. _param-scheduler-label:

Parameter scheduler
Expand Down Expand Up @@ -396,7 +436,7 @@ Example with :class:`ignite.handlers.param_scheduler.ReduceLROnPlateauScheduler`
init_lr = 0.1

lr_values = np.array(ReduceLROnPlateauScheduler.simulate_values(
num_events, metric_values, init_lr,
num_events, metric_values, init_lr,
factor=0.5, patience=1, mode='max', threshold=0.01, threshold_mode='abs'
)
)
Expand Down
27 changes: 15 additions & 12 deletions ignite/contrib/engines/common.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,21 +15,24 @@
from torch.optim.lr_scheduler import _LRScheduler as PyTorchLRScheduler

import ignite.distributed as idist
from ignite.contrib.handlers import (
from ignite.contrib.metrics import GpuInfo
from ignite.engine import Engine, Events
from ignite.handlers import (
Checkpoint,
ClearMLLogger,
DiskSaver,
EarlyStopping,
global_step_from_engine,
MLflowLogger,
NeptuneLogger,
PolyaxonLogger,
ProgressBar,
TensorboardLogger,
TerminateOnNan,
VisdomLogger,
WandBLogger,
)
from ignite.contrib.handlers.base_logger import BaseLogger
from ignite.contrib.metrics import GpuInfo
from ignite.engine import Engine, Events
from ignite.handlers import Checkpoint, DiskSaver, EarlyStopping, TerminateOnNan
from ignite.handlers.base_logger import BaseLogger
from ignite.handlers.checkpoint import BaseSaveHandler
from ignite.handlers.param_scheduler import ParamScheduler
from ignite.metrics import RunningAverage
Expand Down Expand Up @@ -361,7 +364,7 @@ def setup_tb_logging(
kwargs: optional keyword args to be passed to construct the logger.

Returns:
:class:`~ignite.contrib.handlers.tensorboard_logger.TensorboardLogger`
:class:`~ignite.handlers.tensorboard_logger.TensorboardLogger`
"""
logger = TensorboardLogger(log_dir=output_path, **kwargs)
_setup_logging(logger, trainer, optimizers, evaluators, log_every_iters)
Expand Down Expand Up @@ -392,7 +395,7 @@ def setup_visdom_logging(
kwargs: optional keyword args to be passed to construct the logger.

Returns:
:class:`~ignite.contrib.handlers.visdom_logger.VisdomLogger`
:class:`~ignite.handlers.visdom_logger.VisdomLogger`
"""
logger = VisdomLogger(**kwargs)
_setup_logging(logger, trainer, optimizers, evaluators, log_every_iters)
Expand Down Expand Up @@ -423,7 +426,7 @@ def setup_mlflow_logging(
kwargs: optional keyword args to be passed to construct the logger.

Returns:
:class:`~ignite.contrib.handlers.mlflow_logger.MLflowLogger`
:class:`~ignite.handlers.mlflow_logger.MLflowLogger`
"""
logger = MLflowLogger(**kwargs)
_setup_logging(logger, trainer, optimizers, evaluators, log_every_iters)
Expand Down Expand Up @@ -454,7 +457,7 @@ def setup_neptune_logging(
kwargs: optional keyword args to be passed to construct the logger.

Returns:
:class:`~ignite.contrib.handlers.neptune_logger.NeptuneLogger`
:class:`~ignite.handlers.neptune_logger.NeptuneLogger`
"""
logger = NeptuneLogger(**kwargs)
_setup_logging(logger, trainer, optimizers, evaluators, log_every_iters)
Expand Down Expand Up @@ -485,7 +488,7 @@ def setup_wandb_logging(
kwargs: optional keyword args to be passed to construct the logger.

Returns:
:class:`~ignite.contrib.handlers.wandb_logger.WandBLogger`
:class:`~ignite.handlers.wandb_logger.WandBLogger`
"""
logger = WandBLogger(**kwargs)
_setup_logging(logger, trainer, optimizers, evaluators, log_every_iters)
Expand Down Expand Up @@ -516,7 +519,7 @@ def setup_plx_logging(
kwargs: optional keyword args to be passed to construct the logger.

Returns:
:class:`~ignite.contrib.handlers.polyaxon_logger.PolyaxonLogger`
:class:`~ignite.handlers.polyaxon_logger.PolyaxonLogger`
"""
logger = PolyaxonLogger(**kwargs)
_setup_logging(logger, trainer, optimizers, evaluators, log_every_iters)
Expand Down Expand Up @@ -547,7 +550,7 @@ def setup_clearml_logging(
kwargs: optional keyword args to be passed to construct the logger.

Returns:
:class:`~ignite.contrib.handlers.clearml_logger.ClearMLLogger`
:class:`~ignite.handlers.clearml_logger.ClearMLLogger`
"""
logger = ClearMLLogger(**kwargs)
_setup_logging(logger, trainer, optimizers, evaluators, log_every_iters)
Expand Down
31 changes: 21 additions & 10 deletions ignite/contrib/handlers/__init__.py
Original file line number Diff line number Diff line change
@@ -1,14 +1,19 @@
from ignite.contrib.handlers.clearml_logger import ClearMLLogger
from ignite.contrib.handlers.mlflow_logger import MLflowLogger
from ignite.contrib.handlers.neptune_logger import NeptuneLogger
from ignite.contrib.handlers.polyaxon_logger import PolyaxonLogger
from ignite.contrib.handlers.tensorboard_logger import TensorboardLogger
from ignite.contrib.handlers.tqdm_logger import ProgressBar

from ignite.contrib.handlers.visdom_logger import VisdomLogger
from ignite.contrib.handlers.wandb_logger import WandBLogger
from ignite.handlers import EpochOutputStore, global_step_from_engine # ref # ref
from ignite.handlers import ( # ref # ref
clearml_logger,
EpochOutputStore,
global_step_from_engine,
mlflow_logger,
neptune_logger,
polyaxon_logger,
tensorboard_logger,
tqdm_logger,
visdom_logger,
wandb_logger,
)
from ignite.handlers.clearml_logger import ClearMLLogger
from ignite.handlers.lr_finder import FastaiLRFinder
from ignite.handlers.mlflow_logger import MLflowLogger
from ignite.handlers.neptune_logger import NeptuneLogger
from ignite.handlers.param_scheduler import (
ConcatScheduler,
CosineAnnealingScheduler,
Expand All @@ -18,4 +23,10 @@
ParamGroupScheduler,
PiecewiseLinear,
)
from ignite.handlers.polyaxon_logger import PolyaxonLogger
from ignite.handlers.tensorboard_logger import TensorboardLogger
from ignite.handlers.time_profilers import BasicTimeProfiler, HandlersTimeProfiler
from ignite.handlers.tqdm_logger import ProgressBar

from ignite.handlers.visdom_logger import VisdomLogger
from ignite.handlers.wandb_logger import WandBLogger