Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

v22: simplification 2.0 #1395

Merged
merged 42 commits into from
Feb 4, 2022
Merged

v22: simplification 2.0 #1395

merged 42 commits into from
Feb 4, 2022

Conversation

Scitator
Copy link
Member

@Scitator Scitator commented Jan 26, 2022

Pull Request FAQ

Description

  • core architecture moved to Animus-like.
  • engines moved to Accelerate.
  • config/hydra APIs deprecated in favor of hydra-slayer-custom config runners.
  • dl-based scripts removed from the API, there is no need in them now
  • self-supervised runner moved to examples - it's better to have custom still
  • contrib and utils - truncated
  • requirements - simplified
  • codestyle moved to -l 89 (better view on 16'' screen ;) )

Related Issue

Type of Change

  • Examples / docs / tutorials / contributors update
  • Bug fix (non-breaking change which fixes an issue)
  • Improvement (non-breaking change which improves an existing feature)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Checklist

  • Have you updated tests for the new functionality?
  • Have you added your new classes/functions to the docs?
  • Have you updated the CHANGELOG?
  • Have you run colab minimal CI/CD with latest and minimal requirements?
  • Have you checked XLA integration with single and multiple processes?

@pep8speaks
Copy link

pep8speaks commented Jan 26, 2022

Hello @Scitator! Thanks for updating this PR. We checked the lines you've touched for PEP 8 issues, and found:

Line 225:87: E231 missing whitespace after ','

Line 12:1: E303 too many blank lines (3)

Comment last updated at 2022-01-26 05:47:43 UTC

"loader_key": loader_key,
"scope": scope,
},
metadata={"stage_key": stage_key, "loader_key": loader_key, "scope": scope,},

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
E231 missing whitespace after ','


Args:
device: use device, default is `"cpu"`.
class CPUEngine(IEngine):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
D101 Missing docstring in public class


.. code-block:: python
class GPUEngine(IEngine):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
D101 Missing docstring in public class

engine=dl.DeviceEngine("cuda:1"),
...
)
class DataParallelEngine(GPUEngine):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
D101 Missing docstring in public class

...
)
class DataParallelEngine(GPUEngine):
def prepare_model(self, model):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
D102 Missing docstring in public method


from catalyst.typing import Criterion, Device, Model, Optimizer, Scheduler
class IEngine(Accelerator):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
D101 Missing docstring in public class

from catalyst.typing import Criterion, Device, Model, Optimizer, Scheduler
class IEngine(Accelerator):
@staticmethod
def spawn(fn: Callable):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
D102 Missing docstring in public method

Default is `1`.
"""
@staticmethod
def setup(local_rank: int, world_size: int):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
D102 Missing docstring in public method

def cleanup_process(self):
"""Clean DDP variables and processes."""
@staticmethod
def cleanup():

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
D102 Missing docstring in public method

Args:
tensor_or_module: tensor to mode
"""
def mean_reduce_ddp_metrics(self, metrics: Dict):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
D102 Missing docstring in public method

EPOCH_METRICS = Dict[str, LOADER_METRICS]
# {0: {"train": {}, "valid": {}}, 1: {...}}
EXPERIMENT_METRICS = Dict[int, EPOCH_METRICS]

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
W291 trailing whitespace

criterion: Criterion = None,
optimizer: Optimizer = None,
scheduler: Scheduler = None,
criterion: TorchCriterion = None,

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
DAR102 Excess parameter(s) in Docstring: + trial

@@ -195,7 +195,7 @@ def get_engine(self) -> IEngine:
engine = REGISTRY.get_from_params(**engine_params)
else:
engine = get_available_engine(
fp16=self._fp16, ddp=self._ddp, amp=self._amp, apex=self._apex
fp16=self._fp16, ddp=self._ddp, fp16=self._amp, apex=self._apex

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
E999 SyntaxError: keyword argument repeated


def __init__(
self,
model: RunnerModel = None,

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
F821 undefined name 'RunnerModel'

def __init__(
self,
model: RunnerModel = None,
engine: IEngine = None,

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
F821 undefined name 'IEngine'

Runner.__init__(self, model=model, engine=engine)
self.loss_mode_prefix = loss_mode_prefix

@torch.no_grad()

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
F821 undefined name 'torch'

output = self.forward(batch, **kwargs)
return output

def get_callbacks(self) -> "OrderedDict[str, Callback]":

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
F821 undefined name 'OrderedDict'

self._storage: List[Checkpoint] = []
os.makedirs(self.logdir, exist_ok=True)

def save(self, runner: "IRunner", obj: Any, logprefix: str) -> str:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
D102 Missing docstring in public method

"""Event handler."""
self._storage: List[Checkpoint] = []

def on_epoch_end_best(self, runner: "IRunner") -> None:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
D102 Missing docstring in public method

# worker sync
runner.engine.barrier()
return
def on_epoch_end_last(self, runner: "IRunner") -> None:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
D102 Missing docstring in public method

@@ -2,7 +2,7 @@
from functools import partial
import logging

from catalyst.core.callback import CallbackNode, CallbackOrder, IOptimizerCallback
from catalyst.core.callback import CallbackOrder, IOptimizerCallback

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
F401 'catalyst.core.callback.CallbackOrder' imported but unused

import logging
import warnings

from accelerate.state import DistributedType

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
F401 'accelerate.state.DistributedType' imported but unused

@@ -852,8 +537,8 @@ def predict_batch(self, batch: Mapping[str, Any], **kwargs) -> Mapping[str, Any]
Run model inference on specified data batch.

.. warning::
You should not override this method. If you need specific model
call, override forward() method
You should not override this method.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
W291 trailing whitespace

@@ -14,13 +14,15 @@ class SklearnModelCallback(Callback):
to give predictions on the valid loader.

Args:
feature_key: keys of tensors that should be used as features in the classifier calculations
target_key: keys of tensors that should be used as targets in the classifier calculations
feature_key: keys of tensors that should be used as features

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
W291 trailing whitespace

target_key: keys of tensors that should be used as targets in the classifier calculations
feature_key: keys of tensors that should be used as features
for the classifier fit
target_key: keys of tensors that should be used as targets

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
W291 trailing whitespace

@@ -16,7 +16,8 @@ class OnnxCallback(Callback):
input_key: input key from ``runner.batch`` to use for onnx export
logdir: path to folder for saving
filename: filename
method_name (str, optional): Forward pass method to be converted. Defaults to "forward".
method_name (str, optional): Forward pass method to be converted.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
W291 trailing whitespace

@@ -81,7 +81,8 @@ def __init__(
input_key: input key from ``runner.batch`` to use for model tracing
logdir: path to folder for saving
filename: filename
method_name: Model's method name that will be used as entrypoint during tracing
method_name: Model's method name

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
W291 trailing whitespace

also proxy. We are not interested in them, are we? So, if ``on_train_only``
alpha: beta distribution a=b parameters. Must be >=0.
The more alpha closer to zero the less effect of the mixup.
mode: mode determines the method of use. Must be in ["replace", "add"].

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
W291 trailing whitespace

alpha: beta distribution a=b parameters. Must be >=0.
The more alpha closer to zero the less effect of the mixup.
mode: mode determines the method of use. Must be in ["replace", "add"].
If "replace" then replaces the batch with a mixed one,

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
W291 trailing whitespace

mode: mode determines the method of use. Must be in ["replace", "add"].
If "replace" then replaces the batch with a mixed one,
while the batch size is not changed.
If "add", concatenates mixed examples to the current ones,

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
W291 trailing whitespace

while the batch size is not changed.
If "add", concatenates mixed examples to the current ones,
the batch size increases by 2 times.
on_train_only: apply to train only.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
W291 trailing whitespace

If "add", concatenates mixed examples to the current ones,
the batch size increases by 2 times.
on_train_only: apply to train only.
As the mixup use the proxy inputs, the targets are also proxy.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
W291 trailing whitespace

class Runner(IRunner):
"""Single-stage deep learning Runner with user-friendly API.

Runner supports the logic for deep learning pipeline configuration with pure python code.
Runner supports the logic for deep learning pipeline configuration

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
W291 trailing whitespace

@@ -66,7 +54,8 @@ class Runner(IRunner):

It does not automatically add Criterion, Optimizer or Scheduler callbacks.

That means, that you have do optimization step by yourself during ``handle_batch`` method
That means, that you have do optimization step by yourself during

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
W291 trailing whitespace

neptune_run_kwargs: Optional, additional keyword arguments to be passed directly to the
`neptune.init() <https://docs.neptune.ai/api-reference/neptune#init>`_ function.
(default: SETTINGS.log_epoch_metrics or True).
neptune_run_kwargs: Optional, additional keyword arguments

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
W291 trailing whitespace

`neptune.init() <https://docs.neptune.ai/api-reference/neptune#init>`_ function.
(default: SETTINGS.log_epoch_metrics or True).
neptune_run_kwargs: Optional, additional keyword arguments
to be passed directly to the

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
W291 trailing whitespace

(default: SETTINGS.log_epoch_metrics or True).
neptune_run_kwargs: Optional, additional keyword arguments
to be passed directly to the
`neptune.init() <https://docs.neptune.ai/api-reference/neptune#init>`_

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
W291 trailing whitespace

@@ -98,21 +98,20 @@ def _log_metrics(self, metrics: Dict[str, float], step: int, loader_key: str):
log_line_csv = log_line_csv[:-1] + "\n" # replace last "," with new line
self.loggers[loader_key].write(log_line_csv)

def log_hparams(self, hparams: Dict) -> None:
"""Logs hyperparameters to the logger."""
if scope == "experiment":

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
F821 undefined name 'scope'

@@ -25,7 +25,8 @@ class WandbLogger(ILogger):
(default: SETTINGS.log_batch_metrics or False).
log_epoch_metrics: boolean flag to log epoch metrics
(default: SETTINGS.log_epoch_metrics or True).
kwargs: Optional, additional keyword arguments to be passed directly to the wandb.init
kwargs: Optional,

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
W291 trailing whitespace

artifact = wandb.Artifact(
name=self.run.id + "_aritfacts",
type="artifact",
metadata={"loader_key": loader_key, "scope": scope,},

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
E231 missing whitespace after ','

@@ -22,21 +22,20 @@ def __init__(self, log_hparams: bool = False):
super().__init__(log_batch_metrics=False, log_epoch_metrics=True)
self._log_hparams = log_hparams

def log_hparams(self, hparams: Dict) -> None:
"""Logs hyperparameters to the console."""
if scope == "experiment" and self._log_hparams:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
F821 undefined name 'scope'

@@ -195,7 +195,7 @@ def get_engine(self) -> IEngine:
engine = REGISTRY.get_from_params(**engine_params)
else:
engine = get_available_engine(
fp16=self._fp16, ddp=self._ddp, amp=self._amp, apex=self._apex
fp16=self._fp16, ddp=self._ddp, fp16=self._amp, apex=self._apex

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
E999 SyntaxError: keyword argument repeated

@@ -0,0 +1,146 @@
from typing import Callable, Dict, List, TYPE_CHECKING, Union

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
F401 'typing.TYPE_CHECKING' imported but unused

from torch import nn, Tensor
import torch.backends

from catalyst.settings import SETTINGS

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
F401 'catalyst.settings.SETTINGS' imported but unused

@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB



# # Torch
# @mark.skipif(not requirements_satisfied, reason="catalyst[ml] and catalyst[cv] required")
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
E501 line too long (91 > 89 characters)



# # Torch
# @mark.skipif(not requirements_satisfied, reason="catalyst[ml] and catalyst[cv] required")
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
W505 doc line too long (91 > 89 characters)

runner = SupervisedRunner()

runner.evaluate_loader(loader=loader, callbacks=callbacks, model=model)


def test_epoch_increasing():
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
D103 Missing docstring in public function


# data
num_samples, num_features = int(1e4), int(1e1)
X = torch.rand(num_samples, num_features)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
N806 variable 'X' in function should be lowercase

# "train": DataLoader(MNIST(os.getcwd(), train=False), batch_size=32,),
# "valid": DataLoader(MNIST(os.getcwd(), train=False), batch_size=32,),
# }
# model = nn.Sequential(Flatten(), nn.Linear(784, 512), nn.ReLU(), nn.Linear(512, 10))
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
E501 line too long (90 > 89 characters)

# "train": DataLoader(MNIST(os.getcwd(), train=False), batch_size=32,),
# "valid": DataLoader(MNIST(os.getcwd(), train=False), batch_size=32,),
# }
# model = nn.Sequential(Flatten(), nn.Linear(784, 512), nn.ReLU(), nn.Linear(512, 10))
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
W505 doc line too long (90 > 89 characters)

# logits = self.model(batch["features"].view(batch["features"].size(0), -1))

# loss = F.cross_entropy(logits, batch["targets"])
# accuracy01, accuracy03 = metrics.accuracy(logits, batch["targets"], topk=(1, 3))
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
E501 line too long (90 > 89 characters)

# logits = self.model(batch["features"].view(batch["features"].size(0), -1))

# loss = F.cross_entropy(logits, batch["targets"])
# accuracy01, accuracy03 = metrics.accuracy(logits, batch["targets"], topk=(1, 3))
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
W505 doc line too long (90 > 89 characters)

)
from catalyst.utils.distributed import get_nn_from_ddp_module
from catalyst.utils.misc import maybe_recursive_call, merge_dicts
from catalyst.utils.misc import maybe_recursive_call
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
F401 'catalyst.utils.misc.maybe_recursive_call' imported but unused

@@ -1,244 +1,22 @@
from typing import Dict, Iterable, Union
from collections import OrderedDict
from typing import Any, Dict, List, Union
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
F401 'typing.Dict' imported but unused

@@ -1,244 +1,22 @@
from typing import Dict, Iterable, Union
from collections import OrderedDict
from typing import Any, Dict, List, Union
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
F401 'typing.Union' imported but unused

from catalyst import utils
from catalyst.core.callback import Callback, CallbackOrder
from catalyst.core.runner import IRunner
from catalyst.utils import (
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
F401 'catalyst.utils.save_checkpoint' imported but unused

from catalyst import utils
from catalyst.core.callback import Callback, CallbackOrder
from catalyst.core.runner import IRunner
from catalyst.utils import (
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
F401 'catalyst.utils.unpack_checkpoint' imported but unused

logdir=self.logdir,
runner=runner,
mapping=self.load_on_stage_start,
obj = dict(
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pep8] reported by reviewdog 🐶
C408 Unnecessary dict call - rewrite as a literal.

@Scitator Scitator merged commit fccfb60 into master Feb 4, 2022
@mergify mergify bot deleted the dev22 branch February 4, 2022 07:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants