-
-
Notifications
You must be signed in to change notification settings - Fork 390
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
v22: simplification 2.0 #1395
v22: simplification 2.0 #1395
Conversation
Hello @Scitator! Thanks for updating this PR. We checked the lines you've touched for PEP 8 issues, and found:
Comment last updated at 2022-01-26 05:47:43 UTC |
catalyst/loggers/wandb.py
Outdated
"loader_key": loader_key, | ||
"scope": scope, | ||
}, | ||
metadata={"stage_key": stage_key, "loader_key": loader_key, "scope": scope,}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
E231 missing whitespace after ','
|
||
Args: | ||
device: use device, default is `"cpu"`. | ||
class CPUEngine(IEngine): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
D101 Missing docstring in public class
|
||
.. code-block:: python | ||
class GPUEngine(IEngine): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
D101 Missing docstring in public class
engine=dl.DeviceEngine("cuda:1"), | ||
... | ||
) | ||
class DataParallelEngine(GPUEngine): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
D101 Missing docstring in public class
... | ||
) | ||
class DataParallelEngine(GPUEngine): | ||
def prepare_model(self, model): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
D102 Missing docstring in public method
|
||
from catalyst.typing import Criterion, Device, Model, Optimizer, Scheduler | ||
class IEngine(Accelerator): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
D101 Missing docstring in public class
catalyst/core/engine.py
Outdated
from catalyst.typing import Criterion, Device, Model, Optimizer, Scheduler | ||
class IEngine(Accelerator): | ||
@staticmethod | ||
def spawn(fn: Callable): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
D102 Missing docstring in public method
Default is `1`. | ||
""" | ||
@staticmethod | ||
def setup(local_rank: int, world_size: int): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
D102 Missing docstring in public method
def cleanup_process(self): | ||
"""Clean DDP variables and processes.""" | ||
@staticmethod | ||
def cleanup(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
D102 Missing docstring in public method
Args: | ||
tensor_or_module: tensor to mode | ||
""" | ||
def mean_reduce_ddp_metrics(self, metrics: Dict): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
D102 Missing docstring in public method
catalyst/core/runner.py
Outdated
EPOCH_METRICS = Dict[str, LOADER_METRICS] | ||
# {0: {"train": {}, "valid": {}}, 1: {...}} | ||
EXPERIMENT_METRICS = Dict[int, EPOCH_METRICS] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
W291 trailing whitespace
criterion: Criterion = None, | ||
optimizer: Optimizer = None, | ||
scheduler: Scheduler = None, | ||
criterion: TorchCriterion = None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
DAR102 Excess parameter(s) in Docstring: + trial
catalyst/runners/config.py
Outdated
@@ -195,7 +195,7 @@ def get_engine(self) -> IEngine: | |||
engine = REGISTRY.get_from_params(**engine_params) | |||
else: | |||
engine = get_available_engine( | |||
fp16=self._fp16, ddp=self._ddp, amp=self._amp, apex=self._apex | |||
fp16=self._fp16, ddp=self._ddp, fp16=self._amp, apex=self._apex |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
E999 SyntaxError: keyword argument repeated
|
||
def __init__( | ||
self, | ||
model: RunnerModel = None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
F821 undefined name 'RunnerModel'
def __init__( | ||
self, | ||
model: RunnerModel = None, | ||
engine: IEngine = None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
F821 undefined name 'IEngine'
Runner.__init__(self, model=model, engine=engine) | ||
self.loss_mode_prefix = loss_mode_prefix | ||
|
||
@torch.no_grad() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
F821 undefined name 'torch'
output = self.forward(batch, **kwargs) | ||
return output | ||
|
||
def get_callbacks(self) -> "OrderedDict[str, Callback]": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
F821 undefined name 'OrderedDict'
catalyst/callbacks/checkpoint.py
Outdated
self._storage: List[Checkpoint] = [] | ||
os.makedirs(self.logdir, exist_ok=True) | ||
|
||
def save(self, runner: "IRunner", obj: Any, logprefix: str) -> str: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
D102 Missing docstring in public method
"""Event handler.""" | ||
self._storage: List[Checkpoint] = [] | ||
|
||
def on_epoch_end_best(self, runner: "IRunner") -> None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
D102 Missing docstring in public method
# worker sync | ||
runner.engine.barrier() | ||
return | ||
def on_epoch_end_last(self, runner: "IRunner") -> None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
D102 Missing docstring in public method
catalyst/callbacks/optimizer.py
Outdated
@@ -2,7 +2,7 @@ | |||
from functools import partial | |||
import logging | |||
|
|||
from catalyst.core.callback import CallbackNode, CallbackOrder, IOptimizerCallback | |||
from catalyst.core.callback import CallbackOrder, IOptimizerCallback |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
F401 'catalyst.core.callback.CallbackOrder' imported but unused
catalyst/core/runner.py
Outdated
import logging | ||
import warnings | ||
|
||
from accelerate.state import DistributedType |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
F401 'accelerate.state.DistributedType' imported but unused
catalyst/runners/runner.py
Outdated
@@ -852,8 +537,8 @@ def predict_batch(self, batch: Mapping[str, Any], **kwargs) -> Mapping[str, Any] | |||
Run model inference on specified data batch. | |||
|
|||
.. warning:: | |||
You should not override this method. If you need specific model | |||
call, override forward() method | |||
You should not override this method. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
W291 trailing whitespace
catalyst/callbacks/sklearn_model.py
Outdated
@@ -14,13 +14,15 @@ class SklearnModelCallback(Callback): | |||
to give predictions on the valid loader. | |||
|
|||
Args: | |||
feature_key: keys of tensors that should be used as features in the classifier calculations | |||
target_key: keys of tensors that should be used as targets in the classifier calculations | |||
feature_key: keys of tensors that should be used as features |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
W291 trailing whitespace
catalyst/callbacks/sklearn_model.py
Outdated
target_key: keys of tensors that should be used as targets in the classifier calculations | ||
feature_key: keys of tensors that should be used as features | ||
for the classifier fit | ||
target_key: keys of tensors that should be used as targets |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
W291 trailing whitespace
catalyst/callbacks/onnx.py
Outdated
@@ -16,7 +16,8 @@ class OnnxCallback(Callback): | |||
input_key: input key from ``runner.batch`` to use for onnx export | |||
logdir: path to folder for saving | |||
filename: filename | |||
method_name (str, optional): Forward pass method to be converted. Defaults to "forward". | |||
method_name (str, optional): Forward pass method to be converted. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
W291 trailing whitespace
catalyst/callbacks/tracing.py
Outdated
@@ -81,7 +81,8 @@ def __init__( | |||
input_key: input key from ``runner.batch`` to use for model tracing | |||
logdir: path to folder for saving | |||
filename: filename | |||
method_name: Model's method name that will be used as entrypoint during tracing | |||
method_name: Model's method name |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
W291 trailing whitespace
catalyst/callbacks/mixup.py
Outdated
also proxy. We are not interested in them, are we? So, if ``on_train_only`` | ||
alpha: beta distribution a=b parameters. Must be >=0. | ||
The more alpha closer to zero the less effect of the mixup. | ||
mode: mode determines the method of use. Must be in ["replace", "add"]. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
W291 trailing whitespace
catalyst/callbacks/mixup.py
Outdated
alpha: beta distribution a=b parameters. Must be >=0. | ||
The more alpha closer to zero the less effect of the mixup. | ||
mode: mode determines the method of use. Must be in ["replace", "add"]. | ||
If "replace" then replaces the batch with a mixed one, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
W291 trailing whitespace
catalyst/callbacks/mixup.py
Outdated
mode: mode determines the method of use. Must be in ["replace", "add"]. | ||
If "replace" then replaces the batch with a mixed one, | ||
while the batch size is not changed. | ||
If "add", concatenates mixed examples to the current ones, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
W291 trailing whitespace
catalyst/callbacks/mixup.py
Outdated
while the batch size is not changed. | ||
If "add", concatenates mixed examples to the current ones, | ||
the batch size increases by 2 times. | ||
on_train_only: apply to train only. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
W291 trailing whitespace
catalyst/callbacks/mixup.py
Outdated
If "add", concatenates mixed examples to the current ones, | ||
the batch size increases by 2 times. | ||
on_train_only: apply to train only. | ||
As the mixup use the proxy inputs, the targets are also proxy. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
W291 trailing whitespace
catalyst/runners/runner.py
Outdated
class Runner(IRunner): | ||
"""Single-stage deep learning Runner with user-friendly API. | ||
|
||
Runner supports the logic for deep learning pipeline configuration with pure python code. | ||
Runner supports the logic for deep learning pipeline configuration |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
W291 trailing whitespace
catalyst/runners/runner.py
Outdated
@@ -66,7 +54,8 @@ class Runner(IRunner): | |||
|
|||
It does not automatically add Criterion, Optimizer or Scheduler callbacks. | |||
|
|||
That means, that you have do optimization step by yourself during ``handle_batch`` method | |||
That means, that you have do optimization step by yourself during |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
W291 trailing whitespace
catalyst/loggers/neptune.py
Outdated
neptune_run_kwargs: Optional, additional keyword arguments to be passed directly to the | ||
`neptune.init() <https://docs.neptune.ai/api-reference/neptune#init>`_ function. | ||
(default: SETTINGS.log_epoch_metrics or True). | ||
neptune_run_kwargs: Optional, additional keyword arguments |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
W291 trailing whitespace
catalyst/loggers/neptune.py
Outdated
`neptune.init() <https://docs.neptune.ai/api-reference/neptune#init>`_ function. | ||
(default: SETTINGS.log_epoch_metrics or True). | ||
neptune_run_kwargs: Optional, additional keyword arguments | ||
to be passed directly to the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
W291 trailing whitespace
catalyst/loggers/neptune.py
Outdated
(default: SETTINGS.log_epoch_metrics or True). | ||
neptune_run_kwargs: Optional, additional keyword arguments | ||
to be passed directly to the | ||
`neptune.init() <https://docs.neptune.ai/api-reference/neptune#init>`_ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
W291 trailing whitespace
catalyst/loggers/csv.py
Outdated
@@ -98,21 +98,20 @@ def _log_metrics(self, metrics: Dict[str, float], step: int, loader_key: str): | |||
log_line_csv = log_line_csv[:-1] + "\n" # replace last "," with new line | |||
self.loggers[loader_key].write(log_line_csv) | |||
|
|||
def log_hparams(self, hparams: Dict) -> None: | |||
"""Logs hyperparameters to the logger.""" | |||
if scope == "experiment": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
F821 undefined name 'scope'
catalyst/loggers/wandb.py
Outdated
@@ -25,7 +25,8 @@ class WandbLogger(ILogger): | |||
(default: SETTINGS.log_batch_metrics or False). | |||
log_epoch_metrics: boolean flag to log epoch metrics | |||
(default: SETTINGS.log_epoch_metrics or True). | |||
kwargs: Optional, additional keyword arguments to be passed directly to the wandb.init | |||
kwargs: Optional, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
W291 trailing whitespace
catalyst/loggers/wandb.py
Outdated
artifact = wandb.Artifact( | ||
name=self.run.id + "_aritfacts", | ||
type="artifact", | ||
metadata={"loader_key": loader_key, "scope": scope,}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
E231 missing whitespace after ','
catalyst/loggers/console.py
Outdated
@@ -22,21 +22,20 @@ def __init__(self, log_hparams: bool = False): | |||
super().__init__(log_batch_metrics=False, log_epoch_metrics=True) | |||
self._log_hparams = log_hparams | |||
|
|||
def log_hparams(self, hparams: Dict) -> None: | |||
"""Logs hyperparameters to the console.""" | |||
if scope == "experiment" and self._log_hparams: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
F821 undefined name 'scope'
catalyst/contrib/runners/config.py
Outdated
@@ -195,7 +195,7 @@ def get_engine(self) -> IEngine: | |||
engine = REGISTRY.get_from_params(**engine_params) | |||
else: | |||
engine = get_available_engine( | |||
fp16=self._fp16, ddp=self._ddp, amp=self._amp, apex=self._apex | |||
fp16=self._fp16, ddp=self._ddp, fp16=self._amp, apex=self._apex |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
E999 SyntaxError: keyword argument repeated
catalyst/contrib/utils/torch2.py
Outdated
@@ -0,0 +1,146 @@ | |||
from typing import Callable, Dict, List, TYPE_CHECKING, Union |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
F401 'typing.TYPE_CHECKING' imported but unused
catalyst/contrib/utils/torch2.py
Outdated
from torch import nn, Tensor | ||
import torch.backends | ||
|
||
from catalyst.settings import SETTINGS |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
F401 'catalyst.settings.SETTINGS' imported but unused
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
|
||
|
||
# # Torch | ||
# @mark.skipif(not requirements_satisfied, reason="catalyst[ml] and catalyst[cv] required") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
E501 line too long (91 > 89 characters)
|
||
|
||
# # Torch | ||
# @mark.skipif(not requirements_satisfied, reason="catalyst[ml] and catalyst[cv] required") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
W505 doc line too long (91 > 89 characters)
runner = SupervisedRunner() | ||
|
||
runner.evaluate_loader(loader=loader, callbacks=callbacks, model=model) | ||
|
||
|
||
def test_epoch_increasing(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
D103 Missing docstring in public function
|
||
# data | ||
num_samples, num_features = int(1e4), int(1e1) | ||
X = torch.rand(num_samples, num_features) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
N806 variable 'X' in function should be lowercase
# "train": DataLoader(MNIST(os.getcwd(), train=False), batch_size=32,), | ||
# "valid": DataLoader(MNIST(os.getcwd(), train=False), batch_size=32,), | ||
# } | ||
# model = nn.Sequential(Flatten(), nn.Linear(784, 512), nn.ReLU(), nn.Linear(512, 10)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
E501 line too long (90 > 89 characters)
# "train": DataLoader(MNIST(os.getcwd(), train=False), batch_size=32,), | ||
# "valid": DataLoader(MNIST(os.getcwd(), train=False), batch_size=32,), | ||
# } | ||
# model = nn.Sequential(Flatten(), nn.Linear(784, 512), nn.ReLU(), nn.Linear(512, 10)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
W505 doc line too long (90 > 89 characters)
# logits = self.model(batch["features"].view(batch["features"].size(0), -1)) | ||
|
||
# loss = F.cross_entropy(logits, batch["targets"]) | ||
# accuracy01, accuracy03 = metrics.accuracy(logits, batch["targets"], topk=(1, 3)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
E501 line too long (90 > 89 characters)
# logits = self.model(batch["features"].view(batch["features"].size(0), -1)) | ||
|
||
# loss = F.cross_entropy(logits, batch["targets"]) | ||
# accuracy01, accuracy03 = metrics.accuracy(logits, batch["targets"], topk=(1, 3)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
W505 doc line too long (90 > 89 characters)
catalyst/utils/torch.py
Outdated
) | ||
from catalyst.utils.distributed import get_nn_from_ddp_module | ||
from catalyst.utils.misc import maybe_recursive_call, merge_dicts | ||
from catalyst.utils.misc import maybe_recursive_call |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
F401 'catalyst.utils.misc.maybe_recursive_call' imported but unused
catalyst/callbacks/checkpoint.py
Outdated
@@ -1,244 +1,22 @@ | |||
from typing import Dict, Iterable, Union | |||
from collections import OrderedDict | |||
from typing import Any, Dict, List, Union |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
F401 'typing.Dict' imported but unused
catalyst/callbacks/checkpoint.py
Outdated
@@ -1,244 +1,22 @@ | |||
from typing import Dict, Iterable, Union | |||
from collections import OrderedDict | |||
from typing import Any, Dict, List, Union |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
F401 'typing.Union' imported but unused
from catalyst import utils | ||
from catalyst.core.callback import Callback, CallbackOrder | ||
from catalyst.core.runner import IRunner | ||
from catalyst.utils import ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
F401 'catalyst.utils.save_checkpoint' imported but unused
from catalyst import utils | ||
from catalyst.core.callback import Callback, CallbackOrder | ||
from catalyst.core.runner import IRunner | ||
from catalyst.utils import ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
F401 'catalyst.utils.unpack_checkpoint' imported but unused
catalyst/callbacks/checkpoint.py
Outdated
logdir=self.logdir, | ||
runner=runner, | ||
mapping=self.load_on_stage_start, | ||
obj = dict( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[pep8] reported by reviewdog 🐶
C408 Unnecessary dict call - rewrite as a literal.
Pull Request FAQ
Description
-l 89
(better view on 16'' screen ;) )Related Issue
Type of Change
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Checklist
latest
andminimal
requirements?