Trainer profilers are typehinted with the deprecated BaseProfiler
instead of Profiler
#13046
Labels
Milestone
BaseProfiler
instead of Profiler
#13046
🐛 Bug
Trainer profilers are typehinted with the deprecated
BaseProfiler
instead ofProfiler
. This means that you cannot use class_path initialization of profilers with LightningCLI.Error message:
To Reproduce
Use this BoringModel code modified for LightningCLI
pl_bug.py
:along with this trainer config (
bug_trainer.yaml
):Run the script from the commandline with
$ python pl_bug.py fit --trainer bug_trainer.yaml
Expected behavior
The code is run and the PyTorch profiler is instantiated and used
Fix
Change line 176 in trainer/trainer.py from
profiler: Optional[Union[BaseProfiler, str]] = None,
to
profiler: Optional[Union[Profiler, str]] = None,
Related issue:
**profiler_kwargs
inprofiler/pytorch.py
is typehinted asAny
, which gives the errorA fix would be to typehint profiler_kwargs as Dict with an empty dict as default argument:
from
**profiler_kwargs: Any,
toprofiler_kwargs: Dict = {},
. This appears related to Refactor use of **kwargs in PL classes for better LightningCLI support · Issue #11653 · PyTorchLightning/pytorch-lightning (github.com).Environment
- GPU:
- NVIDIA GeForce GTX 1080 Ti
- available: True
- version: 10.2
- numpy: 1.21.6
- pyTorch_debug: False
- pyTorch_version: 1.11.0+cu102
- pytorch-lightning: 1.6.3
- tqdm: 4.64.0
- OS: Linux
- architecture:
- 64bit
- ELF
- processor: x86_64
- python: 3.9.12
- version: How to set hyperparameters search range and run the search? #45~20.04.1-Ubuntu SMP Mon Apr 4 09:38:31 UTC 2022
Additional context
cc @otaj @Borda @carmocca @kaushikb11 @ninginthecloud @rohitgr7 @nbcsm @guotuofeng
The text was updated successfully, but these errors were encountered: