Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

data monitor callbacks #285

Merged
merged 8 commits into from
Nov 6, 2020
Merged

data monitor callbacks #285

merged 8 commits into from
Nov 6, 2020

Conversation

awaelchli
Copy link
Contributor

@awaelchli awaelchli commented Oct 18, 2020

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?

What does this PR do?

I'd like to contribute two callbacks: TrainingDataMonitor and ModuleDataMonitor as proposed in #194
https://github.com/awaelchli/pytorch-lightning-snippets#callbacks

TrainingDataMonitor

    from pl_bolts.callbacks import TrainingDataMonitor
    from pytorch_lightning import Trainer

    # log the histograms of input data sent to LightningModule.training_step
    monitor = TrainingDataMonitor(log_every_n_steps=25)

    model = YourLightningModule()
    trainer = Trainer(callbacks=[monitor])
    trainer.fit()

image

ModuleDataMonitor

    from pl_bolts.callbacks import ModuleDataMonitor
    from pytorch_lightning import Trainer

    # log the in- and output histograms of LightningModule's `forward`
    monitor = ModuleDataMonitor()

    # all submodules in LightningModule
    monitor = ModuleDataMonitor(submodules=True)

    # specific submodules
    monitor = ModuleDataMonitor(submodules=["generator", "generator.conv1"])

    model = YourLightningModule()
    trainer = Trainer(callbacks=[monitor])
    trainer.fit()

image

@awaelchli awaelchli added the enhancement New feature or request label Oct 18, 2020
@mergify mergify bot requested a review from Borda October 18, 2020 18:14
@codecov
Copy link

codecov bot commented Oct 18, 2020

Codecov Report

Merging #285 into master will decrease coverage by 0.73%.
The diff coverage is 90.99%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #285      +/-   ##
==========================================
- Coverage   83.86%   83.12%   -0.74%     
==========================================
  Files          91       92       +1     
  Lines        4858     5066     +208     
==========================================
+ Hits         4074     4211     +137     
- Misses        784      855      +71     
Flag Coverage Δ
#cpu 24.65% <43.24%> (+0.73%) ⬆️
#pytest 24.65% <43.24%> (+0.73%) ⬆️
#unittests 83.50% <90.99%> (+0.18%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
pl_bolts/callbacks/data_monitor.py 90.90% <90.90%> (ø)
pl_bolts/callbacks/__init__.py 100.00% <100.00%> (ø)
pl_bolts/callbacks/self_supervised.py 77.77% <0.00%> (-22.23%) ⬇️
pl_bolts/models/self_supervised/amdim/datasets.py 57.69% <0.00%> (-2.69%) ⬇️
pl_bolts/models/self_supervised/cpc/cpc_module.py 23.65% <0.00%> (+3.18%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update f48357b...40870f6. Read the comment docs.

@awaelchli awaelchli marked this pull request as draft October 18, 2020 18:45
@awaelchli awaelchli marked this pull request as ready for review October 18, 2020 20:14
pl_bolts/callbacks/data_monitor.py Show resolved Hide resolved


@mock.patch("pl_bolts.callbacks.data_monitor.TrainingDataMonitor.log_histogram")
def test_training_data_monitor(log_histogram, tmpdir):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

in 3 tests just with different parameters?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
datamodule Anything related to datamodules enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants