Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Metrics docs #2184

Merged
merged 81 commits into from
Jun 16, 2020
Merged

Metrics docs #2184

merged 81 commits into from
Jun 16, 2020

Conversation

williamFalcon
Copy link
Contributor

@williamFalcon williamFalcon commented Jun 14, 2020

Fixes #2142

@mergify mergify bot requested a review from a team June 14, 2020 22:17
@Borda Borda added the docs Documentation related label Jun 14, 2020
@Borda Borda added this to the 0.8.0 milestone Jun 14, 2020

Example:

.. code-block:: python
Copy link
Member

@Borda Borda Jun 14, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lets make this as test

Suggested change
.. code-block:: python
.. testcode::

@mergify mergify bot requested a review from a team June 14, 2020 22:24

1. A Metric class you can use to implement metrics with built-in distributed (ddp) support which are device agnostic.
2. A collection of popular metrics already implemented for you.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we mention that the package also comes with a interface to sklearn metrics (with a warning that this is slow due to casting back-and-forth of the tensors)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should. We should also include that sklearn has to be installed separately

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should make it clear how to import the different backends. Maybe something like:

Use native backend
import pytorch_lightning.metrics.native as plm

Use sklearn backend
import pytorch_lightning.metrics.sklearn as plm

Use default (native if available else sklearn)
import pytorch_lightning.metrics as plm

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@SkafteNicki can you add the sklearn details in here?

docs/source/metrics.rst Outdated Show resolved Hide resolved
docs/source/metrics.rst Outdated Show resolved Hide resolved
@mergify mergify bot requested a review from a team June 15, 2020 07:57
@justusschock
Copy link
Member

Maybe we should also show an example on how to parametrize ddp reduction during __init__?

And an example where you use a functional (and maybe also with ddp reduction/autocasting via tensor_metric function wrapper/decorator?

docs/source/metrics.rst Outdated Show resolved Hide resolved
docs/source/metrics.rst Outdated Show resolved Hide resolved
docs/source/metrics.rst Outdated Show resolved Hide resolved
docs/source/metrics.rst Outdated Show resolved Hide resolved
@mergify mergify bot requested a review from a team June 15, 2020 12:45
docs/source/metrics.rst Outdated Show resolved Hide resolved
docs/source/metrics.rst Outdated Show resolved Hide resolved
@mergify mergify bot requested a review from a team June 15, 2020 13:36
Copy link
Member

@Borda Borda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I do not think it is good idea to mix .. testcode:: in python files
then you force every contributor to run a test on docs instead of simple pytest
also, it is not consistent with what we already have
moreover, there is no simple way to run sphinx test on the particular file you always have to run all
so lets stay with doctest in python files and sphinx test in RST files

docs/source/metrics.rst Outdated Show resolved Hide resolved
@mergify mergify bot requested a review from a team June 15, 2020 14:02
docs/source/metrics.rst Outdated Show resolved Hide resolved
docs/source/metrics.rst Outdated Show resolved Hide resolved
docs/source/metrics.rst Outdated Show resolved Hide resolved
@@ -1,30 +1,15 @@
"""
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would keep a short version here...

@mergify mergify bot requested a review from a team June 16, 2020 08:48
"""
super().__init__(name='precision',
reduce_group=reduce_group,
reduce_op=reduce_op)

# Fixme: bug with result value
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@SkafteNicki mind check this one?

@mergify mergify bot requested a review from a team June 16, 2020 08:55
williamFalcon and others added 8 commits June 16, 2020 06:18
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
@williamFalcon williamFalcon merged commit 55fbcc0 into master Jun 16, 2020
@Borda Borda deleted the doc8 branch June 16, 2020 11:54
Comment on lines +27 to +29
Out::

tensor(0.7500)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this doe s not make sense

@mergify mergify bot requested a review from a team June 16, 2020 11:56
>>> target = torch.tensor([0, 1, 2, 2])
>>> metric = Precision()
>>> metric(pred, target)
tensor(1.)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@williamFalcon is this correct?

@mergify mergify bot requested a review from a team June 16, 2020 11:57
@Borda Borda added this to in Progress in Metrics package via automation Jun 16, 2020
@Borda Borda moved this from in Progress to in Review in Metrics package Jun 16, 2020
@Borda Borda moved this from in Review to Done in Metrics package Jun 16, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
docs Documentation related priority: 0 High priority task
Projects
No open projects
Development

Successfully merging this pull request may close these issues.

How to use metrics classes of 0.8.0
5 participants