Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Features/mean var merge cleanup #445

Merged
merged 14 commits into from Jan 16, 2020
Merged

Conversation

coquelin77
Copy link
Member

Description

This is a cleanup of the mean and var functions. The moment merging function is rewritten to be used by both mean and var (it is also possible to expand this function to be used for higher order moments).
The var function also supports multi-dimensional axis arguments (the algorithm is the same as it is for mean)

Fixes: #156

Changes proposed:

  • combination of moment merging functions into one function
  • add multi-dimensional axis support to var

Type of change

Select relevant options.

  • New feature (non-breaking change which adds functionality)
  • Documentation update

Are all split configurations tested and accounted for?

  • yes
  • no

Does this change require a documentation update outside of the changes proposed?

  • yes
  • no

Does this change modify the behaviour of other functions?

  • yes
  • no

Are there code practices which require justification?

  • yes
  • no

@codecov
Copy link

codecov bot commented Dec 30, 2019

Codecov Report

❗ No coverage uploaded for pull request base (master@1520c1a). Click here to learn what that means.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff            @@
##             master     #445   +/-   ##
=========================================
  Coverage          ?   96.68%           
=========================================
  Files             ?       57           
  Lines             ?    12134           
  Branches          ?        0           
=========================================
  Hits              ?    11732           
  Misses            ?      402           
  Partials          ?        0
Impacted Files Coverage Δ
heat/core/tests/test_logical.py 97.57% <ø> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 1520c1a...91ef6ed. Read the comment docs.

Copy link
Collaborator

@mtar mtar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some tiny remarks on tinier things I noted

heat/core/statistics.py Show resolved Hide resolved
var_tot[0, 0] = merged[0]
var_tot[0, 1] = merged[1]
var_tot[0, 2] = merged[2]

return var_tot[0][0]

else: # axis is given
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Small inconsistency: When axis is set, integer tensors are not supported any longer. Numpy does support ints and floats, while pytorch only supports floats in general. This behaviour is also in 'mean'.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

because torch does not support ints so we dont either. if its needed, the easiest way to do this would be to cast the input to a float. I dont think that we need to do this though. i dont think that we need to be so religious to numpy

heat/core/statistics.py Show resolved Hide resolved
@mtar
Copy link
Collaborator

mtar commented Jan 16, 2020

Would you be so kind to remove the failing test? As of today, torch.logical_xor allows non-bool tensors in the new pytorch version.

@coquelin77
Copy link
Member Author

Would you be so kind to remove the failing test? As of today, torch.logical_xor allows non-bool tensors in the new pytorch version.

yes this is an issue that needs to be addressed. are you working on this? Bjoern wants to put a release out today. I am in a meeting at the moment, but once this is done I will get to work on a fix for the latest version of pytorch

@coquelin77
Copy link
Member Author

Would you be so kind to remove the failing test? As of today, torch.logical_xor allows non-bool tensors in the new pytorch version.

should the fix be to use the bitwise xor?

@mtar
Copy link
Collaborator

mtar commented Jan 16, 2020

I will make a pull request

@mtar
Copy link
Collaborator

mtar commented Jan 16, 2020

One last thing, I was playing with var and realized that heat uses Bassel's correction as the default, similar to pytorch. numpy,however, uses the biased one as default instead. Shall we keep it that way?

@coquelin77
Copy link
Member Author

One last thing, I was playing with var and realized that heat uses Bassel's correction as the default, similar to pytorch. numpy,however, uses the biased one as default instead. Shall we keep it that way?

i think that Bessel's correction should be the default. But this is mostly personal preference. I think that for general statistical methods (as well as more advanced methods) it will provide more statistically reliable results

@coquelin77 coquelin77 merged commit 51684d5 into master Jan 16, 2020
@coquelin77 coquelin77 deleted the features/mean-var-merge-cleanup branch January 16, 2020 13:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Multi dimensional var
2 participants