Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parallel optimization fix #21

Merged
merged 4 commits into from
Sep 29, 2017
Merged

Parallel optimization fix #21

merged 4 commits into from
Sep 29, 2017

Conversation

alexfeld
Copy link

I discovered that the __getstate__ method in Optimization was reverted to it's original, buggy form. I fixed it and added a new test to make sure that this will show up in the future.

@codecov-io
Copy link

codecov-io commented Aug 31, 2017

Codecov Report

Merging #21 into master will increase coverage by 0.01%.
The diff coverage is 80%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master      #21      +/-   ##
==========================================
+ Coverage   85.64%   85.65%   +0.01%     
==========================================
  Files          27       27              
  Lines        2417     2419       +2     
  Branches      388      388              
==========================================
+ Hits         2070     2072       +2     
  Misses        333      333              
  Partials       14       14
Impacted Files Coverage Δ
paramz/optimization/optimization.py 99.45% <0%> (ø) ⬆️
paramz/model.py 98% <100%> (+0.02%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 8d83095...e96cd3b. Read the comment docs.

@mzwiessele mzwiessele merged commit b6dc658 into sods:master Sep 29, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants