Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Params eval #550

Merged
merged 3 commits into from May 2, 2019
Merged

Params eval #550

merged 3 commits into from May 2, 2019

Conversation

zobristnicholas
Copy link
Contributor

Description

This PR is in reference to the discussion here: https://groups.google.com/forum/#!topic/lmfit-py/pJWdxzYRGqI

Expressions derived from parameter values are often added to a Parameters() object using params.add("name", expr="expression"). These expressions can contain common numpy functions, user-defined functions, and parameter values. During a fit, they are evaluated every iteration.

If this updating behavior is not desired, an expression can be evaluated post-fitting using the parameter values directly using p = params.valuesdict(). However, there is currently no public method to retrieve user-defined symbols from the object.

This PR addresses this point by allowing the user to evaluate arbitrary expressions in the Parameters object's namespace via the params.eval() method. A test for this method is also included.

Type of Changes
  • Bug fix
  • New feature
  • Refactoring / maintenance
  • Documentation / examples
Tested on

Python: 3.7.3 (default, Mar 27 2019, 16:54:48)
[Clang 4.0.1 (tags/RELEASE_401/final)]

lmfit: 0.9.13+7.gbcec209, scipy: 1.2.1, numpy: 1.16.3, asteval: 0.9.13, uncertainties: 3.0.3, six: 1.12.0

Verification
  • included docstrings that follow PEP 257?
  • referenced existing Issue and/or provided relevant link to mailing list?
  • verified that existing tests pass locally?
  • squashed/minimized your commits and written descriptive commit messages?
  • added or updated existing tests to cover the changes?
    Not sure the best way to do these two
  • updated the documentation?
  • added an example?

@codecov
Copy link

codecov bot commented Apr 30, 2019

Codecov Report

Merging #550 into master will increase coverage by 2.93%.
The diff coverage is 100%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #550      +/-   ##
==========================================
+ Coverage   83.28%   86.21%   +2.93%     
==========================================
  Files          11       11              
  Lines        3170     3875     +705     
==========================================
+ Hits         2640     3341     +701     
- Misses        530      534       +4
Impacted Files Coverage Δ
lmfit/parameter.py 81.93% <100%> (+0.35%) ⬆️
lmfit/minimizer.py 90.5% <0%> (+0.45%) ⬆️
lmfit/printfuncs.py 90.24% <0%> (+3.35%) ⬆️
lmfit/models.py 90.1% <0%> (+8.13%) ⬆️
lmfit/lineshapes.py 83.73% <0%> (+11.97%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 7d9bce8...2a728bf. Read the comment docs.

@newville
Copy link
Member

@zobristnicholas Thanks! That looks good to me. Does anyone have an objection to merging this?

Copy link
Contributor

@reneeotten reneeotten left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good to me, just two tiny things in the docstring that could be improved.

lmfit/parameter.py Show resolved Hide resolved
lmfit/parameter.py Show resolved Hide resolved
@reneeotten
Copy link
Contributor

reneeotten commented Apr 30, 2019

the Travis failure is unrelated and due to a flaky test... I have changed it already in the other PR. @newville can I merge this one?

@newville
Copy link
Member

newville commented May 2, 2019

@reneeotten yes, please merge!

@reneeotten reneeotten merged commit aa642a8 into lmfit:master May 2, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants