feat: Initialize FitRecipe with a results file or object#166
feat: Initialize FitRecipe with a results file or object#166sbillinge merged 18 commits intodiffpy:v3.3.0from
Conversation
|
@sbillinge ready for review |
| if hasattr(results, "print_results"): | ||
| params_dict = utils.get_dict_from_results_object(results) | ||
| elif isinstance(results, (str, Path)): | ||
| params_dict = utils.get_dict_from_results_file(results) |
There was a problem hiding this comment.
handles either a results object or a path to a results file
| param.name: param.getValue() | ||
| for param in self._parameters.values() | ||
| } | ||
| self._pretty_print_results_dict(set_parameters_dict) |
There was a problem hiding this comment.
Prints the parameters found in the results and the parameters set
tests/conftest.py
Outdated
| recipe.add_variable(contribution.wave_number, 3) | ||
| recipe.add_variable(contribution.phase_shift, 2) | ||
| return recipe | ||
|
|
There was a problem hiding this comment.
had to wrap this in a second function because I realized calling the same fixture twice after the first one was refined led to initial values in the second call being the values of the previously refined recipe
There was a problem hiding this comment.
I think you can simply change the scope of the fixture to remove this behavior
There was a problem hiding this comment.
@sbillinge Unfortunately that doesn't work here. It was already set to scope="function" which is the lowest level so to speak.
There was a problem hiding this comment.
ok, but I am confused. in this case it would reset every time through a pytest.mark.parametrize Are we not initializing something correctly? I am only banging on about this because it maybe suggests something may be wrong with our tests which would not be good.
There was a problem hiding this comment.
@sbillinge In the current version, the test fixture is not being called twice when you do,
recipe1 = build_recipe_one_contribution
recipe2 = build_recipe_one_contribution
What is happening here is that it is assigning the same fixture value to two variable names so recipe1 == recipe2 would return True even if one was refined and the other wasnt.
When we wrap it in another function like in the incoming version, each recipe object can be created
There was a problem hiding this comment.
This may be indicating a weakness with our code, either the test or the code itself. Is it fixed if you instantiate recipe1 and recipe2 both at the top of the testing function?
There was a problem hiding this comment.
@sbillinge I tested this without the conftest fixture and built the recipes manually like so,
def test_initialize_recipe_from_results_object():
# Case: User initializes a FitRecipe from a FitResults object
# expected: recipe is initialized with variables from previous fit
profile1 = Profile()
x = linspace(0, pi, 10)
y = sin(x)
profile1.set_observed_profile(x, y)
contribution1 = FitContribution("c1")
contribution1.set_profile(profile1)
contribution1.set_equation("amplitude*sin(wave_number*x + phase_shift)")
recipe1 = FitRecipe()
recipe1.add_contribution(contribution1)
recipe1.add_variable(contribution1.amplitude, 4)
recipe1.add_variable(contribution1.wave_number, 3)
recipe1.add_variable(contribution1.phase_shift, 2)
optimize_recipe(recipe1)
results1 = FitResults(recipe1)
expected_values = np.round(results1.varvals, 5)
expected_names = results1.varnames
profile2 = Profile()
x = linspace(0, pi, 10)
y = sin(x)
profile2.set_observed_profile(x, y)
contribution2 = FitContribution("c2")
contribution2.set_profile(profile2)
contribution2.set_equation("amplitude*sin(wave_number*x + phase_shift)")
recipe2 = FitRecipe()
recipe2.add_contribution(contribution2)
recipe2.add_variable(contribution2.amplitude, 4)
recipe2.add_variable(contribution2.wave_number, 3)
recipe2.add_variable(contribution2.phase_shift, 2)
recipe2.create_new_variable(
"extra_var", 5
) # should be included in the initialized recipe
actual_values_before_init = [val for val in recipe2.get_values()]
actual_names_before_init = recipe2.get_names()
expected_names_before_init = [
"amplitude",
"extra_var",
"phase_shift",
"wave_number",
]
expected_values_before_init = [
4,
3,
2,
5,
] # the three variables + the extra_var
assert actual_values_before_init == expected_values_before_init
assert sorted(actual_names_before_init) == sorted(
expected_names_before_init
)
recipe2.initialize_recipe_with_results(results1)
optimize_recipe(recipe2)
results2 = FitResults(recipe2)
actual_values = np.round(results2.varvals, 5)
actual_names = results2.varnames
expected_names = expected_names + [
"extra_var"
] # add the new variable name to expected names
expected_values = list(expected_values) + [
5
] # add the value of the new variable to expected values
assert sorted(expected_names) == sorted(actual_names)
assert sorted(expected_values) == sorted(list(actual_values))
Doing this passes the test meaning its a fixture related thing and not a code related thing, what we could do is have it like this for this specific test (and i think one other) and revert the conftest fixture back to original.
There was a problem hiding this comment.
Is it fixed if you instantiate recipe1 and recipe2 both at the top of the testing function?
@sbillinge and no, this doesnt fix it
sbillinge
left a comment
There was a problem hiding this comment.
This looks very good. Please see a couple of comments
| The recipe from which the results were generated. | ||
|
|
||
| cov : numpy.ndarray or None | ||
| Covariance matrix of the refined variables. None if unavailable. |
There was a problem hiding this comment.
We seen to have lost a bunch of "The"s
tests/conftest.py
Outdated
| recipe.add_variable(contribution.wave_number, 3) | ||
| recipe.add_variable(contribution.phase_shift, 2) | ||
| return recipe | ||
|
|
There was a problem hiding this comment.
I think you can simply change the scope of the fixture to remove this behavior
|
@sbillinge ready for review |
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## v3.3.0 #166 +/- ##
==========================================
+ Coverage 72.36% 72.81% +0.45%
==========================================
Files 25 25
Lines 3832 3896 +64
==========================================
+ Hits 2773 2837 +64
Misses 1059 1059
🚀 New features to boost your workflow:
|
|
@sbillinge ready for revie |
sbillinge
left a comment
There was a problem hiding this comment.
One comment. I am still a bit concerned with our testing workaround that I am worried is indicating a weakness in either our code or our test. I commented on that.
Also, I am not sure whether my other comment to always use an odd number for npoints in linspace actually went through, so let me say it here.....
tests/conftest.py
Outdated
| recipe.add_variable(contribution.wave_number, 3) | ||
| recipe.add_variable(contribution.phase_shift, 2) | ||
| return recipe | ||
|
|
There was a problem hiding this comment.
This may be indicating a weakness with our code, either the test or the code itself. Is it fixed if you instantiate recipe1 and recipe2 both at the top of the testing function?
|
@sbillinge Ready for review
It didn't but now i see it and got it fixed 👍 |
|
I am having a hard time understanding this. Let's maybe jump on a call when we can and discuss it? |
|
@sbillinge sure thing. we can talk about it during our one-on-one. In general though, the pytest fixture hands out an already created For example, in the current version doing, fails like so, Here, you can see they exist in the same place in memory. Wrapping them in the |
|
yes, this is what I am worried about. It means are tests are brittle and not well designed. We need to think more carefully about this. If you can't trust your test, you are lost. |
…dd a new helper function that builds recipe for init testing
|
@sbillinge I do feel confident the fixture is operating as expected, but we are changing this one fixture for just two functions. Instead, I added a new helper function that builds two recipe objects called |
|
Rather than copy paste the function, I wonder if it makes sense to have the function in conftest.py and call it twice in the fixture, then return two different instances of the recipe in the fixture, something like that? |
|
@sbillinge Yeah, good idea. Less confusing for future devs, too. Added that now |
|
@sbillinge ready for review👍 |
sbillinge
left a comment
There was a problem hiding this comment.
nearly there. Small tweak.
tests/conftest.py
Outdated
| return recipe | ||
|
|
||
|
|
||
| @pytest.fixture(scope="function") |
There was a problem hiding this comment.
This is good, but we don't need to repeat it. Just have build_recipes_for testing that returns two recipes, then when it is used,
recipe, _ = build_recipes()
or
recipe1, recipe2 = build_recipes()
if you want both.
|
@sbillinge ready for review. I made the preexisting fixture return two fit recipes instead of one. |
sbillinge
left a comment
There was a problem hiding this comment.
small request, but could we call it build_recipes_one_contribution to better reflect what it actually does? thanks so much.
|
@sbillinge sure thing! done |
|
phew, that was a bit of a struggle but I think worth it in the end. |
|
@sbillinge agreed lol. thanks for the help |
No description provided.