Skip to content

feat: Initialize FitRecipe with a results file or object#166

Merged
sbillinge merged 18 commits intodiffpy:v3.3.0from
cadenmyers13:init-w-results
Mar 3, 2026
Merged

feat: Initialize FitRecipe with a results file or object#166
sbillinge merged 18 commits intodiffpy:v3.3.0from
cadenmyers13:init-w-results

Conversation

@cadenmyers13
Copy link
Contributor

No description provided.

@cadenmyers13
Copy link
Contributor Author

@sbillinge ready for review

if hasattr(results, "print_results"):
params_dict = utils.get_dict_from_results_object(results)
elif isinstance(results, (str, Path)):
params_dict = utils.get_dict_from_results_file(results)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

handles either a results object or a path to a results file

param.name: param.getValue()
for param in self._parameters.values()
}
self._pretty_print_results_dict(set_parameters_dict)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Prints the parameters found in the results and the parameters set

recipe.add_variable(contribution.wave_number, 3)
recipe.add_variable(contribution.phase_shift, 2)
return recipe

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

had to wrap this in a second function because I realized calling the same fixture twice after the first one was refined led to initial values in the second call being the values of the previously refined recipe

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you can simply change the scope of the fixture to remove this behavior

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@sbillinge Unfortunately that doesn't work here. It was already set to scope="function" which is the lowest level so to speak.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok, but I am confused. in this case it would reset every time through a pytest.mark.parametrize Are we not initializing something correctly? I am only banging on about this because it maybe suggests something may be wrong with our tests which would not be good.

Copy link
Contributor Author

@cadenmyers13 cadenmyers13 Feb 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@sbillinge In the current version, the test fixture is not being called twice when you do,

recipe1 = build_recipe_one_contribution
recipe2 = build_recipe_one_contribution

What is happening here is that it is assigning the same fixture value to two variable names so recipe1 == recipe2 would return True even if one was refined and the other wasnt.

When we wrap it in another function like in the incoming version, each recipe object can be created

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This may be indicating a weakness with our code, either the test or the code itself. Is it fixed if you instantiate recipe1 and recipe2 both at the top of the testing function?

Copy link
Contributor Author

@cadenmyers13 cadenmyers13 Feb 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@sbillinge I tested this without the conftest fixture and built the recipes manually like so,

def test_initialize_recipe_from_results_object():
    # Case: User initializes a FitRecipe from a FitResults object
    # expected: recipe is initialized with variables from previous fit
    profile1 = Profile()
    x = linspace(0, pi, 10)
    y = sin(x)
    profile1.set_observed_profile(x, y)
    contribution1 = FitContribution("c1")
    contribution1.set_profile(profile1)
    contribution1.set_equation("amplitude*sin(wave_number*x + phase_shift)")
    recipe1 = FitRecipe()
    recipe1.add_contribution(contribution1)
    recipe1.add_variable(contribution1.amplitude, 4)
    recipe1.add_variable(contribution1.wave_number, 3)
    recipe1.add_variable(contribution1.phase_shift, 2)
    optimize_recipe(recipe1)
    results1 = FitResults(recipe1)
    expected_values = np.round(results1.varvals, 5)
    expected_names = results1.varnames

    profile2 = Profile()
    x = linspace(0, pi, 10)
    y = sin(x)
    profile2.set_observed_profile(x, y)
    contribution2 = FitContribution("c2")
    contribution2.set_profile(profile2)
    contribution2.set_equation("amplitude*sin(wave_number*x + phase_shift)")
    recipe2 = FitRecipe()
    recipe2.add_contribution(contribution2)
    recipe2.add_variable(contribution2.amplitude, 4)
    recipe2.add_variable(contribution2.wave_number, 3)
    recipe2.add_variable(contribution2.phase_shift, 2)
    recipe2.create_new_variable(
        "extra_var", 5
    )  # should be included in the initialized recipe
    actual_values_before_init = [val for val in recipe2.get_values()]
    actual_names_before_init = recipe2.get_names()
    expected_names_before_init = [
        "amplitude",
        "extra_var",
        "phase_shift",
        "wave_number",
    ]
    expected_values_before_init = [
        4,
        3,
        2,
        5,
    ]  # the three variables + the extra_var

    assert actual_values_before_init == expected_values_before_init
    assert sorted(actual_names_before_init) == sorted(
        expected_names_before_init
    )

    recipe2.initialize_recipe_with_results(results1)
    optimize_recipe(recipe2)
    results2 = FitResults(recipe2)
    actual_values = np.round(results2.varvals, 5)
    actual_names = results2.varnames

    expected_names = expected_names + [
        "extra_var"
    ]  # add the new variable name to expected names
    expected_values = list(expected_values) + [
        5
    ]  # add the value of the new variable to expected values
    assert sorted(expected_names) == sorted(actual_names)
    assert sorted(expected_values) == sorted(list(actual_values))

Doing this passes the test meaning its a fixture related thing and not a code related thing, what we could do is have it like this for this specific test (and i think one other) and revert the conftest fixture back to original.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it fixed if you instantiate recipe1 and recipe2 both at the top of the testing function?

@sbillinge and no, this doesnt fix it

Copy link
Contributor

@sbillinge sbillinge left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks very good. Please see a couple of comments

The recipe from which the results were generated.

cov : numpy.ndarray or None
Covariance matrix of the refined variables. None if unavailable.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We seen to have lost a bunch of "The"s

recipe.add_variable(contribution.wave_number, 3)
recipe.add_variable(contribution.phase_shift, 2)
return recipe

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you can simply change the scope of the fixture to remove this behavior

@cadenmyers13
Copy link
Contributor Author

@sbillinge ready for review

@codecov
Copy link

codecov bot commented Feb 27, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 72.81%. Comparing base (cee67de) to head (08478d4).
⚠️ Report is 20 commits behind head on v3.3.0.

Additional details and impacted files
@@            Coverage Diff             @@
##           v3.3.0     #166      +/-   ##
==========================================
+ Coverage   72.36%   72.81%   +0.45%     
==========================================
  Files          25       25              
  Lines        3832     3896      +64     
==========================================
+ Hits         2773     2837      +64     
  Misses       1059     1059              
Files with missing lines Coverage Δ
tests/conftest.py 92.40% <100.00%> (+0.51%) ⬆️
tests/test_fitrecipe.py 99.85% <100.00%> (+0.01%) ⬆️
tests/test_fitresults.py 99.35% <100.00%> (ø)
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@cadenmyers13
Copy link
Contributor Author

@sbillinge ready for revie

Copy link
Contributor

@sbillinge sbillinge left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One comment. I am still a bit concerned with our testing workaround that I am worried is indicating a weakness in either our code or our test. I commented on that.

Also, I am not sure whether my other comment to always use an odd number for npoints in linspace actually went through, so let me say it here.....

recipe.add_variable(contribution.wave_number, 3)
recipe.add_variable(contribution.phase_shift, 2)
return recipe

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This may be indicating a weakness with our code, either the test or the code itself. Is it fixed if you instantiate recipe1 and recipe2 both at the top of the testing function?

@cadenmyers13
Copy link
Contributor Author

@sbillinge Ready for review

One comment. I am still a bit concerned with our testing workaround that I am worried is indicating a weakness in either our code or our test. I commented on that.
Also, I am not sure whether my other comment to always use an odd number for npoints in linspace actually went through, so let me say it here.....

It didn't but now i see it and got it fixed 👍

@sbillinge
Copy link
Contributor

I am having a hard time understanding this. Let's maybe jump on a call when we can and discuss it?

@cadenmyers13
Copy link
Contributor Author

@sbillinge sure thing. we can talk about it during our one-on-one. In general though, the pytest fixture hands out an already created FitRecipe object. It does not generate one, so it exists in the same place in memory. Calling the fixture again is not generating a new fit recipe object.

For example, in the current version doing,

assert recipe1 != recipe2 # check to see if they are identical

fails like so,

FAILED tests/test_fitrecipe.py::test_initialize_recipe_from_results_object - assert <diffpy.srfit.fitbase.fitrecipe.FitRecipe object at 0x1807d8aa0> != <diffpy.srfit.fitbase.fitrecipe.FitRecipe object at 0x1807d8aa0>

Here, you can see they exist in the same place in memory. Wrapping them in the _build function and then calling the fixture actually generates two distinct recipe objects. In doing this, the above assert passes.

@sbillinge
Copy link
Contributor

yes, this is what I am worried about. It means are tests are brittle and not well designed. We need to think more carefully about this. If you can't trust your test, you are lost.

@cadenmyers13
Copy link
Contributor Author

@sbillinge I do feel confident the fixture is operating as expected, but we are changing this one fixture for just two functions. Instead, I added a new helper function that builds two recipe objects called build_recipe_for_init_testing(). This function is only used for the two tests.

@sbillinge
Copy link
Contributor

Rather than copy paste the function, I wonder if it makes sense to have the function in conftest.py and call it twice in the fixture, then return two different instances of the recipe in the fixture, something like that?

@cadenmyers13
Copy link
Contributor Author

@sbillinge Yeah, good idea. Less confusing for future devs, too. Added that now

@cadenmyers13
Copy link
Contributor Author

@sbillinge ready for review👍

Copy link
Contributor

@sbillinge sbillinge left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nearly there. Small tweak.

return recipe


@pytest.fixture(scope="function")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is good, but we don't need to repeat it. Just have build_recipes_for testing that returns two recipes, then when it is used,

recipe, _ = build_recipes()

or

recipe1, recipe2 = build_recipes()

if you want both.

@cadenmyers13
Copy link
Contributor Author

@sbillinge ready for review. I made the preexisting fixture return two fit recipes instead of one.

Copy link
Contributor

@sbillinge sbillinge left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

small request, but could we call it build_recipes_one_contribution to better reflect what it actually does? thanks so much.

@cadenmyers13
Copy link
Contributor Author

@sbillinge sure thing! done

@sbillinge sbillinge merged commit 89a5558 into diffpy:v3.3.0 Mar 3, 2026
4 checks passed
@sbillinge
Copy link
Contributor

phew, that was a bit of a struggle but I think worth it in the end.

@cadenmyers13
Copy link
Contributor Author

@sbillinge agreed lol. thanks for the help

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants