Skip to content

Conversation

@tsmbland
Copy link
Collaborator

@tsmbland tsmbland commented Aug 21, 2024

Description

I've been digging into the code to get some more clarity on how costs (LCOE, ALCOE, EAC etc.) are calculated and used, but the issue is that the code for calculating these costs is defined in multiple places which makes it really tricky to see what's going on, could lead to inconsistencies, and is just a bad idea in general.

To tidy things up, I've created a new costs module that houses these calculations, and modified other parts of the codebase (specifically the outputs and objectives modules) to call these functions rather than repeating the calculations. Hopefully this will also make #288 easier to tackle.

It looks like a massive PR, but it's almost entirely just moving and deleting code, and creating some new tests.

Some more details about the specific changes:

Costs module:

  • I've created functions for NPV, NPC, EAC and LCOE, which are derived from code that was previously in the objectives module
  • I've moved annual_levelized_cost_of_energy and supply_cost here from quantities.py
  • I think there's more work that could be done here, e.g. creating functions for fuel costs, capital costs etc. and calling these functions as appropriate from LCOE, EAC etc. But that can be done later.
  • Tests for this module are in test_costs.py. At the moment, this just goes through each function with some example data to check that the function runs, and that the dimensions of the result are as expected. I notice there are some differences between functions (e.g. the ALCOE function gets rid of the asset dimension), but this is something that can be looked into later and potentially homogenized.
  • It might be a good idea to do some more thorough tests to check the values returned by these functions in a few scenarios. Not sure the best way to do this to be honest because the example data is randomly generated so the values will change every time.
  • Still a bit unsure about the difference between capital recovery factor and discount factor, so I've included functions for both of these, although they seem very similar

Outputs module:

  • Calls the new LCOE and EAC functions from the cost module, rather than repeating all the calculations
  • I've moved capacity_to_service_demand (which was defined multiple times) to quantities.py

Objectives module:

  • Modified the objectives to call the costs functions where appropriate
  • I've also modified what's passed to the objectives. Previously, objectives would take five objects: agent, demand, search_space, technologies and market. In all cases, search_space would be used to filter the technologies and reduce the dimensions of the demand object. Agent was also used in the filtering (to filter objects based on the agent's region and forecast year), and to get the forecast year while computing the objectives. Market was only used to get commodity prices.
  • All this filtering was taking place in the objectives themselves, which resulted in a lot of repeated code. I've simplified things so the filtering is done outside the objectives, and the objectives are only passed the data that they need (technologies, demand and prices). The objectives can get the forecast year from the demand object (a bit of a hack, but it works ok for now)
  • I've re-written the tests for this module, which were a bit of a mess before. The aim now is just to go through each objective with some example data to check that they run without error, and that the dimensions of the result are as expected (this is also really useful to document the dimensions of each output, as there's some variability between objectives). The previous tests also checked the values, but they were essentially just copying and pasting code from the objectives themselves to compute the expected values, which seems a bit pointless. I also don't think this is the appropriate place to be checking the values, as (now) most of the objectives are just calling functions from the costs module, so if we're checking values this should be done in test_costs.py

Type of change

Please add a line in the relevant section of
CHANGELOG.md to
document the change (include PR #) - note reverse order of PR #s.

  • New feature (non-breaking change which adds functionality)
  • Optimization (non-breaking, back-end change that speeds up the code)
  • Bug fix (non-breaking change which fixes an issue)
  • Breaking change (whatever its nature)

Key checklist

  • All tests pass: $ python -m pytest
  • The documentation builds and looks OK: $ python -m sphinx -b html docs docs/build

Further checks

  • Code is commented, particularly in hard-to-understand areas
  • Tests added that prove fix is effective or that feature works

@tsmbland tsmbland changed the title Simplify LCOE code Create costs module and simplify code for objectives and outputs Aug 22, 2024
@tsmbland tsmbland marked this pull request as ready for review August 23, 2024 08:33
@tsmbland tsmbland self-assigned this Aug 23, 2024
@tsmbland tsmbland requested a review from dalonsoa August 23, 2024 09:29
@tsmbland
Copy link
Collaborator Author

Actually the results are now different when I use the ALCOE, capital_costs and efficiency objectives, so I need to investigate why that is...

@tsmbland
Copy link
Collaborator Author

tsmbland commented Aug 23, 2024

Actually the results are now different when I use the ALCOE, capital_costs and efficiency objectives, so I need to investigate why that is...

I think there was an error before with the ALCOE objective, as it was returning data for multiple years, but I think the solver should only have objective data for the forecast year (at least that's what happens with all the other objectives). So the results are different now with this objective, but I think it was wrong before.

Very confused about capital_costs and efficiency. If the model runs all the way to the end, then the results are no different compared to before (at least with the example I've been using). But if the model terminates early because the growth parameters aren't high enough, then it appears to terminate in a different place (I have absolutely no idea why) so the interim results that you get might be different. Weird, but I think it's all ok.

Copy link
Collaborator

@dalonsoa dalonsoa left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is a much needed improvement, reducing the redundancy of functions that were supposed to be doing the same thing. Homogenising the interface of the functions and reducing code duplication is also great. I think that any discrepancy with the previous results is likely to be the result of fixing a bug we didn't know we had. None of the tutorials or examples need updating so, at least for the functionality they use like LCOE, these changes have no negative impact.

Testing the results of mathematical formulas in a meaningful way is always tricky, as the only way of actually test if the numerical values are correct is to calculate them, which in turns uses the formula you're trying to test. Testing the properties of the outputs rather than the numerical values seems like a good compromise.

If you want to take one more step, you could use the hypothesis package to evaluate the functions under a range of conditions, and then evaluate properties of the results rather than specific numerical values (eg. all should be positive values, or be within certain range, or add up to 1, etc.). This is useful to find edge cases that are difficult to find manually.

@tsmbland
Copy link
Collaborator Author

Thanks. I'm actually quite keen to give hypothesis a try, and this seems like a good place for it, so I'll open up another issue for that

@tsmbland tsmbland merged commit c4fc04d into develop Aug 27, 2024
@tsmbland tsmbland deleted the lcoe branch August 27, 2024 07:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

3 participants