Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Being Sensitive #125

Merged
merged 25 commits into from
Jul 5, 2021
Merged

Being Sensitive #125

merged 25 commits into from
Jul 5, 2021

Conversation

lazyoracle
Copy link
Member

@lazyoracle lazyoracle commented Jul 5, 2021

What

Bring back support for Sensitivity Analysis using the current codebase

Why

Closes #121

How

Refactored Sensitivity to inherit and reuse code from Model Learning. This meant getting rid of all legacy and stale code and only keeping code in line with SRP. Certain (broken) features that might have been previously implemented have been removed for the sake of clean, maintainable code. At present this will scan the goal_run in Model Learning using the sweep algorithm and do this in 1D sequentially for as many dims.

Remarks

This code can at present produce sensitivity plots of the following kind. As and when we need added functionality, the same should be added while ensuring maximum code reuse with Model Learning. Ideally, an abstract class should be provided that is implemented by both Model Learning and Sensitivity but such a refactor is beyond the scope of this PR.

sensi_plot

Checklist

Please include and complete the following checklist. Your Pull Request is (in most cases) not ready for review until the following have been completed. You can create a draft PR while you are still completing the checklist. Check the Contribution Guidelines for more details. You can mark an item as complete with the - [x] prefix

  • Tests - Added unit tests for new code, regression tests for bugs and updated the integration tests if required
  • Formatting & Linting - black and flake8 have been used to ensure styling guidelines are met
  • Type Annotations - All new code has been type annotated in the function signatures using type hints
  • Docstrings - Docstrings have been provided for functions in the numpydoc style
  • Documentation - The tutorial style documentation has been updated to explain changes & new features
  • Notebooks - Example notebooks have been updated to incorporate changes and new features

@lazyoracle lazyoracle added bug Something isn't working enhancement New feature or request labels Jul 5, 2021
@lazyoracle lazyoracle added this to the 1.3 milestone Jul 5, 2021
@lazyoracle lazyoracle requested a review from nwittler July 5, 2021 13:35
@lazyoracle lazyoracle self-assigned this Jul 5, 2021
@codecov
Copy link

codecov bot commented Jul 5, 2021

Codecov Report

Merging #125 (431aca8) into dev (eda0bfa) will increase coverage by 1.32%.
The diff coverage is 93.33%.

Impacted file tree graph

@@            Coverage Diff             @@
##              dev     #125      +/-   ##
==========================================
+ Coverage   66.18%   67.51%   +1.32%     
==========================================
  Files          36       36              
  Lines        5267     5162     -105     
==========================================
- Hits         3486     3485       -1     
+ Misses       1781     1677     -104     
Impacted Files Coverage Δ
c3/utils/utils.py 67.24% <78.57%> (+1.22%) ⬆️
c3/libraries/algorithms.py 52.10% <100.00%> (+0.38%) ⬆️
c3/optimizers/modellearning.py 90.04% <100.00%> (+0.52%) ⬆️
c3/optimizers/sensitivity.py 94.28% <100.00%> (+63.31%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update eda0bfa...431aca8. Read the comment docs.

@lazyoracle
Copy link
Member Author

@nwittler Is there any genuine reason to have an x_init argument for sweep type algorithms? If not I recommend we remove this piece of wizardry and also not include init_point options in Sensitivity

if "init_point" in options:
init_point = bool(options["init_point"])
if init_point:
fun([x_init[0].numpy()])

nwittler
nwittler previously approved these changes Jul 5, 2021
@nwittler
Copy link
Collaborator

nwittler commented Jul 5, 2021

@nwittler Is there any genuine reason to have an x_init argument for sweep type algorithms? If not I recommend we remove this piece of wizardry and also not include init_point options in Sensitivity

if "init_point" in options:
init_point = bool(options["init_point"])
if init_point:
fun([x_init[0].numpy()])

We want to make sure that the exact initial point is evaluated. I guess this could be handled at the sweep 'algorithm', if needed.

@lazyoracle lazyoracle merged commit 94acafb into q-optimize:dev Jul 5, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working enhancement New feature or request size/XL
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Sensitivity code is stale and broken
2 participants