chore(deps): update dependency netcdf4 to >=1.6.1, <1.7.5#583
Merged
Conversation
Contributor
|
Important Review skippedBot user detected. To trigger a single review, invoke the You can disable this status message by setting the
Comment |
9a18568 to
0f1b0de
Compare
565654c to
8e45018
Compare
78589e1 to
2200ac3
Compare
da35b5e to
b4b2124
Compare
b4b2124 to
d85d825
Compare
FBumann
added a commit
that referenced
this pull request
Feb 5, 2026
* Update the CHANGELOG.md
* Update to tsam v3.1.0 and add warnings for preserve_n_clusters=False
* [ci] prepare release v6.0.0
* fix typo in deps
* fix typo in README.md
* Revert citation temporarily
* [ci] prepare release v6.0.0
* Improve json io
* fix: Notebooks using tsam
* Allow manual docs dispatch
* Created: tests/test_clustering/test_multiperiod_extremes.py
Test Coverage (56 tests):
Multi-Period with Different Time Series
- TestMultiPeriodDifferentTimeSeries - Tests for systems where each period has distinct demand profiles:
- Different cluster assignments per period
- Optimization with period-specific profiles
- Correct expansion mapping per period
- Statistics correctness per period
Extreme Cluster Configurations
- TestExtremeConfigNewCluster - Tests method='new_cluster':
- Captures peak demand days
- Can increase cluster count
- Works with min_value parameter
- TestExtremeConfigReplace - Tests method='replace':
- Maintains requested cluster count
- Works with multi-period systems
- TestExtremeConfigAppend - Tests method='append':
- Combined with segmentation
- Objective preserved after expansion
Combined Multi-Period and Extremes
- TestExtremeConfigMultiPeriod - Extremes with multi-period/scenario:
- Requires preserve_n_clusters=True for multi-period
- Works with periods and scenarios together
- TestMultiPeriodWithExtremes - Combined scenarios:
- Different profiles with extreme capture
- Extremes combined with segmentation
- Independent cluster assignments per period
Multi-Scenario Clustering
- TestMultiScenarioWithClustering - Scenarios with clustering
- TestFullDimensionalClustering - Full (periods + scenarios) combinations
IO Round-Trip
- TestMultiPeriodClusteringIO - Save/load preservation tests
Edge Cases
- TestEdgeCases - Single cluster, many clusters, occurrence sums, mapping validation
* fix: clustering and tsam 3.1.0 issue
* [ci] prepare release v6.0.1
* fix: clustering and tsam 3.1.0 issue
* [ci] prepare release v6.0.1
* ci: remove test
* [ci] prepare release v6.0.1
* chore(deps): update dependency werkzeug to v3.1.5 (#564)
* chore(deps): update dependency ruff to v0.14.14 (#563)
* chore(deps): update dependency netcdf4 to >=1.6.1, <1.7.5 (#583)
* chore(deps): update dependency pre-commit to v4.5.1 (#532)
* fix: Comparison coords (#599)
* Fix coords concat in comparison.py
* Fix coords concat in comparison.py
* Fix coords concat in comparison.py
* Add 6.0.1 changelog entry
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* Fix coord preservation in Comparison.solution and .inputs
- Apply _extract_nonindex_coords pattern to solution and inputs properties
- Add warning when coordinate mappings conflict during merge
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* Update CHANGELOG.md
* Update CHANGELOG.md
* ⏺ The fix is straightforward — on line 83, mapping.get(dv) returns None for unmapped values. Change it to mapping.get(dv, dv) so unmapped dimension values fall back to
themselves.
⏺ Update(flixopt/comparison.py)
⎿ Added 1 line, removed 1 line
80 for name, (dim, mapping) in merged.items():
81 if dim not in ds.dims:
82 continue
83 - new_coords[name] = (dim, [mapping.get(dv) for dv in ds.coords[dim].values])
83 + new_coords[name] = (dim, [mapping.get(dv, dv) for dv in ds.coords[dim].values])
84
85 return ds.assign_coords(new_coords)
86
⏺ Done. The change on line 83 ensures that when mapping doesn't contain a key for a dimension value (which happens with outer-join additions), the original value dv is
preserved instead of inserting None.
* Update Changelog
---------
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
* [ci] prepare release v6.0.2
* typo
* Revert "typo"
This reverts commit 4a57282.
* Add plan file
* Add comprehensive test_math coverage for multi-period, scenarios, clustering, and validation
- Add 26 new tests across 8 files (×3 optimize modes = ~75 test runs)
- Multi-period: period weights, flow_hours limits, effect limits, linked invest, custom period weights
- Scenarios: scenario weights, independent sizes, independent flow rates
- Clustering: basic objective, storage cyclic/intercluster modes, status cyclic mode
- Storage: relative min/max charge state, relative min/max final charge state, balanced invest
- Components: transmission startup cost, Power2Heat, HeatPumpWithSource, SourceAndSink
- Flow status: max_uptime standalone test
- Validation: SourceAndSink requires size with prevent_simultaneous
* ⏺ Done. Here's a summary of what was changed:
Fix (flixopt/components.py:1146-1169): In _relative_charge_state_bounds, the scalar else branches now expand the base parameter to regular timesteps only
(timesteps_extra[:-1]), then concat with the final-timestep DataArray containing the correct override value. Previously they just broadcast the scalar across all timesteps,
silently ignoring relative_minimum_final_charge_state / relative_maximum_final_charge_state.
Tests (tests/test_math/test_storage.py): Added two new tests — test_storage_relative_minimum_final_charge_state_scalar and
test_storage_relative_maximum_final_charge_state_scalar — identical scenarios to the existing array-based tests but using scalar defaults (the previously buggy path).
* Added TestClusteringExact class with 3 tests asserting exact per-timestep values in clustered systems:
1. test_flow_rates_match_demand_per_cluster — Verifies Grid flow_rate matches demand [10,20,30,40] identically in each cluster, objective = 200.
2. test_per_timestep_effects_with_varying_price — Verifies per-timestep costs [10,20,30,40] reflect price×flow with varying prices [1,2,3,4] and constant demand=10, objective
= 200.
3. test_storage_cyclic_charge_discharge_pattern — Verifies storage with cyclic clustering: charges at cheap timesteps (price=1), discharges at expensive ones (price=100),
with exact charge_state trajectory across both clusters, objective = 100.
Deviation from plan: Used equal cluster weights [1.0, 1.0] instead of [1.0, 2.0]/[1.0, 3.0] for tests 1 and 2. This was necessary because cluster_weight is not preserved
during NetCDF roundtrip (pre-existing IO bug), which would cause the save->reload->solve mode to fail. Equal weights produce correct results in all 3 IO modes while still
testing the essential per-timestep value correctness.
* More storage tests
* Add multi-period tests
* Add clustering tests and fix issues with user set cluster weights
* Update CHANGELOG.md
* Mark old tests as stale
* Update CHANGELOG.md
* Mark tests as stale and move to new dir
* Move more tests to stale
* Change fixtures to speed up tests
* Moved files into stale
* Renamed folder
* Reorganize test dir
* Reorganize test dir
* Rename marker
* 2. 08d-clustering-multiperiod.ipynb (cell 29): Removed stray <cell_type>markdown</cell_type> from Summary cell
3. 08f-clustering-segmentation.ipynb (cell 33): Removed stray <cell_type>markdown</cell_type> from API Reference cell
4. flixopt/comparison.py: _extract_nonindex_coords now detects when the same coord name appears on different dims — warns and skips instead of silently overwriting
5. test_multiperiod_extremes.py: Added .item() to mapping.min()/.max() and period_mapping.min()/.max() to extract scalars before comparison
6. test_flow_status.py: Tightened test_max_uptime_standalone assertion from > 50.0 to assert_allclose(..., 60.0, rtol=1e-5) matching the documented arithmetic
---------
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This PR contains the following updates:
>= 1.6.1, < 1.7.4→>=1.6.1, <1.7.5Release Notes
Unidata/netcdf4-python (netcdf4)
v1.7.4=================================
(issue #1464).
Configuration
📅 Schedule: Branch creation - Between 12:00 AM and 03:59 AM, only on Monday ( * 0-3 * * 1 ) (UTC), Automerge - At any time (no schedule defined).
🚦 Automerge: Enabled.
♻ Rebasing: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
This PR was generated by Mend Renovate. View the repository job log.