Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Jd/parameter update fixes #1013

Merged
merged 125 commits into from
Nov 21, 2023
Merged
Show file tree
Hide file tree
Changes from 44 commits
Commits
Show all changes
125 commits
Select commit Hold shift + click to select a range
be930e8
add fix for resolution mismatches
jd-lara Sep 20, 2023
6762934
add fix for resolution mismatches
jd-lara Sep 20, 2023
b1339e3
Merge branch 'jd/parameter_update_fixes' of https://github.com/NREL-S…
jd-lara Sep 20, 2023
6e661bb
fix t_step calc
jd-lara Sep 20, 2023
731f0b1
allow 3D datasets
jd-lara Sep 27, 2023
64509f4
fix get_columns
jd-lara Sep 27, 2023
36c1d4f
add method for 1 d data set
jd-lara Sep 27, 2023
e3ecbfd
fix return
jd-lara Sep 27, 2023
b23c108
fix results processing
jd-lara Sep 28, 2023
318d9fc
fix results testing
jd-lara Sep 28, 2023
cde7af7
update hdf debug for 3 dims
rodrigomha Sep 28, 2023
33b270c
Merge branch 'jd/parameter_update_fixes' of https://github.com/NREL-S…
rodrigomha Sep 28, 2023
c061ec6
support 3D HDFD dataset
jd-lara Sep 28, 2023
7ad9adf
fix dims count
jd-lara Sep 28, 2023
26ce135
fix to dims
jd-lara Sep 28, 2023
bf7c6a3
use max
jd-lara Sep 28, 2023
d9b65a3
force permute order
rodrigomha Sep 28, 2023
6936835
fix results
jd-lara Sep 28, 2023
32c0759
remove subcomp set param val
rodrigomha Sep 28, 2023
7263ecc
formatter
jd-lara Sep 28, 2023
9335564
Merge branch 'jd/parameter_update_fixes' of https://github.com/NREL-S…
jd-lara Sep 28, 2023
c5ca8c2
add missing method
jd-lara Sep 29, 2023
404d165
update tests
jd-lara Sep 29, 2023
8197b8c
formatter
jd-lara Sep 29, 2023
22d3e72
add min_res
jd-lara Sep 29, 2023
ecf9a0a
add code to store 3D in HDF
jd-lara Sep 29, 2023
ba4e377
fixes to run
jd-lara Sep 29, 2023
5e83cfb
fix columns call
jd-lara Sep 29, 2023
8c4e109
formatter
jd-lara Sep 29, 2023
194d72a
fix passing of key
jd-lara Sep 29, 2023
9a06de3
add logging
jd-lara Sep 30, 2023
723ee76
move logger
jd-lara Sep 30, 2023
761bad1
remove @error
jd-lara Oct 2, 2023
c7c9e01
add logging
jd-lara Oct 2, 2023
fadf8f7
add method for results deserialize
jd-lara Oct 2, 2023
acd5f6e
remove FF code
jd-lara Oct 9, 2023
b878bdf
remove constraints code
jd-lara Oct 9, 2023
edd5080
remove additional params
rodrigomha Oct 10, 2023
cb179a2
remove upd parameter limit FF
rodrigomha Oct 10, 2023
7590ef3
Update adding_new_problem_model.md
pitmonticone Oct 13, 2023
4171ab1
Update quick_start_guide.md
pitmonticone Oct 13, 2023
9e46190
Update debugging_infeasible_models.md
pitmonticone Oct 13, 2023
b91cbcd
Update README.md
pitmonticone Oct 13, 2023
5ccf75a
Update definitions.jl
pitmonticone Oct 13, 2023
c50eca2
Update formulations.jl
pitmonticone Oct 13, 2023
47c9d40
Update powermodels_interface.jl
pitmonticone Oct 13, 2023
1b65aa5
Update decision_problems.jl
pitmonticone Oct 13, 2023
ab7fbfc
Update operation_problem_templates.jl
pitmonticone Oct 13, 2023
e4522fd
Update agc.jl
pitmonticone Oct 13, 2023
9cf92cd
Update reserve_group.jl
pitmonticone Oct 13, 2023
fff7151
Merge pull request #1015 from pitmonticone/master
jd-lara Oct 18, 2023
1cad58c
enable 3D dataframes
jd-lara Oct 19, 2023
7da3309
add missing methods
jd-lara Oct 19, 2023
6f304e0
add comment to the code about the use of dims
jd-lara Oct 23, 2023
eb8de6f
remove unnecessary assertions
jd-lara Oct 23, 2023
00b2134
add missing variable in function
jd-lara Oct 23, 2023
a21d9bc
fix tests
jd-lara Oct 23, 2023
c8b1388
add comment
jd-lara Oct 24, 2023
3ebd3d3
add vector defs
jd-lara Oct 25, 2023
66c04c0
Merge pull request #1014 from NREL-Sienna/jd/move_energy_limit_ff_out
jd-lara Oct 25, 2023
c19a60d
Update Project.toml
jd-lara Oct 25, 2023
8cb2bae
fix read results interface
jd-lara Oct 25, 2023
1d149fc
remove stale code
jd-lara Oct 31, 2023
4010e16
Merge pull request #1016 from NREL-Sienna/jd/fix_tests
jd-lara Oct 31, 2023
026e5bc
Update Project.toml
jd-lara Oct 31, 2023
de326b5
add meta field to ff parameter getter
jd-lara Oct 31, 2023
960fe8e
Merge pull request #1017 from NREL-Sienna/jd/fix_fixedvalue_ff_bug
jd-lara Oct 31, 2023
c1b9007
Update Project.toml
jd-lara Oct 31, 2023
aefa549
Update Project.toml
jd-lara Nov 1, 2023
1ef8e6f
add missing compats
jd-lara Nov 1, 2023
6f1eb69
fix sha version
jd-lara Nov 1, 2023
29ff39f
fix compats
jd-lara Nov 1, 2023
6dfcc11
afc/changes to consider ac branche in FixValueFeedForward
alefcastelli Nov 7, 2023
06525f5
Update src/devices_models/devices/AC_branches.jl
jd-lara Nov 16, 2023
712532b
Merge pull request #1019 from NREL-Sienna/afc/FixValueFeedForward-for…
jd-lara Nov 16, 2023
fce8884
remove not needed comments
jd-lara Nov 21, 2023
03e2764
remove bad check
jd-lara Nov 21, 2023
6039d27
formatter
jd-lara Nov 21, 2023
d3c29ee
re-enable threads
jd-lara Nov 21, 2023
3552fb0
add fix for resolution mismatches
jd-lara Sep 20, 2023
3b31244
fix t_step calc
jd-lara Sep 20, 2023
be854a1
allow 3D datasets
jd-lara Sep 27, 2023
5f9589c
fix get_columns
jd-lara Sep 27, 2023
0febf07
add method for 1 d data set
jd-lara Sep 27, 2023
39e012e
fix return
jd-lara Sep 27, 2023
ddd6f27
fix results processing
jd-lara Sep 28, 2023
97c60e2
update hdf debug for 3 dims
rodrigomha Sep 28, 2023
ed29e5c
fix results testing
jd-lara Sep 28, 2023
bdff35f
support 3D HDFD dataset
jd-lara Sep 28, 2023
4cfab76
fix dims count
jd-lara Sep 28, 2023
420c6d3
fix to dims
jd-lara Sep 28, 2023
34254fc
use max
jd-lara Sep 28, 2023
ea9d639
force permute order
rodrigomha Sep 28, 2023
caa018f
fix results
jd-lara Sep 28, 2023
b9f3f10
formatter
jd-lara Sep 28, 2023
24ec546
remove subcomp set param val
rodrigomha Sep 28, 2023
f6e67a7
add missing method
jd-lara Sep 29, 2023
1fa2f9f
update tests
jd-lara Sep 29, 2023
f2e57ce
formatter
jd-lara Sep 29, 2023
1691a94
add min_res
jd-lara Sep 29, 2023
11d3777
add code to store 3D in HDF
jd-lara Sep 29, 2023
ea64d4f
fixes to run
jd-lara Sep 29, 2023
2eb8c38
fix columns call
jd-lara Sep 29, 2023
d3fe841
formatter
jd-lara Sep 29, 2023
456e717
fix passing of key
jd-lara Sep 29, 2023
01d42b7
add logging
jd-lara Sep 30, 2023
49b7366
move logger
jd-lara Sep 30, 2023
739350a
remove @error
jd-lara Oct 2, 2023
18a6362
add logging
jd-lara Oct 2, 2023
ef5a6f9
add method for results deserialize
jd-lara Oct 2, 2023
a6926ad
enable 3D dataframes
jd-lara Oct 19, 2023
616cc35
add missing methods
jd-lara Oct 19, 2023
28cce32
add comment to the code about the use of dims
jd-lara Oct 23, 2023
cbbfe61
remove unnecessary assertions
jd-lara Oct 23, 2023
3302cc1
add missing variable in function
jd-lara Oct 23, 2023
439f7b2
fix tests
jd-lara Oct 23, 2023
b41fda7
add comment
jd-lara Oct 24, 2023
84bb449
add vector defs
jd-lara Oct 25, 2023
dc6c53a
fix read results interface
jd-lara Oct 25, 2023
ede6147
remove not needed comments
jd-lara Nov 21, 2023
73178c3
remove bad check
jd-lara Nov 21, 2023
31e1afc
formatter
jd-lara Nov 21, 2023
901ec14
re-enable threads
jd-lara Nov 21, 2023
0da8a4e
Merge branch 'jd/parameter_update_fixes' of https://github.com/NREL-S…
jd-lara Nov 21, 2023
ffd6238
fix pretty table issue
jd-lara Nov 21, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
191 changes: 163 additions & 28 deletions src/core/dataset.jl
Original file line number Diff line number Diff line change
Expand Up @@ -20,9 +20,9 @@ end

# Values field is accessed with dot syntax to avoid type instability

mutable struct InMemoryDataset <: AbstractDataset
"Data with dimensions (column names, row indexes)"
values::DenseAxisArray{Float64, 2}
mutable struct InMemoryDataset{N} <: AbstractDataset
"Data with dimensions (N column names, row indexes)"
values::DenseAxisArray{Float64, N}
# We use Array here to allow for overwrites when updating the state
timestamps::Vector{Dates.DateTime}
# Resolution is needed because AbstractDataset might have just one row
Expand All @@ -33,12 +33,12 @@ mutable struct InMemoryDataset <: AbstractDataset
end

function InMemoryDataset(
values::DenseAxisArray{Float64, 2},
values::DenseAxisArray{Float64, N},
timestamps::Vector{Dates.DateTime},
resolution::Dates.Millisecond,
end_of_step_index::Int,
)
return InMemoryDataset(
) where {N}
return InMemoryDataset{N}(
values,
timestamps,
resolution,
Expand All @@ -48,8 +48,8 @@ function InMemoryDataset(
)
end

function InMemoryDataset(values::DenseAxisArray{Float64, 2})
return InMemoryDataset(
function InMemoryDataset(values::DenseAxisArray{Float64, N}) where {N}
return InMemoryDataset{N}(
values,
Vector{Dates.DateTime}(),
Dates.Second(0.0),
Expand All @@ -59,34 +59,95 @@ function InMemoryDataset(values::DenseAxisArray{Float64, 2})
)
end

get_num_rows(s::InMemoryDataset) = size(s.values)[2]
# Helper method for one dimensional cases
function InMemoryDataset(
fill_val::Float64,
initial_time::Dates.DateTime,
resolution::Dates.Millisecond,
end_of_step_index::Int,
row_count::Int,
column_names::Vector{String})
return InMemoryDataset(
fill_val,
initial_time,
resolution,
end_of_step_index,
row_count,
(column_names,),
)
end

function InMemoryDataset(
fill_val::Float64,
initial_time::Dates.DateTime,
resolution::Dates.Millisecond,
end_of_step_index::Int,
row_count::Int,
column_names::NTuple{N, <:Any}) where {N}
return InMemoryDataset(
fill!(
DenseAxisArray{Float64}(undef, column_names..., 1:row_count),
fill_val,
),
collect(
range(
initial_time;
step = resolution,
length = row_count,
),
),
resolution,
end_of_step_index,
)
end

get_num_rows(s::InMemoryDataset{N}) where {N} = size(s.values)[N]

function make_system_state(
values::DenseAxisArray{Float64, 2},
timestamp::Dates.DateTime,
resolution::Dates.Millisecond,
)
return InMemoryDataset(values, [timestamp], resolution, 0, 1, UNSET_INI_TIME)
columns::NTuple{N, <:Any},
) where {N}
return InMemoryDataset(NaN, timestamp, resolution, 0, 1, columns)
end

function get_dataset_value(s::InMemoryDataset, date::Dates.DateTime)
function get_dataset_value(
s::T,
date::Dates.DateTime,
) where {T <: Union{InMemoryDataset{1}, InMemoryDataset{2}}}
s_index = find_timestamp_index(s.timestamps, date)
if isnothing(s_index)
error("Request time stamp $date not in the state")
end
return s.values[:, s_index]
end

get_column_names(s::InMemoryDataset) = axes(s.values)[1]
get_column_names(::OptimizationContainerKey, s::InMemoryDataset) = get_column_names(s)
function get_dataset_value(s::InMemoryDataset{3}, date::Dates.DateTime)
s_index = find_timestamp_index(s.timestamps, date)
if isnothing(s_index)
error("Request time stamp $date not in the state")
end
return s.values[:, :, s_index]
end

function get_column_names(k::OptimizationContainerKey, s::InMemoryDataset)
return get_column_names(k, s.values)
end

function get_last_recorded_value(s::InMemoryDataset)
function get_last_recorded_value(s::InMemoryDataset{2})
if get_last_recorded_row(s) == 0
error("The Dataset hasn't been written yet")
end
return s.values[:, get_last_recorded_row(s)]
end

function get_last_recorded_value(s::InMemoryDataset{3})
if get_last_recorded_row(s) == 0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This check could be common.

error("The Dataset hasn't been written yet")
end
return s.values[:, :, get_last_recorded_row(s)]
daniel-thom marked this conversation as resolved.
Show resolved Hide resolved
end

function get_end_of_step_timestamp(s::InMemoryDataset)
return s.timestamps[s.end_of_step_index]
end
Expand All @@ -110,50 +171,124 @@ function get_value_timestamp(s::InMemoryDataset, date::Dates.DateTime)
return s.timestamps[s_index]
end

function set_value!(s::InMemoryDataset, vals::DenseAxisArray{Float64, 2}, index::Int)
# These set_value! methods expect a single time_step value because they are used to update
#the state so the incoming vals will have one dimension less than the DataSet. The exception
# is for vals of Dimension 1 which are still stored in DataSets of dimension 2.
function set_value!(s::InMemoryDataset{2}, vals::DenseAxisArray{Float64, 2}, index::Int)
s.values[:, index] = vals[:, index]
return
end

function set_value!(s::InMemoryDataset, vals::DenseAxisArray{Float64, 1}, index::Int)
function set_value!(s::InMemoryDataset{2}, vals::DenseAxisArray{Float64, 1}, index::Int)
s.values[:, index] = vals
return
end

function set_value!(s::InMemoryDataset{3}, vals::DenseAxisArray{Float64, 2}, index::Int)
s.values[:, :, index] = vals
return
end

# HDF5Dataset does not account of overwrites in the data. Values are written sequentially.
mutable struct HDF5Dataset <: AbstractDataset
mutable struct HDF5Dataset{N} <: AbstractDataset
values::HDF5.Dataset
column_dataset::HDF5.Dataset
write_index::Int
last_recorded_row::Int
resolution::Dates.Millisecond
initial_timestamp::Dates.DateTime
update_timestamp::Dates.DateTime
column_names::Vector{String}
column_names::NTuple{N, Vector{String}}

function HDF5Dataset(values, column_dataset, write_index, last_recorded_row, resolution,
function HDF5Dataset{N}(values,
column_dataset,
write_index,
last_recorded_row,
resolution,
initial_timestamp,
update_timestamp, column_names,
)
new(values, column_dataset, write_index, last_recorded_row, resolution,
update_timestamp,
column_names::NTuple{N, Vector{String}},
) where {N}
new{N}(values, column_dataset, write_index, last_recorded_row, resolution,
initial_timestamp,
update_timestamp, column_names)
end
end

HDF5Dataset(values, column_dataset, resolution, initial_time) =
HDF5Dataset(
function HDF5Dataset{1}(
values::HDF5.Dataset,
column_dataset::HDF5.Dataset,
::Tuple,
resolution::Dates.Millisecond,
initial_time::Dates.DateTime,
)
HDF5Dataset{1}(
values,
column_dataset,
1,
0,
resolution,
initial_time,
UNSET_INI_TIME,
column_dataset[:],
(column_dataset[:],),
)
end

get_column_names(::OptimizationContainerKey, s::HDF5Dataset) = s.column_names
function HDF5Dataset{2}(
values::HDF5.Dataset,
column_dataset::HDF5.Dataset,
dims::NTuple{4, Int},
resolution::Dates.Period,
initial_time::Dates.DateTime,
)
# The indexing is done in this way because we save all the names in an
# adjacent column entry in the HDF5 Datatset. The indexes for each column
# are known because we know how many elements are in each dimension.
# the names for the first column are store in the 1:first_column_number_of_elements.
col1 = column_dataset[1:dims[2]]
daniel-thom marked this conversation as resolved.
Show resolved Hide resolved
# the names for the second column are store in the first_column_number_of elements + 1:end of the column with the names.
col2 = column_dataset[(dims[2] + 1):end]
HDF5Dataset{2}(
values,
column_dataset,
1,
0,
resolution,
initial_time,
UNSET_INI_TIME,
(col1, col2),
)
end

function HDF5Dataset{2}(
values::HDF5.Dataset,
column_dataset::HDF5.Dataset,
dims::NTuple{5, Int},
resolution::Dates.Period,
initial_time::Dates.DateTime,
)
# The indexing is done in this way because we save all the names in an
# adjacent column entry in the HDF5 Datatset. The indexes for each column
# are known because we know how many elements are in each dimension.
# the names for the first column are store in the 1:first_column_number_of_elements.
col1 = column_dataset[1:dims[2]]
# the names for the second column are store in the first_column_number_of elements + 1:end of the column with the names.
col2 = column_dataset[(dims[2] + 1):end]
HDF5Dataset{2}(
values,
column_dataset,
1,
0,
resolution,
initial_time,
UNSET_INI_TIME,
(col1, col2),
)
end

function get_column_names(::OptimizationContainerKey, s::HDF5Dataset)
return s.column_names
end

"""
Return the timestamp from most recent data row updated in the dataset. This value may not be the same as the result from `get_update_timestamp`
Expand Down
1 change: 1 addition & 0 deletions src/core/definitions.jl
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@ const JuMPVariableMatrix = DenseAxisArray{
JuMP.Containers._AxisLookup{Tuple{Int64, Int64}},
},
}
const JuMPFloatMatrix = DenseAxisArray{Float64, 2}
jd-lara marked this conversation as resolved.
Show resolved Hide resolved
const JuMPFloatArray = DenseAxisArray{Float64}
const JuMPVariableArray = DenseAxisArray{JuMP.VariableRef}

Expand Down
2 changes: 1 addition & 1 deletion src/core/optimizer_stats.jl
Original file line number Diff line number Diff line change
Expand Up @@ -100,5 +100,5 @@ function to_dict(stats::OptimizerStats)
end

function get_column_names(::Type{OptimizerStats})
return collect(string.(fieldnames(OptimizerStats)))
return (collect(string.(fieldnames(OptimizerStats))),)
daniel-thom marked this conversation as resolved.
Show resolved Hide resolved
end
45 changes: 37 additions & 8 deletions src/core/results_by_time.jl
Original file line number Diff line number Diff line change
@@ -1,37 +1,53 @@
mutable struct ResultsByTime{T}
mutable struct ResultsByTime{T, N}
key::OptimizationContainerKey
data::SortedDict{Dates.DateTime, T}
resolution::Dates.Period
column_names::Vector{String}
column_names::NTuple{N, Vector{String}}
end

function ResultsByTime(key, data, resolution, column_names)
function ResultsByTime(
key::OptimizationContainerKey,
data::SortedDict{Dates.DateTime, T},
resolution::Dates.Period,
column_names,
) where {T}
_check_column_consistency(data, column_names)
ResultsByTime(key, data, resolution, column_names)
end

function _check_column_consistency(
data::SortedDict{Dates.DateTime, DenseAxisArray{Float64, 2}},
cols::Vector{String},
cols::Tuple{Vector{String}},
)
for val in values(data)
if axes(val)[1] != cols
if axes(val)[1] != cols[1]
error("Mismatch in DenseAxisArray column names: $(axes(val)[1]) $cols")
end
end
end

function _check_column_consistency(
data::SortedDict{Dates.DateTime, Matrix{Float64}},
cols::Vector{String},
cols::Tuple{Vector{String}},
)
for val in values(data)
if size(val)[2] != length(cols)
error("Mismatch in length of Matrix columns: $(size(val)[2]) $(length(cols))")
if size(val)[2] != length(cols[1])
error(
"Mismatch in length of Matrix columns: $(size(val)[2]) $(length(cols[1]))",
)
end
end
end

function _check_column_consistency(
data::SortedDict{Dates.DateTime, DenseAxisArray{Float64, 2}},
cols::NTuple{N, Vector{String}},
) where {N}
# TODO:
end
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to check something?


# TODO: Implement consistency check for other sizes

# This struct behaves like a dict, delegating to its 'data' field.
Base.length(res::ResultsByTime) = length(res.data)
Base.iterate(res::ResultsByTime) = iterate(res.data)
Expand Down Expand Up @@ -73,6 +89,19 @@ function make_dataframe(
return df
end

function make_dataframe(
results::ResultsByTime{DenseAxisArray{Float64, 3}},
timestamp::Dates.DateTime,
)
df = DataFrames.DataFrame()
array = results.data[timestamp]
for idx in Iterators.product(array.axes[1:2]...)
df[!, "$(idx)"] = array[idx..., :].data
end
# _add_timestamps!(df, results, timestamp, array)
return df
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TODO?

end

function make_dataframe(results::ResultsByTime{Matrix{Float64}}, timestamp::Dates.DateTime)
array = results.data[timestamp]
df = DataFrames.DataFrame(array, results.column_names)
Expand Down
4 changes: 2 additions & 2 deletions src/operation/decision_model_store.jl
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ function initialize_storage!(
for timestamp in
range(initial_time; step = model_interval, length = num_of_executions)
data[timestamp] = fill!(
DenseAxisArray{Float64}(undef, column_names, 1:time_steps_count),
DenseAxisArray{Float64}(undef, column_names..., 1:time_steps_count),
NaN,
)
end
Expand Down Expand Up @@ -133,5 +133,5 @@ end

function get_column_names(store::DecisionModelStore, key::OptimizationContainerKey)
container = getfield(store, get_store_container_type(key))
return axes(first(values(container[key])))[1]
return get_column_names(key, first(values(container[key])))
end
Loading
Loading