Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add validation run for HBV model to test suite #10

Closed
kratzert opened this issue Feb 7, 2018 · 14 comments
Closed

Add validation run for HBV model to test suite #10

kratzert opened this issue Feb 7, 2018 · 14 comments

Comments

@kratzert
Copy link
Owner

kratzert commented Feb 7, 2018

I need someone with MATLAB installed to run the original code for me with fixed parameters to generate a simulation timeseries to compare against. Inputs can be used from the provided data, shipping with the MATLAB code.

@danklotz
Copy link
Contributor

danklotz commented Feb 8, 2018

I don't have it anymore :( - Maybe we need to re-instal it...

@kratzert
Copy link
Owner Author

kratzert commented Feb 8, 2018

Maybe it's still installed on one of the computers in the master students office ><

@amacd31
Copy link

amacd31 commented Feb 8, 2018

I have a copy of Matlab on a rather old laptop and seem to have been able to run the HBV Matlab code downloaded from here: http://amir.eng.uci.edu/software.php (Specifically http://amir.eng.uci.edu/downloads/HBV.zip).

Qsim.zip contains a CSV file that is the result of adding the following to the end of HBV.m:
csvwrite('Qsim.csv', Qsim);

If this looks like the kinds of results you are after let me know if you'd like me to run again with different parameters or input data.

@kratzert
Copy link
Owner Author

kratzert commented Feb 8, 2018

Hi @amacd31,

this would be exactly what I'm looking for. I just would need additionally the values of the model parameters that were used to generate the qsim time series. If I'm not wrong, they are stored in the result structure under result.mp. Would be enough, if you can paste them here (as simple text or as screenshot).

Many thanks in advance

@amacd31
Copy link

amacd31 commented Feb 8, 2018

That makes sense, I'm not particularly familiar with Matlab nor have I looked into HBV much before.

The contents of results.mp is:
5.5529 193.88 1.7367 0.019986 0.054889 3.5749 0.09739 0.024421 0.02473 94.554

Which from looking at the code seems to be set by: [cd;cfc;cbeta;cc;ck0;cl;ck1;ck2;ckp;cpwp]

Those names don't seem to match the parameters in the hbvedu.py code here, so I'm not sure if this is the information you were after. Let me know if you'd like me to dig out any other information.

@kratzert
Copy link
Owner Author

kratzert commented Feb 8, 2018

@amacd31 thanks, but just to be sure, these are the values from the same script run as the qsim you uploaded? Because each time you start the run, the script performs a monte carlo search and and thus, will find different values. The parameters are the right ones, just remove the 'c' in front of every element and you have the correct model parameter. I don't know what the c stands for, but in the script they perform monte-carlo search in the first part, then copy the parameters of the best run into these variables starting with c and do another model run to generate the final qsim. If I'm not completely mistake, already a long time since I worked myself through this (ugly..) code to understand their model.

I won't be able to finish the test function today, but hopefully tomorrow. Would be nice if you could validate if the parameters are from the same run and if not, maybe you could run the script again, write down the parameters and upload the corresponding qsim again?

Many thanks in advance!

@amacd31
Copy link

amacd31 commented Feb 9, 2018

It did occur to me on the way into work today that if it was calibrating the parameters as part of the run there might have been a random number component. The qsim values I uploaded previously were from a different run to the parameters I later posted.

I'll run again and supply both the parameters and qsim data from the same run. I don't have access to the laptop I was running this on here and might not have time to re-run this evening, but I should have time tomorrow so it might have to wait until then.

@kratzert
Copy link
Owner Author

kratzert commented Feb 9, 2018

Okay perfect and there is no stress, so take your time. And many thanks again for providing the necessary data!

@amacd31
Copy link

amacd31 commented Feb 10, 2018

Not a problem, was able to have a look at this again this morning.

HBV_results.zip contains simulation results (Qsim.csv) and the parameter set (parameters.csv) used (and generated) from the same run.

These results were produced with rng(2); specified on line 40 of the HBV.m script so these results can be reproduced again if required.

@kratzert
Copy link
Owner Author

kratzert commented Feb 10, 2018

Thanks @amacd31,

just one last thing ;) I just noticed that in the QSim file, it seems that MATLAB cuts of decimal numbers after the 4 position (sometimes earlier...maybe because there are only zeros). Is it possible that you could save the entire number in full precision?
It seems like there is a little difference and I'm unsure if it is because of the way MATLAB cuts of the decimals when storing the csv file. Even if I round my values to the fourth decimal digit, a few datapoints have a difference of 0.0001 which let the test for similar vectors fail. The total difference between the vectors is negligible, so the model should be perfectly fine, but I would love to use the np.allclose() test function as for the other models.

So if you could take one last look at matlab and try to save the QSim as precise as possible, it would be a huge help!

@kratzert
Copy link
Owner Author

Hm I just wondered, maybe we should fix the parameters, because the same rounding problem could happen to the parameter values you read from the workspace.

Could you fix in line 214 to 223 the c* values to the following values and then copy the qsim?

cd=4.25
cfc=177.1
cbeta=2.35
cc=0.02
ck0=0.05
cl=4.87
ck1=0.03
ck2=0.02
ckp=0.05
cpwp=105.89

These are nearly the values you posted in you file, just up to the second decimal digit.

@amacd31
Copy link

amacd31 commented Feb 10, 2018

Well, today I learnt that Matlab csvwrite by default rounds down the precision of its output. This time around I instead used the dlmwrite method specifying a precision of 12 decimal places.

New higher_precision_results.zip containing higher precision Qsim results using the above parameters (see .truncated.csv files) as well as higher precision calibrated at the same time results (using the random seed 2, see .more_precision.csv).

@kratzert
Copy link
Owner Author

Thanks @amacd31

as I thought, it was some rounding problem and the test for similarity now passes. Thank you very much for helping me out with this.

I added you to the list of contributors. If you have any recommendations for further implementations or models you would like to see within this library, let me know!

@amacd31
Copy link

amacd31 commented Feb 12, 2018

Not a problem, happy to have helped.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants