Comments (14)
Not a problem, happy to have helped.
from rrmpg.
I don't have it anymore :( - Maybe we need to re-instal it...
from rrmpg.
Maybe it's still installed on one of the computers in the master students office ><
from rrmpg.
I have a copy of Matlab on a rather old laptop and seem to have been able to run the HBV Matlab code downloaded from here: http://amir.eng.uci.edu/software.php (Specifically http://amir.eng.uci.edu/downloads/HBV.zip).
Qsim.zip contains a CSV file that is the result of adding the following to the end of HBV.m:
csvwrite('Qsim.csv', Qsim);
If this looks like the kinds of results you are after let me know if you'd like me to run again with different parameters or input data.
from rrmpg.
Hi @amacd31,
this would be exactly what I'm looking for. I just would need additionally the values of the model parameters that were used to generate the qsim time series. If I'm not wrong, they are stored in the result
structure under result.mp
. Would be enough, if you can paste them here (as simple text or as screenshot).
Many thanks in advance
from rrmpg.
That makes sense, I'm not particularly familiar with Matlab nor have I looked into HBV much before.
The contents of results.mp is:
5.5529 193.88 1.7367 0.019986 0.054889 3.5749 0.09739 0.024421 0.02473 94.554
Which from looking at the code seems to be set by: [cd;cfc;cbeta;cc;ck0;cl;ck1;ck2;ckp;cpwp]
Those names don't seem to match the parameters in the hbvedu.py code here, so I'm not sure if this is the information you were after. Let me know if you'd like me to dig out any other information.
from rrmpg.
@amacd31 thanks, but just to be sure, these are the values from the same script run as the qsim you uploaded? Because each time you start the run, the script performs a monte carlo search and and thus, will find different values. The parameters are the right ones, just remove the 'c' in front of every element and you have the correct model parameter. I don't know what the c stands for, but in the script they perform monte-carlo search in the first part, then copy the parameters of the best run into these variables starting with c and do another model run to generate the final qsim. If I'm not completely mistake, already a long time since I worked myself through this (ugly..) code to understand their model.
I won't be able to finish the test function today, but hopefully tomorrow. Would be nice if you could validate if the parameters are from the same run and if not, maybe you could run the script again, write down the parameters and upload the corresponding qsim again?
Many thanks in advance!
from rrmpg.
It did occur to me on the way into work today that if it was calibrating the parameters as part of the run there might have been a random number component. The qsim values I uploaded previously were from a different run to the parameters I later posted.
I'll run again and supply both the parameters and qsim data from the same run. I don't have access to the laptop I was running this on here and might not have time to re-run this evening, but I should have time tomorrow so it might have to wait until then.
from rrmpg.
Okay perfect and there is no stress, so take your time. And many thanks again for providing the necessary data!
from rrmpg.
Not a problem, was able to have a look at this again this morning.
HBV_results.zip contains simulation results (Qsim.csv) and the parameter set (parameters.csv) used (and generated) from the same run.
These results were produced with rng(2);
specified on line 40 of the HBV.m script so these results can be reproduced again if required.
from rrmpg.
Thanks @amacd31,
just one last thing ;) I just noticed that in the QSim file, it seems that MATLAB cuts of decimal numbers after the 4 position (sometimes earlier...maybe because there are only zeros). Is it possible that you could save the entire number in full precision?
It seems like there is a little difference and I'm unsure if it is because of the way MATLAB cuts of the decimals when storing the csv file. Even if I round my values to the fourth decimal digit, a few datapoints have a difference of 0.0001 which let the test for similar vectors fail. The total difference between the vectors is negligible, so the model should be perfectly fine, but I would love to use the np.allclose() test function as for the other models.
So if you could take one last look at matlab and try to save the QSim as precise as possible, it would be a huge help!
from rrmpg.
Hm I just wondered, maybe we should fix the parameters, because the same rounding problem could happen to the parameter values you read from the workspace.
Could you fix in line 214 to 223 the c* values to the following values and then copy the qsim?
cd=4.25
cfc=177.1
cbeta=2.35
cc=0.02
ck0=0.05
cl=4.87
ck1=0.03
ck2=0.02
ckp=0.05
cpwp=105.89
These are nearly the values you posted in you file, just up to the second decimal digit.
from rrmpg.
Well, today I learnt that Matlab csvwrite
by default rounds down the precision of its output. This time around I instead used the dlmwrite
method specifying a precision of 12 decimal places.
New higher_precision_results.zip containing higher precision Qsim results using the above parameters (see .truncated.csv files) as well as higher precision calibrated at the same time results (using the random seed 2, see .more_precision.csv).
from rrmpg.
Thanks @amacd31
as I thought, it was some rounding problem and the test for similarity now passes. Thank you very much for helping me out with this.
I added you to the list of contributors. If you have any recommendations for further implementations or models you would like to see within this library, let me know!
from rrmpg.
Related Issues (10)
- Possible bug in Fortran HOT 3
- HBVEdu model: missing attribute area? HOT 1
- Calibration HOT 3
- Citation and/or acknowledgement HOT 1
- Where can i get the mean potential evapotranspiration for the other gauges in camels HOT 2
- Numba warning
- get_random_params() should be adapted to return n parameter sets on request
- Parallelize simulation function of models to process multiple parameter sets in parallel
- Adapt all metric functions to calculate metrics for multiple simulation at once HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from rrmpg.