Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Finish solarcalcs, RSMDef (init) translation and corresponding tests. New tests comparing UWG_Python values against UWG_Matlab. #23

Merged
merged 33 commits into from
Dec 28, 2017

Conversation

saeranv
Copy link
Member

@saeranv saeranv commented Dec 28, 2017

Apologies @chriswmackey this is another huge PR. I'll aim to send more manageable PRs from here on out! This PR includes the translation and testing of solarcalcs, and RSMDef initialization methods, tests comparing UWG_Python vs UWG_Matlab, and re-exporting the readDOE csv reference files.

With regards to the second point, I have written a (partially automated) testing method that runs, saves and compares values for UWG_Matlab and UWG_Python. You can see this in the tests for the solarcalcs, readDOE, and simparam modules (i.e saving and comparing). This has been making the translating process a lot quicker, since I don't need to manually write tests all the time to find errors. This also means we have massive .txt files with original UWG_Matlab values I'm running the tests against, which is one reason this PR is huge.

Finally, the bulk of this PR (which deceptively looks huge) is because I re-exported the DOE building reference csv files from UWG_Matlab's excel sheets. While running the UWG_Matlab vs UWG_Python tests I discovered that when I exported the original DOE building reference csv files, the number values where rounded to 5 significant digits or so, whereas the UWG_Matlab version rounds to 16 significant digits. I had to manually set the rounding precision in excel and re-export. This seems to have fixed the majority of the problem. However, I find that occasionally I am losing precision when I convert a csv string to a float, so I still have to figure out why that is occurring before the readDOE.py matches the readDOE.m exactly.

@saeranv saeranv merged commit 9e6bee8 into ladybug-tools:master Dec 28, 2017
@chriswmackey
Copy link
Member

@saeranv ,

This is all great to see and, for the record, large PRs are fine. I know to check with you these days before adding anything myself to this repo.
That's great to know that you have the testing implemented and that should make things a lot smoother going forward.

With regards to the rounding accuracy, the correction for excel's 5 sig figs seems important but I wouldn't worry about changes that happen when you go from a string to a float number. Float values are stored in a way that only handles a maximum of 16 sig figs and this can vary depending on how the number is divisible by 10 or math operations that are performed with the number. This PBS video is useful for understanding how float numbers work:
https://www.youtube.com/watch?v=pQs_wx8eoQ8
The point is that I wouldn't worry about these slight changes that occur from strings to floats. There are more than enough sig figs in float values to give us decently accurate results.

@saeranv
Copy link
Member Author

saeranv commented Dec 30, 2017

Agreed @chriswmackey, I think I'm going to let this problem go for now, as it seems to be in an acceptable range of inaccuracy. I was getting an error around 11 digits of precision, which should be okay to tolerate. To provide some context, I have been using the Stefan boltzmann constant 5.67*e-08 (which requires 10 digits of precision) as a sort of rule of thumb for what kind of rounding accuracy we should try and hold to in the code, or else it's meaningless to even have these kinds of super-precise constants in the code at all.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants