Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Empty arrays and out of bounds errors on big datasets #274

Open
azjthomasfamily opened this issue Feb 1, 2014 · 1 comment
Open

Empty arrays and out of bounds errors on big datasets #274

azjthomasfamily opened this issue Feb 1, 2014 · 1 comment

Comments

@azjthomasfamily
Copy link
Contributor

First of all thanks for providing a great product. Currently we are evaluating the server for our own metrics and ran into a snag. Any dataset with more than 32,767 matched rows will result in a out of bounds error. We went ahead and changed row_index from a short to an int and also change some of the other variables that we got loss or precision errors into an int. This fixed the issue for us and we can run results for all 14 years of data. I'd submit a patch but I am not well versed in Java and don't want to miss any glaring details. Thanks again!

@manolama
Copy link
Member

Go ahead and issue a pull request so you can get credit for the fix. I can review and help clean it up. And please sign http://opentsdb.net/contributing.html if you haven't already.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants