New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BNSE initialization is fragile #19
Comments
Thanks for the bug report. Could you check what the scale is of your x-axis? In particular, what is the range of the x values, and what is the (smallest) difference between the x values? We should be hardening the BNSE code to make sure it works for all types of scales. You should try to rescale your x-axis so that it is roughly in the 0 -- 1000 range. We're working on a solution for x-axis scaling in the |
My inputs, both in my sample code and in the private dataset that I mentioned, are already well within the 0--1000 range. |
I also encountered this problem. Is there a better solution? In my case, this situation often occurs when the dimension of each channel is greater than 2 and the number of samples for each channel is not large (about 200). |
Thanks for the feedback. We're in the process of moving to PyTorch and improving the library overall. Part of that will be rewriting the BNSE initialization and testing its stability. In particular, we'll explicitly try and test more than 2 input dimensions and prevent it from returning NaN, Inf, or have an out of bounds error. We're well on our way for the next version, but it will still take a few weeks to iron out all the details. |
BNSE has been rewritten in PyTorch in 0c4f80b, this might fix this issue. |
I'm going to close this issue, since the first error does not get thrown running the code you provided, and the second error involving |
Here's a test code that I used to find one problem:
The results of running this are as follows:
Unfortunately, reducing
num_inp_comps
to 1 doesn't fix this problem.I've also had different errors with another dataset, which look like this:
Unfortunately, I'm not at liberty to release that other dataset, and I haven't been able to find a test dataset that reproduces the above error.
The text was updated successfully, but these errors were encountered: