Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.Sign up
Warning for DTI normalization #451
The static 'min_signal' option set to 1 in the DTI fitting causes an error for normalized data.
It would be a good idea to put a warning here that checks if the data is normalized. Otherwise people might be confused as to why all their eigenvalues are constant, while it is actually a setting in the fitting process that causes this.
Thanks for the suggestion - I've run into this problem myself too in some simulations I was doing. I took a look around the code and I don't think that it would be too hard to implement. You'll need to set a default min_signal value upfront for the
Another option is to change the min_signal to something much smaller than 1. Say 0.0001. Might still work OK, no?