New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve delimiter on read_bvals_bvecs() #1417

skoudoro opened this Issue Feb 6, 2018 · 1 comment


None yet
1 participant

skoudoro commented Feb 6, 2018


As we can see below, The bvals format (not space delimited?) is not managed. It seems some bvals are space delimited, and some are comma delimited (old ones?), and some TAB delimited.

it should be easy to detect delimiter (look here) and update read_bvals_bvecs()

INFO:Computing DTI metrics for ../5a79de2c8e2185002a44906a/59b9ac03620ac10cb3e05d2c/dwi.nii.gz
INFO:Tensor estimation...
Traceback (most recent call last):
  File "/usr/local/bin/dipy_reconst_dti", line 9, in <module>
  File "/usr/local/lib/python2.7/dist-packages/dipy/workflows/", line 80, in run_flow
  File "/usr/local/lib/python2.7/dist-packages/dipy/workflows/", line 106, in run
  File "/usr/local/lib/python2.7/dist-packages/dipy/workflows/", line 175, in get_fitted_tensor
    bvals, bvecs = read_bvals_bvecs(bval, bvec)
  File "/usr/local/lib/python2.7/dist-packages/dipy/io/", line 41, in read_bvals_bvecs
  File "/usr/local/lib/python2.7/dist-packages/numpy/lib/", line 1024, in loadtxt
    items = [conv(val) for (conv, val) in zip(converters, vals)]
  File "/usr/local/lib/python2.7/dist-packages/numpy/lib/", line 725, in floatconv
    return float(x)
ValueError: invalid literal for float(): 0,1000,1000,1000,1000,1000,0,1000,1000,1000,1000,1000,0,1000,1000,1000,1000,1000,0,1000,1000,1000,1000,1000,0,1000,1000,1000,1000,1000,0,1000,1000,1000,1000,1000,0,1000,1000,1000,1000,1000,0,1000,1000

@skoudoro skoudoro changed the title from Improve `read_bvals_bvecs()` to Improve delimiter on read_bvals_bvecs() Feb 6, 2018


This comment has been minimized.


skoudoro commented Feb 17, 2018

This issue is soon resolved @abhayjindal51, there is already PR open (#1422)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment