Skip to content

Commit

Permalink
Merge pull request #111 from brianqiu/dev2
Browse files Browse the repository at this point in the history
new data pull in & validate
  • Loading branch information
briankleinqiu committed Dec 12, 2015
2 parents 349e1b8 + d2dcb63 commit a2565ae
Show file tree
Hide file tree
Showing 5 changed files with 31 additions and 37 deletions.
36 changes: 9 additions & 27 deletions data/Makefile
Original file line number Diff line number Diff line change
@@ -1,35 +1,17 @@
data:
wget http://openfmri.s3.amazonaws.com/tarballs/ds005_raw.tgz
<<<<<<< HEAD

wget http://nipy.bic.berkeley.edu/rcsds/ds005_mnifunc.tar
wget http://nipy.bic.berkeley.edu/rcsds/mni_icbm152_nlin_asym_09c_2mm.tar.gz

validate:
python data.py

unzip:
tar -xvzf ds005_raw.tgz
for i in {1..9}
do
for j in {1..3}
do
gunzip ds005/sub00${i}/BOLD/task001_run00${j}/bold.nii.gz
done
done

for i in {10..16}
do
for j in {1..3}
do
gunzip ds005/sub0${i}/BOLD/task001_run00${j}/bold.nii.gz
done
done

=======
tar -xvzf ds005_raw.tgz
#wget http://nipy.bic.berkeley.edu/rcsds/ds005_mnifunc.tar
#tar -xvf ds005_mnifunc.tar
tar -xvf ds005_mnifunc.tar
tar -xvzf mni_icbm152_nlin_asym_09c_2mm.tar.gz
rm ds005_raw.tgz
#rm ds005_mnifunc.tar

validate:
python data.py
>>>>>>> a96098ccbb47c304f972e54f9161165806dc04f1
rm ds005_mnifunc.tar
rm mni_icbm152_nlin_asym_09c_2mm.tar.gz
mv mni_icbm152_nlin_asym_09c_2mm templates
mv templates/mni_icbm152_t1_tal_nlin_asym_09c_2mm.nii templates/mni_standard.nii
18 changes: 18 additions & 0 deletions data/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
The ds005 dataset, filtered ds005 dataset, and mni templates are stored here. Th
e makefile is written such that:

- 'make data' will pull in the appropriate data
- 'make unzip' will unzip, remove, and rename certain files
- 'make validate' will run data.py to check the hashes of each downloaded file w
ith a master hashlist included, ensuring all downloaded data is correct

THE COMMANDS SHOULD BE DONE IN THIS ORDER to be successfully validated. The ds00
5 folder contains subfolders for each subject, the most relevant of which are:

- BOLD: raw data of fMRI scans for each of the subjects three runs, as well as d
isplacement/variance data
- behav: file for each run that contains the onsets, potential gains, potential
losses, and response of each trial
- model: filtered, processed data of fMRI scans for each of the subjects three r
uns, and the onsets files
-
11 changes: 3 additions & 8 deletions data/data.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
from __future__ import print_function, division
import os
import hashlib
import json

Expand Down Expand Up @@ -48,8 +47,7 @@ def check_hashes(d):
ex: make_hash_list("ds005", "temp") makes hashlist for all of ds005
including subdirectories
"""

"""
"""
file_paths = []
for path, subdirs, files in os.walk(directory):
for name in files:
Expand All @@ -58,12 +56,9 @@ def check_hashes(d):
with open(title, 'w') as outfile:
json.dump(dictionary, outfile)
return dictionary
"""
"""

if __name__ == "__main__":
with open('hashList.txt', 'r') as hl:
with open('total_hash.txt', 'r') as hl:
d = json.load(hl)
check_hashes(d)
#with open('new_hashList.txt', 'r') as hl2:
# data = json.load(hl2)
#check_hashes(data)
1 change: 0 additions & 1 deletion data/hashList.txt

This file was deleted.

2 changes: 1 addition & 1 deletion data/new_hashList.txt → data/total_hash.txt

Large diffs are not rendered by default.

0 comments on commit a2565ae

Please sign in to comment.