Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SpaceNet (this PR succeeds PR #219) #657

Merged
merged 836 commits into from
Jul 28, 2015
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
836 commits
Select commit Hold shift + click to select a range
8ca044a
RENAME: tvl1 -> tv-l1
dohmatob Oct 14, 2014
65f5caf
ENH: renamed *spacenet.py -> *space_net.py
dohmatob Oct 14, 2014
4c2840c
ENH: _crop_mask(...): new function to crop mask
dohmatob Oct 14, 2014
1533924
ENH: tiny fixes on _crop_mask + test cases for this new function
dohmatob Oct 14, 2014
c006de5
TYPO: self.i_alpha_ = self.i_alpha_ -> self.i_alpha_ = self.i_alpha_[0]
dohmatob Oct 14, 2014
3063b37
- ENH: explicit verbose param in path_scores(...) function
dohmatob Oct 15, 2014
f66bc33
ENH: _fast_smooth_array -> scipy.ndimage.gaussian_filter
dohmatob Oct 15, 2014
c9cf969
unused import
dohmatob Oct 15, 2014
dea6589
unused import
dohmatob Oct 15, 2014
da010e2
ENH: in haxby demo: splitting data into train (10 sessions) and test…
dohmatob Oct 15, 2014
a259fb0
ENH: changed screening_percentile from 10 to 20 (percentage full-brain)
dohmatob Oct 15, 2014
6dcb129
BF: horrible classification scores due to unstandardized data; standa…
dohmatob Oct 18, 2014
eaf476d
CLEANUP: confusing docstring about tol in mfista
dohmatob Oct 18, 2014
5bdfee4
CLEANUP: some errors in docstrings
dohmatob Oct 18, 2014
e6d10ab
BF: couple of tiny bugs
dohmatob Oct 18, 2014
4207eb7
CLEANUP: minor troubles
dohmatob Oct 18, 2014
fdc2117
more authors
dohmatob Oct 18, 2014
fa8975e
printint more results in haxby demo
dohmatob Oct 18, 2014
275587c
cosmetic change
dohmatob Oct 18, 2014
0327958
ENH: not screening if mask has as few as 1000 features in its support
dohmatob Oct 18, 2014
afbfaa0
ENH: better var names + tiight_mask -> mask
dohmatob Oct 18, 2014
9b7d4af
CLEANUP: junk _ovr_y function in SpaceNet
dohmatob Oct 18, 2014
1b306ac
CLEANUP: fex typos
dohmatob Oct 19, 2014
f29ef58
CLEANUP: usless int casting
dohmatob Oct 19, 2014
dd4fde0
CLEANUP: tiny fixups about objective, cost, energy stuff
dohmatob Oct 19, 2014
6c54bed
CLEANUP: corrected some docstrings about numpy ndarray shapes
dohmatob Oct 19, 2014
5a5107b
* ENH: (gap ** 2).sum() -> np.dot(gap, gap)
dohmatob Oct 19, 2014
ad5938a
ENH: input_img_norm = (input_img.ravel() * input_img.ravel()).sum() -…
dohmatob Oct 19, 2014
659c5d9
ENH: np.sqrt(float) -> math.sqrt(float)
dohmatob Oct 19, 2014
4a0dec5
cosmetic fixups about docstrings etc.
dohmatob Oct 19, 2014
a61efdf
CLEANUP: more corrections on docstrings about numpy ndarray shapes
dohmatob Oct 19, 2014
1d4ba7b
ENH: printing elapsed time in minutes too
dohmatob Oct 19, 2014
0bda401
BF: broken commit
dohmatob Oct 19, 2014
deebcaf
addressing @agramfort's comments
dohmatob Oct 20, 2014
ce08d7a
cosmetic changes to address comments
dohmatob Oct 20, 2014
5f5c968
inserted _ in name of test case
dohmatob Oct 20, 2014
04fd579
explict testcase name
dohmatob Oct 20, 2014
50d8458
ENH: not using relative imports in tests
dohmatob Oct 20, 2014
3ca7ba2
invalide -> invalid
dohmatob Oct 20, 2014
eb6690e
more docstrings fixed
dohmatob Oct 20, 2014
2363178
ENH: headline in mfista docstring
dohmatob Oct 20, 2014
a5ebb09
REFACTOR: factored out code for univariate screening into separate fu…
dohmatob Oct 20, 2014
bb7e03e
intercepted_prox_l1 -> prox_l1_with_intercept
dohmatob Oct 20, 2014
fbcf0fa
still fixing minor issues
dohmatob Oct 20, 2014
c8eb247
avoiding ravelling twice by using a view instead
dohmatob Oct 20, 2014
0e9f2dd
repairing broken testcase
dohmatob Oct 20, 2014
05ce211
removed superfluous ravelling in mfista
dohmatob Oct 20, 2014
5981826
DOC: improving SpaceNet doc
dohmatob Oct 20, 2014
bb7e8cd
relaxed very string test
dohmatob Oct 20, 2014
9bc1837
typo
dohmatob Oct 20, 2014
9c4bccf
ENH: binary_dilation(binary_erosion) -> binar_fill_holes
dohmatob Oct 20, 2014
fecfbfd
ENH: improving docstring of SpaceNet
dohmatob Oct 21, 2014
6fcc0b9
ENH: still enhancing docstrings
dohmatob Oct 21, 2014
0905b98
ENH: volume-corrected screening percentile
dohmatob Oct 21, 2014
8b5630b
cosmetics
dohmatob Oct 21, 2014
545ca52
NEW TEST: testing code computing mask volume
dohmatob Oct 21, 2014
19474a7
doc
dohmatob Oct 21, 2014
d638e8e
typos + more references
dohmatob Oct 21, 2014
fc1d50b
typo
dohmatob Oct 21, 2014
1385ab9
* NEW FEATURE: SpaceNetClassifier (subclass of SpaceNet, specialized …
dohmatob Oct 21, 2014
41f6915
implemented SpaceNetRegressor subclass
dohmatob Oct 21, 2014
bb00562
typos + more coverage
dohmatob Oct 21, 2014
831040d
cosmetics
dohmatob Oct 21, 2014
5b99a52
reporting mask volume in cm^3 + plus other cosmetics
dohmatob Oct 21, 2014
0182bbe
REFACTOR: refactored classification-specific APIs from SpaceNet base …
dohmatob Oct 22, 2014
a384116
ENH: disabling screening if only 100 features or fewer
dohmatob Oct 22, 2014
51984e1
small corrections
dohmatob Oct 22, 2014
17a8466
corrected some errors in param docstrings
dohmatob Oct 22, 2014
90313b7
corrected more errors in param docstrings
dohmatob Oct 22, 2014
5903f8b
NEWFEATURE: SpaceClassification works with mse loss too
dohmatob Oct 22, 2014
1b8587a
fixed typo pointed-out by Loic
dohmatob Oct 22, 2014
6282dc0
clean err_msg logic
dohmatob Oct 22, 2014
32b4275
ENH: better computation of intercept form mse loss
dohmatob Oct 22, 2014
4b30122
restored default loss in demo
dohmatob Oct 22, 2014
def76ac
cleanup demo
dohmatob Oct 22, 2014
6c3cc84
random_state arg in _check_lipschitz_continuous
dohmatob Oct 22, 2014
747540f
removed abnoxious 'novel' word from mfista docstring
dohmatob Oct 22, 2014
9e51f12
ENH: not centralizing y for classif problems (recently introduced bug)
dohmatob Oct 22, 2014
ea4045c
BF: warnings.Warn -> warnings.Warn
dohmatob Oct 22, 2014
6d68dbd
WIP: Turning off smooth done before screening
dohmatob Oct 24, 2014
d307121
NEW DEMO: PMG
dohmatob Oct 24, 2014
ad7f7bf
more
dohmatob Oct 24, 2014
f3acb9a
more
dohmatob Oct 24, 2014
2680fd0
l1_ratio=.3 for pmg
dohmatob Oct 25, 2014
4ca7aa1
full brain sl (ok)
dohmatob Oct 25, 2014
90c34ff
morphing strategy = dilation o erosion
dohmatob Oct 25, 2014
f7c092d
l1_ratio=.3 in poldrack tvl1 demo
dohmatob Oct 25, 2014
dcdd18e
CLEANUP + ENH: new option smooth=0 in _univariate_feature_screening
dohmatob Oct 25, 2014
138411f
misc cleanups and cosmetics
dohmatob Oct 25, 2014
44326c2
cleanup demo
dohmatob Oct 25, 2014
d89536b
WIP: demoing sl and tvl1 on haxby
dohmatob Oct 25, 2014
645517f
BF: restored standardize=self.standardize in SpaceNet's NiftiMasking
dohmatob Oct 25, 2014
b7c72fc
cleanup
dohmatob Oct 25, 2014
47b22c2
defaut cv=8 in SpaceNet
dohmatob Oct 25, 2014
37e90f7
scores_ -> cv_scores_ in SpaceNet
dohmatob Oct 25, 2014
e395757
removed unused slicer var in plot_haxb_space_net.py
dohmatob Oct 28, 2014
4f7f446
- BF: proper standardization of input data (in fit and predict)
dohmatob Oct 28, 2014
0cc8ba7
ENH: properly setting intercept after fitting
dohmatob Oct 28, 2014
a3d4242
tiny bug fixes
dohmatob Oct 28, 2014
519406a
addressing @bthirion's recent comments
dohmatob Nov 4, 2014
2616629
still addressing @bthirion's comments
dohmatob Nov 4, 2014
f283f1d
still addressing recent comments
dohmatob Nov 4, 2014
1f47067
ENH: alpha_min=None by default, so that eps is used (instead) to cons…
dohmatob Nov 12, 2014
44795e5
Regexp control on raised errors
Titan-C Oct 24, 2014
22b00ef
REFACTOR: adressing @bthirion's oral remarks about the lambda (no mor…
dohmatob Nov 13, 2014
25cea37
typo
dohmatob Nov 14, 2014
986aacd
ENH: cleanup for poldrack demo script
dohmatob Nov 14, 2014
9753492
DOC: rmed ref to TV-L1 in l1_ratio doc of gradient_id(...) function
dohmatob Nov 14, 2014
9df0afd
cooler
dohmatob Nov 10, 2014
cd26b23
n_jobs=1 restored in demo
dohmatob Nov 14, 2014
29077e1
ENH: fixed MemoryError in _univariate_feature_screening(...) function
dohmatob Nov 14, 2014
a636fd5
BF: restored face vs house in haxby spacenet demo
dohmatob Nov 14, 2014
275f81c
ENH: demoing all penalties in poldrack demo
dohmatob Nov 15, 2014
589ee74
REFACTOR: in space_net.py cv scores are 'bigger is better'
dohmatob Nov 20, 2014
a005c33
BF: trailing X_ in space_net.py
dohmatob Nov 20, 2014
acdc374
closes issue #286 (about nans when l1_ratio = 0)
dohmatob Nov 20, 2014
2d74e04
ENH: tests for issue #286
dohmatob Nov 20, 2014
5a86e89
ENH: returning lists of pairs of lists of indices used in cv
dohmatob Nov 21, 2014
ca36a35
typo
dohmatob Nov 21, 2014
55fd082
BF: wrong docstring about ventral mask in plot_haxby_space_net.py demo
dohmatob Nov 23, 2014
9cbdb10
Clean unused imports
GaelVaroquaux Dec 2, 2014
8cc69f8
MISC: Cosmetics in the example
GaelVaroquaux Dec 2, 2014
b259caa
API: SpaceNet -> BaseSpaceNet
GaelVaroquaux Dec 2, 2014
d182c29
WIP: try to fix the alpha grid
GaelVaroquaux Dec 2, 2014
de2ea32
WIP: add a second score to disambiguate
GaelVaroquaux Dec 2, 2014
27bc450
ENH: better verbosity
GaelVaroquaux Dec 2, 2014
ec60eee
DOC: complete the docstring
GaelVaroquaux Dec 2, 2014
04075a4
MISC: avoid deprecation with new numpy
GaelVaroquaux Dec 2, 2014
adb15b7
MISC: minor improvement in example
GaelVaroquaux Dec 2, 2014
a9e3899
fixed tests broken by the other other pr_219
dohmatob Dec 2, 2014
fea6b04
BF: trailing assert ...
dohmatob Dec 3, 2014
172a30e
-BF: confusing rescale_alpha param removed from code base
dohmatob Dec 3, 2014
82772ea
BF: still fixing alpha grid computation
dohmatob Dec 4, 2014
db317a9
cleanups
dohmatob Dec 6, 2014
49032f7
NF: poldrack loader (local)
dohmatob Dec 6, 2014
606b340
SL & TVL1 ok on poldrack :)
dohmatob Dec 6, 2014
f8a7309
NF: cv on l1_ratio
dohmatob Dec 6, 2014
b555aa1
cleanup
dohmatob Dec 7, 2014
eb48480
cv on l1_ratio: ok
dohmatob Dec 7, 2014
c943318
NF: fetcher for PMG (code stubb, only works locally for now)
dohmatob Dec 8, 2014
a72cc97
remove obselet load_poldrack.py module
dohmatob Dec 8, 2014
afd79da
fixing some underground bugs
dohmatob Dec 9, 2014
abd843b
NF: fetch_mixed_gambles
dohmatob Dec 9, 2014
770a427
wip
dohmatob Dec 9, 2014
dc108dc
wip
dohmatob Dec 9, 2014
fd2c674
Favor small l1_ratios
GaelVaroquaux Dec 11, 2014
421adad
Fix path on l1_ratio
GaelVaroquaux Dec 11, 2014
90de959
BF: fixed couple of testcases broken by pr_219 :)
dohmatob Dec 12, 2014
d048f21
restored default l1_ratios to .5 (from .75)
dohmatob Dec 12, 2014
05f62f7
ENH: addressing @gael's comments about spurous start(...) method in E…
dohmatob Dec 12, 2014
6a36e3e
"""
dohmatob Dec 12, 2014
7b0a810
BF: broken demos
dohmatob Dec 12, 2014
d6cc057
cosmetics
dohmatob Dec 12, 2014
1d8e5f5
EXPERIMENTS: debugging why good alphas don't mean good maps
dohmatob Dec 17, 2014
a54ff62
typo
dohmatob Dec 17, 2014
819c85c
wip
dohmatob Dec 18, 2014
af044d4
wip
dohmatob Dec 18, 2014
a253f2b
ENH+BF+CONFUSION: (on utterly bad prediction scores on oasis dataset)…
dohmatob Dec 18, 2014
a94c6aa
ENH: using sk.cv.train_test_split in oasis
dohmatob Dec 18, 2014
1c6b70f
BF: train_size=.8 in oasis space net demo
dohmatob Dec 18, 2014
092772a
- ENH: putting Mean Abs Error in titles of Oasis prediction curves
dohmatob Feb 4, 2015
f8d50f4
REFACTOR: moved space net demos to examples/decoding sub-dir
dohmatob Feb 5, 2015
edf05ab
- ENH: added space net entry in doc (decoding.rst)
dohmatob Feb 5, 2015
98c5981
- ENH: removed zombie doc for inexistend normalize param
dohmatob Feb 5, 2015
8abf120
smooth -> smoothing_fwhm
dohmatob Feb 5, 2015
8e45d53
mask -> mask_img in _get_mask_volume
dohmatob Feb 5, 2015
a829be4
tiny fixes
dohmatob Feb 5, 2015
0b73aec
BF+DOC: fixed bug in docstring about type / shape of X
dohmatob Feb 5, 2015
5f55713
ENH: adding oasis vbm in doc
dohmatob Feb 5, 2015
92d44cc
BF: fixing py3 issues + 0400 ==> 400 (error = invalid token)
dohmatob Jul 13, 2015
33cb60a
BF: still fixing py3 stuff
dohmatob Jul 13, 2015
a2c6484
BF: still fixing py3 stuff
dohmatob Jul 13, 2015
2339440
xrange ==> range
dohmatob Jul 13, 2015
acf88bb
random ==> check_random_state
dohmatob Jul 13, 2015
0107e51
rng.sample ==> rng.choice
dohmatob Jul 13, 2015
5736628
BF: choice ==> choice(replace=False)
dohmatob Jul 13, 2015
715e21b
more
dohmatob Jul 13, 2015
fe3b8b2
BF: basestring ==> _basestring (bkport for py3)
dohmatob Jul 14, 2015
61986ab
DOC: SpaceNet entry in doc/decoding/*
dohmatob Jul 14, 2015
f37b578
DOC: SpaceNet entries in API doc
dohmatob Jul 14, 2015
e0ed463
fix docstring
dohmatob Jul 15, 2015
3db00fb
documenting init param of mfista
dohmatob Jul 15, 2015
39df752
still adressing @bthirion's comments
dohmatob Jul 15, 2015
fefa61d
more
dohmatob Jul 15, 2015
e35f7e1
BF: doc string for poldrack example
dohmatob Jul 15, 2015
0447753
making TV-l1 show up doc before sl
dohmatob Jul 15, 2015
fcf6bed
API: leading underscore for hidden / protected spacenet functions
dohmatob Jul 15, 2015
07e4d48
documenting some nontrivial things
dohmatob Jul 15, 2015
f89fb15
FIX: fixing docstrings (still address @bthirion's remarks)
dohmatob Jul 15, 2015
b2678cc
zombie test case rmed
dohmatob Jul 15, 2015
30c06ef
more
dohmatob Jul 15, 2015
ba96096
fixups
dohmatob Jul 15, 2015
97c5c56
added space net in thumbs
dohmatob Jul 16, 2015
e06d2ee
fixups (carousel)
dohmatob Jul 16, 2015
ac6d4c6
only a few spacenet thumbs
dohmatob Jul 16, 2015
b96fb08
REFACTOR: rmed center_data backport
dohmatob Jul 16, 2015
28b7d96
API: more leading underscores (for protected stuff)
dohmatob Jul 16, 2015
686bf86
more leading _underscors + rmved unused copy_data param
dohmatob Jul 16, 2015
68ab565
BF: broken test case
dohmatob Jul 16, 2015
0416112
useless XXX
dohmatob Jul 16, 2015
313e7c3
CLEANUP: rmved obsolete sklearn backports
dohmatob Jul 16, 2015
f1dcb45
fixups: leading underscores + other docstring wahala
dohmatob Jul 16, 2015
aed4893
more leading underscores for hidden functions
dohmatob Jul 16, 2015
2499f81
DOC: spacenet now in decoding_tutorial and estimator_choice.rst, etc.
dohmatob Jul 16, 2015
dfccea4
glitches
dohmatob Jul 16, 2015
3b76060
CLEANUP: not mentioning pure TV and pure LASSO in user doc
dohmatob Jul 16, 2015
57f18e5
CLEANUP: merged all smooth lasso data gen functions into one
dohmatob Jul 16, 2015
a47d485
typo
dohmatob Jul 16, 2015
8a443cb
more typos
dohmatob Jul 16, 2015
1f9a30f
DOC: display_mode=yz in haxby space net example
dohmatob Jul 16, 2015
5e7300b
reduced cut_coords in oasis space net example
dohmatob Jul 16, 2015
01c4b91
no colorbar in space net examples
dohmatob Jul 17, 2015
d4dcba1
smooth lasoo ==> graph net everywhere
dohmatob Jul 17, 2015
0e9c5fc
fixups rst
dohmatob Jul 17, 2015
71d8399
cleanup var names, etc.
dohmatob Jul 17, 2015
8391e2b
fixups: var names in examples, rm dummy vars, egc.
dohmatob Jul 17, 2015
b761f88
restored examples/plot_haxby_simple.py: has nothing to do with PR
dohmatob Jul 17, 2015
4cbe10c
more smooth lasso ==> graph net
dohmatob Jul 17, 2015
c8798dd
DOC: cmt about intercept in logistic regression
dohmatob Jul 17, 2015
1abf7dc
- moved make_Xy logic from fetch_mixed_gambles to the space net
dohmatob Jul 17, 2015
c3a6afc
renamed example script (space net stuff)
dohmatob Jul 17, 2015
293aa72
default verbose increased to 1
dohmatob Jul 17, 2015
5bc8125
rnmed poldrack ==> mixed_gambles in doc/
dohmatob Jul 17, 2015
26c40f7
typo
dohmatob Jul 17, 2015
4bf8d75
BF: broken backport
dohmatob Jul 17, 2015
4300da3
mixed gambles fetcher integrated into new datasets organization
dohmatob Jul 17, 2015
f9bc35d
REFACTOR: complex data manips moved into fetcher (mixed gambles stuff)
dohmatob Jul 17, 2015
b8edb47
FIX: proper use of memory_level in space net + higher value in oasis …
dohmatob Jul 17, 2015
93d67a1
comment on n_jobs in oasis demo
dohmatob Jul 17, 2015
d0452ba
flipped the interpretation of return_raw_data arg to fetch_mixed_gambles
dohmatob Jul 17, 2015
1ef334c
rmed license
dohmatob Jul 18, 2015
a61211e
using sklearn.utils.check_random_cstate
dohmatob Jul 18, 2015
6c8adcc
rmved dead test
dohmatob Jul 18, 2015
0ba41b7
rmved zombie files
dohmatob Jul 18, 2015
e870fb6
refactored fetch_mixed_gambles (to make it more testable)
dohmatob Jul 18, 2015
7577546
more doc for mixed gambles fetcher
dohmatob Jul 18, 2015
2352791
typo: optinal ==> optional
dohmatob Jul 18, 2015
3cbaee3
typo
dohmatob Jul 18, 2015
6e52349
removed dead tests (fista stuff)
dohmatob Jul 18, 2015
6f4142c
rmed zombie test_datasets.py file
dohmatob Jul 18, 2015
3821dab
rmed commented code + rmed mean_img call in example script
dohmatob Jul 18, 2015
6ec3d1a
temporarilly renamed mixed gambles example to be run early
dohmatob Jul 21, 2015
14d08fe
renamed oasis space net example to be run at top of examples chain
dohmatob Jul 21, 2015
99e1bc1
n_subjects=200 in oasis space net demo (to limit memory demand)
dohmatob Jul 21, 2015
4f5b683
oasis example should run fine now
dohmatob Jul 22, 2015
11c7492
ENH: upsampling oasis data to 4x4x4 (otherwise, we're fried)
dohmatob Jul 23, 2015
37fd0a5
CLEANUP: rmed verbose=2 in oasis example
dohmatob Jul 23, 2015
4158665
not resampling oasis in space net demo + doing only graph-net penalty
dohmatob Jul 23, 2015
74923aa
DOC: mentioning oasis space net example in space_net.rst
dohmatob Jul 24, 2015
3efa3c7
return_raw_data=False by default + fix saying why Regressor and not C…
dohmatob Jul 28, 2015
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,8 @@
*.swo
.DS_Store
build
#*
*#

nilearn.egg-info/
dist/
Expand Down
5 changes: 5 additions & 0 deletions doc/decoding/decoding_simulated.rst
Original file line number Diff line number Diff line change
Expand Up @@ -105,4 +105,9 @@ models, a `coef_` attribute that stores the coefficients **w** estimated
The full file to run the simulation can be found in
:ref:`example_decoding_plot_simulated_data.py`

.. seealso::

* :ref:`space_net`
* :ref:`searchlight`


5 changes: 4 additions & 1 deletion doc/decoding/decoding_tutorial.rst
Original file line number Diff line number Diff line change
Expand Up @@ -426,8 +426,10 @@ To visualize the results, we need to:

.. seealso::

* :ref:`searchlight`
* :ref:`decoding_simulated`
* :ref:`space_net`
* :ref:`searchlight`


Going further with scikit-learn
===================================
Expand Down Expand Up @@ -503,3 +505,4 @@ But, be aware that this can take A WHILE...
has very detailed explanations on a large variety of estimators and
machine learning techniques. To become better at decoding, you need
to study it.

25 changes: 16 additions & 9 deletions doc/decoding/estimator_choice.rst
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@

.. _estimator_choice:

============================================
=====================================
Choosing the right predictive model
============================================
=====================================

This page gives a few simple considerations on the choice of an estimator.
It is slightly oriented towards a *decoding* application, that is the
Expand All @@ -18,11 +18,11 @@ the :ref:`dedicated section of the nilearn documentation


Predictions: regression, classification and multi-class
========================================================
=======================================================


Regression
-----------
----------

A regression problem is a learning task in which the variable to predict
--that we often call ``y``-- is a continuous value, such as an age.
Expand All @@ -33,8 +33,12 @@ Encoding models [1]_ typically call for regressions.
Naselaris et al, Encoding and decoding in fMRI, NeuroImage Encoding
and decoding in fMRI.2011 http://www.ncbi.nlm.nih.gov/pubmed/20691790

.. seealso::

* :ref:`space_net`

Classification: two classes or multi-class
-------------------------------------------
------------------------------------------

A classification task consists in predicting a *class* label for each
observation. In other words, the variable to predict is categorical.
Expand Down Expand Up @@ -68,8 +72,8 @@ whereas the former is linear with the number of classes.

.. seealso::

`Multi-class prediction in scikit-learn's documentation
<http://scikit-learn.org/stable/modules/multiclass.html>`_
* `Multi-class prediction in scikit-learn's documentation <http://scikit-learn.org/stable/modules/multiclass.html>`_
* :ref:`space_net`


**Confusion matrix** `The confusion matrix
Expand All @@ -93,7 +97,7 @@ understand the classifier's errors in a multiclass problem.
:scale: 40

Setting estimator parameters
=============================
============================

Most estimators have parameters that can be set to optimize their
performance. Importantly, this must be done via **nested**
Expand Down Expand Up @@ -124,7 +128,7 @@ CPUs.
* The example :ref:`example_decoding_plot_haxby_grid_search.py`

Different linear models
========================
=======================

There is a wide variety of classifiers available in scikit-learn (see the
`scikit-learn documentation on supervised learning
Expand Down Expand Up @@ -222,4 +226,7 @@ little guarantee on the brain maps.
:align: left
:scale: 70

.. seealso::

* :ref:`space_net`

1 change: 1 addition & 0 deletions doc/decoding/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -24,5 +24,6 @@ predicting an output value.
decoding_tutorial.rst
estimator_choice.rst
decoding_simulated.rst
space_net.rst
searchlight.rst

104 changes: 104 additions & 0 deletions doc/decoding/space_net.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,104 @@
.. for doctests to run, we need to define variables that are define in
the literal includes
>>> import numpy as np
>>> from sklearn import datasets
>>> iris = datasets.load_iris()
>>> fmri_masked = iris.data
>>> target = iris.target
>>> session = np.ones_like(target)
>>> n_samples = len(target)

.. _space_net:

=====================================
Multivariate decoding with SpaceNet
=====================================

The SpaceNet decoder
--------------------
SpaceNet implements a suite of multi-variate priors which for improved
brain decoding. It uses priors like TV (Total Variation) `[Michel et
al. 2011] <https://hal.inria.fr/inria-00563468/document>`_, TV-L1
`[Baldassarre et al. 2012]
<http://www0.cs.ucl.ac.uk/staff/M.Pontil/reading/neurosparse_prni.pdf>`_,
`[Gramfort et al. 2013] <https://hal.inria.fr/hal-00839984>`_
(option: penalty="tvl1"), and Graph-Net `[Hebiri et al. 2011]
<https://hal.archives-ouvertes.fr/hal-00462882/document>`_ (known
as GraphNet in neuroimaging `[Grosenick et al. 2013]
<https://hal.inria.fr/hal-00839984>`_) (option:
penalty="smooth-lasso") to regularize classification and regression
problems in brain imaging. The result are brain maps which are both
sparse (i.e regression coefficients are zero everywhere, except at
predictive voxels) and structured (blobby). The superiority of TV-L1
over methods without structured priors like the Lasso, SVM, ANOVA,
Ridge, etc. for yielding more interpretable maps and improved
prediction scores is now well established `[Baldassarre et al. 2012]
<http://www0.cs.ucl.ac.uk/staff/M.Pontil/reading/neurosparse_prni.pdf>`_,
`[Gramfort et al. 2013] <https://hal.inria.fr/hal-00839984>`_,
`[Grosenick et al. 2013] <https://hal.inria.fr/hal-00839984>`_.


The following table summarizes the parameter(s) used to activate a
given penalty:

- TV-L1: `penalty="tv-l1"`
- Graph-Net: `penalty="smooth-lasso"` (this is the default prior in
SpaceNet)

Note that TV-L1 prior leads to a hard optimization problem, and so can
be slow to run. Under the hood, a few heuristics are used to make
things a bit faster. These include:

- Feature preprocessing, where an F-test is used to eliminate
non-predictive voxels, thus reducting the size of the brain mask in
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

reducing

a principled way.
- Continuation is used along the regularization path, where the
solution of the optimization problem for a given value of the
regularization parameter `alpha` is used as initialization
of for next the regularization (smaller) value on the regularization
grid.

**Implementation:** See `[Dohmatob et al. 2015 (PRNI)]
<https://hal.inria.fr/hal-01147731>`_ and `[Dohmatob
et al. 2014 (PRNI)] <https://hal.inria.fr/hal-00991743>`_ for
technical details regarding the implementation of SpaceNet.

Mixed gambles
.............

.. figure:: ../auto_examples/decoding/images/plot_mixed_gambles_space_net_001.png
:align: right
:scale: 60

.. figure:: ../auto_examples/decoding/images/plot_mixed_gambles_space_net_002.png
:scale: 60

.. topic:: **Code**

The complete script can be found
:ref:`here <example_decoding_plot_mixed_gambles_space_net.py>`.


Haxby
.....

.. figure:: ../auto_examples/decoding/images/plot_haxby_space_net_001.png
:align: right
:scale: 60

.. figure:: ../auto_examples/decoding/images/plot_haxby_space_net_002.png
:scale: 60

.. topic:: **Code**

The complete script can be found
:ref:`here <example_decoding_plot_haxby_space_net.py>`.

.. seealso::

* :ref:`Age prediction on OASIS dataset with SpaceNet <example_decoding_plot_oasis_vbm_space_net.py>`.

* The `scikit-learn documentation <http://scikit-learn.org>`_
has very detailed explanations on a large variety of estimators and
machine learning techniques. To become better at decoding, you need
to study it.
5 changes: 5 additions & 0 deletions doc/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,9 @@
.. |canica| image:: auto_examples/connectivity/images/plot_canica_resting_state_011.png
:target: auto_examples/connectivity/plot_canica_resting_state.html

.. |tvl1_haxby| image:: auto_examples/decoding/images/plot_haxby_space_net_002.png
:target: auto_examples/decoding/plot_haxby_space_net.html

.. |searchlight| image:: auto_examples/decoding/images/plot_haxby_searchlight_001.png
:target: auto_examples/decoding/plot_haxby_searchlight.html

Expand Down Expand Up @@ -65,6 +68,8 @@

* |canica|

* |tvl1_haxby|

* |searchlight|

.. raw:: html
Expand Down
2 changes: 2 additions & 0 deletions doc/modules/reference.rst
Original file line number Diff line number Diff line change
Expand Up @@ -64,6 +64,8 @@ uses.
:toctree: generated/
:template: class.rst

SpaceNetClassifier
SpaceNetRegressor
SearchLight

.. _decomposition_ref:
Expand Down
62 changes: 62 additions & 0 deletions examples/decoding/plot_haxby_space_net.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
"""
Decoding with SpaceNet: face vs house object recognition
=========================================================

Here is a simple example of decoding with a SpaceNet prior (i.e Graph-Net,
TV-l1, etc.), reproducing the Haxby 2001 study on a face vs house
discrimination task.
"""

### Load Haxby dataset ########################################################
from nilearn.datasets import fetch_haxby
data_files = fetch_haxby()

### Load Target labels ########################################################
import numpy as np
labels = np.recfromcsv(data_files.session_target[0], delimiter=" ")


### Split data into train and test samples ####################################
target = labels['labels']
condition_mask = np.logical_or(target == "face", target == "house")
condition_mask_train = np.logical_and(condition_mask, labels['chunks'] <= 6)
condition_mask_test = np.logical_and(condition_mask, labels['chunks'] > 6)

### make X (design matrix) and y (response variable)
import nibabel
from nilearn.image import index_img
niimgs = nibabel.load(data_files.func[0])
X_train = index_img(niimgs, condition_mask_train)
X_test = index_img(niimgs, condition_mask_test)
y_train = target[condition_mask_train]
y_test = target[condition_mask_test]


### Loop over Graph-Net and TV-L1 penalties ####################################
from nilearn.decoding import SpaceNetClassifier
import matplotlib.pyplot as plt
from nilearn.image import mean_img
from nilearn.plotting import plot_stat_map
background_img = mean_img(data_files.func[0])
for penalty in ['graph-net', 'tv-l1']:
### Fit model on train data and predict on test data ######################
decoder = SpaceNetClassifier(memory="cache", penalty=penalty)
decoder.fit(X_train, y_train)
y_pred = decoder.predict(X_test)
accuracy = (y_pred == y_test).mean() * 100.

### Visualization #########################################################
print("Results")
print("=" * 80)
coef_img = decoder.coef_img_
plot_stat_map(coef_img, background_img,
title="%s: accuracy %g%%" % (penalty, accuracy),
cut_coords=(-34, -16), display_mode="yz")
coef_img.to_filename('haxby_%s_weights.nii' % penalty)
print("- %s %s" % (penalty, '-' * 60))
print("Number of train samples : %i" % condition_mask_train.sum())
print("Number of test samples : %i" % condition_mask_test.sum())
print("Classification accuracy : %g%%" % accuracy)
print("_" * 80)

plt.show()
47 changes: 47 additions & 0 deletions examples/decoding/plot_mixed_gambles_space_net.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
"""
SpaceNet on Jimura et al "mixed gambles" dataset.
==================================================

The segmenting power of SpaceNet is quite visible here.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please explain here that you are using the regressor object given that the task is to predict a continuous variable, the gain of the gamble.

Also, maybe we should leave out some data and show a prediction power (if it somewhat works).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On Tue, Jul 28, 2015 at 5:22 PM, Gael Varoquaux notifications@github.com
wrote:

In examples/decoding/plot_mixed_gambles_space_net.py
#657 (comment):

@@ -0,0 +1,45 @@
+"""
+SpaceNet on Jimura et al "mixed gambles" dataset.
+==================================================
+
+The segmenting power of SpaceNet is quite visible here.

Please explain here that you are using the regressor object given that the
task is to predict a continuous variable, the gain of the gamble.

Yeah, definitely.

Also, maybe we should leave out some data and show a prediction power (if
it somewhat works).

hmm this looks like a big change. I propose this be done separately,
otherwise we won't simply ever be able to merge this PR. I propose we don't
produce a single line of additional code (except fixing bugs, typos, etc.)
related to SpaceNet until this PR is merged :)


Reply to this email directly or view it on GitHub
https://github.com/nilearn/nilearn/pull/657/files#r35660156.

DED

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Otherwise, "it's like simulated annealing, without decreasing the temperature" -source: probably @GaelVaroquaux

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds like @eickenberg to my ear ;)

"""
# author: DOHMATOB Elvis Dopgima,
# GRAMFORT Alexandre


### Load data ################################################################
import numpy as np
import nibabel
from scipy import ndimage
from nilearn.datasets import fetch_mixed_gambles
data = fetch_mixed_gambles(n_subjects=16)
zmaps, object_category, mask_img = data.zmaps, data.gain, data.mask_img


### Fit TV-L1 #################################################################
# Here we're using the regressor object given that the task is to predict a
# continuous variable, the gain of the gamble.
from nilearn.decoding import SpaceNetRegressor
decoder = SpaceNetRegressor(mask=mask_img, penalty="tv-l1",
eps=1e-1, # prefer large alphas
memory="cache")
decoder.fit(zmaps, object_category) # fit

### Visualize TV-L1 weights
import matplotlib.pyplot as plt
from nilearn.plotting import plot_stat_map
plot_stat_map(decoder.coef_img_, title="tv-l1", display_mode="yz",
cut_coords=[20, -2])


### Fit Graph-Net ##########################################################
decoder = SpaceNetRegressor(mask=mask_img, penalty="graph-net",
eps=1e-1, # prefer large alphas
memory="cache")
decoder.fit(zmaps, object_category) # fit

### Visualize Graph-Net weights
plot_stat_map(decoder.coef_img_, title="graph-net",
display_mode="yz", cut_coords=[20, -2])


plt.show()