Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP
Browse files

Merge branch 'debian' into debian_proper

* debian: (177 commits)
  Update Debian packaging.
  Update news item.
  Prepare release.
  Update changelog.
  NF: confusion.plot(): explicit control over alpha level of numbers
  confusion.plot: text labels are colored with colors of limits in cmap
  NF: disabled test for confusion.plot (based on cell data confusion)
  RF: confusion.plot -- more stable colorbar size, etc
  BF+NF: RF: confusion.plot works fine with provided labels order, NF: sets argument
  NF: confusion.plot() inspired by Ingo
  NF: some pieces migrated from frontier's paper code
  BF: needed to assign axis limits prior to each axis creation
  NF: Let plotBars() pass kwargs to pylab.bars().
  adjusted untraining in sg.SVM
  NF+BF: MappedClassifierSensitivityAnalyzer so now FeatSelClassifiers return appropriate sensitivity
  RF: more informative msg whenever detrend per chunk fails
  BF: maskmapper.reverse didn't care about the size of the 1D data
  RF+NF: stats carry mean values, TPR=0 iff P=0, use alignment within table
  NF: adding lightweight markup for table2string
  NF: now we can assign new labels_map to Dataset at runtime ;-)
  ...
  • Loading branch information...
commit 942059ce23afadb1b0a876fec04da7f5181ed28d 2 parents b8395e3 + 7fd021f
@hanke hanke authored
Showing with 6,664 additions and 619 deletions.
  1. +35 −1 Changelog
  2. +10 −4 Makefile
  3. +6 −3 TODO
  4. +1,452 −0 data/attributes_literal.txt
  5. BIN  data/tueb_meg.dat.gz
  6. +152 −0 data/tueb_meg_coord.xyz
  7. +7 −0 debian/changelog
  8. +1 −1  debian/control
  9. +2 −1  debian/rules
  10. +5 −0 doc/_templates/indexsidebar.html
  11. +27 −0 doc/authors.txt
  12. +9 −4 doc/conf.py
  13. +23 −0 doc/datasets.txt
  14. +53 −0 doc/devguide.txt
  15. +7 −4 doc/examples/clfs_examples.py
  16. +70 −0 doc/examples/erp_plot.py
  17. 0  doc/examples/kerneldemo.py
  18. 0  doc/examples/projections.py
  19. +16 −10 doc/examples/sensanas.py
  20. 0  doc/examples/smellit.py
  21. +4 −34 doc/examples/svdclf.py
  22. +47 −0 doc/examples/topo_plot.py
  23. +51 −0 doc/faq.txt
  24. +193 −3 doc/featsel.txt
  25. +91 −22 doc/index.txt
  26. +83 −10 doc/installation.txt
  27. +8 −6 doc/intro.txt
  28. +12 −0 doc/publications.txt
  29. +9 −3 mvpa/__init__.py
  30. +34 −12 mvpa/algorithms/cvtranserror.py
  31. +8 −7 mvpa/base/__init__.py
  32. +76 −0 mvpa/base/dochelpers.py
  33. +3 −2 mvpa/base/externals.py
  34. +2 −2 mvpa/clfs/_svmbase.py
  35. +106 −11 mvpa/clfs/base.py
  36. +196 −7 mvpa/clfs/distance.py
  37. +197 −81 mvpa/clfs/gpr.py
  38. +37 −8 mvpa/clfs/kernel.py
  39. +3 −0  mvpa/clfs/libsmlr/__init__.py
  40. +47 −5 mvpa/clfs/libsvm/svm.py
  41. +46 −20 mvpa/clfs/model_selector.py
  42. +55 −16 mvpa/clfs/sg/svm.py
  43. +20 −5 mvpa/clfs/smlr.py
  44. +45 −4 mvpa/clfs/stats.py
  45. +276 −45 mvpa/clfs/transerror.py
  46. +7 −5 mvpa/clfs/warehouse.py
  47. +563 −31 mvpa/datasets/base.py
  48. +227 −0 mvpa/datasets/channel.py
  49. +13 −38 mvpa/datasets/eep.py
  50. +9 −0 mvpa/datasets/mapped.py
  51. +42 −6 mvpa/datasets/miscfx.py
  52. +10 −3 mvpa/datasets/miscfx_sp.py
  53. +8 −3 mvpa/datasets/nifti.py
  54. +3 −0  mvpa/datasets/splitter.py
  55. +3 −0  mvpa/featsel/base.py
  56. +9 −1 mvpa/mappers/mask.py
  57. +6 −1 mvpa/measures/anova.py
  58. +63 −17 mvpa/measures/base.py
  59. +14 −9 mvpa/measures/corrcoef.py
  60. +465 −0 mvpa/measures/irelief.py
  61. +5 −0 mvpa/misc/cmdline.py
  62. +2 −2 mvpa/misc/data_generators.py
  63. +73 −16 mvpa/misc/io/base.py
  64. +2 −2 mvpa/misc/io/meg.py
  65. +1 −1  mvpa/misc/param.py
  66. +126 −0 mvpa/misc/plot/base.py
  67. +271 −87 mvpa/misc/plot/erp.py
  68. +212 −0 mvpa/misc/plot/topo.py
  69. +10 −0 mvpa/misc/state.py
  70. +23 −0 mvpa/misc/support.py
  71. +5 −1 mvpa/misc/transformers.py
  72. +10 −3 mvpa/suite.py
  73. +17 −8 setup.py
  74. +5 −0 tests/main.py
  75. +14 −2 tests/test_clf.py
  76. +2 −2 tests/test_config.py
  77. +55 −13 tests/test_datameasure.py
  78. +269 −27 tests/test_dataset.py
  79. +29 −0 tests/test_datasetfx.py
  80. +8 −1 tests/test_datasetfx_sp.py
  81. +27 −1 tests/test_eepdataset.py
  82. +1 −1  tests/test_iohelpers.py
  83. +60 −2 tests/test_kernel.py
  84. +12 −0 tests/test_maskeddataset.py
  85. +39 −0 tests/test_meg.py
  86. +6 −2 tests/test_niftidataset.py
  87. +15 −0 tests/test_smlr.py
  88. +18 −1 tests/test_stats.py
  89. +2 −2 tests/test_svdmapper.py
  90. +70 −0 tests/test_svm.py
  91. +272 −0 tests/test_transerror.py
  92. +6 −0 tests/tests_warehouse.py
  93. +31 −0 tools/mpkg_wrapper.py
View
36 Changelog
@@ -23,12 +23,46 @@ Unreleased changes
Changes described here are not yet released, but available from VCS
repository.
- * None.
+ * None yet.
Releases
========
+* 0.3.1 (Sun, 14 Sep 2008)
+
+ * New manual section about feature selection with a focus on RFE.
+ Contributed by James M. Hughes.
+ * New dataset type `ChannelDataset` for data structured in channels. Might
+ be useful for data modalities like EEG and MEG. This dataset includes
+ support for common preprocessing steps like resampling and baseline
+ signal substraction.
+ * Plotting of topographies on heads. Thanks to Ingo Fründ for contributing
+ this code. Additionally, a new example shows how to do such plots.
+ * New general purpose function for generating barplots and candlestick plots
+ with error bars (`plotBars()`).
+ * Dataset supports mapping of string labels onto numerical labels, removing
+ the need to perform this mapping manually in user code. 'clfs_examples.py'
+ is adjusted accordingly to demonstrate the new feature.
+ * New Classifier.summary() method to dump classifier settings.
+ * Improved and more flexible plotERPs().
+ * New I-RELIEF sensitivity analyzer.
+ * Added visualization of confusion matrices via `ConfusionMatrix.plot()`
+ inspired by Ingo Fründ.
+ * The PyMVPA version is now globally available in `mvpa.pymvpa_version`.
+ * BugFix: TuebingenMEG reader failed in some cases.
+ * Several improvements (docs and implementation) for building PyMVPA on
+ MacOS X.
+ * New convenience accessor methods (`select()`, `where()` and
+ `__getitem__()`) for the `Dataset` base class.
+ * New `seed()` function to configure the random number generators from user
+ code.
+ * Added reader for a MEG sensor locations format
+ (`TuebingenMEGSensorLocations`).
+ * Initial model selection support for GRP (using openopt).
+ * And tons of minor bugfixes, additional tests and improved documentation.
+
+
* 0.3.0 (Mon, 18 Aug 2008)
* Import of binary EEP files (used by EEProbe) and EEPDataset class.
View
14 Makefile
@@ -194,12 +194,13 @@ testapiref: apidoc
test: unittests testmanual testsuite testapiref testexamples
$(COVERAGE_REPORT): build
+ @echo "Generating coverage data and report. Takes awhile. No progress output."
@cd tests && { \
- export PYTHONPATH=..; \
- python-coverage -x main.py; \
- python-coverage -r -i -o /usr >| ../$(COVERAGE_REPORT); \
+ export PYTHONPATH=.. MVPA_DEBUG=.* MVPA_DEBUG_METRICS=ALL; \
+ python-coverage -x main.py >/dev/null 2>&1; \
+ python-coverage -r -i -o /usr,/var >| ../$(COVERAGE_REPORT); \
grep -v '100%$$' ../$(COVERAGE_REPORT); \
- python-coverage -a -i -o /usr; }
+ python-coverage -a -i -o /usr,/var ; }
#
@@ -245,6 +246,11 @@ bdist_rpm: 3rd
--packager "PyMVPA Authors <pkg-exppsy-pymvpa@lists.alioth.debian.org>" \
--vendor "PyMVPA Authors <pkg-exppsy-pymvpa@lists.alioth.debian.org>"
+# build MacOS installer -- depends on patched bdist_mpkg for Leopard
+bdist_mpkg: 3rd
+ python tools/mpkg_wrapper.py setup.py build_ext
+ python tools/mpkg_wrapper.py setup.py install
+
#
# Data
View
9 TODO
@@ -15,8 +15,11 @@
TODO
****
+ * OptimizedClassifier: to automatically select the model in an easy and
+ non-biased(i.e. non-cheating) way, so it could become a part of any
+ more advanced pipeline as a regular classifier
+ * Add ability to add/modify custom attributes to a dataset instance
* IPython mode
- * Add ability to add/modify custom attributes to a dataset.
* Possibly make NiftiDataset default to float32 when it sees that the data are
- ints.
- * Along with ICA mapper, we should add a PLS mapper.
+ ints
+ * Along with ICA mapper, we should add a PLS mapper
View
1,452 data/attributes_literal.txt
@@ -0,0 +1,1452 @@
+rest 0
+rest 0
+rest 0
+rest 0
+rest 0
+rest 0
+scissors 0
+scissors 0
+scissors 0
+scissors 0
+scissors 0
+scissors 0
+scissors 0
+scissors 0
+scissors 0
+rest 0
+rest 0
+rest 0
+rest 0
+rest 0
+rest 0
+face 0
+face 0
+face 0
+face 0
+face 0
+face 0
+face 0
+face 0
+face 0
+rest 0
+rest 0
+rest 0
+rest 0
+rest 0
+cat 0
+cat 0
+cat 0
+cat 0
+cat 0
+cat 0
+cat 0
+cat 0
+cat 0
+rest 0
+rest 0
+rest 0
+rest 0
+rest 0
+shoe 0
+shoe 0
+shoe 0
+shoe 0
+shoe 0
+shoe 0
+shoe 0
+shoe 0
+shoe 0
+rest 0
+rest 0
+rest 0
+rest 0
+rest 0
+house 0
+house 0
+house 0
+house 0
+house 0
+house 0
+house 0
+house 0
+house 0
+rest 0
+rest 0
+rest 0
+rest 0
+rest 0
+rest 0
+scrambledpix 0
+scrambledpix 0
+scrambledpix 0
+scrambledpix 0
+scrambledpix 0
+scrambledpix 0
+scrambledpix 0
+scrambledpix 0
+scrambledpix 0
+rest 0
+rest 0
+rest 0
+rest 0
+rest 0
+bottle 0
+bottle 0
+bottle 0
+bottle 0
+bottle 0
+bottle 0
+bottle 0
+bottle 0
+bottle 0
+rest 0
+rest 0
+rest 0
+rest 0
+rest 0
+chair 0
+chair 0
+chair 0
+chair 0
+chair 0
+chair 0
+chair 0
+chair 0
+chair 0
+rest 0
+rest 0
+rest 0
+rest 0
+rest 0
+rest 0
+rest 1
+rest 1
+rest 1
+rest 1
+rest 1
+rest 1
+face 1
+face 1
+face 1
+face 1
+face 1
+face 1
+face 1
+face 1
+face 1
+rest 1
+rest 1
+rest 1
+rest 1
+rest 1
+rest 1
+cat 1
+cat 1
+cat 1
+cat 1
+cat 1
+cat 1
+cat 1
+cat 1
+cat 1
+rest 1
+rest 1
+rest 1
+rest 1
+rest 1
+shoe 1
+shoe 1
+shoe 1
+shoe 1
+shoe 1
+shoe 1
+shoe 1
+shoe 1
+shoe 1
+rest 1
+rest 1
+rest 1
+rest 1
+rest 1
+chair 1
+chair 1
+chair 1
+chair 1
+chair 1
+chair 1
+chair 1
+chair 1
+chair 1
+rest 1
+rest 1
+rest 1
+rest 1
+rest 1
+scissors 1
+scissors 1
+scissors 1
+scissors 1
+scissors 1
+scissors 1
+scissors 1
+scissors 1
+scissors 1
+rest 1
+rest 1
+rest 1
+rest 1
+rest 1
+rest 1
+bottle 1
+bottle 1
+bottle 1
+bottle 1
+bottle 1
+bottle 1
+bottle 1
+bottle 1
+bottle 1
+rest 1
+rest 1
+rest 1
+rest 1
+rest 1
+house 1
+house 1
+house 1
+house 1
+house 1
+house 1
+house 1
+house 1
+house 1
+rest 1
+rest 1
+rest 1
+rest 1
+rest 1
+scrambledpix 1
+scrambledpix 1
+scrambledpix 1
+scrambledpix 1
+scrambledpix 1
+scrambledpix 1
+scrambledpix 1
+scrambledpix 1
+scrambledpix 1
+rest 1
+rest 1
+rest 1
+rest 1
+rest 1
+rest 1
+rest 2
+rest 2
+rest 2
+rest 2
+rest 2
+rest 2
+cat 2
+cat 2
+cat 2
+cat 2
+cat 2
+cat 2
+cat 2
+cat 2
+cat 2
+rest 2
+rest 2
+rest 2
+rest 2
+rest 2
+rest 2
+scrambledpix 2
+scrambledpix 2
+scrambledpix 2
+scrambledpix 2
+scrambledpix 2
+scrambledpix 2
+scrambledpix 2
+scrambledpix 2
+scrambledpix 2
+rest 2
+rest 2
+rest 2
+rest 2
+rest 2
+scissors 2
+scissors 2
+scissors 2
+scissors 2
+scissors 2
+scissors 2
+scissors 2
+scissors 2
+scissors 2
+rest 2
+rest 2
+rest 2
+rest 2
+rest 2
+chair 2
+chair 2
+chair 2
+chair 2
+chair 2
+chair 2
+chair 2
+chair 2
+chair 2
+rest 2
+rest 2
+rest 2
+rest 2
+rest 2
+bottle 2
+bottle 2
+bottle 2
+bottle 2
+bottle 2
+bottle 2
+bottle 2
+bottle 2
+bottle 2
+rest 2
+rest 2
+rest 2
+rest 2
+rest 2
+rest 2
+shoe 2
+shoe 2
+shoe 2
+shoe 2
+shoe 2
+shoe 2
+shoe 2
+shoe 2
+shoe 2
+rest 2
+rest 2
+rest 2
+rest 2
+rest 2
+face 2
+face 2
+face 2
+face 2
+face 2
+face 2
+face 2
+face 2
+face 2
+rest 2
+rest 2
+rest 2
+rest 2
+rest 2
+house 2
+house 2
+house 2
+house 2
+house 2
+house 2
+house 2
+house 2
+house 2
+rest 2
+rest 2
+rest 2
+rest 2
+rest 2
+rest 2
+rest 3
+rest 3
+rest 3
+rest 3
+rest 3
+rest 3
+shoe 3
+shoe 3
+shoe 3
+shoe 3
+shoe 3
+shoe 3
+shoe 3
+shoe 3
+shoe 3
+rest 3
+rest 3
+rest 3
+rest 3
+rest 3
+rest 3
+house 3
+house 3
+house 3
+house 3
+house 3
+house 3
+house 3
+house 3
+house 3
+rest 3
+rest 3
+rest 3
+rest 3
+rest 3
+chair 3
+chair 3
+chair 3
+chair 3
+chair 3
+chair 3
+chair 3
+chair 3
+chair 3
+rest 3
+rest 3
+rest 3
+rest 3
+rest 3
+cat 3
+cat 3
+cat 3
+cat 3
+cat 3
+cat 3
+cat 3
+cat 3
+cat 3
+rest 3
+rest 3
+rest 3
+rest 3
+rest 3
+face 3
+face 3
+face 3
+face 3
+face 3
+face 3
+face 3
+face 3
+face 3
+rest 3
+rest 3
+rest 3
+rest 3
+rest 3
+rest 3
+scrambledpix 3
+scrambledpix 3
+scrambledpix 3
+scrambledpix 3
+scrambledpix 3
+scrambledpix 3
+scrambledpix 3
+scrambledpix 3
+scrambledpix 3
+rest 3
+rest 3
+rest 3
+rest 3
+rest 3
+bottle 3
+bottle 3
+bottle 3
+bottle 3
+bottle 3
+bottle 3
+bottle 3
+bottle 3
+bottle 3
+rest 3
+rest 3
+rest 3
+rest 3
+rest 3
+scissors 3
+scissors 3
+scissors 3
+scissors 3
+scissors 3
+scissors 3
+scissors 3
+scissors 3
+scissors 3
+rest 3
+rest 3
+rest 3
+rest 3
+rest 3
+rest 3
+rest 4
+rest 4
+rest 4
+rest 4
+rest 4
+rest 4
+house 4
+house 4
+house 4
+house 4
+house 4
+house 4
+house 4
+house 4
+house 4
+rest 4
+rest 4
+rest 4
+rest 4
+rest 4
+rest 4
+scissors 4
+scissors 4
+scissors 4
+scissors 4
+scissors 4
+scissors 4
+scissors 4
+scissors 4
+scissors 4
+rest 4
+rest 4
+rest 4
+rest 4
+rest 4
+bottle 4
+bottle 4
+bottle 4
+bottle 4
+bottle 4
+bottle 4
+bottle 4
+bottle 4
+bottle 4
+rest 4
+rest 4
+rest 4
+rest 4
+rest 4
+face 4
+face 4
+face 4
+face 4
+face 4
+face 4
+face 4
+face 4
+face 4
+rest 4
+rest 4
+rest 4
+rest 4
+rest 4
+chair 4
+chair 4
+chair 4
+chair 4
+chair 4
+chair 4
+chair 4
+chair 4
+chair 4
+rest 4
+rest 4
+rest 4
+rest 4
+rest 4
+rest 4
+shoe 4
+shoe 4
+shoe 4
+shoe 4
+shoe 4
+shoe 4
+shoe 4
+shoe 4
+shoe 4
+rest 4
+rest 4
+rest 4
+rest 4
+rest 4
+cat 4
+cat 4
+cat 4
+cat 4
+cat 4
+cat 4
+cat 4
+cat 4
+cat 4
+rest 4
+rest 4
+rest 4
+rest 4
+rest 4
+scrambledpix 4
+scrambledpix 4
+scrambledpix 4
+scrambledpix 4
+scrambledpix 4
+scrambledpix 4
+scrambledpix 4
+scrambledpix 4
+scrambledpix 4
+rest 4
+rest 4
+rest 4
+rest 4
+rest 4
+rest 4
+rest 5
+rest 5
+rest 5
+rest 5
+rest 5
+rest 5
+house 5
+house 5
+house 5
+house 5
+house 5
+house 5
+house 5
+house 5
+house 5
+rest 5
+rest 5
+rest 5
+rest 5
+rest 5
+rest 5
+scrambledpix 5
+scrambledpix 5
+scrambledpix 5
+scrambledpix 5
+scrambledpix 5
+scrambledpix 5
+scrambledpix 5
+scrambledpix 5
+scrambledpix 5
+rest 5
+rest 5
+rest 5
+rest 5
+rest 5
+face 5
+face 5
+face 5
+face 5
+face 5
+face 5
+face 5
+face 5
+face 5
+rest 5
+rest 5
+rest 5
+rest 5
+rest 5
+shoe 5
+shoe 5
+shoe 5
+shoe 5
+shoe 5
+shoe 5
+shoe 5
+shoe 5
+shoe 5
+rest 5
+rest 5
+rest 5
+rest 5
+rest 5
+chair 5
+chair 5
+chair 5
+chair 5
+chair 5
+chair 5
+chair 5
+chair 5
+chair 5
+rest 5
+rest 5
+rest 5
+rest 5
+rest 5
+rest 5
+cat 5
+cat 5
+cat 5
+cat 5
+cat 5
+cat 5
+cat 5
+cat 5
+cat 5
+rest 5
+rest 5
+rest 5
+rest 5
+rest 5
+bottle 5
+bottle 5
+bottle 5
+bottle 5
+bottle 5
+bottle 5
+bottle 5
+bottle 5
+bottle 5
+rest 5
+rest 5
+rest 5
+rest 5
+rest 5
+scissors 5
+scissors 5
+scissors 5
+scissors 5
+scissors 5
+scissors 5
+scissors 5
+scissors 5
+scissors 5
+rest 5
+rest 5
+rest 5
+rest 5
+rest 5
+rest 5
+rest 6
+rest 6
+rest 6
+rest 6
+rest 6
+rest 6
+face 6
+face 6
+face 6
+face 6
+face 6
+face 6
+face 6
+face 6
+face 6
+rest 6
+rest 6
+rest 6
+rest 6
+rest 6
+rest 6
+chair 6
+chair 6
+chair 6
+chair 6
+chair 6
+chair 6
+chair 6
+chair 6
+chair 6
+rest 6
+rest 6
+rest 6
+rest 6
+rest 6
+scissors 6
+scissors 6
+scissors 6
+scissors 6
+scissors 6
+scissors 6
+scissors 6
+scissors 6
+scissors 6
+rest 6
+rest 6
+rest 6
+rest 6
+rest 6
+shoe 6
+shoe 6
+shoe 6
+shoe 6
+shoe 6
+shoe 6
+shoe 6
+shoe 6
+shoe 6
+rest 6
+rest 6
+rest 6
+rest 6
+rest 6
+scrambledpix 6
+scrambledpix 6
+scrambledpix 6
+scrambledpix 6
+scrambledpix 6
+scrambledpix 6
+scrambledpix 6
+scrambledpix 6
+scrambledpix 6
+rest 6
+rest 6
+rest 6
+rest 6
+rest 6
+rest 6
+house 6
+house 6
+house 6
+house 6
+house 6
+house 6
+house 6
+house 6
+house 6
+rest 6
+rest 6
+rest 6
+rest 6
+rest 6
+cat 6
+cat 6
+cat 6
+cat 6
+cat 6
+cat 6
+cat 6
+cat 6
+cat 6
+rest 6
+rest 6
+rest 6
+rest 6
+rest 6
+bottle 6
+bottle 6
+bottle 6
+bottle 6
+bottle 6
+bottle 6
+bottle 6
+bottle 6
+bottle 6
+rest 6
+rest 6
+rest 6
+rest 6
+rest 6
+rest 6
+rest 7
+rest 7
+rest 7
+rest 7
+rest 7
+rest 7
+face 7
+face 7
+face 7
+face 7
+face 7
+face 7
+face 7
+face 7
+face 7
+rest 7
+rest 7
+rest 7
+rest 7
+rest 7
+rest 7
+scrambledpix 7
+scrambledpix 7
+scrambledpix 7
+scrambledpix 7
+scrambledpix 7
+scrambledpix 7
+scrambledpix 7
+scrambledpix 7
+scrambledpix 7
+rest 7
+rest 7
+rest 7
+rest 7
+rest 7
+scissors 7
+scissors 7
+scissors 7
+scissors 7
+scissors 7
+scissors 7
+scissors 7
+scissors 7
+scissors 7
+rest 7
+rest 7
+rest 7
+rest 7
+rest 7
+shoe 7
+shoe 7
+shoe 7
+shoe 7
+shoe 7
+shoe 7
+shoe 7
+shoe 7
+shoe 7
+rest 7
+rest 7
+rest 7
+rest 7
+rest 7
+bottle 7
+bottle 7
+bottle 7
+bottle 7
+bottle 7
+bottle 7
+bottle 7
+bottle 7
+bottle 7
+rest 7
+rest 7
+rest 7
+rest 7
+rest 7
+rest 7
+cat 7
+cat 7
+cat 7
+cat 7
+cat 7
+cat 7
+cat 7
+cat 7
+cat 7
+rest 7
+rest 7
+rest 7
+rest 7
+rest 7
+chair 7
+chair 7
+chair 7
+chair 7
+chair 7
+chair 7
+chair 7
+chair 7
+chair 7
+rest 7
+rest 7
+rest 7
+rest 7
+rest 7
+house 7
+house 7
+house 7
+house 7
+house 7
+house 7
+house 7
+house 7
+house 7
+rest 7
+rest 7
+rest 7
+rest 7
+rest 7
+rest 7
+rest 8
+rest 8
+rest 8
+rest 8
+rest 8
+rest 8
+face 8
+face 8
+face 8
+face 8
+face 8
+face 8
+face 8
+face 8
+face 8
+rest 8
+rest 8
+rest 8
+rest 8
+rest 8
+rest 8
+chair 8
+chair 8
+chair 8
+chair 8
+chair 8
+chair 8
+chair 8
+chair 8
+chair 8
+rest 8
+rest 8
+rest 8
+rest 8
+rest 8
+cat 8
+cat 8
+cat 8
+cat 8
+cat 8
+cat 8
+cat 8
+cat 8
+cat 8
+rest 8
+rest 8
+rest 8
+rest 8
+rest 8
+house 8
+house 8
+house 8
+house 8
+house 8
+house 8
+house 8
+house 8
+house 8
+rest 8
+rest 8
+rest 8
+rest 8
+rest 8
+scrambledpix 8
+scrambledpix 8
+scrambledpix 8
+scrambledpix 8
+scrambledpix 8
+scrambledpix 8
+scrambledpix 8
+scrambledpix 8
+scrambledpix 8
+rest 8
+rest 8
+rest 8
+rest 8
+rest 8
+rest 8
+shoe 8
+shoe 8
+shoe 8
+shoe 8
+shoe 8
+shoe 8
+shoe 8
+shoe 8
+shoe 8
+rest 8
+rest 8
+rest 8
+rest 8
+rest 8
+bottle 8
+bottle 8
+bottle 8
+bottle 8
+bottle 8
+bottle 8
+bottle 8
+bottle 8
+bottle 8
+rest 8
+rest 8
+rest 8
+rest 8
+rest 8
+scissors 8
+scissors 8
+scissors 8
+scissors 8
+scissors 8
+scissors 8
+scissors 8
+scissors 8
+scissors 8
+rest 8
+rest 8
+rest 8
+rest 8
+rest 8
+rest 8
+rest 9
+rest 9
+rest 9
+rest 9
+rest 9
+rest 9
+face 9
+face 9
+face 9
+face 9
+face 9
+face 9
+face 9
+face 9
+face 9
+rest 9
+rest 9
+rest 9
+rest 9
+rest 9
+rest 9
+cat 9
+cat 9
+cat 9
+cat 9
+cat 9
+cat 9
+cat 9
+cat 9
+cat 9
+rest 9
+rest 9
+rest 9
+rest 9
+rest 9
+scrambledpix 9
+scrambledpix 9
+scrambledpix 9
+scrambledpix 9
+scrambledpix 9
+scrambledpix 9
+scrambledpix 9
+scrambledpix 9
+scrambledpix 9
+rest 9
+rest 9
+rest 9
+rest 9
+rest 9
+house 9
+house 9
+house 9
+house 9
+house 9
+house 9
+house 9
+house 9
+house 9
+rest 9
+rest 9
+rest 9
+rest 9
+rest 9
+scissors 9
+scissors 9
+scissors 9
+scissors 9
+scissors 9
+scissors 9
+scissors 9
+scissors 9
+scissors 9
+rest 9
+rest 9
+rest 9
+rest 9
+rest 9
+rest 9
+chair 9
+chair 9
+chair 9
+chair 9
+chair 9
+chair 9
+chair 9
+chair 9
+chair 9
+rest 9
+rest 9
+rest 9
+rest 9
+rest 9
+shoe 9
+shoe 9
+shoe 9
+shoe 9
+shoe 9
+shoe 9
+shoe 9
+shoe 9
+shoe 9
+rest 9
+rest 9
+rest 9
+rest 9
+rest 9
+bottle 9
+bottle 9
+bottle 9
+bottle 9
+bottle 9
+bottle 9
+bottle 9
+bottle 9
+bottle 9
+rest 9
+rest 9
+rest 9
+rest 9
+rest 9
+rest 9
+rest 10
+rest 10
+rest 10
+rest 10
+rest 10
+rest 10
+cat 10
+cat 10
+cat 10
+cat 10
+cat 10
+cat 10
+cat 10
+cat 10
+cat 10
+rest 10
+rest 10
+rest 10
+rest 10
+rest 10
+rest 10
+scrambledpix 10
+scrambledpix 10
+scrambledpix 10
+scrambledpix 10
+scrambledpix 10
+scrambledpix 10
+scrambledpix 10
+scrambledpix 10
+scrambledpix 10
+rest 10
+rest 10
+rest 10
+rest 10
+rest 10
+chair 10
+chair 10
+chair 10
+chair 10
+chair 10
+chair 10
+chair 10
+chair 10
+chair 10
+rest 10
+rest 10
+rest 10
+rest 10
+rest 10
+bottle 10
+bottle 10
+bottle 10
+bottle 10
+bottle 10
+bottle 10
+bottle 10
+bottle 10
+bottle 10
+rest 10
+rest 10
+rest 10
+rest 10
+rest 10
+shoe 10
+shoe 10
+shoe 10
+shoe 10
+shoe 10
+shoe 10
+shoe 10
+shoe 10
+shoe 10
+rest 10
+rest 10
+rest 10
+rest 10
+rest 10
+rest 10
+house 10
+house 10
+house 10
+house 10
+house 10
+house 10
+house 10
+house 10
+house 10
+rest 10
+rest 10
+rest 10
+rest 10
+rest 10
+face 10
+face 10
+face 10
+face 10
+face 10
+face 10
+face 10
+face 10
+face 10
+rest 10
+rest 10
+rest 10
+rest 10
+rest 10
+scissors 10
+scissors 10
+scissors 10
+scissors 10
+scissors 10
+scissors 10
+scissors 10
+scissors 10
+scissors 10
+rest 10
+rest 10
+rest 10
+rest 10
+rest 10
+rest 10
+rest 11
+rest 11
+rest 11
+rest 11
+rest 11
+rest 11
+bottle 11
+bottle 11
+bottle 11
+bottle 11
+bottle 11
+bottle 11
+bottle 11
+bottle 11
+bottle 11
+rest 11
+rest 11
+rest 11
+rest 11
+rest 11
+rest 11
+house 11
+house 11
+house 11
+house 11
+house 11
+house 11
+house 11
+house 11
+house 11
+rest 11
+rest 11
+rest 11
+rest 11
+rest 11
+chair 11
+chair 11
+chair 11
+chair 11
+chair 11
+chair 11
+chair 11
+chair 11
+chair 11
+rest 11
+rest 11
+rest 11
+rest 11
+rest 11
+scrambledpix 11
+scrambledpix 11
+scrambledpix 11
+scrambledpix 11
+scrambledpix 11
+scrambledpix 11
+scrambledpix 11
+scrambledpix 11
+scrambledpix 11
+rest 11
+rest 11
+rest 11
+rest 11
+rest 11
+face 11
+face 11
+face 11
+face 11
+face 11
+face 11
+face 11
+face 11
+face 11
+rest 11
+rest 11
+rest 11
+rest 11
+rest 11
+rest 11
+shoe 11
+shoe 11
+shoe 11
+shoe 11
+shoe 11
+shoe 11
+shoe 11
+shoe 11
+shoe 11
+rest 11
+rest 11
+rest 11
+rest 11
+rest 11
+cat 11
+cat 11
+cat 11
+cat 11
+cat 11
+cat 11
+cat 11
+cat 11
+cat 11
+rest 11
+rest 11
+rest 11
+rest 11
+rest 11
+scissors 11
+scissors 11
+scissors 11
+scissors 11
+scissors 11
+scissors 11
+scissors 11
+scissors 11
+scissors 11
+rest 11
+rest 11
+rest 11
+rest 11
+rest 11
+rest 11
View
BIN  data/tueb_meg.dat.gz
Binary file not shown
View
152 data/tueb_meg_coord.xyz
@@ -0,0 +1,152 @@
+MLC11 6.43309 0.670899 13.4402 8.20804 0.69133 18.1158
+ MLC12 6.02874 3.34062 13.2801 7.75956 4.48484 17.8306
+ MLC13 5.39823 6.78026 11.8691 7.15929 9.73255 15.5017
+ MLC14 3.21363 8.61391 10.6813 4.06879 12.7677 13.3322
+ MLC15 0.95648 9.74185 8.79292 1.07282 14.5277 10.2404
+ MLC21 3.72035 2.14294 14.2558 5.11451 2.71475 19.0247
+ MLC22 3.85707 4.97217 13.5194 5.24876 6.75771 17.9789
+ MLC23 2.01625 6.9622 12.8764 2.86362 9.90188 16.8328
+ MLC24 0.150434 8.55218 11.4194 0.366156 12.6471 14.2826
+ MLC31 1.36801 3.98749 14.5124 2.27763 5.21968 19.2734
+ MLC32 -0.627059 5.97007 13.8514 -0.537318 8.6004 18.1042
+ MLC33 -3.24143 4.63291 14.2881 -4.4373 6.42974 18.7997
+ MLC41 1.3048 0.94737 14.9523 2.31824 1.11305 19.847
+ MLC42 -1.15389 2.67507 15.0747 -1.22037 3.46685 20.0125
+ MLC43 -4.00963 1.35956 14.7879 -5.5742 1.82642 19.5151
+ MLF11 13.2971 2.31383 3.15653 18.1465 3.5344 3.22999
+ MLF12 12.0735 5.18245 2.75578 16.321 7.81514 2.95661
+ MLF21 13.389 0.547957 5.8523 18.3004 1.05079 6.65094
+ MLF22 12.5779 3.7323 5.79963 17.1043 5.71197 6.57773
+ MLF23 10.9342 6.41337 5.20067 14.3995 10.0088 5.47705
+ MLF31 12.2649 2.2702 8.5339 16.6119 3.43185 10.7172
+ MLF32 11.0349 5.27666 8.18054 14.8047 8.00571 10.0118
+ MLF33 9.0207 7.49213 7.24406 11.5437 11.677 8.30899
+ MLF34 6.70453 8.82803 6.43962 8.85636 13.3163 6.92759
+ MLF41 10.9874 0.472447 10.7135 14.603 0.53096 14.1684
+ MLF42 10.2774 3.7256 10.5787 13.6676 5.3659 13.8694
+ MLF43 8.5787 6.63851 9.85316 10.9998 10.1559 12.4568
+ MLF44 6.1882 8.28856 9.40097 8.27662 12.3434 11.4527
+ MLF45 3.93468 9.67336 7.86596 4.86126 14.456 8.99736
+ MLF51 8.6925 2.04293 12.3176 11.2202 2.62519 16.5936
+ MLF52 7.74505 4.94581 11.9293 9.83169 7.12761 15.9166
+ MLO11 -10.6637 2.25014 6.732 -15.3931 3.16358 8.07787
+ MLO12 -9.2542 5.66291 7.07295 -13.4733 7.97569 8.43785
+ MLO21 -10.5418 4.25006 4.40446 -15.1684 6.0996 4.83642
+ MLO22 -8.71712 7.0638 4.47767 -12.4433 10.3657 4.95283
+ MLO31 -11.3332 2.50767 1.43071 -16.2279 3.46638 1.79954
+ MLO32 -10.0558 5.70851 1.76321 -14.3255 8.2826 2.15824
+ MLO33 -7.98397 8.33461 1.75274 -11.1541 12.178 2.18978
+ MLO41 -10.9907 4.31508 -1.06132 -15.6057 6.20352 -0.676544
+ MLO42 -9.2975 7.15523 -1.04352 -13.0713 10.4091 -0.614928
+ MLO43 -6.8026 9.26602 -1.07131 -9.30956 13.571 -0.629918
+ MLP11 -5.86214 3.17431 13.5085 -8.62626 4.27807 17.5277
+ MLP12 -5.12474 6.2357 12.502 -7.48496 8.94491 15.9808
+ MLP13 -2.55228 7.69497 12.3121 -3.43919 11.2986 15.6648
+ MLP21 -8.26949 1.84841 11.5154 -12.143 2.49476 14.6124
+ MLP22 -7.39821 4.57366 11.5478 -10.9638 6.24941 14.6286
+ MLP31 -9.25909 3.66496 9.27152 -13.4634 5.12296 11.5542
+ MLP32 -7.44854 6.64877 9.62921 -10.9382 9.51258 11.7818
+ MLP33 -4.88893 8.3254 10.1346 -7.0294 12.2863 12.3123
+ MLP34 -2.02824 9.40274 9.60955 -2.64128 14.0937 11.2319
+ MLT11 8.6119 8.10479 4.18194 10.8829 12.5483 4.51446
+ MLT12 4.68265 9.79876 4.84144 6.10382 14.5807 5.19647
+ MLT13 1.74361 10.2468 5.86873 1.79731 15.2272 6.32163
+ MLT14 -1.23692 10.4511 6.66823 -1.56995 15.4262 7.05632
+ MLT15 -4.25098 9.65316 7.19067 -5.87866 14.3207 7.9504
+ MLT16 -7.02166 8.12138 7.14954 -10.0117 12.0122 8.11575
+ MLT21 10.1314 7.4154 1.60863 13.0674 11.4552 1.87678
+ MLT22 6.53594 9.2105 2.4837 8.56757 13.7704 2.78775
+ MLT23 2.71556 10.983 2.99684 3.13757 15.9502 3.39922
+ MLT24 -0.307514 11.253 3.69686 -0.38345 16.2335 4.14546
+ MLT25 -3.37249 11.0274 4.28335 -4.57388 15.8464 4.87185
+ MLT26 -6.15779 9.01815 4.43457 -8.55371 13.384 4.89429
+ MLT31 7.92398 8.757 -0.177182 10.1259 13.24 0.0820664
+ MLT32 4.4619 10.1523 0.553291 5.77901 14.9667 0.868532
+ MLT33 1.06829 11.3852 0.736023 1.15405 16.371 1.12042
+ MLT34 -2.12556 11.5306 1.32146 -2.66397 16.4888 1.69418
+ MLT35 -5.27191 10.4136 1.66851 -7.09323 15.0522 2.09199
+ MLT41 5.79559 9.91298 -2.12984 7.83593 14.4674 -1.80332
+ MLT42 2.6779 11.383 -1.90355 3.31433 16.3307 -1.5454
+ MLT43 -0.584949 11.7246 -1.62847 -0.574255 16.7111 -1.24482
+ MLT44 -3.92685 11.4394 -1.23133 -4.97304 16.3117 -0.808614
+ MRC11 6.21769 -2.19372 13.3144 7.95556 -2.87797 17.9539
+ MRC12 5.44419 -4.7393 12.9214 6.96079 -6.54262 17.3328
+ MRC13 4.29595 -7.93743 11.1834 5.65095 -11.3622 14.5669
+ MRC14 1.8635 -9.31441 9.88533 2.20135 -13.7488 12.1732
+ MRC15 -0.509474 -9.8942 7.92446 -0.986795 -14.771 8.92541
+ MRC21 3.31778 -3.35588 14.0095 4.59469 -4.48621 18.711
+ MRC22 3.04587 -6.04371 13.0407 4.1599 -8.44387 17.2846
+ MRC23 0.940487 -7.68634 12.2308 1.34229 -11.1007 15.8631
+ MRC24 -1.13384 -8.86902 10.679 -1.47794 -13.2229 13.1158
+ MRC31 0.726879 -4.81547 14.1332 1.42428 -6.62335 18.7438
+ MRC32 -1.54256 -6.50936 13.3101 -1.8362 -9.40981 17.3738
+ MRC33 -3.91024 -4.79024 13.8456 -5.38908 -6.75822 18.1991
+ MRC41 1.12218 -1.89037 14.8375 2.03148 -2.6034 19.7034
+ MRC42 -1.60708 -3.2178 14.8056 -1.76942 -4.44994 19.65
+ MRC43 -4.20643 -1.50156 14.662 -5.86477 -2.14258 19.3366
+ MRF11 12.8344 -3.89177 2.89762 17.4602 -5.79025 2.79548
+ MRF12 11.1932 -6.47897 2.22351 15.0168 -9.70273 2.20064
+ MRF21 13.1615 -2.41637 5.72507 17.9518 -3.66274 6.44057
+ MRF22 11.968 -5.50113 5.42215 16.2076 -8.09881 5.96082
+ MRF23 9.86615 -7.73751 4.57047 12.8038 -11.7849 4.52853
+ MRF31 11.7709 -4.16883 8.2679 15.908 -6.13432 10.2763
+ MRF32 10.1103 -6.94094 7.61327 13.4821 -10.2662 9.22145
+ MRF33 7.82553 -8.68062 6.53342 9.71209 -13.2633 7.20594
+ MRF34 5.32681 -9.64965 5.64713 6.88249 -14.4026 5.68813
+ MRF41 10.7996 -2.39653 10.5441 14.28 -3.30234 14.0196
+ MRF42 9.58512 -5.52483 10.1649 12.7289 -7.86256 13.2736
+ MRF43 7.45092 -8.04023 9.21968 9.42407 -12.072 11.4252
+ MRF44 4.8546 -9.30636 8.61867 6.43061 -13.7475 10.2935
+ MRF45 2.4373 -10.1768 7.01155 2.78194 -15.1236 7.66208
+ MRF51 8.27346 -3.73547 12.0659 10.6485 -5.10803 16.2478
+ MRF52 6.88668 -6.48305 11.3884 8.65736 -9.20069 15.1952
+ MRO11 -10.8754 -0.675293 6.58978 -15.6837 -1.10274 7.89778
+ MRO12 -9.97506 -4.24173 6.64643 -14.4451 -6.16784 7.79597
+ MRO21 -11.0467 -2.43461 4.08018 -15.8504 -3.78167 4.43088
+ MRO22 -9.66475 -5.52537 3.94068 -13.7849 -8.3551 4.1137
+ MRO31 -11.5425 -0.372808 1.28347 -16.5168 -0.752355 1.63668
+ MRO32 -10.7389 -3.79488 1.34943 -15.342 -5.74082 1.54341
+ MRO33 -8.99111 -6.54759 1.10855 -12.626 -9.98162 1.19738
+ MRO41 -11.436 -2.08939 -1.36915 -16.2994 -3.21649 -1.06866
+ MRO42 -10.1452 -5.16246 -1.59278 -14.4425 -7.71387 -1.40293
+ MRO43 -7.99941 -7.50194 -1.79859 -11.0985 -11.4269 -1.74295
+ MRP11 -6.29115 -2.90911 13.2375 -9.21765 -3.94225 17.1593
+ MRP12 -6.01772 -5.93669 11.9547 -8.73637 -8.58155 15.2145
+ MRP13 -3.69834 -7.72773 11.6083 -5.04758 -11.4434 14.672
+ MRP21 -8.47255 -1.059 11.3807 -12.4114 -1.45617 14.4369
+ MRP22 -7.98497 -3.86918 11.1891 -11.794 -5.34002 14.0771
+ MRP31 -9.68174 -2.51387 9.00168 -14.0754 -3.57787 11.1409
+ MRP32 -8.35637 -5.72725 9.07802 -12.1879 -8.30593 10.9971
+ MRP33 -6.11621 -7.82427 9.45043 -8.73285 -11.6841 11.2581
+ MRP34 -3.47443 -9.27031 8.76319 -4.57786 -13.9783 10.04
+ MRT11 7.35783 -8.95613 3.42973 8.96071 -13.6931 3.36371
+ MRT12 3.18528 -10.2006 3.97731 4.01894 -15.131 3.88416
+ MRT13 0.163654 -10.3113 4.95292 -0.291589 -15.2916 5.00034
+ MRT14 -2.80721 -10.2037 5.73537 -3.67394 -15.1293 5.75371
+ MRT15 -5.69846 -8.93087 6.40139 -7.77051 -13.4726 6.70471
+ MRT16 -8.14655 -7.07335 6.48481 -11.6537 -10.5794 7.13316
+ MRT21 8.9223 -8.21674 0.92677 11.2664 -12.6331 0.810166
+ MRT22 5.08976 -9.67627 1.68012 6.60893 -14.4385 1.51832
+ MRT23 1.05293 -10.8959 2.0395 0.997696 -15.8966 1.98687
+ MRT24 -1.96853 -10.8517 2.74041 -2.54508 -15.8195 2.70371
+ MRT25 -5.02706 -10.2577 3.38722 -6.59675 -15.0061 3.41225
+ MRT26 -7.41965 -7.86832 3.6667 -10.3938 -11.8867 3.80759
+ MRT31 6.59421 -9.10791 -0.964231 8.17209 -13.8518 -1.10127
+ MRT32 2.92409 -10.1208 -0.340806 3.70329 -15.06 -0.439392
+ MRT33 -0.581357 -10.9589 -0.245669 -1.09681 -15.9333 -0.296408
+ MRT34 -3.79206 -10.6467 0.34972 -4.7993 -15.5451 0.288299
+ MRT35 -6.71049 -8.9923 0.812736 -8.97932 -13.4493 0.829584
+ MRT41 4.39355 -9.74924 -2.96818 5.67052 -14.5828 -3.10546
+ MRT42 1.1087 -10.8346 -2.87428 0.939914 -15.8321 -2.96821
+ MRT43 -2.19632 -10.7572 -2.61693 -2.92289 -15.7052 -2.6713
+ MRT44 -5.4643 -10.0776 -2.17081 -7.26274 -14.7443 -2.19027
+ MZC01 3.80106 -0.609105 14.278 5.1994 -0.942972 19.0682
+ MZC02 -1.44037 -0.281826 15.2444 -1.44856 -0.505828 20.2406
+ MZF01 13.5483 -0.843879 3.22514 18.5374 -1.18422 3.15103
+ MZF02 12.5461 -0.970928 8.48035 16.9593 -1.43099 10.7878
+ MZF03 8.8011 -0.885649 12.2682 11.2482 -1.26282 16.6135
+ MZO01 -11.3049 0.892652 4.24027 -16.2624 1.26524 4.78533
+ MZO02 -11.6667 1.16014 -0.982373 -16.644 1.51473 -0.645077
+ MZP01 -6.47856 0.137594 13.504 -9.59148 0.23432 17.4172
+ MZP02 -9.84413 0.586053 9.37474 -14.2462 0.789525 11.7397
+
View
7 debian/changelog
@@ -1,3 +1,10 @@
+pymvpa (0.3.1-1) unstable; urgency=low
+
+ * New upstream release.
+ * Added dependency to ctypes for python 2.4.
+
+ -- Michael Hanke <michael.hanke@gmail.com> Sun, 14 Sep 2008 13:53:31 +0200
+
pymvpa (0.3.0-1) unstable; urgency=low
* New upstream release.
View
2  debian/control
@@ -12,7 +12,7 @@ Vcs-Git: git://git.debian.org/git/pkg-exppsy/pymvpa.git
Package: python-mvpa
Architecture: any
-Depends: ${shlibs:Depends}, ${python:Depends}, python-numpy
+Depends: ${shlibs:Depends}, ${python:Depends}, python-numpy, python-ctypes (>= 1.0.1) | python (>= 2.5)
Provides: ${python:Provides}
XB-Python-Version: ${python:Versions}
Recommends: python-nifti, python-psyco, python-mdp, python-scipy, shogun-python-modular, python-pywt
View
3  debian/rules
@@ -24,7 +24,8 @@ install/python-mvpa-doc::
# install directly into package directory (despite multiple packages)
DEB_DESTDIR = $(CURDIR)/debian/python-mvpa
# immediately useable documentation
-DEB_COMPRESS_EXCLUDE := .py .pdf .html .css .jpg .txt .js .json
+# and exemplar data (they are small excerpts anyways)
+DEB_COMPRESS_EXCLUDE := .py .pdf .html .css .jpg .txt .js .json .rtc .par .bin
# -doc package contents
DEB_INSTALL_DOCS_python-mvpa-doc := build/html build/latex/*.pdf
DEB_INSTALL_EXAMPLES_python-mvpa-doc := doc/examples/* data
View
5 doc/_templates/indexsidebar.html
@@ -5,8 +5,11 @@
<li><a href="http://git.debian.org/?p=pkg-exppsy/pymvpa.git;a=summary">Git repository</a></li>
<li><a href="http://lists.alioth.debian.org/pipermail/pkg-exppsy-pymvpa">Mailing list archive</a></li>
<li><a href="http://mloss.org/software/view/76/">MLOSS.org entry</a></li>
+<li><a href="http://software.incf.net/software/49/view/PyMVPA">INCF entry</a></li>
</ul>
+<script type="text/javascript" src="http://www.ohloh.net/projects/16363/widgets/project_partner_badge"></script>
+
<h3>Search mailing list archive</h3>
<script type="text/javascript">
function mlsearch(curobj)
@@ -18,3 +21,5 @@
<input name="userquery" size="18" type="text" /> <input type="submit" value="Go" />
<input name="q" type="hidden" />
</form>
+
+
View
27 doc/authors.txt
@@ -0,0 +1,27 @@
+.. -*- mode: rst -*-
+.. ex: set sts=4 ts=4 sw=4 et tw=79:
+
+
+The PyMVPA developers team currently consists of:
+
+* `Michael Hanke`_, University of Magdeburg, Germany
+* `Yaroslav O. Halchenko`_, Rutgers University Newark, USA
+* `Per B. Sederberg`_, Princeton University, USA
+* `Emanuele Olivetti`_, University of Trento, Italy
+
+.. _Michael Hanke: http://apsy.gse.uni-magdeburg.de/hanke
+.. _Yaroslav O. Halchenko: http://www.onerussian.com
+.. _Per B. Sederberg: http://www.princeton.edu/~persed/
+.. _Emanuele Olivetti: http://sra.itc.it/people/olivetti/
+
+
+We are very grateful to the following people, who have contributed
+valueable advice, code or documentation to PyMVPA:
+
+* `Greg Detre`_, Princeton University, USA
+* `James M. Hughes`_, Dartmouth College, USA
+* `Ingo Fründ`_, University of Magdeburg, Germany
+
+.. _Greg Detre: http://www.princeton.edu/~gdetre/
+.. _James M. Hughes: http://www.cs.dartmouth.edu/~hughes/index.html
+.. _Ingo Fründ: http://www-e.uni-magdeburg.de/fruend/
View
13 doc/conf.py
@@ -42,9 +42,9 @@
# other places throughout the built documents.
#
# The short X.Y version.
-version = '0.3.0'
+version = '0.3.1'
# The full version, including alpha/beta/rc tags.
-release = '0.3.0'
+release = '0.3.1'
# There are two options for replacing |today|: either, you set today to some
# non-false value, then it is used:
@@ -132,8 +132,13 @@
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title, author, document class [howto/manual]).
latex_documents = [
- ('manual', 'PyMVPA-Manual.tex', 'PyMVPA Manual', 'Michael Hanke, Yaroslav Halchenko, Per B. Sederberg', 'manual'),
- ('devguide', 'PyMVPA-DevGuide.tex', 'PyMVPA Developer Guidelines', 'Michael Hanke, Yaroslav Halchenko, Per B. Sederberg', 'manual'),
+ ('manual', 'PyMVPA-Manual.tex', 'PyMVPA Manual',
+ 'Michael~Hanke, Yaroslav~O.~Halchenko, Per~B.~Sederberg, '
+ 'James M. Huges',
+ 'manual'),
+ ('devguide', 'PyMVPA-DevGuide.tex', 'PyMVPA Developer Guidelines',
+ 'Michael~Hanke, Yaroslav~O.~Halchenko, Per~B.~Sederberg',
+ 'manual'),
]
# The name of an image file (relative to this directory) to place at the top of
View
23 doc/datasets.txt
@@ -200,6 +200,29 @@ information like class labels and chunks are maintained. By calling
`mapReverse()` on the new dataset one can see that the remaining four features
are precisely mapped back onto their original locations in the data space.
+.. index:: syntactic sugaring
+.. _data_sugaring:
+
+Data Access Sugaring
+====================
+
+Complementary to self-descriptive attribute names (e.g. `labels`, `samples`)
+datasets have a few concise shortcuts to get quick access to some attributes
+or perform some common action
+
+================ ============ ================
+Attribute Abbreviation Definition class
+---------------- ------------ ----------------
+samples S Dataset_
+labels L Dataset_
+uniquelabels UL Dataset_
+chunks O Dataset_
+uniquechunks UC Dataset_
+origids I Dataset_
+samples_original O MappedDataset_
+================ ============ ================
+
+
.. index:: data formats
.. _data_formats:
View
53 doc/devguide.txt
@@ -559,6 +559,13 @@ Things to implement for the next release (Release goals)
Long and medium term TODOs (aka stuff that has been here forever)
-----------------------------------------------------------------
+ * Agree upon sensitivities returned by the classifiers. Now SMLR/libsvm.SVM
+ returns (nfeatures x X), (where X is either just 1 for binary problems, or
+ nclasses in full multiclass in SMLR, or nclasses-1 for libsvm(?) or not-full
+ SMLR). In case of sg.SVM and GPR (I believe) it is just (nfeatures,).
+ MaskMapper puked on reverse in the first specification... think about
+ combiner -- should it or should not be there... etc
+
* selected_ids -> implement via MaskMapper?
yoh:
@@ -631,3 +638,49 @@ Long and medium term TODOs (aka stuff that has been here forever)
in --> data -> dataShape
out --> features ->
+
+Building a binary installer on MacOS X 10.5
+===========================================
+
+A simple way to build a binary installer for Mac OS is bdist_mpkg_. This is
+a setuptools extension that uses the proper native parts of MacOS to build the
+installer. However, for PyMVPA there are two problems with bdist_mpkg_:
+1. PyMVPA uses distutils not setuptools and 2. current bdist_mpkg_ 0.4.3 does
+not work for MacOS X 10.5 (Leopard). But both can be solved.
+
+Per 1) A simple wrapper script in `tools/mpkg_wrapper.py` will enable the use of
+setuptools on top of distutils, while keeping the distutils part in a usable
+state.
+
+Per 2) The following patch (against 0.4.3.) makes bdist_mpkg_ compatible with
+MacOS 10.5. It basically changes the way bdist_mpkg_ determined the GID of the
+admin group. 10.5 removed the `nidump` command::
+
+
+ diff -rNu bdist_mpkg-0.4.3/bdist_mpkg/tools.py bdist_mpkg-0.4.3.leopard/bdist_mpkg/tools.py
+ --- bdist_mpkg-0.4.3/bdist_mpkg/tools.py 2006-07-09 00:39:00.000000000 -0400
+ +++ bdist_mpkg-0.4.3.leopard/bdist_mpkg/tools.py 2008-08-21 07:43:35.000000000 -0400
+ @@ -79,15 +79,12 @@
+ yield os.path.join(root, fn)
+
+ def get_gid(name, _cache={}):
+ - if not _cache:
+ - for line in os.popen('/usr/bin/nidump group .'):
+ - fields = line.split(':')
+ - if len(fields) >= 3:
+ - _cache[fields[0]] = int(fields[2])
+ - try:
+ - return _cache[name]
+ - except KeyError:
+ - raise ValueError('group %s not found' % (name,))
+ + for line in os.popen("dscl . -read /Groups/" + name + " PrimaryGroupID"):
+ + fields = [f.strip() for f in line.split(':')]
+ + if fields[0] == "PrimaryGroupID":
+ + return fields[1]
+ +
+ + raise ValueError('group %s not found' % (name,))
+
+ def find_root(path, base='/'):
+ """
+
+.. _bdist_mpkg: http://undefined.org/python/#bdist_mpkg
View
11 doc/examples/clfs_examples.py
@@ -22,17 +22,19 @@ def main():
# Load Haxby dataset example
haxby1path = 'data'
- attrs = SampleAttributes(os.path.join(haxby1path, 'attributes.txt'))
+ attrs = SampleAttributes(os.path.join(haxby1path, 'attributes_literal.txt'))
haxby8 = NiftiDataset(samples=os.path.join(haxby1path, 'bold.nii.gz'),
labels=attrs.labels,
+ labels_map=True,
chunks=attrs.chunks,
mask=os.path.join(haxby1path, 'mask.nii.gz'),
dtype=N.float32)
# preprocess slightly
+ rest_label = haxby8.labels_map['rest']
detrend(haxby8, perchunk=True, model='linear')
- zscore(haxby8, perchunk=True, baselinelabels=[0], targetdtype='float32')
- haxby8_no0 = haxby8.selectSamples(haxby8.labels != 0)
+ zscore(haxby8, perchunk=True, baselinelabels=[rest_label], targetdtype='float32')
+ haxby8_no0 = haxby8.selectSamples(haxby8.labels != rest_label)
dummy2 = normalFeatureDataset(perlabel=30, nlabels=2,
nfeatures=100,
@@ -59,7 +61,7 @@ def main():
#print cv.confusion
# to report transfer error
- confusion = ConfusionMatrix()
+ confusion = ConfusionMatrix(labels_map=dataset.labels_map)
times = []
nf = []
t0 = time.time()
@@ -79,6 +81,7 @@ def main():
tfull = time.time() - t0
times = N.mean(times, axis=0)
nf = N.mean(nf)
+ # print "\n", confusion
print "%5.1f%% %-4d\t %.2fs %.2fs %.2fs" % \
(confusion.percentCorrect, nf, times[0], times[1], tfull)
View
70 doc/examples/erp_plot.py
@@ -0,0 +1,70 @@
+#!/usr/bin/env python
+#emacs: -*- mode: python-mode; py-indent-offset: 4; indent-tabs-mode: nil -*-
+#ex: set sts=4 ts=4 sw=4 et:
+### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ##
+#
+# See COPYING file distributed along with the PyMVPA package for the
+# copyright and license terms.
+#
+### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ##
+"""Example demonstrating an ERP plot. Actually it is ERF plot since we have MEG data"""
+
+from mvpa.suite import *
+
+# load data
+meg = TuebingenMEG(os.path.join('data', 'tueb_meg.dat.gz'))
+
+
+# Define plots for easy feeding into plotERP
+plots = []
+colors = ['r', 'b', 'g']
+
+# figure out pre-stimulus onset interval
+t0 = -meg.timepoints[0]
+
+plots = [ {'label' : meg.channelids[i],
+ 'color' : colors[i],
+ 'data' : meg.data[:, i, :]}
+ for i in xrange(len(meg.channelids)) ]
+
+# Common arguments for all plots
+cargs = {
+ 'SR' : meg.samplingrate,
+ 'pre_onset' : t0,
+ # Plot only 50ms before and 100ms after the onset since we have
+ # just few trials
+ 'pre' : 0.05, 'post' : 0.1,
+ # Plot all 'errors' in different degrees of shadings
+ 'errtype' : ['ste', 'ci95', 'std'],
+ # Set to None if legend manages to obscure the plot
+ 'legend' : 'best',
+ 'alinewidth' : 1 # assume that we like thin lines
+ }
+
+# Create a new figure
+fig = P.figure(figsize=(12, 8))
+
+# Following plots are plotted inverted (negative up) for the
+# demonstration of this capability and elderly convention for ERP
+# plots. That is controlled with ymult (negative gives negative up)
+
+
+# Plot MEG sensors
+
+# frame_on=False guarantees abent outside rectangular axis with
+# labels. plotERP recreates its own axes centered at (0,0)
+ax = fig.add_subplot(2, 1, 1, frame_on=False)
+plotERPs(plots[:2], ylabel='$pT$', ymult=-1e12, ax=ax, **cargs)
+
+# Plot EEG sensor
+ax = fig.add_subplot(2, 1, 2, frame_on=False)
+plotERPs(plots[2:3], ax=ax, ymult=-1e6, **cargs)
+
+# Additional example: plotting a single ERP on an existing plot
+# without drawing axis:
+#
+# plotERP(data=meg.data[:, 0, :], SR=meg.samplingrate, pre=pre, pre_mean=pre, errtype=errtype, ymult=-1.0)
+
+if cfg.getboolean('examples', 'interactive', True):
+ # show all the cool figures
+ P.show()
View
0  doc/examples/kerneldemo.py 100644 → 100755
File mode changed
View
0  doc/examples/projections.py 100644 → 100755
File mode changed
View
26 doc/examples/sensanas.py
@@ -29,20 +29,23 @@
sensanas = {'a) ANOVA': OneWayAnova(transformer=N.abs),
'b) Linear SVM weights': LinearNuSVMC().getSensitivityAnalyzer(
transformer=N.abs),
- 'c) Splitting ANOVA (odd-even)':
+ 'c) I-RELIEF': IterativeRelief(transformer=N.abs),
+ 'd) Splitting ANOVA (odd-even)':
SplitFeaturewiseMeasure(OneWayAnova(transformer=N.abs),
OddEvenSplitter()),
- 'd) Splitting SVM (odd-even)':
+ 'e) Splitting SVM (odd-even)':
SplitFeaturewiseMeasure(
LinearNuSVMC().getSensitivityAnalyzer(transformer=N.abs),
OddEvenSplitter()),
- 'e) Splitting ANOVA (nfold)':
+ 'f) I-RELIEF Online':
+ IterativeReliefOnline(transformer=N.abs),
+ 'g) Splitting ANOVA (nfold)':
SplitFeaturewiseMeasure(OneWayAnova(transformer=N.abs),
NFoldSplitter()),
- 'f) Splitting SVM (nfold)':
+ 'h) Splitting SVM (nfold)':
SplitFeaturewiseMeasure(
LinearNuSVMC().getSensitivityAnalyzer(transformer=N.abs),
- NFoldSplitter())
+ NFoldSplitter()),
}
# do chunkswise linear detrending on dataset
@@ -60,7 +63,7 @@
dtype='bool'))
fig = 0
-P.figure(figsize=(8,8))
+P.figure(figsize=(14,8))
keys = sensanas.keys()
keys.sort()
@@ -70,7 +73,9 @@
print "Running %s ..." % (s)
# compute sensitivies
- smap = sensanas[s](dataset)
+ # I-RELIEF assigns zeros, which corrupts voxel masking for pylab's imshow,
+ # so adding some epsilon :)
+ smap = sensanas[s](dataset)+0.00001
# map sensitivity map into original dataspace
orig_smap = dataset.mapReverse(smap)
@@ -78,7 +83,7 @@
# make a new subplot for each classifier
fig += 1
- P.subplot(3,2,fig)
+ P.subplot(3,3,fig)
P.title(s)
@@ -90,12 +95,13 @@
# uniform scaling per base sensitivity analyzer
if s.count('ANOVA'):
P.clim(0, 0.4)
- else:
+ elif s.count('SVM'):
P.clim(0, 0.055)
+ else:
+ pass
P.colorbar(shrink=0.6)
-
if cfg.getboolean('examples', 'interactive', True):
# show all the cool figures
P.show()
View
0  doc/examples/smellit.py 100644 → 100755
File mode changed
View
38 doc/examples/svdclf.py
@@ -15,37 +15,6 @@
if __debug__:
debug.active += ["CROSSC"]
-# plotting helper function
-def makeBarPlot(data, labels=None, title=None, ylim=None, ylabel=None):
- xlocations = N.array(range(len(data))) + 0.5
- width = 0.5
-
- # work with arrays
- data = N.array(data)
-
- # plot bars
- plot = P.bar(xlocations,
- data.mean(axis=1),
- yerr=data.std(axis=1) / N.sqrt(data.shape[1]),
- width=width,
- color='0.6',
- ecolor='black')
- P.axhline(0.5, ls='--', color='0.4')
-
- if ylim:
- P.ylim(*(ylim))
- if title:
- P.title(title)
-
- if labels:
- P.xticks(xlocations+ width/2, labels)
-
- if ylabel:
- P.ylabel(ylabel)
-
- P.xlim(0, xlocations[-1]+width*2)
-
-
#
# load PyMVPA example dataset
#
@@ -109,9 +78,10 @@ def makeBarPlot(data, labels=None, title=None, ylim=None, ylabel=None):
results.append(cv.results)
labels.append(desc)
-makeBarPlot(results, labels=labels,
- title='Linear C-SVM classification (cats vs. scissors)',
- ylabel='Mean classification error (N-1 cross-validation, 12-fold)')
+plotBars(results, labels=labels,
+ title='Linear C-SVM classification (cats vs. scissors)',
+ ylabel='Mean classification error (N-1 cross-validation, 12-fold)',
+ distance=0.5)
if cfg.getboolean('examples', 'interactive', True):
P.show()
View
47 doc/examples/topo_plot.py
@@ -0,0 +1,47 @@
+#!/usr/bin/env python
+#emacs: -*- mode: python-mode; py-indent-offset: 4; indent-tabs-mode: nil -*-
+#ex: set sts=4 ts=4 sw=4 et:
+### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ##
+#
+# See COPYING file distributed along with the PyMVPA package for the
+# copyright and license terms.
+#
+### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ##
+"""Example demonstrating a topography plot."""
+
+from mvpa.suite import *
+
+
+# EEG example splot
+P.subplot(1, 2, 1)
+
+# load the sensor information from their definition file.
+# This file has sensor names, as well as their 3D coordinates
+sensors=XAVRSensorLocations(os.path.join('data', 'xavr1010.dat'))
+
+# make up some artifical topography
+# 'enable' to channels, all others set to off ;-)
+topo = N.zeros(len(sensors.names))
+topo[sensors.names.index('O1')] = 1
+topo[sensors.names.index('F4')] = 1
+
+# plot with sensor locations shown
+plotHeadTopography(topo, sensors.locations(), plotsensors=True)
+
+
+# MEG example plot
+P.subplot(1, 2, 2)
+
+# load MEG sensor locations
+sensors=TuebingenMEGSensorLocations(os.path.join('data', 'tueb_meg_coord.xyz'))
+
+# random values this time
+topo = N.random.randn(len(sensors.names))
+
+# plot without additional interpolation
+plotHeadTopography(topo, sensors.locations(), interpolation='nearest')
+
+
+if cfg.getboolean('examples', 'interactive', True):
+ # show all the cool figures
+ P.show()
View
51 doc/faq.txt
@@ -217,3 +217,54 @@ the classifier is configured to select the single most important feature (given
the SMLR weights). After enabling the `feature_ids` state, the classifier
provides the desired information, that can e.g. be applied to generate a
stripped dataset for an analysis of the similarity structure.
+
+
+.. index:: sensitivity, cross-validation
+
+How do I extract sensitivities from a classifier used within a cross-validation?
+--------------------------------------------------------------------------------
+
+.. The answer depends on size of the classification problem and the used
+ classifier. If you can afford to keep a copy of the trained classifier for
+ each data split, the most elegant solution is probably a SplitClassifier_...
+ ...BUT no yet
+
+CrossValidatedTransferError_ provides an interface to access any
+classifier-related information: `harvest_attribs`. Harvesting the sensitivities
+computed by all classifiers (without recomputing them again) looks like this:
+
+ >>> cv = CrossValidatedTransferError(
+ ... TransferError(SMLR()),
+ ... OddEvenSplitter(),
+ ... harvest_attribs=\
+ ... ['transerror.clf.getSensitivityAnalyzer(force_training=False)()'])
+ >>> merror = cv(dataset)
+ >>> sensitivities = cv.harvested.values()[0]
+ >>> N.array(sensitivities).shape == (2, dataset.nfeatures)
+ True
+
+First, we define an instance of CrossValidatedTransferError_ that uses an SMLR_
+classifier to perform the cross-validation on odd-even splits of a dataset.
+The important piece is the definition of the `harvest_attribs`. It takes a
+list of code snippets that will be executed in the local context of the
+cross-validation function. The TransferError_ instance used to train and test
+the classifier on each split is available via `transerror`. The rest is easy:
+TransferError_ provides access to its classifier and any classifier can in turn
+generate an appropriate Sensitivity_ instance via `getSensitivityAnalyzer()`.
+This generator method takes additional arguments to the constructor of the
+Sensitivity_ class. In this case we want to prevent retraining the classifiers,
+as they will be trained anyway by the TransferError_ instance they belong to.
+
+The return values of all code snippets defined in `harvest_attribs` are
+available in the `harvested` state variable. `harvested` is a dictionary where
+the keys are the code snippets used to compute the value. As the key in this
+case is pretty long, we simply take the first (and only) value from the
+dictionary. The value is actually a list of sensitivity vectors, one per
+split.
+
+.. _CrossValidatedTransferError : api/mvpa.clfs.base.SplitClassifier-class.html
+.. _SplitClassifier : api/mvpa.clfs.base.SplitClassifier-class.html
+.. _Sensitivity : api/mvpa.measures.base.Sensitivity-class.html
+.. _TransferError : api/mvpa.clfs.transerror.TransferError-class.html
+.. _SMLR : api/mvpa.clfs.smlr.SMLR-class.html
+
View
196 doc/featsel.txt
@@ -15,18 +15,208 @@
Feature Selection
*****************
+ *This section has been contributed by James M. Hughes.*
+
+It is often the case in machine learning problems that we wish to reduce a
+feature space of high dimensionality into something more manageable by
+selecting only those features that contribute most to classification
+performance. Feature selection methods attempt to achieve this goal in an
+algorithmic fashion.
+
+.. index:: FeatureSelectionClassifier
+
+PyMVPA's flexible framework allows various feature selection methods to take
+place within a small block of code. FeatureSelectionClassifier_ extends the
+basic classifier framework to allow for the use of arbitrary methods of feature
+selection according to whatever ranking metric, feature selection criteria, and
+stopping criterion the user chooses for a given application. Examples of the
+code/classification algorithms presented here can be found in
+`mvpa/clfs/warehouse.py`_.
+
+More formally, a FeatureSelectionClassifier_ is a meta-classifier. That is, it
+is not a classifier itself -- it can take any *slave* Classifier_, perform some
+feature selection in advance, select those features, and then train the
+provided *slave* Classifier_ on those features. Externally, however, it looks
+like a Classifier_, in that it fulfills the specialization of the Classifier
+base class. The following are the relevant arguments to the constructor of
+such a Classifier_:
+
+`clf`: Classifier_
+ classifier based on which mask classifiers is created
+
+`feature_selection`: FeatureSelection_
+ whatever feature selection is considered best
+
+`testdataset`: Dataset_ (optional)
+ dataset which would be given on call to feature_selection
+
+.. index:: FeatureSelection
+
+Let us turn out attention to the second argument, FeatureSelection_. As noted
+above, this feature selection can be arbitrary and should be chosen
+appropriately for the task at hand. For example, we could perform a one-way
+ANOVA statistic to select features, then keep only the most important 5% of
+them. It is crucial to note that, in PyMVPA, the way in which features are
+selected (in this example by keeping only 5% of them) is wholly independent of
+the way features are ranked (in this example, by using a one-way ANOVA).
+Feature selection using this method could be accomplished using the following
+code (from `mvpa/clfs/warehouse.py`_):
+
+ >>> from mvpa.suite import *
+ >>> FeatureSelection = SensitivityBasedFeatureSelection(
+ ... OneWayAnova(),
+ ... FractionTailSelector(0.05, mode='select', tail='upper'))
+
+A more interesting analysis is one in which we use the weights (hyperplane
+coefficients) to rank features. This allows us to use the same classifier to
+train the selected features as we used to select them:
+
+.. here we'll put the warehouse.py example of linear svm weights from yarik's
+ email
+
+ >>> sample_linear_svm = clfs['linear', 'svm'][0]
+ >>> clf = \
+ ... FeatureSelectionClassifier(
+ ... sample_linear_svm,
+ ... SensitivityBasedFeatureSelection(
+ ... sample_linear_svm.getSensitivityAnalyzer(transformer=Absolute),
+ ... FractionTailSelector(0.05, mode='select', tail='upper')),
+ ... descr="LinSVM on 5%(SVM)")
+
+It bears mentioning at this point that caution must be exercised when selecting
+features. The process of feature selection must be performed on an independent
+training dataset: it is not possible to select features using the entire
+dataset, re-train a classifier on a subset of the original data (but using only
+the selected features) and then test on a held-out testing dataset. This
+results in an obvious positive bias in classification performance. PyMVPA
+allows for easy dataset splitting, however, so creating independent training
+and testing datasets is easily accomplished, for instance using an
+NFoldSplitter_, OddEvenSplitter_, etc.
+
+.. fill in end of last paragraph with suggestions for how to take in an entire
+ original dataset and split it: should we just do a cross-validated outer
+ loop that uses multiple training/testing splits and does RFE on each of
+ these splits?
+
+
+
.. index:: recursive feature selection, RFE
.. _recursive_feature_elimination:
Recursive Feature Elimination
=============================
-RFE_
-
-(to be written)
+Recursive feature elimination (RFE_) is a technique that falls under the larger
+umbrella of feature selection. Recursive feature elimination specifically
+attempts to reduce the number of selected features used for classification in
+the following way:
+
+* A classifier is trained on a subset of the data and features are ranked
+ according to an arbitrary metric.
+
+* Some amount of those features is either selected or discarded according to a
+ pre-selected rule.
+
+* The classifier is retrained and features are once again ranked; this process
+ continues until some criterion determined \textit{a priori} (such as
+ classification error) is reached.
+
+* One or more classifiers trained only on the final set of selected features
+ are used on a generalization dataset and performance is calculated.
+
+PyMVPA's flexible framework allows each of these steps to take place within a
+small block of code. To actually perform recursive feature elimination, we
+consider two separate analysis scenarios that deal with a pre-selected training
+dataset:
+
+* We split the training dataset into an arbitrary number of independent
+ datasets and perform RFE on each of these; the sensitivity analysis of
+ features is performed independently for each split and features are selected
+ based on those independent measures.
+
+* We split the training dataset into an arbitrary number of independent
+ datasets (as before), but we average the feature sensitivities and select
+ which features to prune/select based on that one average measure.
+
+.. index:: SplitClassifier
+
+We will concentrate on the second approach. The following code can be used to
+perform such an analysis:
+
+ >>> rfesvm_split = SplitClassifier(LinearCSVMC())
+ >>> clf = \
+ ... FeatureSelectionClassifier(
+ ... clf = LinearCSVMC(),
+ ... # on features selected via RFE
+ ... feature_selection = RFE(
+ ... # based on sensitivity of a clf which does splitting internally
+ ... sensitivity_analyzer=rfesvm_split.getSensitivityAnalyzer(),
+ ... transfer_error=ConfusionBasedError(
+ ... rfesvm_split,
+ ... confusion_state="confusion"),
+ ... # and whose internal error we use
+ ... feature_selector=FractionTailSelector(
+ ... 0.2, mode='discard', tail='lower'),
+ ... # remove 20% of features at each step
+ ... update_sensitivity=True),
+ ... # update sensitivity at each step
+ ... descr='LinSVM+RFE(splits_avg)' )
+
+The code above introduces the SplitClassifier_, which in this case is yet
+another *meta-classifier* that takes in a Classifier_ (in this case a
+LinearCSVMC_) and an arbitrary Splitter_ object, so that the dataset can be
+split in whatever way the user desires. Prior to training, the
+SplitClassifier_ splits the training dataset, dedicates a separate classifier
+to each split, trains each on the training part of the split, and then computes
+transfer error on the testing part of the split. If a SplitClassifier_ instance
+is later on asked to *predict* some new data, it uses (by default) the
+MaximalVote_ strategy to derive an answer. A summary about the performance of
+a SplitClassifier_ internally on each split of the training dataset is
+available by accessing the `confusion` state variable.
+
+To summarize somewhat, RFE_ is just one method of feature selection, so we use a
+FeatureSelectionClassifier_ to facilitate this. To parameterize the RFE
+process, we refer above to the following:
+
+`sensitivity_analyzer`
+ in this case just the default from a linear C-SVM (the SVM weights), taken as
+ an average over all splits (in accordance with scenario 2 as above)
+
+`transfer_error`
+ confusion-based error that relies on the confusion matrices computed during
+ splitting of the dataset by the SplitClassifier_; this is used to provide a
+ value that can be compared against a stopping criterion to stop eliminating
+ features
+
+`feature_selector`
+ in this example we simply discard the 20% of features deemed least important
+
+`update_sensitivity`
+ true to retrain the classifiers each time we eliminate features; should be
+ false if a non-classifier-based sensitivity measure (such as one-way ANOVA)
+ is used
+
+As has been shown, recursive feature elimination is an easy-to-implement,
+flexible, and powerful tool within the PyMVPA framework. Various ranking
+methods for selecting features have been discussed. Additionally, several
+analysis scenarios have been presented, along with enough requisite knowledge
+that the user can plug in whatever classifiers, error metrics, or sensitivity
+measures are most appropriate for the task at hand.
.. _RFE: api/mvpa.featsel.rfe.RFE-class.html
+.. _Dataset: api/mvpa.datasets.base.Dataset-class.html
+.. _Splitter: api/mvpa.datasets.splitter.Splitter-class.html
+.. _NFoldSplitter: api/mvpa.datasets.splitter.NFoldSplitter-class.html
+.. _OddEvenSplitter: api/mvpa.datasets.splitter.OddEvenSplitter-class.html
+.. _MaximalVote: api/mvpa.clfs.base.MaximalVote-class.html
+.. _Classifier: api/mvpa.clfs.base.Classifier-class.html
+.. _FeatureSelectionClassifier: api/mvpa.clfs.base.FeatureSelectionClassifier-class.html
+.. _SplitClassifier: api/mvpa.clfs.base.SplitClassifier-class.html
+.. _FeatureSelection: api/mvpa.featsel.base.FeatureSelection-class.html
+.. _LinearCSVMC: api/mvpa.clfs.svm.LinearCSVMC-class.html
+.. _mvpa/clfs/warehouse.py: api/mvpa.clfs.warehouse-pysrc.html