Skip to content

Commit

Permalink
Merge branch 'master' of https://github.com/nilearn/nilearn into spri…
Browse files Browse the repository at this point in the history
…nt_modify_fetch_dev

* 'master' of https://github.com/nilearn/nilearn: (228 commits)
  [MRG] Nans in view connectome (nilearn#2166)
  [MRG] FIX: orientation problem with plot_glass_brain (nilearn#1888)
  Import HTMLDocument in its original module to preserve backwards compatibility (nilearn#2162)
  Fixing malfunctioning allowed-failure section in Travis (nilearn#2160)
  Update Brainomics fetcher (nilearn#2097)
  Remove inplace modification in signal.clean (nilearn#2125)
  Iter age group prediction example (nilearn#2063)
  Expose bg_img, vmin and vmax in plot_img signature (nilearn#2157)
  Replac conda with pip in TravisCI setup (nilearn#2141)
  Core devs doc and add @emdupre (nilearn#2151)
  Add check for vmin, vmax in plot_surf_roi (nilearn#2052)
  [ENH] Initial visual reports (nilearn#2019)
  Renamed test to deduplicate name (nilearn#2144)
  Fixes nilearn#2029 Handle gzip files without extensions (nilearn#2126)
  Update Schaefer parcelation to v0.14.3 (nilearn#2138)
  MAINT: Future-compatible cmap reversal (nilearn#2131)
  fix openmp crash (nilearn#2140)
  change nose to pytest on appveyor (nilearn#2130)
  adding fix to whatsnew
  fix wrong urls in nki dataset
  ...

# Conflicts:
#	nilearn/datasets/func.py
  • Loading branch information
kchawla-pi committed Oct 12, 2019
2 parents e84e1af + d566678 commit 1bcfdba
Show file tree
Hide file tree
Showing 156 changed files with 6,275 additions and 1,517 deletions.
1 change: 1 addition & 0 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ jobs:
NUMPY_VERSION: "*"
SCIPY_VERSION: "*"
SCIKIT_LEARN_VERSION: "*"
JOBLIB_VERSION: "*"
MATPLOTLIB_VERSION: "*"

steps:
Expand Down
5 changes: 5 additions & 0 deletions .coveragerc
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
[run]
branch = True
parallel = True
omit =
*/nilearn/externals/*
47 changes: 28 additions & 19 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,49 +2,58 @@ sudo: required
dist: xenial

language: python
python: "3.5"

virtualenv:
system_site_packages: true

env:
global:
- TEST_RUN_FOLDER="/tmp" # folder where the tests are run from
- TEST_RUN_FOLDER="/tmp"

matrix:
# Do not wait for the allowed_failures entry to finish before
# setting the status
fast_finish: true
allow_failures:
# allow_failures seems to be keyed on the python version.
- python: 3.5
# allow_failures keyed to python 3.5 & skipping tests.
- python: "3.5"
env: DISTRIB="travisci" PYTHON_VERSION="3.5" FLAKE8_VERSION="*" SKIP_TESTS="true"
include:
# without matplotlib
- env: DISTRIB="conda" PYTHON_VERSION="3.5"
NUMPY_VERSION="1.11" SCIPY_VERSION="0.17" PANDAS_VERSION="*"
SCIKIT_LEARN_VERSION="0.18" COVERAGE="true"
- name: "Python 3.5 minimum package versions without Matplotlib"
python: "3.5"
env: DISTRIB="travisci" PYTHON_VERSION="3.5"
NUMPY_VERSION="1.11" SCIPY_VERSION="0.19" PANDAS_VERSION="*"
SCIKIT_LEARN_VERSION="0.19" COVERAGE="true" JOBLIB_VERSION="0.11"
LXML_VERSION="*"
- env: DISTRIB="conda" PYTHON_VERSION="3.5"
- name: "Python 3.5 latest package versions"
python: "3.5"
env: DISTRIB="travisci" PYTHON_VERSION="3.5"
NUMPY_VERSION="*" SCIPY_VERSION="*" PANDAS_VERSION="*"
SCIKIT_LEARN_VERSION="*" MATPLOTLIB_VERSION="*" COVERAGE="true"
JOBLIB_VERSION="0.11"
LXML_VERSION="*"
- env: DISTRIB="conda" PYTHON_VERSION="3.6"
- name: "Python 3.6 latest package versions"
python: "3.6"
env: DISTRIB="travisci" PYTHON_VERSION="3.6"
NUMPY_VERSION="*" SCIPY_VERSION="*" PANDAS_VERSION="*"
SCIKIT_LEARN_VERSION="*" MATPLOTLIB_VERSION="*" COVERAGE="true"
LXML_VERSION="*"
- env: DISTRIB="conda" PYTHON_VERSION="3.7"
JOBLIB_VERSION="0.12" LXML_VERSION="*"
# joblib.Memory switches from keyword cachedir to location in version 0.12
# Making sure we get the deprecation warning.

- name: "Python 3.7 latest package versions"
python: "3.7"
env: DISTRIB="travisci" PYTHON_VERSION="3.7"
NUMPY_VERSION="*" SCIPY_VERSION="*" PANDAS_VERSION="*"
SCIKIT_LEARN_VERSION="*" MATPLOTLIB_VERSION="*" COVERAGE="true"
LXML_VERSION="*"
JOBLIB_VERSION="*" LXML_VERSION="*"

# FLAKE8 linting on diff wrt common ancestor with upstream/master
# Note: the python value is only there to trigger allow_failures
- python: 3.5
env: DISTRIB="conda" PYTHON_VERSION="3.5" FLAKE8_VERSION="*" SKIP_TESTS="true"
- name: Python 3.5 Flake8 no tests
python: "3.5"
env: DISTRIB="travisci" PYTHON_VERSION="3.5" FLAKE8_VERSION="*" SKIP_TESTS="true"

install: source continuous_integration/install.sh

before_script: make clean
before_script: make clean

script: source continuous_integration/test_script.sh

Expand Down
71 changes: 37 additions & 34 deletions AUTHORS.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,38 +3,36 @@
People
------

This work is made available by a community of people, amongst which
This work is made available by a community of people, which
originated from
the `INRIA Parietal Project Team <https://team.inria.fr/parietal/>`_
and the `scikit-learn <http://scikit-learn.org/>`_ folks, in
particular:

* Alexandre Abraham
* `Alexandre Gramfort <http://alexandre.gramfort.net>`_
* Vincent Michel
* Bertrand Thirion
* `Fabian Pedregosa <http://fa.bianp.net/>`_
* `Gael Varoquaux <http://gael-varoquaux.info/>`_
* Philippe Gervais
* Michael Eickenberg
* Danilo Bzdok
* Loïc Estève
* Kamalakar Reddy Daddy
* Elvis Dohmatob
* Alexandre Abadie
* Andres Hoyos Idrobo
* Salma Bougacha
* Mehdi Rahim
* Sylvain Lanuzel
* `Kshitij Chawla <https://github.com/kchawla-pi>`_

Many of also contributed outside of Parietal, notably:

* `Chris Filo Gorgolewski <http://multiplecomparisons.blogspot.fr/>`_
* `Ben Cipollini <http://cseweb.ucsd.edu/~bcipolli/>`_
* Julia Huntenburg
* Martin Perez-Guevara

Thanks to M. Hanke and Y. Halchenko for data and packaging.
and the `scikit-learn <http://scikit-learn.org/>`_ but grew much further.

An up-to-date list of contributors can be seen in on `gitub
<https://github.com/nilearn/nilearn/graphs/contributors>`_

Additional credit goes to M. Hanke and Y. Halchenko for data and packaging.

.. _core_devs:

Core developers
.................

The nilearn core developers are:

* Alexandre Gramfort https://github.com/agramfort
* Ben Cipollini https://github.com/bcipolli
* Bertrand Thirion https://github.com/bthirion
* Chris Gorgolewski https://github.com/chrisgorgo
* Danilo Bzdok https://github.com/banilo
* Elizabeth DuPre https://github.com/emdupre
* Gael Varoquaux https://github.com/GaelVaroquaux
* Jerome Dockes https://github.com/jeromedockes
* Julia Huntenburg https://github.com/juhuntenburg
* KamalakerDadi https://github.com/KamalakerDadi
* Kshitij Chawla https://github.com/kchawla-pi
* Medhi Rahim https://github.com/mrahim
* Salma Bougacha https://github.com/salma1601

Funding
........
Expand All @@ -45,7 +43,8 @@ Mehdi Rahim, Philippe Gervais where payed by the `NiConnect
project, funded by the French `Investissement d'Avenir
<http://www.gouvernement.fr/investissements-d-avenir-cgi>`_.

NiLearn is also supported by `DigiCosme <https://digicosme.lri.fr>`_ |digicomse logo|
NiLearn is also supported by `DigiCosme <https://digicosme.lri.fr>`_
|digicosme logo| and `DataIA <https://dataia.eu/en>`_ |dataia_logo|.

.. _citing:

Expand Down Expand Up @@ -74,6 +73,10 @@ See the scikit-learn documentation on `how to cite
<http://scikit-learn.org/stable/about.html#citing-scikit-learn>`_.


.. |digicomse logo| image:: logos/digi-saclay-logo-small.png
.. |digicosme logo| image:: logos/digi-saclay-logo-small.png
:height: 25
:alt: DigiComse Logo

.. |dataia_logo| image:: logos/dataia.png
:height: 25
:alt: DigiComse Logo
:alt: DataIA Logo
14 changes: 5 additions & 9 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,6 @@

PYTHON ?= python
CYTHON ?= cython
NOSETESTS ?= nosetests
NOSETESTS_OPTIONS := $(shell pip list | grep nose-timer > /dev/null && \
echo '--with-timer --timer-top-n 50')
CTAGS ?= ctags

all: clean test doc-noplot
Expand All @@ -32,15 +29,15 @@ inplace:
$(PYTHON) setup.py build_ext -i

test-code:
$(NOSETESTS) -s nilearn $(NOSETESTS_OPTIONS)
python -m pytest --pyargs nilearn --cov=nilearn

test-doc:
$(NOSETESTS) -s --with-doctest --doctest-tests --doctest-extension=rst \
--doctest-extension=inc --doctest-fixtures=_fixture `find doc/ -name '*.rst'`
pytest --doctest-glob='*.rst' `find doc/ -name '*.rst'`


test-coverage:
rm -rf coverage .coverage
$(NOSETESTS) -s --with-coverage --cover-html --cover-html-dir=coverage \
--cover-package=nilearn nilearn
pytest --pyargs nilearn --showlocals --cov=nilearn --cov-report=html:coverage

test: test-code test-doc

Expand All @@ -66,4 +63,3 @@ doc:
.PHONY : pdf
pdf:
make -C doc pdf

5 changes: 3 additions & 2 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -41,8 +41,9 @@ The required dependencies to use the software are:
* Python >= 3.5,
* setuptools
* Numpy >= 1.11
* SciPy >= 0.17
* Scikit-learn >= 0.18
* SciPy >= 0.19
* Scikit-learn >= 0.19
* Joblib >= 0.11
* Nibabel >= 2.0.2

If you are using nilearn plotting functionalities or running the
Expand Down
6 changes: 3 additions & 3 deletions appveyor.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,10 +24,10 @@ install:
# See similar fix which made for travis and circleci
# https://github.com/nilearn/nilearn/pull/1525
# Should be removed after a new matplotlib release 2.1.1
- "conda install pip numpy scipy scikit-learn nose wheel matplotlib -y -q"
- "conda install pip numpy scipy scikit-learn joblib nose pytest wheel matplotlib -y -q"

# Install other nilearn dependencies
- "pip install nibabel coverage nose-timer"
- "pip install nibabel coverage nose-timer pytest-cov"
- "python setup.py bdist_wheel"
- ps: "ls dist"

Expand All @@ -41,7 +41,7 @@ test_script:
# Change to a non-source folder to make sure we run the tests on the
# installed library.
- "cd C:\\"
- "python -c \"import nose; nose.main()\" -v -s nilearn --with-timer --timer-top-n 50"
- "pytest --pyargs nilearn -v"

artifacts:
# Archive the generated packages in the ci.appveyor.com build report.
Expand Down
2 changes: 1 addition & 1 deletion azure-pipelines.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ jobs:

- script: |
pip install .
nosetests ./nilearn -v
pytest ./nilearn -v
displayName: 'test'
- task: PublishTestResults@2
Expand Down
2 changes: 2 additions & 0 deletions codecov.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
ignore:
- "*externals/.*" # ignore folders and all its contents
61 changes: 20 additions & 41 deletions continuous_integration/install.sh
Original file line number Diff line number Diff line change
Expand Up @@ -22,21 +22,20 @@ create_new_venv() {
deactivate
virtualenv --system-site-packages testvenv
source testvenv/bin/activate
pip install nose
pip install nose pytest
}

print_conda_requirements() {
# Echo a conda requirement string for example
echo_requirements_string() {
# Echo a requirement string for example
# "pip nose python='2.7.3 scikit-learn=*". It has a hardcoded
# list of possible packages to install and looks at _VERSION
# environment variables to know whether to install a given package and
# if yes which version to install. For example:
# - for numpy, NUMPY_VERSION is used
# - for scikit-learn, SCIKIT_LEARN_VERSION is used
TO_INSTALL_ALWAYS="pip nose"
TO_INSTALL_ALWAYS="pip nose pytest"
REQUIREMENTS="$TO_INSTALL_ALWAYS"
TO_INSTALL_MAYBE="python numpy scipy matplotlib scikit-learn pandas \
flake8 lxml"
TO_INSTALL_MAYBE="numpy scipy matplotlib scikit-learn pandas flake8 lxml joblib"
for PACKAGE in $TO_INSTALL_MAYBE; do
# Capitalize package name and add _VERSION
PACKAGE_VERSION_VARNAME="${PACKAGE^^}_VERSION"
Expand All @@ -45,62 +44,42 @@ flake8 lxml"
# dereference $PACKAGE_VERSION_VARNAME to figure out the
# version to install
PACKAGE_VERSION="${!PACKAGE_VERSION_VARNAME}"
if [ -n "$PACKAGE_VERSION" ]; then
REQUIREMENTS="$REQUIREMENTS $PACKAGE=$PACKAGE_VERSION"
if [[ -n "$PACKAGE_VERSION" ]]; then
if [[ "$PACKAGE_VERSION" == "*" ]]; then
REQUIREMENTS="$REQUIREMENTS $PACKAGE"
else
REQUIREMENTS="$REQUIREMENTS $PACKAGE==$PACKAGE_VERSION"
fi
fi
done
echo $REQUIREMENTS
}

create_new_conda_env() {
# Skip Travis related code on circle ci.
if [ -z $CIRCLECI ]; then
# Deactivate the travis-provided virtual environment and setup a
# conda-based environment instead
deactivate
fi

# Use the miniconda installer for faster download / install of conda
# itself
wget http://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh \
-O ~/miniconda.sh
chmod +x ~/miniconda.sh && ~/miniconda.sh -b
export PATH=$HOME/miniconda3/bin:$PATH
echo $PATH
conda update --quiet --yes conda

# Configure the conda environment and put it in the path using the
# provided versions
REQUIREMENTS=$(print_conda_requirements)
echo "conda requirements string: $REQUIREMENTS"
conda create -n testenv --quiet --yes $REQUIREMENTS
source activate testenv
create_new_travisci_env() {
REQUIREMENTS=$(echo_requirements_string)
pip install ${REQUIREMENTS}
pip install pytest pytest-cov

if [[ "$INSTALL_MKL" == "true" ]]; then
# Make sure that MKL is used
conda install --quiet --yes mkl
elif [[ -z $CIRCLECI ]]; then
# Travis doesn't use MKL but circle ci does for speeding up examples
# generation in the html documentation.
# Make sure that MKL is not used
conda remove --yes --features mkl || echo "MKL not installed"
pip install mkl
fi
}

if [[ "$DISTRIB" == "neurodebian" ]]; then
create_new_venv
pip install nose-timer
bash <(wget -q -O- http://neuro.debian.net/_files/neurodebian-travis.sh)
sudo apt-get install -qq python-scipy python-nose python-nibabel python-sklearn
sudo apt-get install -qq python-scipy python-nose python-nibabel python-sklearn python-joblib

elif [[ "$DISTRIB" == "conda" ]]; then
create_new_conda_env
elif [[ "$DISTRIB" == "travisci" ]]; then
create_new_travisci_env
pip install nose-timer
# Note: nibabel is in setup.py install_requires so nibabel will
# always be installed eventually. Defining NIBABEL_VERSION is only
# useful if you happen to want a specific nibabel version rather
# than the latest available one.
if [ -n "$NIBABEL_VERSION" ]; then
if [[ -n "$NIBABEL_VERSION" ]]; then
pip install nibabel=="$NIBABEL_VERSION"
fi

Expand Down
2 changes: 1 addition & 1 deletion continuous_integration/show-python-packages-versions.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
import sys

DEPENDENCIES = ['numpy', 'scipy', 'sklearn', 'matplotlib', 'nibabel']
DEPENDENCIES = ['numpy', 'scipy', 'sklearn', 'joblib', 'matplotlib', 'nibabel']


def print_package_version(package_name, indent=' '):
Expand Down
1 change: 1 addition & 0 deletions continuous_integration/test_script.sh
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ if [[ "$SKIP_TESTS" != "true" ]]; then
# Copy setup.cfg to TEST_RUN_FOLDER where we are going to run the tests from
# Mainly for nose config settings
cp setup.cfg "$TEST_RUN_FOLDER"
cp .coveragerc "$TEST_RUN_FOLDER"
# We want to back out of the current working directory to make
# sure we are using nilearn installed in site-packages rather
# than the one from the current working directory
Expand Down

0 comments on commit 1bcfdba

Please sign in to comment.