Skip to content

Commit

Permalink
Merge branch 'master' of github.com:nilearn/nilearn into immigrate-ni…
Browse files Browse the repository at this point in the history
…stats

* 'master' of github.com:nilearn/nilearn:
  Expose bg_img, vmin and vmax in plot_img signature (nilearn#2157)
  Replac conda with pip in TravisCI setup (nilearn#2141)
  Core devs doc and add @emdupre (nilearn#2151)
  Add check for vmin, vmax in plot_surf_roi (nilearn#2052)
  • Loading branch information
kchawla-pi committed Oct 8, 2019
2 parents 35396d5 + f1d268c commit b1a182a
Show file tree
Hide file tree
Showing 9 changed files with 109 additions and 96 deletions.
41 changes: 24 additions & 17 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,47 +2,54 @@ sudo: required
dist: xenial

language: python
python: "3.5"

virtualenv:
system_site_packages: true

env:
global:
- TEST_RUN_FOLDER="/tmp" # folder where the tests are run from
- TEST_RUN_FOLDER="/tmp"

matrix:
# Do not wait for the allowed_failures entry to finish before
# setting the status
fast_finish: true
allow_failures:
# allow_failures seems to be keyed on the python version.
- python: 3.5
# allow_failures keyed to skipping tests.
- env: SKIP_TESTS="true"
include:
# without matplotlib
- env: DISTRIB="conda" PYTHON_VERSION="3.5"
- name: "Python 3.5 minimum package versions"
python: "3.5"
env: DISTRIB="travisci" PYTHON_VERSION="3.5"
NUMPY_VERSION="1.11" SCIPY_VERSION="0.19" PANDAS_VERSION="*"
SCIKIT_LEARN_VERSION="0.19" COVERAGE="true" JOBLIB_VERSION="*"
SCIKIT_LEARN_VERSION="0.19" COVERAGE="true" JOBLIB_VERSION="0.11"
LXML_VERSION="*"
- env: DISTRIB="conda" PYTHON_VERSION="3.5"
- name: "Python 3.5 latest package versions"
python: "3.5"
env: DISTRIB="travisci" PYTHON_VERSION="3.5"
NUMPY_VERSION="*" SCIPY_VERSION="*" PANDAS_VERSION="*"
SCIKIT_LEARN_VERSION="*" MATPLOTLIB_VERSION="*" COVERAGE="true"
JOBLIB_VERSION="0.11"
LXML_VERSION="*"
- env: DISTRIB="conda" PYTHON_VERSION="3.6"
- name: "Python 3.6 latest package versions"
python: "3.6"
env: DISTRIB="travisci" PYTHON_VERSION="3.6"
NUMPY_VERSION="*" SCIPY_VERSION="*" PANDAS_VERSION="*"
SCIKIT_LEARN_VERSION="*" MATPLOTLIB_VERSION="*" COVERAGE="true"
JOBLIB_VERSION="*"
LXML_VERSION="*"
- env: DISTRIB="conda" PYTHON_VERSION="3.7"
JOBLIB_VERSION="0.12" LXML_VERSION="*"
# joblib.Memory switches from keyword cachedir to location in version 0.12
# Making sure we get the deprecation warning.

- name: "Python 3.7 latest package versions"
python: "3.7"
env: DISTRIB="travisci" PYTHON_VERSION="3.7"
NUMPY_VERSION="*" SCIPY_VERSION="*" PANDAS_VERSION="*"
SCIKIT_LEARN_VERSION="*" MATPLOTLIB_VERSION="*" COVERAGE="true"
JOBLIB_VERSION="0.12" LXML_VERSION="*"
JOBLIB_VERSION="*" LXML_VERSION="*"

# FLAKE8 linting on diff wrt common ancestor with upstream/master
# Note: the python value is only there to trigger allow_failures
- python: 3.5
env: DISTRIB="conda" PYTHON_VERSION="3.5" FLAKE8_VERSION="*" SKIP_TESTS="true"
- name: Python 3.5 Flake8 no tests
python: 3.5
env: DISTRIB="travisci" PYTHON_VERSION="3.5" FLAKE8_VERSION="*" SKIP_TESTS="true"

install: source continuous_integration/install.sh

Expand Down
71 changes: 37 additions & 34 deletions AUTHORS.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,38 +3,36 @@
People
------

This work is made available by a community of people, amongst which
This work is made available by a community of people, which
originated from
the `INRIA Parietal Project Team <https://team.inria.fr/parietal/>`_
and the `scikit-learn <http://scikit-learn.org/>`_ folks, in
particular:

* Alexandre Abraham
* `Alexandre Gramfort <http://alexandre.gramfort.net>`_
* Vincent Michel
* Bertrand Thirion
* `Fabian Pedregosa <http://fa.bianp.net/>`_
* `Gael Varoquaux <http://gael-varoquaux.info/>`_
* Philippe Gervais
* Michael Eickenberg
* Danilo Bzdok
* Loïc Estève
* Kamalakar Reddy Daddy
* Elvis Dohmatob
* Alexandre Abadie
* Andres Hoyos Idrobo
* Salma Bougacha
* Mehdi Rahim
* Sylvain Lanuzel
* `Kshitij Chawla <https://github.com/kchawla-pi>`_

Many of also contributed outside of Parietal, notably:

* `Chris Filo Gorgolewski <http://multiplecomparisons.blogspot.fr/>`_
* `Ben Cipollini <http://cseweb.ucsd.edu/~bcipolli/>`_
* Julia Huntenburg
* Martin Perez-Guevara

Thanks to M. Hanke and Y. Halchenko for data and packaging.
and the `scikit-learn <http://scikit-learn.org/>`_ but grew much further.

An up-to-date list of contributors can be seen in on `gitub
<https://github.com/nilearn/nilearn/graphs/contributors>`_

Additional credit goes to M. Hanke and Y. Halchenko for data and packaging.

.. _core_devs:

Core developers
.................

The nilearn core developers are:

* Alexandre Gramfort https://github.com/agramfort
* Ben Cipollini https://github.com/bcipolli
* Bertrand Thirion https://github.com/bthirion
* Chris Gorgolewski https://github.com/chrisgorgo
* Danilo Bzdok https://github.com/banilo
* Elizabeth DuPre https://github.com/emdupre
* Gael Varoquaux https://github.com/GaelVaroquaux
* Jerome Dockes https://github.com/jeromedockes
* Julia Huntenburg https://github.com/juhuntenburg
* KamalakerDadi https://github.com/KamalakerDadi
* Kshitij Chawla https://github.com/kchawla-pi
* Medhi Rahim https://github.com/mrahim
* Salma Bougacha https://github.com/salma1601

Funding
........
Expand All @@ -45,7 +43,8 @@ Mehdi Rahim, Philippe Gervais where payed by the `NiConnect
project, funded by the French `Investissement d'Avenir
<http://www.gouvernement.fr/investissements-d-avenir-cgi>`_.

NiLearn is also supported by `DigiCosme <https://digicosme.lri.fr>`_ |digicomse logo|
NiLearn is also supported by `DigiCosme <https://digicosme.lri.fr>`_
|digicosme logo| and `DataIA <https://dataia.eu/en>`_ |dataia_logo|.

.. _citing:

Expand Down Expand Up @@ -74,6 +73,10 @@ See the scikit-learn documentation on `how to cite
<http://scikit-learn.org/stable/about.html#citing-scikit-learn>`_.


.. |digicomse logo| image:: logos/digi-saclay-logo-small.png
.. |digicosme logo| image:: logos/digi-saclay-logo-small.png
:height: 25
:alt: DigiComse Logo

.. |dataia_logo| image:: logos/dataia.png
:height: 25
:alt: DigiComse Logo
:alt: DataIA Logo
55 changes: 17 additions & 38 deletions continuous_integration/install.sh
Original file line number Diff line number Diff line change
Expand Up @@ -25,8 +25,8 @@ create_new_venv() {
pip install nose pytest
}

print_conda_requirements() {
# Echo a conda requirement string for example
echo_requirements_string() {
# Echo a requirement string for example
# "pip nose python='2.7.3 scikit-learn=*". It has a hardcoded
# list of possible packages to install and looks at _VERSION
# environment variables to know whether to install a given package and
Expand All @@ -35,8 +35,7 @@ print_conda_requirements() {
# - for scikit-learn, SCIKIT_LEARN_VERSION is used
TO_INSTALL_ALWAYS="pip nose pytest"
REQUIREMENTS="$TO_INSTALL_ALWAYS"
TO_INSTALL_MAYBE="python numpy scipy matplotlib scikit-learn pandas \
flake8 lxml joblib"
TO_INSTALL_MAYBE="numpy scipy matplotlib scikit-learn pandas flake8 lxml joblib"
for PACKAGE in $TO_INSTALL_MAYBE; do
# Capitalize package name and add _VERSION
PACKAGE_VERSION_VARNAME="${PACKAGE^^}_VERSION"
Expand All @@ -45,45 +44,25 @@ flake8 lxml joblib"
# dereference $PACKAGE_VERSION_VARNAME to figure out the
# version to install
PACKAGE_VERSION="${!PACKAGE_VERSION_VARNAME}"
if [ -n "$PACKAGE_VERSION" ]; then
REQUIREMENTS="$REQUIREMENTS $PACKAGE=$PACKAGE_VERSION"
if [[ -n "$PACKAGE_VERSION" ]]; then
if [[ "$PACKAGE_VERSION" == "*" ]]; then
REQUIREMENTS="$REQUIREMENTS $PACKAGE"
else
REQUIREMENTS="$REQUIREMENTS $PACKAGE==$PACKAGE_VERSION"
fi
fi
done
echo $REQUIREMENTS
}

create_new_conda_env() {
# Skip Travis related code on circle ci.
if [ -z $CIRCLECI ]; then
# Deactivate the travis-provided virtual environment and setup a
# conda-based environment instead
deactivate
fi

# Use the miniconda installer for faster download / install of conda
# itself
wget https://repo.continuum.io/miniconda/Miniconda3-4.6.14-Linux-x86_64.sh \
-O ~/miniconda.sh
chmod +x ~/miniconda.sh && ~/miniconda.sh -b
export PATH=$HOME/miniconda3/bin:$PATH
echo $PATH

# Configure the conda environment and put it in the path using the
# provided versions
REQUIREMENTS=$(print_conda_requirements)
echo "conda requirements string: $REQUIREMENTS"
conda create -n testenv --quiet --yes $REQUIREMENTS
source activate testenv
conda install pytest pytest-cov --yes
create_new_travisci_env() {
REQUIREMENTS=$(echo_requirements_string)
pip install ${REQUIREMENTS}
pip install pytest pytest-cov

if [[ "$INSTALL_MKL" == "true" ]]; then
# Make sure that MKL is used
conda install --quiet --yes mkl
elif [[ -z $CIRCLECI ]]; then
# Travis doesn't use MKL but circle ci does for speeding up examples
# generation in the html documentation.
# Make sure that MKL is not used
conda remove --yes --features mkl || echo "MKL not installed"
pip install mkl
fi
}

Expand All @@ -93,14 +72,14 @@ if [[ "$DISTRIB" == "neurodebian" ]]; then
bash <(wget -q -O- http://neuro.debian.net/_files/neurodebian-travis.sh)
sudo apt-get install -qq python-scipy python-nose python-nibabel python-sklearn python-joblib

elif [[ "$DISTRIB" == "conda" ]]; then
create_new_conda_env
elif [[ "$DISTRIB" == "travisci" ]]; then
create_new_travisci_env
pip install nose-timer
# Note: nibabel is in setup.py install_requires so nibabel will
# always be installed eventually. Defining NIBABEL_VERSION is only
# useful if you happen to want a specific nibabel version rather
# than the latest available one.
if [ -n "$NIBABEL_VERSION" ]; then
if [[ -n "$NIBABEL_VERSION" ]]; then
pip install nibabel=="$NIBABEL_VERSION"
fi

Expand Down
5 changes: 1 addition & 4 deletions doc/development.rst
Original file line number Diff line number Diff line change
Expand Up @@ -122,12 +122,9 @@ Decisions are made public, through discussion on issues and pull requests
in Github.

The decisions are made by the core-contributors, ie people with write
access to the repository, as listed `here
<https://github.com/orgs/nilearn/people>`__
access to the repository, as listed :ref:`here <core_devs>`

If there are open questions, final decisions are made by the Temporary
Benevolent Dictator, currently Gaël Varoquaux.



.. include:: ../CONTRIBUTING.rst
Binary file added doc/logos/dataia.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
6 changes: 6 additions & 0 deletions doc/whats_new.rst
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,11 @@ Changes
in `nilearn.input_data`. You can now set `standardize` to `zscore` or `psc`. `psc` stands
for `Percent Signal Change`, which can be a meaningful metric for BOLD.

- :func:`nilearn.plotting.plot_img` now has explicit keyword arguments `bg_img`,
`vmin` and `vmax` to control the background image and the bounds of the
colormap. These arguments were already accepted in `kwargs` but not documented
before.

Fixes
-----

Expand All @@ -67,6 +72,7 @@ Fixes
- :func:`nilearn.plotting.view_surf` now accepts surface data provided as a file
path.
- :func:`nilearn.plotting.plot_matrix` providing labels=None, False, or an empty list now correctly disables labels.
- :func:`nilearn.plotting.plot_surf_roi` now takes vmin, vmax parameters
- :func:`nilearn.datasets.fetch_surf_nki_enhanced` is now downloading the correct
left and right functional surface data for each subject
- :func:`nilearn.datasets.fetch_atlas_schaefer_2018` now downloads from release
Expand Down
14 changes: 12 additions & 2 deletions nilearn/plotting/img_plotting.py
Original file line number Diff line number Diff line change
Expand Up @@ -232,7 +232,8 @@ def _crop_colorbar(cbar, cbar_vmin, cbar_vmax):
def plot_img(img, cut_coords=None, output_file=None, display_mode='ortho',
figure=None, axes=None, title=None, threshold=None,
annotate=True, draw_cross=True, black_bg=False, colorbar=False,
resampling_interpolation='continuous', **kwargs):
resampling_interpolation='continuous',
bg_img=None, vmin=None, vmax=None, **kwargs):
""" Plot cuts of a given image (by default Frontal, Axial, and Lateral)
Parameters
Expand Down Expand Up @@ -290,6 +291,14 @@ def plot_img(img, cut_coords=None, output_file=None, display_mode='ortho',
space. Can be "continuous" (default) to use 3rd-order spline
interpolation, or "nearest" to use nearest-neighbor mapping.
"nearest" is faster but can be noisier in some cases.
bg_img : Niimg-like object, optional
See http://nilearn.github.io/manipulating_images/input_output.html
The background image that the ROI/mask will be plotted on top of.
If nothing is specified, no background image is plotted.
vmin : float, optional
lower bound of the colormap. If `None`, the min of the image is used.
vmax : float, optional
upper bound of the colormap. If `None`, the max of the image is used.
kwargs: extra keyword arguments, optional
Extra keyword arguments passed to matplotlib.pyplot.imshow
""" # noqa: E501
Expand All @@ -300,7 +309,8 @@ def plot_img(img, cut_coords=None, output_file=None, display_mode='ortho',
threshold=threshold, annotate=annotate,
draw_cross=draw_cross,
resampling_interpolation=resampling_interpolation,
black_bg=black_bg, colorbar=colorbar, **kwargs)
black_bg=black_bg, colorbar=colorbar,
bg_img=bg_img, vmin=vmin, vmax=vmax, **kwargs)

return display

Expand Down
5 changes: 4 additions & 1 deletion nilearn/plotting/surf_plotting.py
Original file line number Diff line number Diff line change
Expand Up @@ -514,7 +514,10 @@ def plot_surf_roi(surf_mesh, roi_map, bg_map=None,
# messages in case of wrong inputs

roi = load_surf_data(roi_map)
vmin, vmax = np.min(roi), 1 + np.max(roi)
if vmin is None:
vmin = np.min(roi)
if vmax is None:
vmax = 1 + np.max(roi)

mesh = load_surf_mesh(surf_mesh)

Expand Down
8 changes: 8 additions & 0 deletions nilearn/plotting/tests/test_surf_plotting.py
Original file line number Diff line number Diff line change
Expand Up @@ -162,6 +162,14 @@ def test_plot_surf_roi():
# plot roi
plot_surf_roi(mesh, roi_map=roi_map)
plot_surf_roi(mesh, roi_map=roi_map, colorbar=True)
# change vmin, vmax
img = plot_surf_roi(mesh, roi_map=roi_map,
vmin=1.2, vmax=8.9, colorbar=True)
cbar = img.axes[-1]
cbar_vmin = float(cbar.get_yticklabels()[0].get_text())
cbar_vmax = float(cbar.get_yticklabels()[-1].get_text())
assert cbar_vmin == 1.2
assert cbar_vmax == 8.9

# plot parcellation
plot_surf_roi(mesh, roi_map=parcellation)
Expand Down

0 comments on commit b1a182a

Please sign in to comment.