Giter Club home page Giter Club logo

nilearn's Introduction

Pypi Package

PyPI - Python Version

Github Actions Build Status

Coverage Status

image

image

Twitter

Mastodon

Discord

nilearn

Nilearn enables approachable and versatile analyses of brain volumes. It provides statistical and machine-learning tools, with instructive documentation & friendly community.

It supports general linear model (GLM) based analysis and leverages the scikit-learn Python toolbox for multivariate statistics with applications such as predictive modelling, classification, decoding, or connectivity analysis.

Important links

Install

Latest release

1. Setup a virtual environment

We recommend that you install nilearn in a virtual Python environment, either managed with the standard library venv or with conda (see miniconda for instance). Either way, create and activate a new python environment.

With venv:

python3 -m venv /<path_to_new_env>
source /<path_to_new_env>/bin/activate

Windows users should change the last line to \<path_to_new_env>\Scripts\activate.bat in order to activate their virtual environment.

With conda:

conda create -n nilearn python=3.9
conda activate nilearn

2. Install nilearn with pip

Execute the following command in the command prompt / terminal in the proper python environment:

python -m pip install -U nilearn

Development version

Please find all development setup instructions in the contribution guide.

Check installation

Try importing nilearn in a python / iPython session:

import nilearn

If no error is raised, you have installed nilearn correctly.

Drop-in Hours

The Nilearn team organizes regular online drop-in hours to answer questions, discuss feature requests, or have any Nilearn-related discussions. Nilearn drop-in hours occur every Wednesday from 4pm to 5pm UTC, and we make sure that at least one member of the core-developer team is available. These events are held on Jitsi Meet and are fully open, anyone is welcome to join! For more information and ways to engage with the Nilearn team see How to get help.

Dependencies

The required dependencies to use the software are listed in the file pyproject.toml.

If you are using nilearn plotting functionalities or running the examples, matplotlib >= 3.3.0 is required.

Some plotting functions in Nilearn support both matplotlib and plotly as plotting engines. In order to use the plotly engine in these functions, you will need to install both plotly and kaleido, which can both be installed with pip and anaconda.

If you want to run the tests, you need pytest >= 6.0.0 and pytest-cov for coverage reporting.

Development

Detailed instructions on how to contribute are available at https://nilearn.github.io/stable/development.html

nilearn's People

Contributors

aabadie avatar ahoyosid avatar alexandreabraham avatar banilo avatar bthirion avatar chrisgorgo avatar dohmatob avatar eickenberg avatar emdupre avatar fliem avatar gaelvaroquaux avatar github-actions[bot] avatar jaquesgrobler avatar jeankossaifi avatar jeromedockes avatar juhuntenburg avatar kamalakerdadi avatar kchawla-pi avatar lesteve avatar martinperez avatar nicolasgensollen avatar pbellec avatar pgervais avatar remi-gau avatar salma1601 avatar sylvainlan avatar titan-c avatar tsalo avatar virgilefritsch avatar ymzayek avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

nilearn's Issues

Code layout suggestion

Here are a few suggestions to reorganize a bit the code layout. I am opening this issue for discussion.

  • The utils module should be split in:
    • A _utils (or utils) directory containing stuff only for our internal use: utils/conversions for _asarray and utils/cache_mixin for the CacheMixin, and probably the testing
    • A niimg_utils file containing all the concat_niims and check_niimgs related code
    • largest_connected_component: where does this go? I see it used only in masking, so I am wondering whether it should not be there
  • Consolidate the image part and the resampling part. We need an 'image' directory, in which we put image and resampling.
  • Rename io in input_data
  • Similarly, I feel that the NiftiLabelsMasker and NiftiMapsMasker are a bit 'far'. It seems to me that they belong to the same kind of code, and should fit close to eachother.

Get Miyawaki example back

As Kamitani dataset data format has changed, the Kamitani example cannot run for the moment. We should bring it back, either by parsing the new data format or by uploading data formatted for nisl somewhere.

Delete NISL website

I have put a <link rel="canonical" href="..."> tag in the header of nisl pages. Once we see in google requests that results are pointing to nilearn website, I suggest that we delete completely the nisl website to replace it by a simple redirection.

Dataset downloaders don't fail graciusly

An example (with a screwed up proxy setting):

plotting plot_visualization.py
Downloading data from http://www.pymvpa.org/files/pymvpa_exampledata.tar.bz2 ...
HTTP Error: HTTP Error 407: Proxy Authentication Required http://www.pymvpa.org/files/pymvpa_exampledata.tar.bz2
An error occured, abort fetching
extracting data from None...
archive corrupted, trying to download it again
Downloading data from http://www.pymvpa.org/files/pymvpa_exampledata.tar.bz2 ...
HTTP Error: HTTP Error 407: Proxy Authentication Required http://www.pymvpa.org/files/pymvpa_exampledata.tar.bz2
extracting data from None...
________________________________________________________________________________
plot_visualization.py is not compiling:
Traceback (most recent call last):
  File "/home/varoquau/dev/nisl/tutorial/doc/sphinxext/gen_rst.py", line 308, in generate_file_rst
    execfile(os.path.basename(src_file), my_globals)
  File "plot_visualization.py", line 10, in 
    haxby_files = datasets.fetch_haxby_simple()
  File "/home/varoquau/dev/nisl/tutorial/nisl/datasets.py", line 480, in fetch_haxby_simple
    resume=resume)
  File "/home/varoquau/dev/nisl/tutorial/nisl/datasets.py", line 376, in _fetch_dataset
    _uncompress_file(full_name)
  File "/home/varoquau/dev/nisl/tutorial/nisl/datasets.py", line 201, in _uncompress_file
    data_dir = os.path.dirname(file)
  File "/usr/lib/python2.7/posixpath.py", line 120, in dirname
    i = p.rfind('/') + 1
AttributeError: 'NoneType' object has no attribute 'rfind'
________________________________________________________________________________

The traceback is not the right one

plot_rest_clustering fails

In [4]: run plot_rest_clustering.py
Downloading data from http://www.nitrc.org/frs/download.php/1071/NYU_TRT_session1a.tar.gz ...
Downloaded 697388519 of 697388519 bytes (100.00%,    0.0s remaining)  
...done. (423 seconds, 7 min)
extracting data from /Users/alex/work/src/nisl/tutorial/nisl_data/nyu_rest/NYU_TRT_session1a.tar.gz...
   ...done.
---------------------------------------------------------------------------
IOError                                   Traceback (most recent call last)
/Library/Frameworks/Python.framework/Versions/7.2/lib/python2.7/site-packages/IPython/utils/py3compat.pyc in execfile(fname, *where)
    176             else:
    177                 filename = fname
--> 178             __builtin__.execfile(filename, *where)

/Users/alex/work/src/nisl/tutorial/plot_rest_clustering.py in <module>()
     19 import numpy as np
     20 from nisl import datasets, io
---> 21 dataset = datasets.fetch_nyu_rest(n_subjects=1)
     22 nifti_masker = io.NiftiMasker()
     23 fmri_masked = nifti_masker.fit_transform(dataset.func[0])

/Users/alex/work/src/nisl/tutorial/nisl/datasets.pyc in fetch_nyu_rest(n_subjects, sessions, data_dir, verbose)
    741                 _fetch_dataset('nyu_rest', [url], data_dir=data_dir,
    742                                folder=session_path, verbose=verbose)
--> 743                 files = _get_dataset("nyu_rest", paths, data_dir=data_dir)
    744             for i in range(len(subjects)):
    745                 # We are considering files 3 by 3

/Users/alex/work/src/nisl/tutorial/nisl/datasets.pyc in _get_dataset(dataset_name, file_names, data_dir, folder)
    416         full_name = os.path.join(data_dir, file_name)
    417         if not os.path.exists(full_name):
--> 418             raise IOError("No such file: '%s'" % full_name)
    419         file_paths.append(full_name)
    420     return file_paths

IOError: No such file: '/Users/alex/work/src/nisl/tutorial/nisl_data/nyu_rest/session1/sub05676/anat/mprage_anonymized.nii.gz'

Missing brackets

An error occurs in the downloader when trying to fetch a dataset : the function join used to build the URLs takes its parameters in one array whereas several parameters are given in the code.

Website is not up to date

The website has not been rebuilt since transition from nisl to nilearn and many references to nisl are still there. I'll rebuild the website ASAP.

building nilearn failed

Dear all,

My attempts to install nilearn have failed so far, I cannot seem to
find what the problem is.

My system has Ubuntu Linux with python 2.7.3
install_log txt gz

(python3 and python3.2 also present but not default)

and
numpy v1.6.1
scipy v0.9.0
sklearn v0.14.1
matplotlib v1.1.1rc

The steps I have taken are:

  1. git clone https://github.com/nilearn/nilearn.git python/nilearn
  2. git remote add nilearn https://github.com/nilearn/nilearn.git
  3. cd python/nilearn; make install
  4. cd ../../; git pull nilearn master # today
  5. cd python/nilearn; make install

But the messages I get (see attached) suggests that many library import fail.
Does this mean that there still are unmet dependencies?

Many thanks for your help!

With best wishes
Alle Meije Wink

move create_simulation_data to nisl

It would be very useful for me if the function create_simulation_data could be moved from the example plot_simulated_data.py into nisl/datasets so that it can be imported from external projects.

I am willing to submit a patch if people are OK with this. Comments?

Introduce a memory_level concept

I am not sure that I like the idea of having different 'memory' arguments (as in the maskers). I'd prefer having a 'memory_level' and always use the same memory object.

Expose a PDF and a ZIP file download

Copy from scipy-lectures the strategy to expose PDF (with double page layout) and ZIP file download on the front page.

Some work may be required to get proper rendering of the PDF.

Tuple (.npy + mask niimage) as niimage

I dump a lot niimages in my nilearn pipeline. Unfortunately, this takes a lot of disk space under nifti format because data is not masked and I don't want to gzip it because it is time consuming. The best way to save it is under the .npy form, along with the corresponding mask (ideally, a path to the mask). This way, check_niimg could unmask my images on the fly.

I suggest to add a tuple (npy filepath / numpy array, mask niimage) as a niimage for compression purpose.

Confusing documentation layout

As Nisl has no graphical identity for the moment, we left the scikit-learn layout for the documentation. Changing it is planned but, to avoid confusion, I suggest that we change at least the logo so that users can distinguish Nisl for scikit-learn.

Redirect NISL

We need to add a '' to the 'head' in the NISL website.

Fix cloning of MultiNiftiMasker in MultiPCA

In MultiPCA, if the user provides a masker, we clone it. Unfortunately, in joblib 0.6, a bug prevent cloning a Memory object. This issue can be addressed by testing joblib version (patch proposed by Philippe):

        if isinstance(self.mask, NiftiMultiMasker) \
           and sklearn.externals.joblib.__version__.startswith("0.6"):
            # Dirty workaround for a joblib bug
            # Memory with cachedir=None cannot be cloned in version 0.6
            # of joblib.
            masker_memory = self.mask.memory
            if masker_memory.cachedir is None:
                self.mask.memory = None
                self.masker_ = clone(self.mask)
                self.mask.memory = masker_memory
                self.masker_.memory = Memory(cachedir=None)
            else:
                self.masker_ = clone(self.mask)
            del masker_memory
        else:
            self.masker_ = clone(self.mask)

but, unfortunately, this breaks Travis build:

======================================================================
ERROR: nisl.decomposition.tests.test_multi_pca.test_multi_pca
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/lib/python2.7/dist-packages/nose/case.py", line 197, in runTest
    self.test(*self.arg)
File "/home/travis/build/nilearn/nilearn/nisl/decomposition/tests/test_multi_pca.py", line 44, in test_multi_pca
    multi_pca.fit(data[:2])
File "/home/travis/build/nilearn/nilearn/nisl/decomposition/multi_pca.py", line 211, in fit
    and sklearn.externals.joblib.__version__.startswith("0.6"):
AttributeError: 'module' object has no attribute '__version__'

I have made a dirty fix for the moment but we should fix this ASAP.

compute_multi_epi_mask does not resample data

If one wants to compute a mask over several subjects with differents affines, the NiftiMultiMasker cannot do it.

There are two solution for this problem:

  • NiftiMultiMasker.fit() could resample the data. This would be computationnaly expensive but there is a high probability that these data will be passed to transform after that.
  • compute_multi_epi_mask could compute the mean epis, then resample them (which is not costly) and then compute the group mask.

I think the second solution is the best even if it implies adding options to compute_multi_epi_mask.

What do you think ?

n_jobs in NiftiMultiMasker

The transform of NiftiMultiMasker could really use parallelism, as it is CPU bound. The object should take an 'n_jobs' as an init parameter.

This should be done only after the merge is over.

Also in multiple session compute mask function...

ENH: Use NeuroDebian datasets

NeuroDebian provides datasets that can be installed as packages (haxby2001-faceobject for example).

By default, Nisl should be able to browse the folder where these data are store (/usr/share/data on my computer) to load them if they are present.

Caching new API

I suggest the following new API for the recursive caching patterns:

  • A wrapper for joblib.Memory (NiLearnMemory):
    • attributes: memory_level (integer), mem (joblib.Memory)
    • methods:
      • make_child that returns an object of the same class with the same mem and a memory_level smaller (where the decrement would be an optional argument of the method)
      • cache that has the same signature as Memory.cache, but with an additional optional level parameter. That parameter is compared to the memory_level of the object to know whether to cache or not.
  • A constructor check_memory that can take a string or a joblib.Memory or a NiLearnMemory object (with optional memory_level) and always returns a NiLearnMemory object.

The combination of check_memory and 'mem.make_child' should cover all our needs in NiLearn and enable us to remove the 'memory_level' parameters and 'cache' function calls in our code.

Create glossary in documentation

Nilearn contains a lot of concepts that need to be explained (e.g. niimg(s)). Adding a glossary to the documentation could be a good idea.

This is also a mean to improve the naming conventions in the code.

Change the buttons on the top

The buttons on the top should be:

  • Data manipulation
  • Supervised learning
  • Unsupervised learning
  • Examples

pointing to the respective parts of the tutorial. @jaquesgrobler, I think that you now this aspect of the doc-generate reasonnably well. Can you take care of this?

Haxby dataset as fetched doesn't have the same fields as tutorial says, sec 2.1 doesn't work

I just tried to run the first cell of Sec 2.1, but I'm getting this error:

In [8]:

from nisl import datasets
haxby_data = datasets.fetch_haxby()
# The data is then already loaded as numpy arrays:
haxby_data.keys() 
haxby_data.data.shape
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-8-6127e8737e4d> in <module>()
      3 # The data is then already loaded as numpy arrays:
      4 haxby_data.keys()
----> 5 haxby_data.data.shape

AttributeError: 'Bunch' object has no attribute 'data'

That's because in fact, the haxby_data object has different fields from what the docs say:

In [9]:

haxby_data
Out[9]:
{'conditions_target': '/home/fperez/talks/slides/1210_pycon_canada/nisl_data/haxby2001/pymvpa-exampledata/attributes_literal.txt',
 'func': '/home/fperez/talks/slides/1210_pycon_canada/nisl_data/haxby2001/pymvpa-exampledata/bold.nii.gz',
 'mask': '/home/fperez/talks/slides/1210_pycon_canada/nisl_data/haxby2001/pymvpa-exampledata/mask.nii.gz',
 'session_target': '/home/fperez/talks/slides/1210_pycon_canada/nisl_data/haxby2001/pymvpa-exampledata/attributes.txt'}

Unfortunately I need to run out to catch a flight and can't debug this one now, but I figured at least I'd report it... This is running on a fresh install of nisl from just a moment ago.

directories io and decomposition are not installed

because they do not appear on setup.py. Fix could be something like

diff --git a/setup.py b/setup.py
index 621c95c..17bccf4 100644
--- a/setup.py
+++ b/setup.py
@@ -30,6 +30,8 @@ def configuration(parent_package='', top_path=None):
     config = Configuration(None, parent_package, top_path)

     config.add_subpackage('nisl')
+    config.add_subpackage('nisl/io')
+    config.add_subpackage('nisl/decomposition')

     return config

test_canica_square_img fails randomly

I took a look at the test: actually the ICA estimate of the ICs is really poor: it fails to recover the components, and actually gets linear combinations of these (meaning that the failure is related to the ICA part). The initial test is quite robust to this effect, but not enough, at least on my box.
Using

rng = np.random.RandomState(1)

in the test solves the issue on my box, but I don't find it satisfactory.
Possible solutions are:

  • test that the right subspace is recovered (instead of the right components)
  • Make the fit more robust, by breaking the symmetry of the components (seems to work on my box).

People mistake our site for scikit-learn's

As our template is based on the scikit-learn one (we even kept the logo), a user asked a question about nilearn on the scikit-learn mailing list... I answered him in private and apologize but this should not happen again.

I have quickly changed nilearn logo and switched some colors in the CSS to break the likeness with the scikit learn coloring scheme (as an emergency action). I think that a brand new website is under construction. What should we do in the meantime ?

Numpy 1.4 required

In plot_haxby_decoding.py, numpy.in1d is called. This function has been added to numpy 1.4.0 but numpy >= 1.3.0 is required for the scikit-learn. Should we require numpy 1.4.0 ?

Related numpy man page : http://docs.scipy.org/doc/numpy/reference/generated/numpy.in1d.html

Traceback (most recent call last):
  File "plot_haxby_decoding.py", line 44, in <module>
    condition_mask = np.in1d(conditions, ('face', 'house'))
AttributeError: 'module' object has no attribute 'in1d'

plot_simulated_data.py takes one hour to finish

3822.42 seconds in my machine. Mostly because of Searchlight.

In any case this makes it really painful to build the doc and to the user it looks as if the example was broken (there is no output whatsoever for one hour).

This is blocking to have the nightly documentation builds.

Validate the behavior of MultiPCA if a MultiNiftiMasker is given

Formerly, if a MultiNiftiMasker was given to MultiPCA, along with masker arguments (smoothing_fwhm...), then parameters of the MultiNiftiMasker were overriden by the ones given to MultiPCA.

After discussion with Philippe, we find much more consistant to ignore masker parameters given to MultiPCA in this case (if the user gives us a masker, we assume that he knows what he's doing).

This is a thread to discuss and validate this new behavior.

In any case, the code must be cleaned as, for the moment, the user is warned that memory and memory_level are ignored, which should not be the case.

Sources don't copile (missing conf.py)

(p26)fabian@:nisl-tutorial(master)$ make html
sphinx-build -b html -d ../webpage/doctrees . ../webpage/html
Error: Source directory doesn't contain conf.py file.
make: *** [html] Error 1

Nifti1Image breaks joblib caching system

When a Nifti1Image is passed to a function, its representation contains a pointer to data in memory and therefore breaks the joblib cache system.

This is not really a joblib issue, nor a nibabel one, this is why I open it here.

NiftiMasker should expose a 'mask_strategy' option

The current heuristic for computing a mask works well only if the data is raw EPI. If it is an activation map, we want a different heuristic, such as 'threshold', where the mask is given by the values that are above mask_lower_cutoff.

I think that we should introduce yet another option 'mask_strategy' that could be 'epi', for the current behavior, or 'threshold', and maybe later other options.

Haxby cropped : propagate modifications

In plot_haxby_decoding, some assertions are given about Haxby dataset shape in comments :

# fmri_data.shape is (40, 64, 64, 1452)
# and mask.shape is (40, 64, 64)

These assertions are now untrue because Haxby dataset has been cropped.

In [16]: fmri_data.shape
Out[16]: (40, 49, 41, 1452)

We should propagate these modifications and make it a doctest or a real assertion to avoid further problems.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.