nipy / nipype Goto Github PK
View Code? Open in Web Editor NEWWorkflows and interfaces for neuroimaging packages
Home Page: https://nipype.readthedocs.org/en/latest/
License: Other
Workflows and interfaces for neuroimaging packages
Home Page: https://nipype.readthedocs.org/en/latest/
License: Other
When I want to access the bias_field and restored_image outputs of a FAST node I get nothing.
It looks like it searches for restore%i and bias%i files. However, there are no numbered outputs. Instead FAST produces _restore and _bias files. I did not test this with multi-domain input (e.g. T1 and T2), but for a single T1 input this is the case.
After pipeline execution or even when it is still running user would like to be able to have a quick glance on what where the outputs of the intermediate stages. An online reporting system could provide with such information. A simple solution would consist of generating an auto updating html page (similar to FSL approach).
Doing make -C doc pdf results in
...
pdfTeX warning: pdflatex (file ./inheritance-a6ea610817660597b4a97d1eef82a61a61
49018b.pdf): PDF inclusion: Page Group detected which pdfTeX can't handle. Igno
ring it.
] [71] [72] [73]
Underfull \hbox (badness 10000) in paragraph at lines 5250--5252
[]\T1/ptm/m/n/10 key, value pairs that will up-date the Es-ti-mate-Model.inputs
at-tributes see
[74] [75]
! LaTeX Error: Too deeply nested.
See the LaTeX manual or LaTeX Companion for explanation.
Type H for immediate help.
...
l.5520 \begin{description}
attaching that .tex file (if I targetted it correctly).
I have python-sphinx 0.6.3-1, nipype rev 647
Change afni class names to CamelCase to match Python Style Guide
os.path.relpath appears in python 2.6
and is used eg. in nipype/pipeline/utils.py
Pipelines therefore fails on my 2.5 install.
If possible, can we query the underlying packages (SPM, FSL, Freesurfer...) for default values on our inputs? This may not be possible.
so, trying to run example1.py, didn't setup my MATLABPATH to point to spm, so it is not available... running example simply stalls after puking little error which is barely detectable among the other output lines
$> python example1.py
PE: checking connections:
PE: finished checking connections
No clients found. running serially
Executing: Realign.spm H: c49a2f44ee83424c23f90790042e55b0
directory /home/yoh/proj/nipy/nipype.gitsvn/doc/examples/test3/Realign.spm exists
continuing to execute
copying /home/yoh/proj/nipy/nipype.gitsvn/doc/examples/data/s1/f3.nii to /home/yoh/proj/nipy/nipype.gitsvn/doc/examples/test3/Realign.spm/f3.nii
option cwd not supported
doing quick look:
$> strace -p 27588
Process 27588 attached - interrupt to quit
select(6, [3 5], [], [], NULL
C <unfinished ...>
Process 27588 detached
$> ls -l /proc/27588/fd/6
lr-x------ 1 yoh yoh 64 2009-10-07 09:43 /proc/27588/fd/6 -> pipe:[2800813]
So it waits for input which would never come? ;)
Here is a traceback I got after hitting Ctrl-C
Traceback (most recent call last):
File "example1.py", line 37, in <module>
pipeline.run()
File "/home/yoh/proj/nipy/nipype.gitsvn/install/lib/python2.5/site-packages/nipype/pipeline/engine.py", line 221, in run
self.run_in_series()
File "/home/yoh/proj/nipy/nipype.gitsvn/install/lib/python2.5/site-packages/nipype/pipeline/engine.py", line 252, in run_in_series
node.run()
File "/home/yoh/proj/nipy/nipype.gitsvn/install/lib/python2.5/site-packages/nipype/pipeline/node_wrapper.py", line 135, in run
self._run_interface(execute=True)
File "/home/yoh/proj/nipy/nipype.gitsvn/install/lib/python2.5/site-packages/nipype/pipeline/node_wrapper.py", line 214, in _run_interface
self._result = self._interface.run()
File "/home/yoh/proj/nipy/nipype.gitsvn/install/lib/python2.5/site-packages/nipype/interfaces/spm.py", line 125, in run
results = super(SpmMatlabCommandLine?,self).run()
File "/home/yoh/proj/nipy/nipype.gitsvn/install/lib/python2.5/site-packages/nipype/interfaces/matlab.py", line 66, in run
results = self._runner()
File "/home/yoh/proj/nipy/nipype.gitsvn/install/lib/python2.5/site-packages/nipype/interfaces/base.py", line 273, in _runner
runtime.stdout, runtime.stderr = proc.communicate()
File "/usr/lib/python2.5/subprocess.py", line 670, in communicate
return self._communicate(input)
File "/usr/lib/python2.5/subprocess.py", line 1213, in _communicate
rlist, wlist, xlist = select.select(read_set, write_set, [])
Freesurfer's bbregister program generates a file called register.dat.mincost (where register.dat is whatever name is used for the transform matrix that gets created) that includes an estimate of registration performance, intended to guide the user to registrations that might need fixing. It would be good to have this included in the BBRegister class outputs, as the parsing of that file can then be easily automated to save time/encourage good quality control.
this will return a path/filename without checking that it exists
change fname_pressufix in list_outputs to use newpath=cwd
prior to pickling outputs, one could change working directory without having to rerun already processed nodes. because the pickled file stores an absolute path, this will now return erroneous output.
one solution would be to move towards relative paths within the workflow.
While the other interfaces use in_file, AFNI still has infile
Since it would break API, I guess it would be logical to aim for 0.4.0, thus I added a new label (hope it is ok)
$> git grep '\Winfile.*File'
doc/devel/interface_specs.rst: infile = File(exists=True,
nipype/interfaces/afni/preprocess.py: infile = File(desc = 'input file to 3drefit',
nipype/interfaces/afni/preprocess.py: infile = File(desc = 'input file to 3dresample',
nipype/interfaces/afni/preprocess.py: infile = File(desc = 'input file to 3dTstat',
nipype/interfaces/afni/preprocess.py: infile = File(desc = 'input file to 3dAutomask',
nipype/interfaces/afni/preprocess.py: infile = File(desc = 'input file to 3dvolreg',
nipype/interfaces/afni/preprocess.py: infile = File(desc = 'input file to 3dvolreg',
nipype/interfaces/afni/preprocess.py: infile = File(desc = 'input file to 3dZcutup',
nipype/interfaces/afni/preprocess.py: infile = File(desc = 'input file to 3dAllineate',
E.g. if it's a file, is it called xxxfile? Or just xxx?
This is currently being done haphazardly even within individual classes, such as Flirt in fsl.
I think we should do this early and do it once - it will cause serious breakage of existing pypelines.
It looks like this could be a simple renaming bugfix export_graph -> write_graph. However, export_graph from pipeline.utils has e.g. show= that is not available in write_graph, hence not a drop in replacement. If somebody could clarify this issue I can provide a tentative fix.
Thanks
the inputs and outputs in mcflirt doesn't have desc metadata
The following afni classes have no tests:
http://cran.r-project.org/web/packages/fmri/index.html
http://cran.r-project.org/web/packages/dti/index.html
especially adaptive smoothing
use the nibabel guidelines to generate version info and mvpa options to generate package info
Add some regression testing on our tutorials so we're sure they continue to work.
modelfit.inputs.fsfdesign.register = True
level1design does not have register attribute anymore
As written, the docstring example for the SegStats interface class in freesurfer.model won't work. To use the avgwf_txt_file input, you need an in_file (which should be a functional or similar image in the same space as your seg or annot file).
When a non-file (i. e. integer, string, array etc) output is passed from one node to another it is not saved anywhere. Therefore upon rerunning a workflow things break.
The way it is done right now limits extensibility (imagine someone inheriting from our SPMCommand and trying to set m-file style).
If you try to connect a the output of MapNode to a DataSink, it crashes when it tries to do os.path manipulation on the input and gets a list. DataSink itself can't be a MapNode (I don't think -- because it assigns the iterfield before the dynamic input traits are created) so it's impossible to save the output of certain (lots) of nodes.
A->B->C
if B is an identity node that passes the inputs from A to C, A currently will get deleted after B has finished executing (under the configurable option), but C still needs the files from A.
since the underlying source files in the filesystem can change independent of the workflow, datagrabber should always run even if it's inputs haven't changed.
My interface has two mutually exclusive but mandatory inputs, how can traits handle this?
do we have a mandatory_xor logic for our traits
nipype (both in parallel and serial modes) can be a real memory hog for big jobs (thousands of nodes). It tends to accumulate more and more memory during the execution. I believe that storing too much debugging data might be a problem (like keeping all of the node objects). Ideally after successful execution everything that will not be used should be removed from memory.
FNIRT needs to be reorganized. it has a fair bit of unused code. may want to check if parse_inputs really needs to be overwritten.
There really should be an interface for the mri_surfcluster program, which is used to do multiple comparisons correction on the output of mri_glmfit (if using surfaces; mri_volcluster is the volume equivalent) as well as generating cluster summary tables. Of course, this will require mri_glmfit to have outputs.
... or break them down to basic components
Some node creates output files that are not used as input files in any other node of a workflow. To be able to efficiently recompute workflows when parts of the data change it is necessary to keep the workflow working directory. But the directory content might be a lot heavier than what is necessary to avoid unnecessary recomputing. Please add an option/command that strips all pieces from the working directory that are irrelevant for recomputing parts (or the full) workflow.
Thanks for your consideration,
Michael
in several interfaces _list_outputs() (former aggregate_outputs()) use glob and "*". We should infer the number of expected files (betas, contrasts etc.) without querying the filesystem.
In fsl_tutorial2, the inputs to the coregister node are reversed. The functional is used at the reference file, and the structural is used as the "in_file". I'm pretty sure that's wrong..
register nipype with pypi. use setup.py info from nitime+ipython
If I call e.g. BET with an improper option the actual call to BET fails, but the nipype line in Python returns without an error. I find this confusing, since it seems like an indication that everything it ok -- which it isn't.
If I call a Python function improperly -- it fails. If I want to create a file in Python in a place where it is not allowed/possible -- it fails. If I make any tool (via a nipype interface) crash/fail (maybe because it cannot create a file for the same reason as before) -- it doesn't fail -- but I cannot see the difference of the last usecase from any of the former.
Thanks for your consideration,
Michael
SUSAN (FSL's smoothing interface) seems to always rerun, even when none of its inputs have changed. This cascades and forces rerunning of the model.
create a level1design node for freesurfer that allows running volume/surface based estimation at the first level using mri_glmfit
This is pretty small-bore, but I just spent a while getting confused by this. ApplyVolTransform writes a *_warped.nii image, but the underlying program only deals with linear/affine transforms, not warps, so this is pretty misleading.
connecting twice to the same input is in 99% case a bug in the pipeline definition
Currently when two pure python nodes (like nipy GLM fit and contrast estimate) are connected (A->B) node A saves the data to file and node B reads it. The reading phase could be skipped by sending the data directly to node B. This, however, creates some problems:
I'm pretty sure, anyway. The Datasink docstring says to use e.g. contrasts@con
to stick you contrast images in a contrasts
directory. But when I use that, I get a contrasts@con
directory with my contrast images in it. Doing contrasts.@con
, on the other hand, gives me the behavior I expect. This is also what's done in the tutorials. So either I'm totally confused, or that docstring is wrong.
I'm using interfaces, but not in the context of a workflow, and I'd like to set the fieldcoeff_file trait with the name of the file I would like to write. After fnirt runs though, it raies a FileNotFound exception looking for the default filename (ie, if I'm passing in T1.nii.gz, it's looking for T1_fieldwarp.nii.gz). The fieldwarp file has in fact been written, but with the name I asked for not the default name. I think this is a bug, or it could just be a result of the traits transition I haven't seen before.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.