-
Notifications
You must be signed in to change notification settings - Fork 530
0.4 brainstorming
1. Git workflow: getting on the same page (we should have a solid model and possibly extend gitwash to reflect our model of development)
a. who has merge rights to nipy/nipype master?
b. who has merge rights to nipy/nipype maint/0.3
c. gitwash:
– add info on how to create diff for easy review on github
– who to review
– who to send pull request to (dependency on which brainch it will merge into)?
– adding tools/suggestions from recent posts to nipy
– clear guideline about which branch (master or maint/0.3) to fork. (API changes vs API extensions.)
d. Use of common tag in commit messages : eg BUG, ENH, DOC, FA
2. – nipy umbrella integration
a. integrate other nipy projects with nipype
Examples: - nibabel for auto file-format conversion between nodes - nipy for model specification and estimation (I have started doing this (and got sidetracked): http://github.com/chrisfilo/nipype/tree/nipy_glm) - nitime for ‘resting’ connectivity analysis - dipy nodes for clusteringfigure out dependencies of nipy-suite as yarik is aggregating different projects
3. – architecture: open problems, eeky solutions
a. adding functionality to traits, adding more logic cases
handling more complex command line parsing
handling file manipulation
b.logging
c.adding setup.py to testing directories
d. traits features/bugs/incompatibilities (eeky solutions):
1. dynamic traits are not pickled properly (CG: And not deepcopied properly either).
2. metadata are class specific, not trait instance specific
3. graphical traits do not take undefined as a state
e. should interface and workflow be separate? (CG: You mean Interface and Node? If so 1)
f. automated test against different versions of softwaredependencies on different platforms?
(CG: Let’s start with automated tutorial running.)
g. integration with ipython 0.11, connectome viewer, slicer
4. – interfaces:
part 1. metadata + api stabilization + tests
a. what is the set of metadata a trait can have and what are their defaults (if not none)
b. additional logical metadata: e.g., requiresoneof
c. is the api stable?
(CG: No – we need to once again think about unifying inputs and outputs names.)
d. standardized unit testing for interfaces.
part 2. additional interfaces
a. finish wrapping freesurfer, spm, afni and fsl interfaces (complete support by 0.4)
(CG: What are we missing from SPM? And FSL (apart from TBSS)?)
(MLW: There are many little FSL utilities that would be helpful, but it would be a huge project to wrap them all.
In terms of TBSS, I think we should just put together workflows that replicate the TBSS shell script progression,
rather than wrap the shell scripts themselves. This is something I’ve been planning on doing for a bit, we just
haven’t focused on TBSS stuff yet so I haven’t been able to find the time.)
“Complete” Freesurfer support is also probably impossible (again, there are >300 or so FS binaries), but I can
continue working on the various important surface utilities.
b. There should be heuristic interfaces for some of the more common things fslmaths does — ie temporal filter,
isotropic smoothing, etc. That would let us be a little bit more transparent about the inputs, and it would make existing
workflows easier to follow. – MLW
c. wrap ANTS, Slicer, PyMVPA, PyXNAT (MLW Which PyMVPA? that could be tricky)
(CG: Slicer is basically ready… when Enthought will fix pickle and deepcopy bugs :/)
d. wrap “canica” for ica analysis
5. – workflow:
a. relocation handling, relative paths
b. add loops and conditionals on workflows
c. graphical interface/alternative (vistrails)
(CG: as an inspiration have a look at this)
d. metadata generation and extraction for inserting into databases (enhanced provenance tracking)
e. xnat/hid integration
f. backward compatibility, seamless upgrade of processed workflows with new versions of nipype
g. progress tracking across distributed computation (enhanced diagnostic and reporting)
(CG: even for local execution.)
h. centralized common workflows (e.g., see mindflows)
(CG: community portal)
i. web interface for canned workflows (e.g., hospital environments)
j. support for semi-automatic pipelines (when some steps need to be
done manually).
g. it really needs to be a lot easier to datasink things in a clean way. as it is now, something like
half of my script code is devoted to the generation of datasink directories that people unfamiliar with
my workflows can understand. – MLW