As you learned in the Workflow tutorial, a workflow is executed with the run
method. For example:
workflow.run()
Whenever you execute a workflow like this, it will be executed in serial order. This means that no node will be executed in parallel, even if they are completely independent of each other. Now, while this might be preferable under certain circumstances, we usually want to executed workflows in parallel. For this, Nipype provides many different plugins.
MultiProc
PluginThe easiest way to executed a workflow locally in parallel is the MultiProc
plugin:
workflow.run(plugin='MultiProc', plugin_args={'n_procs': 4})
The additional plugin argument n_procs
, specifies how many cores should be used for the parallel execution. In this case, it's 4.
The MultiProc
plugin uses the multiprocessing package in the standard library, and is the only parallel plugin that is guaranteed to work right out of the box.
There are many different plugins to run Nipype on a cluster, such as: PBS
, SGE
, LSF
, Condor
and IPython
. Implementing them is as easy as 'MultiProc'
.
workflow.run('PBS', plugin_args={'qsub_args': '-q many'})
workflow.run('SGE', plugin_args={'qsub_args': '-q many'})
workflow.run('LSF', plugin_args={'qsub_args': '-q many'})
workflow.run('Condor')
workflow.run('IPython')
workflow.run('PBSGraph', plugin_args={'qsub_args': '-q many'})
workflow.run('SGEGraph', plugin_args={'qsub_args': '-q many'})
workflow.run('CondorDAGMan')
For a complete list and explanation of all supported plugins, see: http://nipype.readthedocs.io/en/latest/users/plugins.html