Data input for BIDS datasets

DataGrabber and SelectFiles are great if you are dealing with generic datasets with arbitrary organization. However if you have decided to use Brain Imaging Data Structure (BIDS) to organized your data (or got your hands on a BIDS dataset) you can take advanted of a formal structure BIDS imposes. In this short tutorial you will learn how to do this.

pybids - a Python API for working with BIDS datasets

pybids is a lightweight python API for querying BIDS folder structure for specific files and metadata. You can install it from PyPi:

pip install pybids

Please note it should be already installed in the tutorial Docker image.

The layout object and simple queries

To begin working with pubids we need to initalize a layout object. We will need it to do all of our queries


In [1]:
from bids.grabbids import BIDSLayout
layout = BIDSLayout("/data/ds102/")

In [2]:
!tree /data/ds102/


/data/ds102/
├── CHANGES
├── dataset_description.json
├── participants.tsv
├── README
├── sub-01
│   ├── anat
│   │   └── sub-01_T1w.nii.gz
│   └── func
│       ├── sub-01_task-flanker_run-1_bold.nii.gz
│       ├── sub-01_task-flanker_run-1_events.tsv
│       ├── sub-01_task-flanker_run-2_bold.nii.gz
│       └── sub-01_task-flanker_run-2_events.tsv
├── sub-02
│   ├── anat
│   │   └── sub-02_T1w.nii.gz
│   └── func
│       ├── sub-02_task-flanker_run-1_bold.nii.gz
│       ├── sub-02_task-flanker_run-1_events.tsv
│       ├── sub-02_task-flanker_run-2_bold.nii.gz
│       └── sub-02_task-flanker_run-2_events.tsv
├── sub-03
│   ├── anat
│   │   └── sub-03_T1w.nii.gz
│   └── func
│       ├── sub-03_task-flanker_run-1_bold.nii.gz
│       ├── sub-03_task-flanker_run-1_events.tsv
│       ├── sub-03_task-flanker_run-2_bold.nii.gz
│       └── sub-03_task-flanker_run-2_events.tsv
├── sub-04
│   ├── anat
│   │   └── sub-04_T1w.nii.gz
│   └── func
│       ├── sub-04_task-flanker_run-1_bold.nii.gz
│       ├── sub-04_task-flanker_run-1_events.tsv
│       ├── sub-04_task-flanker_run-2_bold.nii.gz
│       └── sub-04_task-flanker_run-2_events.tsv
├── sub-05
│   ├── anat
│   │   └── sub-05_T1w.nii.gz
│   └── func
│       ├── sub-05_task-flanker_run-1_bold.nii.gz
│       ├── sub-05_task-flanker_run-1_events.tsv
│       ├── sub-05_task-flanker_run-2_bold.nii.gz
│       └── sub-05_task-flanker_run-2_events.tsv
├── T1w.json
└── task-flanker_bold.json

15 directories, 31 files

Let's figure out what are the subject labels in this dataset


In [3]:
layout.get_subjects()


Out[3]:
['01', '02', '03', '04', '05']

What modalities are included in this dataset?


In [4]:
layout.get_modalities()


Out[4]:
['anat', 'func']

What different data types are included in this dataset?


In [5]:
layout.get_types()


Out[5]:
['bold', 'events', 'T1w']

In [6]:
layout.get_types(modality='func')


Out[6]:
['bold', 'events']

What are the different tasks included in this dataset?


In [7]:
layout.get_tasks()


Out[7]:
['flanker']

We can also ask for all of the data for a particular subject.


In [8]:
layout.get(subject='01')


Out[8]:
[File(filename='/data/ds102/sub-01/anat/sub-01_T1w.nii.gz', type='T1w', modality='anat', subject='01'),
 File(filename='/data/ds102/sub-01/func/sub-01_task-flanker_run-1_bold.nii.gz', type='bold', task='flanker', run='1', modality='func', subject='01'),
 File(filename='/data/ds102/sub-01/func/sub-01_task-flanker_run-1_events.tsv', type='events', task='flanker', run='1', modality='func', subject='01'),
 File(filename='/data/ds102/sub-01/func/sub-01_task-flanker_run-2_bold.nii.gz', type='bold', task='flanker', run='2', modality='func', subject='01'),
 File(filename='/data/ds102/sub-01/func/sub-01_task-flanker_run-2_events.tsv', type='events', task='flanker', run='2', modality='func', subject='01')]

We can also ask for a specific subset of data. Note that we are using extension filter to get just the imaging data (BIDS allows both .nii and .nii.gz so we need to include both).


In [9]:
layout.get(subject='01', type='bold', extensions=['nii', 'nii.gz'])


Out[9]:
[File(filename='/data/ds102/sub-01/func/sub-01_task-flanker_run-1_bold.nii.gz', type='bold', task='flanker', run='1', modality='func', subject='01'),
 File(filename='/data/ds102/sub-01/func/sub-01_task-flanker_run-2_bold.nii.gz', type='bold', task='flanker', run='2', modality='func', subject='01')]

You probably noticed that this method does not only return the file paths, but objects with relevant query fields. We can easily extract just the file paths.


In [10]:
[f.filename for f in layout.get(subject='01', type='T1w', extensions=['nii', 'nii.gz'])]


Out[10]:
['/data/ds102/sub-01/anat/sub-01_T1w.nii.gz']

Exercise 1:

List all of the BOLD files for flanker task for subject 03, but only from the second run

Including pybids in your nipype workflow

This is great, but what we really want is to include this into our nipype workflows. How to do this? We can create our own custom BIDSDataGrabber using a Function Interface. First we need a plain Python function that for a given subject label and dataset location will return list of BOLD and T1w files.


In [11]:
def get_niftis(subject_id, data_dir):
    # Remember that all the necesary imports need to be INSIDE the function for the Function Interface to work!
    from bids.grabbids import BIDSLayout
    
    layout = BIDSLayout(data_dir)
    
    bolds = [f.filename for f in layout.get(subject=subject_id, type='bold', extensions=['nii', 'nii.gz'])]
    
    return bolds

In [12]:
get_niftis('01', '/data/ds102')


Out[12]:
['/data/ds102/sub-01/func/sub-01_task-flanker_run-1_bold.nii.gz',
 '/data/ds102/sub-01/func/sub-01_task-flanker_run-2_bold.nii.gz']

Ok we got our function. Now we need to wrap it inside a Node object.


In [13]:
from nipype.pipeline import Node, MapNode, Workflow
from nipype.interfaces.utility import IdentityInterface, Function

In [14]:
BIDSDataGrabber = Node(Function(function=get_niftis, input_names=["subject_id",
                                       "data_dir"],
                                   output_names=["bolds", 
                                        "T1ws"]), name="BIDSDataGrabber")
BIDSDataGrabber.inputs.data_dir = "/data/ds102"

In [15]:
BIDSDataGrabber.inputs.subject_id='01'
res = BIDSDataGrabber.run()
res.outputs


170325-00:11:17,133 workflow INFO:
	 Executing node BIDSDataGrabber in dir: /tmp/tmpIvJ0lL/BIDSDataGrabber
Out[15]:
T1ws = /data/ds102/sub-01/func/sub-01_task-flanker_run-2_bold.nii.gz
bolds = /data/ds102/sub-01/func/sub-01_task-flanker_run-1_bold.nii.gz

Works like a charm! (hopefully :) Lets put it in a workflow. We are not going to analyze any data, but for demostrantion purposes we will add a couple of nodes that pretend to analyze their inputs


In [16]:
def printMe(paths):
    print("\n\nanalyzing " + str(paths) + "\n\n")
    
analyzeBOLD = Node(Function(function=printMe, input_names=["paths"],
                            output_names=[]), name="analyzeBOLD")

In [17]:
wf = Workflow(name="bids_demo")
wf.connect(BIDSDataGrabber, "bolds", analyzeBOLD, "paths")
wf.run()


170325-00:11:17,378 workflow INFO:
	 Workflow bids_demo settings: ['check', 'execution', 'logging']
170325-00:11:17,610 workflow INFO:
	 Running serially.
170325-00:11:17,612 workflow INFO:
	 Executing node BIDSDataGrabber in dir: /tmp/tmpgVOH_Z/bids_demo/BIDSDataGrabber
170325-00:11:17,676 workflow INFO:
	 Executing node analyzeBOLD in dir: /tmp/tmpoesKqC/bids_demo/analyzeBOLD


analyzing /data/ds102/sub-01/func/sub-01_task-flanker_run-1_bold.nii.gz


Out[17]:
<networkx.classes.digraph.DiGraph at 0x7efda2cc9a10>

Exercise 2:

Modify the BIDSDataGrabber and the workflow to include T1ws.

Iterating over subject labels

In the previous example we demostrated how to use pybids to "analyze" one subject. How can we scale it for all subjects? Easy - using iterables.


In [18]:
BIDSDataGrabber.iterables = ('subject_id', layout.get_subjects())
wf.run()


170325-00:11:17,770 workflow INFO:
	 Workflow bids_demo settings: ['check', 'execution', 'logging']
170325-00:11:17,973 workflow INFO:
	 Running serially.
170325-00:11:17,977 workflow INFO:
	 Executing node BIDSDataGrabber.aI.a0 in dir: /tmp/tmppeNgld/bids_demo/_subject_id_01/BIDSDataGrabber
170325-00:11:18,55 workflow INFO:
	 Executing node analyzeBOLD.a0 in dir: /tmp/tmpQkxaL3/bids_demo/_subject_id_01/analyzeBOLD


analyzing /data/ds102/sub-01/func/sub-01_task-flanker_run-1_bold.nii.gz


170325-00:11:18,128 workflow INFO:
	 Executing node BIDSDataGrabber.aI.a4 in dir: /tmp/tmpZ3qzlP/bids_demo/_subject_id_05/BIDSDataGrabber
170325-00:11:18,220 workflow INFO:
	 Executing node BIDSDataGrabber.aI.a3 in dir: /tmp/tmpXC27i8/bids_demo/_subject_id_04/BIDSDataGrabber
170325-00:11:18,307 workflow INFO:
	 Executing node analyzeBOLD.a3 in dir: /tmp/tmp1lKN4t/bids_demo/_subject_id_04/analyzeBOLD


analyzing /data/ds102/sub-04/func/sub-04_task-flanker_run-1_bold.nii.gz


170325-00:11:18,355 workflow INFO:
	 Executing node BIDSDataGrabber.aI.a1 in dir: /tmp/tmpMNKmQZ/bids_demo/_subject_id_02/BIDSDataGrabber
170325-00:11:18,448 workflow INFO:
	 Executing node analyzeBOLD.a1 in dir: /tmp/tmpFEgJeg/bids_demo/_subject_id_02/analyzeBOLD


analyzing /data/ds102/sub-02/func/sub-02_task-flanker_run-1_bold.nii.gz


170325-00:11:18,512 workflow INFO:
	 Executing node BIDSDataGrabber.aI.a2 in dir: /tmp/tmpp4stHi/bids_demo/_subject_id_03/BIDSDataGrabber
170325-00:11:18,617 workflow INFO:
	 Executing node analyzeBOLD.a2 in dir: /tmp/tmpVxCtOX/bids_demo/_subject_id_03/analyzeBOLD


analyzing /data/ds102/sub-03/func/sub-03_task-flanker_run-1_bold.nii.gz


170325-00:11:18,673 workflow INFO:
	 Executing node analyzeBOLD.a4 in dir: /tmp/tmpZkO66V/bids_demo/_subject_id_05/analyzeBOLD


analyzing /data/ds102/sub-05/func/sub-05_task-flanker_run-1_bold.nii.gz


Out[18]:
<networkx.classes.digraph.DiGraph at 0x7efda2d5c4d0>

Accessing additional metadata

Querying different files is nice, but sometimes you want to access more metadata. For example RepetitionTime. pybids can help with that as well


In [19]:
layout.get_metadata('/data/ds102/sub-01/func/sub-01_task-flanker_run-1_bold.nii.gz')


Out[19]:
{u'CogAtlasID': u'tsk_4a57abb949a4f',
 u'EchoTime': 0.03,
 u'FlipAngle': 80.0,
 u'MRAcquisitionType': u'2D',
 u'MagneticFieldStrength': 3.0,
 u'Manufacturer': u'Siemens',
 u'ManufacturerModelName': u'Allegra',
 u'RepetitionTime': 2.0,
 u'ScanningSequence': u'Echo Planar',
 u'TaskDescription': u'On each trial (inter-trial interval (ITI) varied between 8 s and 14 s; mean ITI=12 s),participants used one of two buttons on a response pad to indicate the direction of a central arrow in an array of 5 arrows. In congruent trials the flanking arrows pointed in the same direction as the central arrow (e.g., < < < < <), while in more demanding incongruent trials the flanking arrows pointed in the opposite direction (e.g., < < > < <). Subjects performed two 5-minute blocks, each containing 12 congruent and 12 incongruent trials, presented in a pseudorandom order.',
 u'TaskName': u'Flanker'}

Can we incorporate this into our pipeline? Yes we can!


In [20]:
def printMetadata(path, data_dir):
    from bids.grabbids import BIDSLayout
    layout = BIDSLayout(data_dir)
    print("\n\nanalyzing " + path + "\nTR: "+ str(layout.get_metadata(path)["RepetitionTime"]) + "\n\n")
    
analyzeBOLD2 = MapNode(Function(function=printMetadata, input_names=["path", "data_dir"],
                             output_names=[]), name="analyzeBOLD2", iterfield="path")
analyzeBOLD2.inputs.data_dir = "/data/ds102/"

In [21]:
wf = Workflow(name="bids_demo")
wf.connect(BIDSDataGrabber, "bolds", analyzeBOLD2, "path")
wf.run()


170325-00:11:18,852 workflow INFO:
	 Workflow bids_demo settings: ['check', 'execution', 'logging']
170325-00:11:19,95 workflow INFO:
	 Running serially.
170325-00:11:19,98 workflow INFO:
	 Executing node BIDSDataGrabber.aI.a4 in dir: /tmp/tmpQWOjgy/bids_demo/_subject_id_05/BIDSDataGrabber
170325-00:11:19,212 workflow INFO:
	 Executing node analyzeBOLD2.a4 in dir: /tmp/tmpcTFJM5/bids_demo/_subject_id_05/analyzeBOLD2
170325-00:11:19,227 workflow INFO:
	 Executing node _analyzeBOLD20 in dir: /tmp/tmpcTFJM5/bids_demo/_subject_id_05/analyzeBOLD2/mapflow/_analyzeBOLD20


analyzing /data/ds102/sub-05/func/sub-05_task-flanker_run-1_bold.nii.gz
TR: 2.0


170325-00:11:19,340 workflow INFO:
	 Executing node BIDSDataGrabber.aI.a0 in dir: /tmp/tmpGppMhP/bids_demo/_subject_id_01/BIDSDataGrabber
170325-00:11:19,433 workflow INFO:
	 Executing node analyzeBOLD2.a0 in dir: /tmp/tmp6MPMHH/bids_demo/_subject_id_01/analyzeBOLD2
170325-00:11:19,451 workflow INFO:
	 Executing node _analyzeBOLD20 in dir: /tmp/tmp6MPMHH/bids_demo/_subject_id_01/analyzeBOLD2/mapflow/_analyzeBOLD20


analyzing /data/ds102/sub-01/func/sub-01_task-flanker_run-1_bold.nii.gz
TR: 2.0


170325-00:11:19,518 workflow INFO:
	 Executing node BIDSDataGrabber.aI.a2 in dir: /tmp/tmpfEjhs6/bids_demo/_subject_id_03/BIDSDataGrabber
170325-00:11:19,603 workflow INFO:
	 Executing node BIDSDataGrabber.aI.a1 in dir: /tmp/tmpCfSsJs/bids_demo/_subject_id_02/BIDSDataGrabber
170325-00:11:19,684 workflow INFO:
	 Executing node analyzeBOLD2.a1 in dir: /tmp/tmpmdEi5m/bids_demo/_subject_id_02/analyzeBOLD2
170325-00:11:19,720 workflow INFO:
	 Executing node _analyzeBOLD20 in dir: /tmp/tmpmdEi5m/bids_demo/_subject_id_02/analyzeBOLD2/mapflow/_analyzeBOLD20


analyzing /data/ds102/sub-02/func/sub-02_task-flanker_run-1_bold.nii.gz
TR: 2.0


170325-00:11:19,798 workflow INFO:
	 Executing node analyzeBOLD2.a2 in dir: /tmp/tmpUL2soh/bids_demo/_subject_id_03/analyzeBOLD2
170325-00:11:19,825 workflow INFO:
	 Executing node _analyzeBOLD20 in dir: /tmp/tmpUL2soh/bids_demo/_subject_id_03/analyzeBOLD2/mapflow/_analyzeBOLD20


analyzing /data/ds102/sub-03/func/sub-03_task-flanker_run-1_bold.nii.gz
TR: 2.0


170325-00:11:19,911 workflow INFO:
	 Executing node BIDSDataGrabber.aI.a3 in dir: /tmp/tmpREkFNf/bids_demo/_subject_id_04/BIDSDataGrabber
170325-00:11:20,3 workflow INFO:
	 Executing node analyzeBOLD2.a3 in dir: /tmp/tmpePChTv/bids_demo/_subject_id_04/analyzeBOLD2
170325-00:11:20,42 workflow INFO:
	 Executing node _analyzeBOLD20 in dir: /tmp/tmpePChTv/bids_demo/_subject_id_04/analyzeBOLD2/mapflow/_analyzeBOLD20


analyzing /data/ds102/sub-04/func/sub-04_task-flanker_run-1_bold.nii.gz
TR: 2.0


Out[21]:
<networkx.classes.digraph.DiGraph at 0x7efda2cc9d50>

Exercise 3:

Modify the printMetadata function to also print EchoTime