Post-mortem macaque tractography
Contents
Post-mortem macaque tractography#
QuNex support processing of post-mortem macaque tractography. Below are the steps included in this processing pipeline.
Study creation#
Just like any other processing we have to start by creating a study:
qunex create_study \
--studyFolder=/data/macaque_study
Importing the data into QuNex#
For onboarding post-mortem macaque data QuNex uses a specialized function called import_nhp
. The import_nhp
is capable of importing folders and archives (.zip
and .tar
) with data. The visualization below depicts how the input data should be structured.
sessions
|
├─ session1
| └─ dMRI
| ├─ bvals
| ├─ bvecs
| ├─ data.nii.gz
| └─ nodif_brain_mask.nii.gz
|
└─ session2
└─ dMRI
├─ bvals
├─ bvecs
├─ data.nii.gz
└─ nodif_brain_mask.nii.gz
The following parameters are relevant for import_nhp
:
--sessionsfolder The sessions folder where all the sessions are to be
mapped to. It should be a folder within the
<study folder>. [.]
--inbox The location of the NHP dataset. It can be a folder
that contains the NHP datasets or compressed `.zip`
or `.tar.gz` packages that contain a single
session or a multi-session dataset. For instance the user
can specify "<path>/<nhp_file>.zip" or "<path>" to
a folder that contains multiple packages. The default
location where the command will look for a NHP dataset
is [<sessionsfolder>/inbox/NHP].
--sessions An optional parameter that specifies a comma or pipe
separated list of sessions from the inbox folder to be
processed. Regular expression patterns can be used.
If provided, only packets or folders within the inbox
that match the list of sessions will be processed. If
`inbox` is a file `sessions` will not be applied.
Note: the session will match if the string is found
within the package name or the session id. So
"NHP" with match any zip file that contains string
"NHP" or any session id that contains "NHP"!
--action How to map the files to QuNex structure. ["link"]
The following actions are supported:
- link (files will be mapped by creating hard links if
possible, otherwise they will be copied)
- copy (files will be copied)
- move (files will be moved)
--overwrite The parameter specifies what should be done with
data that already exists in the locations to which NHP
data would be mapped to. ["no"] Options are:
- no (do not overwrite the data and skip processing of
the session)
- yes (remove existing files in `nii` folder and redo the
mapping)
--archive What to do with the files after they were mapped.
["move"] Options are:
- leave (leave the specified archive where it is)
- move (move the specified archive to
`<sessionsfolder>/archive/NHP`)
- copy (copy the specified archive to
`<sessionsfolder>/archive/NHP`)
- delete (delete the archive after processing if no
errors were identified)
Please note that there can be an
interaction with the `action` parameter. If files are
moved during action, they will be missing if `archive`
is set to "move" or "copy".
If the data onboarding process was successful dMRI images for each session will be stored in:
<sessionsfolder>/<session>/dMRI
Below is an example of data import:
qunex import_nhp \
--sessionsfolder=/data/macaque_study/sessions \
--inbox=/data/macaque_raw \
--archive=leave \
--overwrite=yes
DTIFIT#
The next processing step in this pipeline is FSL's DTIFIT (detailed documentation):
qunex dwi_dtifit \
--sessionsfolder=/data/macaque_study/sessions \
--sessions="hilary,jane" \
--species="macaque"
BEDPOSTX#
After DTIFIT comes FSL's BEDPOSTX (detailed documentation):
qunex dwi_bedpostx_gpu \
--sessionsfolder=/data/macaque_study/sessions \
--sessions="hilary,jane" \
--species="macaque" \
--bash="module load CUDA/9.1.85" \
--scheduler="SLURM,time=12:00:00,cpus-per-task=1,mem-per-cpu=16000,gpus=1,jobname=qx_bedpostx"
Since QuNex uses FSL's GPU BEDPOSTX implementation the example above schedules the command for execution on a GPU node. It also loads the appropriate CUDA module through the bash
parameter.
F99 registration#
Next, we use the dwi_f99
(detailed documentation) QuNex command to register our diffusion to the F99 atlas:
qunex dwi_f99 \
--sessionsfolder=/data/macaque_study/sessions \
--sessions="jane,hilary"
XTRACT tractography#
Finally, we can use dwi_xtract
(detailed documentation) to get the tracts:
qunex dwi_xtract \
--sessionsfolder=/gpfs/project/fas/n3/Studies/MBLab/HCPDev/jd_tests/macaque_study/sessions \
--sessions="jane,hilary" \
--species="macaque"