# Running DWI analyses The commands below prepare the data for a number of common DWI analyses including diffusion tensor imaging (DTI) and probabilistic tractography. Each command explicitly assumes the Human Connectome Project preprocessing folder structure. All of the listed steps are performed using the `qunex` processing wrapper. These commands are: * [`dwi_bedpostx_gpu`](../../api/gmri/dwi_bedpostx_gpu.rst) * [`dwi_dtifit`](../../api/gmri/dwi_dtifit.rst) * [`dwi_pre_tractography`](../../api/gmri/dwi_pre_tractography.rst) * [`dwi_probtrackx_dense_gpu`](../../api/gmri/dwi_probtrackx_dense_gpu.rst) * [`dwi_parcellate`](../../api/gmri/dwi_parcellate.rst) * [`dwi_seed_tractography_dense`](../../api/gmri/dwi_seed_tractography_dense.rst) * [`dwi_xtract`](../../api/gmri/dwi_xtract.rst) * [`dwi_f99`](../../api/gmri/dwi_f99.rst) Some DWI commands also support processing of post-mortem macaque data. See [Post-mortem macaque tractography](../UsageDocs/PostMortemMacaque) for details about how to process such data. ## dwi_bedpostx_gpu `dwi_bedpostx_gpu` models crossing fibers within each voxel using [FSL's bedpostx](https://fsl.fmrib.ox.ac.uk/fsl/fslwiki/FDT/UserGuide#BEDPOSTX "FSL bedpostx User Guide"). The probability of diffusion from each voxel in every direction versus all other directions is estimated, building the distributions that are necessary for running probabilistic tractography. Human Connectome Project minimal preprocessing diffusion results must be present before running the command. A GPU-enabled node is required. ### dwi_bedpostx_gpu example ``` bash qunex dwi_bedpostx_gpu \ --sessionsfolder="" \ --sessions="" \ --fibers="" \ --burnin="" \ --model="" \ --overwrite="no" \ --scheduler= ``` The breakdown of the parameters is as follows: * `sessionsfolder` Path to sessions folder within the study folder. * `sessions` List of sessions to run command on. * `species` Set to "macaque" when processing macaque data. * `fibers` Number of fibres per voxel; default is set to "3". * `weight` ARD weight, more weight means less secondary fibres per voxel; default is set to "1". * `burnin` Burnin period; default is set to "1000". * `jumps` Number of jumps; default is set to "1250". * `sample` Sample every; default is set to "25". * `model` Deconvolution model; 1: with sticks, 2: with sticks with a range of diffusivities (default), 3: with zeppelins. * `rician` Replacing the default Gaussian noise assumption with Rician noise; default is set to "yes". * `gradnonlin` Consider gradient nonlinearities (yes/no). By default set automatically. Set to yes if the file `grad_dev.nii.gz` is present, set to no if it is not. * `overwrite` Set to "yes" to overwrite previous data; default is set to "no". * `scheduler` A string for the cluster scheduler (e.g. PBS or SLURM) followed by relevant options. e.g. for SLURM the string would look like this: `--scheduler='SLURM,jobname=,time=,cpus-per-task=,mem-per-cpu=,partition='` ## dwi_dtifit [`dwi_dtifit`](../../api/gmri/dwi_dtifit.rst) carries out diffusion tensor fitting at each voxel using [FSL's dtifit](https://fsl.fmrib.ox.ac.uk/fsl/fslwiki/FDT/UserGuide#DTIFIT "FSL dtifit User Guide"). Its outputs include mean diffusivity and fractional anisotropy. Note that `dwi_dtifit` is not required for probabilistic tractography. Human Connectome Project minimal preprocessing diffusion results must be present before running the command. ### dwi_dtifit example ``` bash qunex dwi_dtifit \ --sessionsfolder="" \ --sessions="" \ --overwrite="no" \ --scheduler= ``` The breakdown of the parameters is as follows: * `sessionsfolder` Path to sessions folder within the study folder. * `sessions` List of sessions to run command on. * `species` Set to "macaque" when processing macaque data. * `overwrite` Set to "yes" to overwrite previous data; default is set to "no". * `mask` Bet binary mask file [T1w/diffusion/nodif_brain_mask]. * `bvecs` b vectors file [T1w/diffusion/bvecs]. * `bvals` b values file [T1w/diffusion/bvals]. * `cni` Input confound regressors [not set by default]. * `sse` Output sum of squared errors [not set by default]. * `wls` Fit the tensor with weighted least squares [not set by default]. * `kurt` Output mean kurtosis map (for multi-shell data) [not set by default]. * `kurtdir` Output parallel/perpendicular kurtosis maps (for multi-shell data) [not set by default]. * `littlebit` Only process small area of brain [not set by default]. * `save_tensor` Save the elements of the tensor [not set by default]. * `zmin` Min z [not set by default]. * `zmax` Max z [not set by default]. * `ymin` Min y [not set by default]. * `ymax` Max y [not set by default]. * `xmin` Min x [not set by default]. * `xmax` Max x [not set by default]. * `gradnonlin` Gradient nonlinearity tensor file [not set by default]. * `scheduler` A string for the cluster scheduler (e.g. PBS or SLURM) followed by relevant options. e.g. for SLURM the string would look like this: `--scheduler='SLURM,jobname=,time=,cpus-per-task=,mem-per-cpu=,partition='`. ## dwi_pre_tractography `dwi_pre_tractography` runs pretractography dense trajectory space generation. The command is very quick to run so no overwrite options exist (new outputs that overwrite old ones will be always generated). ### dwi_pre_tractography - example ``` bash qunex dwi_pre_tractography \ --sessionsfolder="" \ --sessions="" \ --scheduler="" ``` The breakdown of the parameters is as follows: * `sessionsfolder` Path to sessions folder within the study folder. * `sessions` List of sessions to run command on. * `scheduler` A string for the cluster scheduler (e.g. PBS or SLURM) followed by relevant options. e.g. for SLURM the string would look like this: `--scheduler='SLURM,jobname=,time=,cpus-per-task=,mem-per-cpu=,partition='` ## dwi_probtrackx_dense_gpu `dwi_probtrackx_dense_gpu` samples the `dwi_bedpostx_gpu` distribution results using [FSL's probtrackx](https://fsl.fmrib.ox.ac.uk/fsl/fslwiki/FDT/UserGuide#PROBTRACKX_-_probabilistic_tracking_with_crossing_fibres "FSL probtrackx User Guide"). A whole-brain dense connectome is generated showing the probability of streamline connections from every voxel to every other voxel. The QuNex suite is cluster-enabled by default and a GPU-enabled node is required. The command can produce two versions of the dense connectivity matrix based on different seeding strategies: * Matrix 1 computes probabilistic tractography between each grey matter point and every other grey matter point. * Matrix 3 starts with a white matter voxel and computes the tractography in both directions (along a given orientation) to the two grey matter points at either end. Whereas Matrix 1 is unidirectional surface-to-surface, Matrix 3 is bidirectional voxel-to-surface which better reflects long-range projections. ### dwi_probtrackx_dense_gpu - example ``` bash qunex dwi_probtrackx_dense_gpu \ --sessionsfolder="" \ --sessions="" \ --omatrix1="" \ --omatrix3="" \ --nsamplesmatrix1="" \ --nsamplesmatrix3="" \ --overwrite="no" \ --scheduler="" ``` The breakdown of the parameters is as follows: * `sessionsfolder` Path to sessions folder within the study folder. * `sessions` List of sessions to run command on. * `omatrix1` Specify if you wish to run matrix 1 model; "yes" or omit flag. * `omatrix3` Specify if you wish to run matrix 3 model; "yes" or omit flag. * `nsamplesmatrix1` Number of samples; default is set to "10000". * `nsamplesmatrix3` Number of samples; default is set to "3000". * `distancecorrection` Use distance correction; default is set to "no". * `storestreamlineslength` Store average length of the streamlines; default is set to "no". * `overwrite` Set to "yes" to overwrite previous data; default is set to "no". * `scheduler` A string for the cluster scheduler (e.g. PBS or SLURM) followed by relevant options. e.g. for SLURM the string would look like this: `--scheduler='SLURM,jobname=,time=,cpus-per-task=,mem-per-cpu=,partition='` Note that: * Waytotal normalization is computed automatically as part of the run prior to any inter-session or group comparisons to account for individual differences in geometry and brain size. The command divides the dense connectome by the waytotal value, turning absolute streamline counts into relative proportions of the total streamline count in each session. * Next, a log transformation is computed on the waytotal normalized data, which will yield stronger connectivity values for long-range projections. Log-transformation accounts for algorithmic distance bias in tract generation (path probabilities drop with distance as uncertainty is accumulated). See Donahue et al. The Journal of Neuroscience, June 22, 2016, 36(25):6758–6770. DOI: [https://doi.org/10.1523/JNEUROSCI.0493-16.2016](https://doi.org/10.1523/JNEUROSCI.0493-16.2016). The outputs for these files will be in: `//hcp//MNINonLinear/Results/Tractography/_waytotnorm.dconn.nii` `//hcp//MNINonLinear/Results/Tractography/_waytotnorm_log.dconn.nii` ## dwi_parcellate `dwi_parcellate` implements parcellation on the dense connectome using a whole-brain parcellation file, such as the [Glasser parcellation](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4990127/ "Article: Glasser et al., 2016") with subcortical labels. ### dwi_parcellate - example ``` bash qunex dwi_parcellate \ --sessionsfolder="" \ --sessions="" \ --matrixversion="" \ --waytotal="" \ --parcellationfile="" \ --outname="" \ --overwrite="no" \ --scheduler="" ``` The breakdown of the parameters is as follows: * `sessionsfolder` Path to sessions folder within the study folder. * `sessions` List of sessions to run command on. * `matrixversion` Matrix solution version to use as input; "1" or "3". * `waytotal` Version of dense connectome to use as input; "none": without waytotal normalization, "standard": standard waytotal normalized, or "log": log-transformed waytotal normalized. * `parcellationfile` Specify the absolute path of the file you want to use for parcellation. * `outname` Specify the suffix output name of the pconn file. * `overwrite` Set to "yes" to overwrite previous data; default is set to "no". * `scheduler` Only use the scheduler if you have a large amount of sessions, otherwise this command can be run locally. ## dwi_seed_tractography_dense `dwi_seed_tractography_dense` reduces the dense connectome using a given 'seed' structure (e.g. thalamus). The files produced will contain streamline information for only those connections originating from the specific anatomical seed. ### dwi_seed_tractography_dense - example ````bash qunex dwi_seed_tractography_dense \ --sessionsfolder="" \ --sessions="" \ --matrixversion="" \ --waytotal="" \ --seedfile="" \ --outname="" \ --overwrite="no" \ --scheduler="" ```` The breakdown of the parameters is as follows: * `sessionsfolder` Path to sessions folder within the study folder. * `sessions` List of sessions to run command on. * `matrixversion` Matrix solution version to use as input; "1" or "3". * `waytotal` Version of dense connectome to use as input; "none": without waytotal normalization, "standard": standard waytotal normalized, or "log": log-transformed waytotal normalized. * `seedfile` Specify the absolute path of the seed file you want to use as a seed for dconn reduction. * `outname` Specify the suffix output name of the dscalar file. * `overwrite` Set to "yes" to overwrite previous data; default is set to "no". * `scheduler` Only use the scheduler if you have a large amount of sessions, otherwise this command can be run locally. ## dwi_xtract `dwi_xtract` executes the FSL's XTRACT (cross-species tractography) command. It can be used to automatically extract a set of carefully dissected tracts in humans and macaques. It can also be used to define one's own tractography protocols where all the user needs to do is to define a set of masks in standard space (e.g. MNI152). The breakdown of command specific parameters is as follows: ``` sh --species Species: human or macaque. [human] --nogpu Do not use the GPU version, this flag is not set by default. --xtract_list Comma separated list of tract names. [] --xtract_structures Path to structures file (format: per line OR format: [samples=1], 1 means 1000, '#' to skip lines). [] --xtract_protocols Protocols folder (all masks in same standard space) [$FSLDIR/data/xtract_data/]. --xtract_stdwarp Standard2diff and Diff2standard transforms. Default for humans is set to the one used by session: [acpc_dc2standard.nii.gz and standard2acpc_dc.nii.gz], for macaques warp fields from F99 registration command (dwi_f99) are used by default. --xtract_resolution Output resolution in mm. Default is the same as in the protocols folder unless --xtract_native is used. --xtract_ptx_options Pass extra probtrackx2 options as a text file to override defaults (e.g. --steplength=0.2). [] for humans, [qunex/templates/NHP/ptx_config] for macaques. --xtract_native Run tractography in native (diffusion) space. This flag is not set by default. --xtract_ref Reference image (" ") for running tractography in reference space, Diff2Reference and Reference2Diff transforms. [] ``` ### dwi_xtract - examples ``` bash # xtract on macaques qunex dwi_xtract \ --sessionsfolder=/data/macaque_study/sessions \ --sessions="jane,hilary" \ --species="macaque" \ --overwrite=yes \ --bash="module load CUDA/9.1.85" \ --scheduler="SLURM,time=12:00:00,cpus-per-task=1,mem-per-cpu=16000,gpus=2,jobname=qx_xtract" # xtract on humans qunex dwi_xtract \ --sessionsfolder=/data/test_study/sessions \ --sessions="/data/test_study/processing/batch.txt" \ --overwrite=yes \ --bash="module load CUDA/9.1.85" \ --scheduler="SLURM,time=12:00:00,cpus-per-task=1,mem-per-cpu=16000,gpus=1,jobname=qx_xtract" ``` ## dwi_f99 `dwi_f99` executes FSL's F99 script for registering your own diffusion or structural data to the F99 atlas. This atlas is used when processing macaque data. This command does not have any command specific parameters. ### dwi_f99 - example ``` bash qunex dwi_f99 \ --sessionsfolder=/gpfs/project/fas/n3/Studies/MBLab/HCPDev/jd_tests/macaque_study/sessions \ --sessions="jane,hilary" ```