QuNex-XNAT introduction and container setup#

The integration with the XNAT environment is designed to enable launching of the initial data organization and processing, which is accomplished via the HCP (NOTE: Human Connectome Pipelines https://www.humanconnectome.org/software/hcp-mr-pipelines) pipelines for a given MRI dataset at the single-session (or subject) level.

Docker and AWS#

Integrated into an Amazon Cloud through Docker Swarm, docker containers can be launched on demand.

Docker & QuNex-XNAT integration#

The QuNex Suite, along with all its dependencies, is available as a docker container image. Instances of the docker image can be invoked and deployed using a site hosted by the XNAT server (see figure). The figure below provides a schematic layout of the XNAT-QuNex architecture and logic of the data flow.

QuNex XNAT

XNAT data organization#

Data stored within XNAT is organized into Projects. All data stored in XNAT must be associated with a project. This association is the basis of the XNAT security model. Users are given access to data which belongs to a particular project.

In order to execute QuNex via XNAT, the following steps are to be performed:


  • STEP 2: Adding the QuNex Docker Image to a Site Managed by XNAT.

    An XNAT Site Admin would add QuNex Docker Image to the XNAT site. Each XNAT site, thus, has a docker image which could be executed for imaging sessions within each project. The command which needs to be executed by these containers is specified in a JSON format, henceforth called QuNex Command.


  • STEP 3: Enable Docker Container Command for a Specific Project.

    Here a project owner would enable the relevant QuNex Suite Container Command for the project and provide the necessary arguments specific to the project's acquisition protocol.


  • STEP 4: Adding Processing Files as Project Level Resources.

    A project owner would define 2 relevant project level files (batch parameter file and the mapping file) to the project as a resource.


  • STEP 5: Launching the QuNex Command for an MRSession or a collection of MRSessions.

    Here a user would launch the QuNex Suite Docker Container for a given imaging session with a particular command specified and submit it for processing.

    In the following sections, the above steps are described in more detail.

PRELIMINARY STEP: How to create a new project#

Note that creation of a new project is reserved for Admin-level staff who would be setting a new project for the first time with relevant user permissions. For details on steps for how to create a new project please refer to the relevant Management for Advanced Users & Administrators section: Creating a New Project.

STEP 1: Uploading new data for a site managed by XNAT#

Data can be uploaded into a project managed by XNAT using the following options:

  • XNAT includes an integrated DICOM C-STORE Service Class Provider (SCP) receiver. The XNAT DICOM C-STORE SCP can receive data from any DICOM C-STORE SCU, including scanners or DICOM viewers such as OsiriX or DicomBrowser.

  • Using the XNAT Upload Assistant to upload data saved on local hard-disks to the XNAT server.

  • Using an QuNex Suite script on the user local computer or server where the data are located. The script can be run directly from the terminal on the user’s local machine if QuNex Suite is deployed or the specific QuNex script is available:

    > ~/qunex/bash/qx_utilities/xnat_upload_download.sh
    

    For help on running the script refer to the help call by running via terminal or refer to the relevant help documentation by invoking:

    > XNATUploadDownload.sh –-help
    
  • Generic example how to run XNATUploadDownload.sh from the terminal on the user’s local machine that contains the relevant data:

    > XNATUploadDownload.sh \
    --sessionsfolder='/<absolute_path_to_study_on_local_server>/sessions/' \
    --sessions='<session_id>' \
    --projectid='<project_name>' \
    --hostname='<xnat_site_url>' \
    --runtype='upload'
    
  • Specific example how to run XNATUploadDownload.sh from the terminal for sample project:

    > XNATUploadDownload.sh \
    --sessionsfolder='/gpfs/project/fas/n3/Studies/BlackThorn/sessions' \
    --sessions='pb0986' \
    --projectid='btrx_demo_data' \
    --hostname='https://imagingdb.blackthornrx.com' \
    --runtype='upload'
    

    After upload, XNAT reads the DICOM tags, as it receives data, and maps out the Project, Subject and Imaging Session Label (More Information here). Data uploaded into XNAT, using any of the methods above, either is saved in a temporary holding area called Prearchive or in the Archive (after the data has been checked and committed). Specifically, when XNAT is able to ascertain the Project and the Project is configured (how to configure) to save data directly into the Archive, then all data received is saved in the Archive. If any pipelines added to the Project were set to run automatically when a session is archived (see the following step), they will start running once the data is successfully uploaded and moved to Archive for a new session.

STEP 2: Setting up specific QuNex Suite command for a site managed by XNAT#

In the next section we will focus on the implementation of the QuNex-compatible HCP Pipelines as a use case of the QuNex-XNAT integration.

As mentioned above, a site admin has to add a command which is to be executed by the Docker container. The command is specified in JSON format and among other things, contains information about:

  • The QuNex Suite Docker image to instantiate

  • The path to the executable within the docker image

  • The command-line arguments for the executable

  • How to construct the command-line argument and pass its value

  • Which files are to be mounted so that the executable has all its required inputs

  • After the executable completes successfully, which of the generated files are to be uploaded back into XNAT and where are these files to be stored

The command JSON is consumed by the XNAT Container Service, which communicates between Docker and XNAT, and is responsible for instantiating the container instance and putting the generated data back into MRSession against which the container was executed. See Appendix I for QuNex command JSON example.

Note: These steps need to be done only once upon initial configuration of the XNAT site and must be done by your site administrator.

In the following steps, we would be adding a Docker container to the XNAT site and adding a command json for the docker image.

STEP 2A: Login to XNAT instance as admin. Navigate to Administer → Plugin Settings.

XNAT login

Step 2B: Click on → Images and Commands.

Images and Commands

To add a docker image to the XNAT site, use Add Image. Once the image is added, a command JSON needs to be associated with the image. This is done in the next step.

Add image

Step 2C: Click on → Add New Command against the docker image.

New command 1

New command 2

Type in the command JSON (see Appendix I for an QuNex-ready example) and click Save Command.

STEP 3: Setting up QuNex command for a specific project#

STEP 3A: Enable QuNex command for the project

As a project owner, Navigate to Project Settings.

Project Settings

STEP 3B: Enable the required command.

Enable command

STEP 3C: Provide the input parameters specific to the project, Click on → Set Defaults.

Set defaults

STEP 3D: → Click Save.

Specific notes on defining QuNex parameter files for the first time#

The parameters required for setting up mapping, batch processing and overwrite options within the XNAT site are:

  • overwriteprojectxnat – Specify <yes> or <no> for delete of entire XNAT project folder prior to an QuNex Suite Docker Container run. Default is [no].

Note: Be cautious with this parameter to avoid deleting your entire session. Use only if you truly wish to re-run a clean QuNex Suite container run from the very beginning.

  • overwritestep – Specify <yes> or <no> for delete of prior workflow step. Default is [no]. Use this if you wish to explicitly re-run a given step and overwrite prior data for that particular step (e.g. hcp_pre_freesurfer).

  • turnkeysteps – Specify specific turnkey steps you wish to run. To get help on these turnkey steps and how parameterize, run qunex <function_name>

    Supported: create_study map_raw_data import_dicom create_session_info setup_hcp create_batch export_hcp hcp_pre_freesurfer hcp_freesurfer hcp_post_freesurfer run_qc_t1w run_qc_t2w run_qc_myelin hcp_fmri_volume hcp_fmri_surface run_qc_bold hcp_diffusion run_qc_dwi hcp_diffusionLegacy run_qc_dwi_legacy run_qc_dwi_eddy dwi_dtifit run_qc_dwi_dtifit dwi_bedpostx_gpu run_qc_dwi_process run_qc_dwi_bedpostx pretractographyDense dwi_parcellate dwi_seed_tractography_dense run_qc_custom map_hcp_data create_bold_brain_masks compute_bold_stats create_stats_report extract_nuisance_signal preprocess_bold preprocess_conc general_plot_bold_timeseries parcellate_bold compute_bold_fc_seed compute_bold_fc_gbc run_qc_bold_fc

  • batch_parameters_filename – This variable refers to the name of the file which contains the parameters required for HCP and other processing. The default name of this batch parameter file in the XNAT instance is batch_parameters.txt. However, this default can be changed to match the file that is manually generated by the user with proper access permissions and uploaded as a project-level resources, described in STEP 4 below. The logic here is that pre-processing and analysis commands are typically run on batches of sessions and/or subjects that most often represent all the subjects in a study. Additionally, a number of parameters that are to be used in pre-processing and analysis commands are stable and do not change between command invocations. To ease processing of batches of sessions and/or subjects from small sets to thousands, QuNex utilizes ‘batch’ parameter files. Complete details on the ‘batch’ parameter file specification is described in the batch file format and relevant options for HCP processing are described in the QuNex Wiki – Compiling a batch file.

    Note: An example populated ‘batch’ parameter files for either single-band ‘legacy’ specification or multi-band HCP-style specification is provided as part of the QuNex Suite.

    Parameter file example for single-band MR data:

    $TOOLS/qunex/python/qx_utilities/templates/batch_singleband_parameters_example.txt
    

    Parameter file example for multi-band MR data:

    $TOOLS/qunex/python/qx_utilities/templates/batch_multiband_parameters_example.txt
    

    All supported batch parameter options can be accessed from the terminal on the user’s local machine if QuNex is deployed or the specific QuNex script is available:

    > qunex -o
    

    All options are also recorded in the following file provided as part of the QuNex Suite:

    $TOOLS/qunex/python/qx_utilities/templates/batch_parameters_example.txt
    

    When setting up the batch parameter file for the first time, the Project Administrator will need to make sure that the following variable points to the correct path specification for the HCP pipelines installation location on the Pipeline AMI (see QuNex AMI Details below):

_hcp_pipeline : <absolute_path_to_qunex_suite_folder>#

Example based on the QuNex AMI:

_hcp_pipeline : /opt/HCP/HCPpipelines/#

  • scan_mapping_filename – The name of the file which contains the scan mapping. The default value of this file is hcp_mapping.txt. However, this file can be renamed to a different name for a specific project to match the file name that is manually generated by the user with proper access permissions and uploaded as a project-level resources, described in STEP 4 below. Each of the images collected, that are to be used and processed via the HCP pipeline, need to be mapped to the appropriate folder structure and named to be recognized and processed by the HCP scripts. The purpose of a scan mapping file is to describe how the images collected in your study map to HCP standard names so that they can be correctly mapped and named. Tags to be used for mapping are described in the QuNex Wiki.

An example mapping description is provided in the following file as part of the QuNex Suite:

~/qunex/python/qx_utilities/templates/hcp_mapping_example.txt#

  • The batch parameter file and the scan mapping file are stored as project-level resources (see the next section on how to add these files as project level resource). For complete details regarding how to setup the batch parameters file and the scan mapping file again please refer to the QuNex Wiki.

STEP 4: Adding processing files as project level resources#

The batch parameters file and the mapping file have to be stored as project level resource. This resource has to be named QUNEX_PROC.

These files have to be generated manually for the first time by the user with appropriate project-level access and/or the Admin for the XNAT Site or relevant project. The idea is that they are setup a single time for a project. However, you can upload as many mapping or batch parameter files as you wish, as long as you reference them correctly later when building a job, described in STEP 5 below.

Comprehensive details for how to generate the batch parameters file and the mapping file files is documented in the QuNex Wiki. The rest of the tutorial will proceed under the assumption that these files have been generated and can be uploaded as a project level resource.

STEP 4A: Navigate to the project. Click on Actions Manage Files.

Manage files

STEP 4B: Create a new resource called QUNEX_PROC. Click on Add Folder.

Add Folder

STEP 4C: Select the level to be resources. Set the Folder name to be QUNEX_PROC. → Click Create.

Create

Note: The folder name HAS to be QUNEX_PROC

STEP 4D: Upload the mapping file and batch parameters file. Click on Upload Files.

Upload Files

STEP 4E: Set Level to resources, Folder to QUNEX_PROC. Click on Choose File.

Choose File

Note: Step 4D and Step 4E have to be performed twice, once for the mapping file and once for the batch parameters file. You can pick whatever file name you prefer for each of the two files. However, the names of the files that were uploaded at this step must match the QuNex pipeline parameter settings when defining the pipeline (see Step 3; for description of batch_parameters_filename and the **_scan_mapping_filename _**expectations). This setting can be changed in two places:

  1. By navigating to Project Settings → Configure Commands → Set Defaults and then changing the defaults to match the names you just uploaded;

  2. By changing the names once you build a container launch, described in Step 5.