Skip to content

Latest commit

 

History

History
141 lines (105 loc) · 6.27 KB

README.md

File metadata and controls

141 lines (105 loc) · 6.27 KB

HTCondor DAGMan application for the construction of a spatio-temporal brain atlas from cross-sectional brain MR images using deformable registration between all pairs of images. The subject-to-atlas deformations can be iteratively refined. Alternatively, the initial pairwise registrations can be skipped, and only an initial global average image computed.

A more recent iterative atlas construction is implemented by the construct-atlas MIRTK command.

This workflow has been tested with MIRTK master branch revision c18e1ac, but HEAD commit should work.

Initial Setup

Clone this repository into a workflow subdirectory next to the directories containing the individual MR images and corresponding segmentation label maps. For example, run the following commands to create a new directory for the construction of a new brain atlas.

mkdir BrainAtlas && cd BrainAtlas
ln -s <path_to_images> images
ln -s <path_to_labels> labels
git clone [email protected]/MIRTK/BAtCh.git workflow
cd workflow

Update: The location of input files can be specified in the configuration file. No need to copy/link these.

Configuration Files

The atlas construction workflow is configured by mainly three files:

  • etc/config/default.sh: A shell script containing global variables used by bin/gen-workflow.
  • etc/config/custom.sh: Optional shell script to override global default variables.
  • etc/config/ages.csv: A comma or space separated CSV file with image IDs and corresponding ages.
  • etc/config/subjects.lst: An optional subject list containing only the IDs of those images from which the spatio-temporal atlas should be created.

Additionally, some of the workflows require a reference image used for the initial global normalization of the images which also defines the coordinate system of the generated atlas images. A neonatal reference brain image downloaded from brain-development.org can be found in the etc/reference directory.

See the comments in etc/config/default.sh for what parameters and file paths can be set.

Temporal Regression Kernels

The atlas construction workflow produces a spatial anatomical atlas and tissue/structure probability maps for each time point for which a temporal kernel is found in the kernel directory specified in the configuration file.

The kernels used for the neonatal atlas are based on a Gaussian function with mean corresponding to the desired atlas time point (gestational age, GA) and a constant standard deviation (default 1 week). A variable kernel width is possible by generating kernels with varying standard deviation for different atlas time points. An input "kernel" is simply a comma or tab separated CSV file, e.g., named t$age.tsv, where the first column contains the ID of the images from which the atlas at the respective time point is created and the second column their respective kernel weight. The provided bin/gen-kernels script can be used to create such CSV files using a Gaussian kernel function. It should be noted, however, that the kernels can be generated with any tool, including MATLAB.

For example, the kernels for the dHCP atlas built from 275 images for the age range 36 to 44 weeks GA, with a temporal resolution of 1 week with constant kernel width can be generated by setting sigma=1 (default set in etc/config/default.sh) in the respective configuration file and by then running the command

bin/gen-kernels -c etc/config/dhcp-v2.4/dHCP275/constant-sigma.sh -range 36 44

Generate Workflow DAG

Given the ages.csv (and subjects.lst) as well as the temporal regression kernels generated in the previous step, execute the bin/gen-workflow script to generate the HTCondor and DAGMan files which specify the separate jobs to be executed by HTCondor and describe the directed acyclic graph (DAG) of the workflow (i.e., job dependencies). The setup script will also copy the used MIRTK commands into the configured bindir to ensure these are not modified while the workflow is being executed. The generated DAG files, parameter files, and job descriptions can be found in the configured dagdir.

bin/gen-workflow -v

Workflow Execution

The workflow can be executed by submitting the main.dag to HTCondor using condor_submit_dag.

The long running DAGMan job needs to have a valid authentication method to submit new jobs and monitor running jobs. The current Imperial College London Department of Computing (DoC) HTCondor installation uses Kerberos v5 authentication. The user running the DAGMan job must periodically renew their Kerberos ticket granting ticket (TGT). This can be done by executing the bin/run-dagman script instead of calling condor_submit_dag directly:

bin/run-dagman "$dagdir/main.dag"

This script will replace the condor_dagman executable usually submitted to HTCondor by a Bash script named lib/tools/run-dagman which runs condor_dagman as background job and periodically reinitializes the Kerberos ticket cache using kinit. To be able to do so without the user account password, it requires a user-generated kerb5.keytab file.

Alternatively, a cron job independently of this atlas creation workflow can be setup which periodically obtains a new Kerberos ticket. Instructions are available to BioMedIA members here.

A Python script for either serial execution or submission of batch jobs to SLURM instead of HTCondor is included as well. To run the atlas construction on a SLURM cluster using the partition called long, run the following command instead:

bin/run-workflow "$dagdir/main.dag" --queue long

A serial execution on the local machine is not recommended unless the atlas is constructed from only a few images.

References

  • A. Schuh, M. Murgasova, A. Makropoulos, C. Ledig, S.J. Counsell, J.V. Hajnal, P. Aljabar, D. Rueckert, "Construction of a 4D Brain Atlas and Growth Model using Diffeomorphic Registration", MICCAI STIA Workshop, LNCS Volume 8682, pp. 27-37 (2014)