Skip to content

Example Usage and Tutorials

Caterina Fuster-Barceló edited this page May 30, 2024 · 8 revisions

📚 Example Usage and Tutorials

Article Reference

Introduction

The deepImageJ case studies repository contains various examples demonstrating how to use deepImageJ for different bioimage analysis tasks. These case studies illustrate the application of deep learning models for image-to-image translation, nuclei segmentation, and integration with the BioImage Model Zoo. Each example includes scripts, macros, and detailed instructions to replicate the workflows. More details can be found in this article.

Case Study 1: Pipeline for Integrated Image-to-Image Translation and Nuclei Segmentation

Details

Step by Step Guide

Fine-Tuning Pix2Pix and StarDist

To fine-tune the models for your specific data, follow these steps:

  1. Download Datasets:

    • Obtain the two datasets required for this case study. If using Google Colab, upload these datasets to Google Drive. If fine-tuning locally, ensure the datasets are accessible on your local drive.
  2. Fine-Tune Pix2Pix:

    • Use the ZeroCostDL4Mic notebook for Pix2Pix to fine-tune the Pix2Pix model for 200 epochs with default parameters.
    • Training parameters:
      • Batch size: 1
      • Loss function: Vanilla Generative Adversarial Network (GAN)
      • Patch size: 512 × 512
      • Initial learning rate: 2e-4
      • Data augmentation: None
    • The Pix2Pix model is exported using PyTorch 2.0.1.
  3. Fine-Tune StarDist:

    • Use the ZeroCostDL4Mic notebook for StarDist to fine-tune the StarDist model for 100 epochs.
    • Training parameters:
      • Dataset: 45 paired image patches (1024 × 1024)
      • Patch size: 1024 × 1024
      • Batch size: 2
      • Initial learning rate: 3e-4
      • Data augmentation: None
    • The StarDist model is exported using TensorFlow 2.14.
  4. Export Models:

    • Ensure both fine-tuned models are exported in the BioImage Model Zoo format. Pay careful attention to the exporting and packaging requirements, including metadata.

Pix2Pix and StarDist with deepImageJ

Once the models are exported, follow these steps to install and use them in deepImageJ:

  1. Install Models in deepImageJ:

    • Open Fiji and navigate to Plugins > deepImageJ > deepImageJ Install Model.
    • Move to the Private Model tab, select From ZIP file, and add the path to your model zip file one by one.
  2. Run the Macro:

    • Use the macro provided here.
    • Modify the paths for the input and output folders in the macro to match your directories.
    • The macro workflow includes:
      1. Running the fine-tuned Pix2Pix model through deepImageJ to perform image translation from actin to DAPI images.
      2. Running the fine-tuned StarDist model on the synthetic DAPI images.
      3. Performing StarDist post-processing steps.
  3. Output:

    • You will obtain a folder with the masks of your input images. If using the same data, expect five masks corresponding to the five time points.

TrackMate

To conclude this use case, you can use TrackMate for tracking and data visualization:

  1. Load Time Points:

    • Load the final five masks into TrackMate.
  2. Perform Tracking:

    • Follow the default options and documentation in TrackMate to perform the tracking analysis.
    • Refer to the TrackMate documentation for detailed instructions.

By following these steps, you will successfully fine-tune and apply Pix2Pix and StarDist models using deepImageJ, and perform tracking analysis with TrackMate.

Case Study 2: Comprehensive 3D Nuclei Segmentation with deepImageJ

Details

Step by Step Guide

Pre-Processing Dataset

The dataset for this case study is downloaded from the Cell Tracking Challenge. It includes two different embryos. Embryo 01 is used for fine-tuning StarDist, and embryo 02 is reserved for the complete pipeline in Fiji.

  1. Generate Ground Truth for Embryo 01:

    • Annotations for embryo 01 are sparse. To create a training dataset containing partially annotated slices, run the Generate_GT.py script to extract these 2D slices into a new directory.
  2. Prepare Training and Testing Sets:

    • Use the mount_stardist_dataset.py script to separate the extracted slices of embryo 01 into train/ and test/ folders for fine-tuning in the notebook.
  3. Pre-process Datasets:

    • Noise reduction:
      • Apply a median filter with a radius of 7.0 pixels to all images from embryos 01 and 02 (not masks) in Fiji through Process > Filters > Median.
    • Reduce computational costs:
      • Downsample images in x and y dimensions to half their size for both embryos. Additionally, downsample the number of slices in the z-axis for embryo 02 by reducing it by half.

After these steps, you will have two folders: one with embryo 01 ready for fine-tuning StarDist and another with embryo 02 ready for use and testing in Fiji.

Fine-Tuning StarDist

  1. Prepare the Dataset:

    • Ensure you have a folder containing train/ and test/ subfolders for embryo 01.
  2. Fine-Tune StarDist:

    • Use the same ZeroCostDL4Mic notebook as in Case Study 1 for StarDist to fine-tune the model with the new data.
    • Training parameters:
      • Epochs: 50
      • Image Patches: 40 paired patches of size 512×512 cropped from the original images (1871×965 pixels)
      • Batch Size: 15
      • Loss Function: MAE
      • Learning Rate: 5e-05
      • Validation Data: 10%
      • Number of Rays: 32
      • Grid Parameter: 2
  3. Export the Model:

    • After successful training, export the model in the BioImage Model Zoo format.

StarDist in deepImageJ

  1. Install the Model:

    • Open Fiji and navigate to Plugins > deepImageJ > deepImageJ Install Model.
    • Select From ZIP file in the Private Model tab and add the path to your model zip file.
  2. Run the Macro:

    • Use the provided macro here to apply StarDist and post-processing steps.
    • Modify the paths for input and output folders as necessary.
    • The macro runs StarDist on each slice of the 3D volume of one time point of embryo 02 and outputs a folder containing a mask per slice.

Connected Components

  1. Analyze the 3D Volume:

    • Use the Connected Components plugin from MorphoLibJ to analyze the 3D volume for each time point.
    • Apply connected components analysis over the entire stack to obtain a comprehensive analysis of the embryo.
  2. Next Steps:

    • Repeat the analysis for each time point of embryo 02 to create a 4D video, allowing you to analyze the movement and behavior of the embryo over time.

By following these steps, you will fine-tune and apply the StarDist model using deepImageJ and analyze the results using Connected Components in MorphoLibJ.

Case Study 3: Segmentation of Arabidopsis Apical Stem Cells and Integration with the BioImage Model Zoo in deepImageJ

Details

Step by Step

Downloading the Dataset and Model

This case study utilizes a model already available on the BioImage Model Zoo. Follow these steps:

  1. Locate the Model:

    • Visit the BioImage Model Zoo and search for the model under the emotional-cricket nickname.
  2. Download the Dataset:

    • The dataset can be downloaded from the repository indicated above. Focus on the specific 3D volume needed for this case study.
    • Navigate to PNAS > PNAS > plant 13 > processed_tiffs within the downloaded volume and select the file named 84hrs_plant13_trim-acylYFP_improved.tif.

Inference in deepImageJ

  1. Install the Model:

    • Open Fiji and navigate to Plugins > deepImageJ > deepImageJ Install Model.
    • Install the model by selecting the appropriate zip file from the BioImage Model Zoo.
  2. Run the Model:

    • Open the 3D volume (84hrs_plant13_trim-acylYFP_improved.tif) in Fiji.
    • Navigate to Plugins > deepImageJ > deepImageJ Run.
    • Choose the installed model and run the inference to obtain a mask for the segmented root of the plant.

Post-Processing in Fiji

The post-processing pipeline includes two steps:

  1. Gamma Correction:

    • Apply a gamma correction with a value of 0.80 to enhance membrane visibility and reduce blurriness.
  2. Morphological Segmentation:

    • Use the Morphological Segmentation tool from MorphoLibJ for segmentation and visualization.
    • Set the tolerance to 10 to effectively depict catchment and overlay basins on the segmented image.

This precise application of Morphological Segmentation ensures clear and distinct visualization of each cell.

By following these steps, you will be able to download and use the dataset and model from the BioImage Model Zoo, perform inference with deepImageJ , and apply post-processing steps in Fiji.