Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Jaccard, Dice and Tversky losses are incompatible with soft labels #8094

Open
zifuwanggg opened this issue Sep 18, 2024 · 2 comments · May be fixed by #8138
Open

Jaccard, Dice and Tversky losses are incompatible with soft labels #8094

zifuwanggg opened this issue Sep 18, 2024 · 2 comments · May be fixed by #8138

Comments

@zifuwanggg
Copy link

Describe the bug
The Jaccard, Dice and Tversky losses are incompatible with soft labels [1, 2]. For example, with a ground truth value of 0.5 for a single pixel, the Dice loss is minimized when the predicted value is 1, which is clearly erroneous.

To Reproduce

import torch
from monai.losses.dice import DiceLoss
from monai.losses.tversky import TverskyLoss

torch.manual_seed(0)

B, C, H, W = 7, 5, 3, 2
input = torch.rand(B, C, H, W).softmax(1)
jaccard_loss = DiceLoss(jaccard=True, reduction='mean')
dice_loss = DiceLoss(reduction='mean')
tversky_loss = TverskyLoss(reduction='mean')

jaccard_loss_value = jaccard_loss(input, input)
dice_loss_value = dice_loss(input, input)
tversky_loss_value = tversky_loss(input, input)

print(jaccard_loss_value, dice_loss_value, tversky_loss_value)

# tensor(0.8817) tensor(0.7888) tensor(0.7888)

Expected behavior
When the input is equal to the target, the loss should be minimized and equals 0.

Environment

================================
Printing MONAI config...
================================
MONAI version: 0+unknown
Numpy version: 1.26.4
Pytorch version: 2.2.2
MONAI flags: HAS_EXT = False, USE_COMPILED = False, USE_META_DICT = False
MONAI rev id: 25589c377ac63d6be28ab9bb65dd8cb52f2bebdf
MONAI __file__: /Users/<username>/Desktop/loss/MONAI-dev/monai/__init__.py

Optional dependencies:
Pytorch Ignite version: 0.4.11
ITK version: 5.4.0
Nibabel version: 5.2.1
scikit-image version: 0.24.0
scipy version: 1.13.1
Pillow version: 10.4.0
Tensorboard version: 2.17.1
gdown version: 5.2.0
TorchVision version: 0.17.2
tqdm version: 4.66.5
lmdb version: 1.5.1
psutil version: 6.0.0
pandas version: 2.2.2
einops version: 0.8.0
transformers version: 4.40.2
mlflow version: 2.16.2
pynrrd version: 1.0.0
clearml version: 1.16.5rc0

For details about installing the optional dependencies, please visit:
    https://docs.monai.io/en/latest/installation.html#installing-the-recommended-dependencies


================================
Printing system config...
================================
System: Darwin
Mac version: 10.16
Platform: macOS-10.16-x86_64-i386-64bit
Processor: i386
Machine: x86_64
Python version: 3.9.19
Process name: python3.9
Command: ['python', '-c', 'import monai; monai.config.print_debug_info()']
Open files: []
Num physical CPUs: 8
Num logical CPUs: 8
Num usable CPUs: UNKNOWN for given OS
CPU usage (%): [55.2, 54.9, 31.8, 54.8, 12.8, 13.5, 7.7, 7.0]
CPU freq. (MHz): 2400
Load avg. in last 1, 5, 15 mins (%): [59.1, 62.0, 55.6]
Disk usage (%): 95.6
Avg. sensor temp. (Celsius): UNKNOWN for given OS
Total physical memory (GB): 16.0
Available memory (GB): 1.3
Used memory (GB): 1.9

================================
Printing GPU config...
================================
Num GPUs: 0
Has CUDA: False
cuDNN enabled: False
NVIDIA_TF32_OVERRIDE: None
TORCH_ALLOW_TF32_CUBLAS_OVERRIDE: None

References
[1] Dice Semimetric Losses: Optimizing the Dice Score with Soft Labels. Zifu Wang, Teodora Popordanoska, Jeroen Bertels, Robin Lemmens, Matthew B. Blaschko. MICCAI 2023.

[2] Jaccard Metric Losses: Optimizing the Jaccard Index with Soft Labels. Zifu Wang, Xuefei Ning, Matthew B. Blaschko. NeurIPS 2023.

@coolteemf
Copy link

Using squared_pred=True fixes this issue, though I don't know why it isn't True by default.

@zifuwanggg
Copy link
Author

zifuwanggg commented Oct 10, 2024

L1 and L2 norms will yield different versions of the loss functions. The issue exists for the L1 version, but not for the L2 version. However, the L1 version is more widely used and often leads to better results. [2, 3] provides a comparison of the two.

References
[2] Jaccard Metric Losses: Optimizing the Jaccard Index with Soft Labels. Zifu Wang, Xuefei Ning, Matthew B. Blaschko. NeurIPS 2023.

[3] Optimization for Medical Image Segmentation: Theory and Practice When Evaluating With Dice Score or Jaccard Index. Eelbode et al. TMI 2020.

@zifuwanggg zifuwanggg linked a pull request Oct 10, 2024 that will close this issue
7 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants