Skip to content

Commit

Permalink
add unittests
Browse files Browse the repository at this point in the history
Signed-off-by: KumoLiu <[email protected]>
  • Loading branch information
KumoLiu committed Sep 1, 2023
1 parent cac5e52 commit a7a690f
Show file tree
Hide file tree
Showing 2 changed files with 11 additions and 2 deletions.
5 changes: 3 additions & 2 deletions monai/losses/dice.py
Original file line number Diff line number Diff line change
Expand Up @@ -666,7 +666,8 @@ def __init__(
batch: whether to sum the intersection and union areas over the batch dimension before the dividing.
Defaults to False, a Dice loss value is computed independently from each item in the batch
before any `reduction`.
ce_weight: a rescaling weight given to each class for cross entropy loss.
ce_weight: a rescaling weight given to each class for cross entropy loss for `CrossEntropyLoss`.
or a rescaling weight given to the loss of each batch element for `BCEWithLogitsLoss`.
See ``torch.nn.CrossEntropyLoss()`` or ``torch.nn.BCEWithLogitsLoss()`` for more information.
lambda_dice: the trade-off weight value for dice loss. The value should be no less than 0.0.
Defaults to 1.0.
Expand Down Expand Up @@ -729,7 +730,7 @@ def bce(self, input: torch.Tensor, target: torch.Tensor) -> torch.Tensor:
if not torch.is_floating_point(target):
target = target.to(dtype=input.dtype)

return self.binary_cross_entropy(input, target)
return self.binary_cross_entropy(input, target) # type: ignore[no-any-return]

def forward(self, input: torch.Tensor, target: torch.Tensor) -> torch.Tensor:
"""
Expand Down
8 changes: 8 additions & 0 deletions tests/test_dice_ce_loss.py
Original file line number Diff line number Diff line change
Expand Up @@ -75,6 +75,14 @@
},
0.3133,
],
[ # shape: (2, 1, 3), (2, 1, 3), bceloss
{"ce_weight": torch.tensor([1.0, 1.0, 1.0]), "sigmoid": True},
{
"input": torch.tensor([[[0.8, 0.6, 0.0]], [[0.0, 0.0, 0.9]]]),
"target": torch.tensor([[[0.0, 0.0, 1.0]], [[0.0, 1.0, 0.0]]]),
},
1.5608,
],
]


Expand Down

0 comments on commit a7a690f

Please sign in to comment.