Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug of GeneralizedDiceLoss #6765

Closed
lixinxin9703 opened this issue Jul 25, 2023 · 4 comments · Fixed by #6775
Closed

Bug of GeneralizedDiceLoss #6765

lixinxin9703 opened this issue Jul 25, 2023 · 4 comments · Fixed by #6775
Labels
bug Something isn't working

Comments

@lixinxin9703
Copy link

Describe the bug
Code:
numer = 2.0 * (intersection * w) + self.smooth_nr
denom = (denominator * w) + self.smooth_dr
f: torch.Tensor = 1.0 - (numer / denom)

If self.batch is True, the shape of intersection, denominator and w are [C]. In the code, both the intersection and denominator are multiplied by w. If smooth item is ignored, then w doesn't work.

NiftyNet code:
generalised_dice_numerator = 2 * tf.reduce_sum(tf.multiply(weights, intersect))
generalised_dice_denominator = tf.reduce_sum(tf.multiply(weights, tf.maximum(seg_vol + ref_vol, 1)))

In niftynet code, they will first sum and then divide.

@wyli
Copy link
Contributor

wyli commented Jul 25, 2023

thanks for reporting, I think those are class-wise weighting, the shape [C] makes sense, or do I misunderstand the idea here?

@lixinxin9703
Copy link
Author

lixinxin9703 commented Jul 25, 2023

thanks for reporting, I think those are class-wise weighting, the shape [C] makes sense, or do I misunderstand the idea here?

yes, the shape [c] makes sense. But I think w doesn't work, bacause both the intersection and denominator are multiplied by w.

For example:
intersection: [1,2,3,4];
denominator: [2,3,4,5];
w: [0.1, 0.2, 0.3, 0.4];
so numer / denom = 2*([1, 2, 3, 4] * [0.1, 0.2, 0.3, 0.4)]) / ([2, 3, 4, 5] * [0.1, 0.2, 0.3, 0.4))
then [0.1, 0.2, 0.3, 0.4] are eliminated.
numer / denom = 2*([1, 2, 3, 4]) / ([2, 3, 4, 5])
w doesn't work. I think w is meaningless.

@wyli
Copy link
Contributor

wyli commented Jul 25, 2023

I see, that's a good point, looks like an issue introduced when fixing this bug #5466, please let me know if you are interested in submitting a patch, otherwise I'll have a look soon.

@wyli wyli added the bug Something isn't working label Jul 25, 2023
@lixinxin9703
Copy link
Author

lixinxin9703 commented Jul 25, 2023

The previous version of the code in #5466 seemed more reasonable

numer = 2.0 * (intersection * w).sum(final_reduce_dim, keepdim=True) + self.smooth_nr
denom = (denominator * w).sum(final_reduce_dim, keepdim=True) + self.smooth_dr
f: torch.Tensor = 1.0 - (numer / denom)

In my opinion, GDL cannot generate [C] output, as GDL will calculate all classes as one metric.

wyli added a commit that referenced this issue Jul 26, 2023
Fixes #6765

### Description
as discussed in #6765, when `batch=True` the loss should still return 1
aggregated value instead of C channels.
#5466 is not actually
achievable with this formulation.

### Types of changes
<!--- Put an `x` in all the boxes that apply, and remove the not
applicable items -->
- [x] Non-breaking change (fix or new feature that would not break
existing functionality).
- [ ] Breaking change (fix or new feature that would cause existing
functionality to change).
- [ ] New tests added to cover the changes.
- [ ] Integration tests passed locally by running `./runtests.sh -f -u
--net --coverage`.
- [ ] Quick tests passed locally by running `./runtests.sh --quick
--unittests --disttests`.
- [ ] In-line docstrings updated.
- [ ] Documentation updated, tested `make html` command in the `docs/`
folder.

---------

Signed-off-by: Wenqi Li <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants