-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug of GeneralizedDiceLoss #6765
Comments
thanks for reporting, I think those are class-wise weighting, the shape [C] makes sense, or do I misunderstand the idea here? |
yes, the shape [c] makes sense. But I think w doesn't work, bacause both the intersection and denominator are multiplied by w. For example: |
I see, that's a good point, looks like an issue introduced when fixing this bug #5466, please let me know if you are interested in submitting a patch, otherwise I'll have a look soon. |
The previous version of the code in #5466 seemed more reasonable
In my opinion, GDL cannot generate [C] output, as GDL will calculate all classes as one metric. |
Fixes #6765 ### Description as discussed in #6765, when `batch=True` the loss should still return 1 aggregated value instead of C channels. #5466 is not actually achievable with this formulation. ### Types of changes <!--- Put an `x` in all the boxes that apply, and remove the not applicable items --> - [x] Non-breaking change (fix or new feature that would not break existing functionality). - [ ] Breaking change (fix or new feature that would cause existing functionality to change). - [ ] New tests added to cover the changes. - [ ] Integration tests passed locally by running `./runtests.sh -f -u --net --coverage`. - [ ] Quick tests passed locally by running `./runtests.sh --quick --unittests --disttests`. - [ ] In-line docstrings updated. - [ ] Documentation updated, tested `make html` command in the `docs/` folder. --------- Signed-off-by: Wenqi Li <[email protected]>
Describe the bug
Code:
numer = 2.0 * (intersection * w) + self.smooth_nr
denom = (denominator * w) + self.smooth_dr
f: torch.Tensor = 1.0 - (numer / denom)
If self.batch is True, the shape of intersection, denominator and w are [C]. In the code, both the intersection and denominator are multiplied by w. If smooth item is ignored, then w doesn't work.
NiftyNet code:
generalised_dice_numerator = 2 * tf.reduce_sum(tf.multiply(weights, intersect))
generalised_dice_denominator = tf.reduce_sum(tf.multiply(weights, tf.maximum(seg_vol + ref_vol, 1)))
In niftynet code, they will first sum and then divide.
The text was updated successfully, but these errors were encountered: