Commit 063df92
committed
Fix batch size broadcasting bug in GeneralizedWassersteinDiceLoss (#4650)
After `torch.gather`, `alpha_extended` retains shape (B, 1, S) while
`wasserstein_distance_map` has shape (B, S). When batch size > 1 the
element-wise multiply broadcasts to (B, B, S), mixing values across
samples. Fixed by squeezing dim=1 after gather in both
`_compute_generalized_true_positive` and `_compute_denominator`, and
reducing with `dim=1` instead of `dim=[1, 2]`.
Also fixed the `reduction="none"` code path which incorrectly tried to
reshape the per-sample loss tensor (B,) to (B, C, 1, ...) — GWDL
aggregates over classes internally so the class dimension doesn't apply.
Added regression tests that verify batch consistency:
- identical samples in a batch produce the same loss as a single sample
- batched per-sample losses match individually computed losses
Signed-off-by: hongjie-qiu <77599736+hongjie-qiu@users.noreply.github.com>1 parent 2f10e18 commit 063df92
2 files changed
Lines changed: 96 additions & 6 deletions
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
548 | 548 | | |
549 | 549 | | |
550 | 550 | | |
551 | | - | |
552 | | - | |
553 | | - | |
554 | | - | |
| 551 | + | |
| 552 | + | |
555 | 553 | | |
556 | 554 | | |
557 | 555 | | |
| |||
609 | 607 | | |
610 | 608 | | |
611 | 609 | | |
| 610 | + | |
612 | 611 | | |
613 | | - | |
| 612 | + | |
614 | 613 | | |
615 | 614 | | |
616 | 615 | | |
| |||
626 | 625 | | |
627 | 626 | | |
628 | 627 | | |
| 628 | + | |
629 | 629 | | |
630 | | - | |
| 630 | + | |
631 | 631 | | |
632 | 632 | | |
633 | 633 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
218 | 218 | | |
219 | 219 | | |
220 | 220 | | |
| 221 | + | |
| 222 | + | |
| 223 | + | |
| 224 | + | |
| 225 | + | |
| 226 | + | |
| 227 | + | |
| 228 | + | |
| 229 | + | |
| 230 | + | |
| 231 | + | |
| 232 | + | |
| 233 | + | |
| 234 | + | |
| 235 | + | |
| 236 | + | |
| 237 | + | |
| 238 | + | |
| 239 | + | |
| 240 | + | |
| 241 | + | |
| 242 | + | |
| 243 | + | |
| 244 | + | |
| 245 | + | |
| 246 | + | |
| 247 | + | |
| 248 | + | |
| 249 | + | |
| 250 | + | |
| 251 | + | |
| 252 | + | |
| 253 | + | |
| 254 | + | |
| 255 | + | |
| 256 | + | |
| 257 | + | |
| 258 | + | |
| 259 | + | |
| 260 | + | |
| 261 | + | |
| 262 | + | |
| 263 | + | |
| 264 | + | |
| 265 | + | |
| 266 | + | |
| 267 | + | |
| 268 | + | |
| 269 | + | |
| 270 | + | |
| 271 | + | |
| 272 | + | |
| 273 | + | |
| 274 | + | |
| 275 | + | |
| 276 | + | |
| 277 | + | |
| 278 | + | |
| 279 | + | |
| 280 | + | |
| 281 | + | |
| 282 | + | |
| 283 | + | |
| 284 | + | |
| 285 | + | |
| 286 | + | |
| 287 | + | |
| 288 | + | |
| 289 | + | |
| 290 | + | |
| 291 | + | |
| 292 | + | |
| 293 | + | |
| 294 | + | |
| 295 | + | |
| 296 | + | |
| 297 | + | |
| 298 | + | |
| 299 | + | |
| 300 | + | |
| 301 | + | |
| 302 | + | |
| 303 | + | |
| 304 | + | |
| 305 | + | |
| 306 | + | |
| 307 | + | |
| 308 | + | |
| 309 | + | |
| 310 | + | |
221 | 311 | | |
222 | 312 | | |
223 | 313 | | |
| |||
0 commit comments