We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I would like to add a simple Hausdorff Loss as proposed in (DOI: 10.1109/TMI.2019.2930068)
It approximates the Hausdorff distance to allow for direct minimization of the HD during training
function hd_loss(ŷ, y, ŷ_dtm, y_dtm) M = (ŷ .- y) .^ 2 .* (ŷ_dtm .^ 2 .+ y_dtm .^ 2) loss = mean(M) end
Should I open a PR for this?
The text was updated successfully, but these errors were encountered:
It might also be nice to add in a dice loss too even though this is already in Flux?
function dice_loss(ŷ, y) ϵ = 1e-5 loss = 1 - ((2 * sum(ŷ .* y) + ϵ) / (sum(ŷ .* ŷ) + sum(y .* y) + ϵ)) end
Sorry, something went wrong.
@Dale-Black can you please try to draft a PR following the current interface?
Yes I would love to do that, but it might be 1.5 weeks once finals are over. Is there a way to have GitHub remind me about this in like 2 weeks??
@Dale-Black I don't know of any feature in GitHub for reminders, but you can always set it in your own personal calendar.
We are cleaning up this repo once more, and it would be nice to have more contributors/maintainers. 👍🏽
No branches or pull requests
I would like to add a simple Hausdorff Loss as proposed in (DOI: 10.1109/TMI.2019.2930068)
It approximates the Hausdorff distance to allow for direct minimization of the HD during training
Should I open a PR for this?
The text was updated successfully, but these errors were encountered: