Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add focal loss option for classification #9

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

jmduarte
Copy link
Contributor

@jmduarte jmduarte commented Sep 9, 2023

  • Add focal loss option for classification

Note I explicitly keep the current implementation F.cross_entropy if self.options.classification_focal_gamma == 0 out of an abundance of caution even though I checked that, numerically, the more complex formula gives the same answer.

The difference is that F.cross_entropy fuses F.log_softmax and F.nll_loss, so it is supposed to be more numerically stable, which I wanted to keep. But if you think it's unnecessary, we can remove this if-then clause.

@mstamenk

@jmduarte
Copy link
Contributor Author

jmduarte commented Oct 5, 2023

@Alexanders101 is this PR ok for you? Can you merge it? Or do you have any requested changes?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant