You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When testing the Swin Transformer with the image, four seeds were detected. Even if the prediction is the correct one (Bromus hordaceous), only three seeds should have been detected. This could be caused by how the model was trained or because he is sensitive to the seed shadows.
Expected Behavior 📉
Detect three seeds and return three predictions.
Actual Behavor 📈
Detect four seeds and return four predictions.
Additional Context 📌
Quick fixes, on the user side, we can try to give more space to the seeds and see if the results are different.
Another avenue is to change the inference function in the backend to check if box are overlapping with the previous box and the next box instead of just checking two boxes. (However this would just hide that kind of errors)
Support image 🖼️
See previous images and images from the previous issues:
MaxenceGui
changed the title
Repeat the process with Swin Transformer. Detect the right seed, but hallucinate a fourth one.
Swin Transformer hallucinate non-existant seed
Apr 10, 2024
Just a further example:
Screenshot:
(Correct ID for the two seeds is Bromus japonicus)
Original image:
I think the reprogramming a couple of weeks ago to allow the red boxes to overlap has allowed it to return a lot more of these "phantoms". Not sure what the solution is though :(
Originally posted by @MaxenceGui in #14 (comment)
Description 🚀
When testing the Swin Transformer with the image, four seeds were detected. Even if the prediction is the correct one (Bromus hordaceous), only three seeds should have been detected. This could be caused by how the model was trained or because he is sensitive to the seed shadows.
Expected Behavior 📉
Detect three seeds and return three predictions.
Actual Behavor 📈
Detect four seeds and return four predictions.
Additional Context 📌
Quick fixes, on the user side, we can try to give more space to the seeds and see if the results are different.
Another avenue is to change the inference function in the backend to check if box are overlapping with the previous box and the next box instead of just checking two boxes. (However this would just hide that kind of errors)
Support image 🖼️
See previous images and images from the previous issues:
Related to
The text was updated successfully, but these errors were encountered: