Improve the AI-Polygon experience #1279
Replies: 3 comments
-
If this happens with the already computed images, we should fix that first.
This is also weird and should give better experience if we use GPU successfully. |
Beta Was this translation helpful? Give feedback.
-
I think still there is an issue. Takes a lot of time to compute for new & computed images no matter what. Is there any possible way. |
Beta Was this translation helpful? Give feedback.
-
metoo,I want to know if it can compute use gpu? |
Beta Was this translation helpful? Give feedback.
-
labelme with SAM integration is great, but my experience with it right now is a little annoying. Because I have to wait several seconds to compute image embedding for each image when I change to the next/prev image. Despite the caching mechanism, I still have to wait for calculations from time to time, which makes my computer very slow overall. I tried to change the provider of onnxruntime to GPU successfully, but it shows the following warning and doesn't seem to have much of a speedup.
As with the bad experience mentioned above, I think it would be a smoother experience if the user could import all the image embeddings computed on a powerful GPU and then do the AI labeling. The project https://github.com/anuragxel/salt did just that.
Beta Was this translation helpful? Give feedback.
All reactions