-
-
Notifications
You must be signed in to change notification settings - Fork 232
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Segmentation Data Normalization for TrashCan Dataset #56
Comments
Thanks for writing the issue. Looking at the global_json2yolo code, there are some flags.
Converting the COCO segmentation format to YOLO segmentation format.
Converting the COCO keypoints format to YOLO keypoints format.
To convert the COCO segmentation format to YOLO segmentation format.
This is the folder structure when we run the script. Please let us know your opinion. |
It looks like you've been exploring the data normalization process for segmentation label data using YOLOv8 with a TrashCan dataset. The script you've referenced, global_json2yolo.py, appears to provide functionality for converting COCO segmentation format to YOLO segmentation format, as well as for converting other COCO data formats to YOLO equivalent. Additionally, the script contains flags for modifying the format conversion process based on specific requirements. While the global_json2yolo.py script seems to provide some relevant functionality, please ensure that it aligns with the specifications required for your TrashCan dataset. You can modify the script and its flags based on the dataset's specific structure and requirements to ensure accurate label data normalization. Feel free to further inquire about any specific details or concerns you may have about the data normalization process for the TrashCan dataset. |
Greetings,
I have been trying to use YOLOv8 recently using TrashCan dataset
However, I still don't know how to normalize segmentation label data.
I look everywhere but I don't get any references about it.
Any ideas?
Thank you.
The text was updated successfully, but these errors were encountered: