Training on remote machine, inference locally #816
-
Hello, I have exported my human-labelled images by clicking 'Export training job package', uploaded them to my remote machine and then launched 'sleap-train single_instance.json labels.v000.pkg.slp'. Now I have a trained model. Since I have selected 'Predict on: entire current video' at the export stage, I have expected the labels of that video to appear somewhere, which I would need to refine. However, I am not sure where these predictions are. Now that I have trained, I want to move my files (models, labels etc) back to my laptop to refine labels and then push bash them back to the remote machine for another round of training. After I move my files back locally, which file do I need to open that contains the refined labels? Do I need to do any merging of datasets before I push to the remove machine for another round of training? Thanks! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Hi @agosztolai, If you ran inference via the GUI, then you do not need to merge predictions into the project - the predictions will have been merged for you. However, if you ran inference via the CLI, then you will need to open your original project and import the predictions into the original via File > Merge Into Project... To merge predictions (if inference was run via CLI) Locating predictions - no need to merge (if inference was run via GUI) Let me know if that answered your question or if anything needs to be clarified. Happy to help!
Fig 1: An example directory tree showcasing where the Thanks, |
Beta Was this translation helpful? Give feedback.
Hi @agosztolai,
If you ran inference via the GUI, then you do not need to merge predictions into the project - the predictions will have been merged for you. However, if you ran inference via the CLI, then you will need to open your original project and import the predictions into the original via File > Merge Into Project...
To merge predictions (if inference was run via CLI)
If you ran the
sleap-track
command, the predictions will have been saved to the output filename specified by the optional-o
or--output
argument. If no output path was specified, the output defaults to[data_path].predictions.slp
wheredata_path
is the (required) positional argument used insleap-track
.Locating pr…