sbatch script for running inference #1967
Replies: 3 comments
-
Thank you for starting a new discussion! We appreciate your input and will review it soon. Warning A friendly reminder that this is a public forum. Please be cautious when clicking links, downloading files, or running scripts posted by others.
Stay safe and happy SLEAPing! Best regards, |
Beta Was this translation helpful? Give feedback.
-
Hi @sarazimmerman0, I can see that there are 2 errors in your logs. Yes, you are right, the first error Regarding the OOM error, I'm still looking into it, it would be great if you could share your training_config.json here. Meanwhile, this is a useful resource on OOM error. Thanks, Divya |
Beta Was this translation helpful? Give feedback.
-
Just copying my response from email thread: A few suggestions/observations:
|
Beta Was this translation helpful? Give feedback.
-
Hi,
I'm using SLURM for the first time and trying to run inference on a few videos using the SDSC cluster, but keep receiving an error that the process is out of memory. Any recommendations for adjusting the settings of my sbatch parameters? I am able to run inference locally on a Mac (Apple M3 Max chip and 64GB of memory) and allocating much more memory when submitting the job. The error log and sbatch commands are both attached.
I noticed the error discussed in #1918 in the log as well, but not sure if this is related.
Thanks for the help!
Sara
edited_log_file (1).txt
Beta Was this translation helpful? Give feedback.
All reactions