Duplicate Train Loader Preparation with Accelerate #139
Closed
Qifeng-Wu99
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello
I think this line of code is redundant, since the train_loader has already been wrapped by Accelerator in
setup_accelerate()
.The removal of the line of code supresses an error when I try to finetune xtts with my own dataset while employing multi gpus.
The error looks like:
Beta Was this translation helpful? Give feedback.
All reactions