-
-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
A ImportError when I run the program "FinGPT_Training_LoRA_with_ChatGLM2_6B_for_Beginners.ipynb" #165
Comments
Hi, based on my experience, you can try to reinstall these two packages when this error shows, then restart your kernel to run your code. Hope this works. |
Thank you very much! I have already run the code successfully. |
The error indicates that the necessary packages for 8-bit training, specifically accelerate and bitsandbytes, are either not installed correctly or not recognized by the environment. Here's how you can troubleshoot and resolve the issue:
and also check the gpu settings hope this will help, let me know the further updates |
Hi,
When I try to run "FinGPT_Training_LoRA_with_ChatGLM2_6B_for_Beginners.ipynb" in google colab, I came aross a problem.
The code is
model_name = "THUDM/chatglm2-6b"
tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)
model = AutoModel.from_pretrained(
model_name,
quantization_config=q_config,
trust_remote_code=True,
device='cuda'
)
and, the error is
ImportError: Using
load_in_8bit=True
requires Accelerate:pip install accelerate
and the latest version of bitsandbytespip install -i https://test.pypi.org/simple/ bitsandbytes
or pip install bitsandbytes`model = prepare_model_for_int8_training(model, use_gradient_checkpointing=True)
Last, I program the code in Gcolab pro and I am sure both packages is installed.
Please help me solve the problem, thank you so much!
The text was updated successfully, but these errors were encountered: