-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
pdb: loss reference before assignment #15
Comments
What is the version of your (Not suggested) If you want to run the code with a higher version of transformers, you need to modify the tokenizer settings in this line of flair/embeddings.py into:
|
Hi, My transformer version is 3.0.0. I am still facing the above-mentioned issue. Is there any other solution available? Or Nuveyla have you found any solution to the problem? Thanks |
Hi, have you tried to modify the tokenizer settings in you need to modify the tokenizer settings in this line of flair/embeddings.py into:
|
Yes, I did that. But still the same error. |
Can you post the screenshot of the error? |
I'm not sure for the reason. I install a new environment based on Moreover, I find that |
Hi, I meet this issue too. Three possible reasons may contribute to it: 1 transformers version need to be 3.0.0 2 torch must use GPU version 3 incompatible GPU and Cuda. In this case, you could still pass torch.cuda.is_available() but meet CUDA error: no kernel image is available for execution on the device. It is caused by higher torch version with GPU. For example, when I use Tesla K40 with torch1.7.0 cuda10.1, it would raise this issue, but degrading the torch to 1.3.0 would solve it. |
Hey,
When running: python train.py --config config/wnut17_doc_cl_kl.yaml, with the original code (only change in paths) I run into an error that the loss is referenced before assignment. See the following screenshot:
The given TypeError causes this issue. I have tried the option to add is_split_into_words=True into line 3171 in embeddings.py. This gave a new error:
with again same result (no assignment of loss). What can be the cause of this?
The text was updated successfully, but these errors were encountered: