Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Which version of trlx and transformers are you using? #7

Open
PamKing7 opened this issue Aug 14, 2024 · 8 comments
Open

Which version of trlx and transformers are you using? #7

PamKing7 opened this issue Aug 14, 2024 · 8 comments

Comments

@PamKing7
Copy link

No matter whether I load the local model or the gpt2-imdb model from huggingface, the following error is reported:
ValueError: GPTModelBranch does not support an attention implementation through torch.nn.functional.scaled_dot_product_attention yet. Please request the support for this architecture: https://github.com/huggingface/transformers/issues/28005. If you believe this error is a bug, please open an issue in Transformers GitHub repository and load your model with the argumentattn_implementation="eager"meanwhile. Example:model = AutoModel.from_pretrained("openai/whisper-tiny", attn_implementation="eager")

This seems to be a problem caused by the version of transformers, but my version has been updated to the latest version.Which version of trlx and transformers are you using?

@PamKing7
Copy link
Author

In addition, it appears that the TRLX module used for training does not support the MistralForCausalLM model.

@nuwuxian
Copy link

I have met the same problem.

@PamKing7
Copy link
Author

I have met the same problem.

This may be a version problem, please return the version of the transformers.

@williamd4112
Copy link
Collaborator

  • trlx: i'm using a customized version. See custom_trlx
  • transformers: I'm using the version in custom_trlx transformers==4.32.0

@PamKing7
Copy link
Author

  • trlx: i'm using a customized version. See custom_trlx
  • transformers: I'm using the version in custom_trlx transformers==4.32.0

transformers 4.41.2

@zui-jiang
Copy link

  • trlx: i'm using a customized version. See custom_trlx
  • transformers: I'm using the version in custom_trlx transformers==4.32.0

how about the version of sentence_transformers

@alexsting
Copy link

I really admire you for completing a great job, however, I also encountered the same problem when I used colab to run your project. I have solved the problem after modifying transformers and sentence-transformer versions, but I have encountered new problems: RuntimeError: ffi_prep_cif_var failed, Could you please tell me all your environment versions?thank you!

@alexsting
Copy link

(! pip show freeze) would be helpful,thx!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants