Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Load existing LORA and continue training it #1887

Open
5 tasks done
Nero10578 opened this issue Sep 1, 2024 · 3 comments
Open
5 tasks done

Load existing LORA and continue training it #1887

Nero10578 opened this issue Sep 1, 2024 · 3 comments
Labels

Comments

@Nero10578
Copy link
Contributor

Nero10578 commented Sep 1, 2024

⚠️ Please check that this feature request hasn't been suggested before.

  • I searched previous Ideas in Discussions didn't find any similar feature requests.
  • I searched previous Issues didn't find any similar feature requests.

🔖 Feature description

As far as I understand specifying the LORA directory only means you're loading the LORA ontop of the base model and then training a new LORA on top of that?

✔️ Solution

Be able to load a pre-trained LORA and continue training it. Unless Axolotl already does this and I'm misunderstanding the docs?

❓ Alternatives

No response

📝 Additional Context

No response

Acknowledgements

  • My issue title is concise, descriptive, and in title casing.
  • I have searched the existing issues to make sure this feature has not been requested yet.
  • I have provided enough information for the maintainers to understand and evaluate this request.
@Nero10578 Nero10578 added the enhancement New feature or request label Sep 1, 2024
@NanoCode012
Copy link
Collaborator

NanoCode012 commented Nov 11, 2024

Hey, I think if you do load, it may work as intended and continue training. I believe I may have done this quite a while back.

Please let us know if my understanding is correct!

@Nero10578
Copy link
Contributor Author

Hey, I think if you do load, it may work as intended and continue training. I believe I may have done this quite a while back.

Please let us know if my understanding is correct!

Do you mean it should work by just specifying the lora directory?

@NanoCode012
Copy link
Collaborator

Yes, it should continue training from that adapter. Please do let me know if there's some issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants