You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, Thanks for posting a nice repo!
I see that we can train the DPR model with GC-DPR but I guess we need to train it from scratch by loading base models (bert-base-uncased|roberta-base)
How can we make use of this repo to fine tune a pretrained DPR model. For example, we already have DPR encoder models provided by Facebook.
question_model = "facebook/dpr-question_encoder-single-nq-base"
context_model = "facebook/dpr-ctx_encoder-single-nq-base"
To make these models domain-specific my idea is to fine-tune these models with domain data.
It would be helpful if you can let me know how we can load question and context models with train_dense_encoder function.
Any other suggestion would be appreciated.
The text was updated successfully, but these errors were encountered:
Hi, Thanks for posting a nice repo!
I see that we can train the DPR model with GC-DPR but I guess we need to train it from scratch by loading base models (bert-base-uncased|roberta-base)
How can we make use of this repo to fine tune a pretrained DPR model. For example, we already have DPR encoder models provided by Facebook.
question_model = "facebook/dpr-question_encoder-single-nq-base"
context_model = "facebook/dpr-ctx_encoder-single-nq-base"
To make these models domain-specific my idea is to fine-tune these models with domain data.
It would be helpful if you can let me know how we can load question and context models with train_dense_encoder function.
Any other suggestion would be appreciated.
The text was updated successfully, but these errors were encountered: