How to predict after loading in trained model in local? #1437
-
[Novice] I was successfully able to train model on my data , on Google Colab initially, and save a model to my Google drive and download that model to my local machine using Haystack tutorial here. I saved using this line:
Now that I have a saved/trained model, I am wanting to take the trained model and apply it locally on the same docs that were used for training in google colab, to the same docs on my local machine. I can read in the pretrained model in my local like so:
Now that I can read the model, I tried rerunning the prediction code like so:
However, this wont work and doesnt, because I dont have code on my local machine to run the My question is, once I have trained a model on a set of documents in google colab, is there any easy way to pass the model to someone else and have them predicting without running all the code that I used for training, which includes the If I need to process documents again, in my training code, can I save the text documents that are converted to paragraphs and use those for running the model on at a later date? It seems counter intuitive to have to reprocess documents as done in training. Thank you in advance. Local Machine Specs:
|
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
Hi @asharm0662 preprocessing of the documents is not needed again if you store the preprocessed documents in one of Haystack's document stores and re-use that document store. However, the document stores don't have something like an You're right that you don't need to run the training code again to make predictions after having trained a model. However, |
Beta Was this translation helpful? Give feedback.
-
Hi @asharm0662! Please let me try to help you. I think there is a misunderstanding, given that the tutorial you have linked does not contain any training but only the application of already trained QA models (= making predictions). You do not need to train a model in order to make predictions. You should be able to load one of the existing Reader models (for example I hope this makes things a bit clearer, if not, I am happy to answer your questions. |
Beta Was this translation helpful? Give feedback.
Hi @asharm0662! Please let me try to help you. I think there is a misunderstanding, given that the tutorial you have linked does not contain any training but only the application of already trained QA models (= making predictions). You do not need to train a model in order to make predictions.
You should be able to load one of the existing Reader models (for example
"deepset/roberta-base-squad2"
) on your local machine just like in the tutorial and ask questions on your documents by following the steps in Tutorial 1 or Tutorial 3. In order to make predictions on your own documents instead of the Game-of-Thrones sample documents, just write your own documents to the document store.I hope t…