You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I've seen your code at the front page for training a language model
`from fastai.text import *
import multifit
exp = multifit.from_pretrained("name of the model")
fa_config = exp.pretrain_lm.tokenizer.get_fastai_config(add_open_file_processor=True)
data_lm = (TextList.from_folder(imdb_path, **fa_config)
.filter_by_folder(include=['train', 'test', 'unsup'])
.split_by_rand_pct(0.1)
.label_for_lm()
.databunch(bs=bs))
learn = exp.finetune_lm.get_learner(data_lm)
# learn is a preconfigured fastai learner with a pretrained model loaded
learn.fit_one_cycle(10)
learn.save_encoder("enc")
...`
I would like to ask how I can then train my own classifier on top of this model, since all guidlines described here https://docs.fast.ai/text.html assume AWD-LSTM architecture, so they will not work with MULTIFIT language model as an encoder.
Thanks
The text was updated successfully, but these errors were encountered:
Hello, I've seen your code at the front page for training a language model
I would like to ask how I can then train my own classifier on top of this model, since all guidlines described here https://docs.fast.ai/text.html assume AWD-LSTM architecture, so they will not work with MULTIFIT language model as an encoder.
Thanks
The text was updated successfully, but these errors were encountered: