You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to use keras-nlp with a pretrained masked BERT model to predict some tokens in a sequence. However the model produces inconsistent results. Am I doing something wrong?
Tokens in predictions should match tokens in labels. The preprocessor masks two tokens, and first two tokens of predictions probably should match labels.
The text was updated successfully, but these errors were encountered:
Description
I am trying to use keras-nlp with a pretrained masked BERT model to predict some tokens in a sequence. However the model produces inconsistent results. Am I doing something wrong?
To Reproduce
https://colab.research.google.com/drive/1kL5oXveRen9Zub_7kJCI28Gg34GfDt_x
Expected behavior
Tokens in predictions should match tokens in labels. The preprocessor masks two tokens, and first two tokens of predictions probably should match labels.
The text was updated successfully, but these errors were encountered: