You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for your sharing about MLM works!
I have tried chatting with "Lion"-Bot while asking in Chinese, and got brief answer almost in English.
But I'm appreciated to know why using chineseBERT as Q-Former writed in your readme.md?
Do you have any plan to make “Lion” answer in Chinese?
Thank you!
The text was updated successfully, but these errors were encountered:
Thank you for your sharing about MLM works! I have tried chatting with "Lion"-Bot while asking in Chinese, and got brief answer almost in English. But I'm appreciated to know why using chineseBERT as Q-Former writed in your readme.md? Do you have any plan to make “Lion” answer in Chinese? Thank you!
Hi sunyoe,
We used both Chinese and English data during pre-training, but during SFT, we found that using pure English data would perform better on MME. Therefore, this demo model is more inclined to speak English. Our Chinese and English chat model will be released as soon as possible, and look forward to it.
Thank you for your sharing about MLM works!
I have tried chatting with "Lion"-Bot while asking in Chinese, and got brief answer almost in English.
But I'm appreciated to know why using chineseBERT as Q-Former writed in your readme.md?
Do you have any plan to make “Lion” answer in Chinese?
Thank you!
The text was updated successfully, but these errors were encountered: