You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I use alpace2flm.py to generate llama2-13b model, I met an error
Traceback (most recent call last):
File "/home/xxx/fastllm/tools/scripts/alpaca2flm.py", line 18, in <module>
torch2flm.tofile(exportPath, model, tokenizer, pre_prompt = "<FLM_FIX_TOKEN_1>",
File "/usr/local/lib/python3.10/dist-packages/ftllm-0.0.0.1-py3.10.egg/ftllm/torch2flm.py", line 223, in tofile
tokenizer_data = json.load(f)
File "/usr/lib/python3.10/json/__init__.py", line 293, in load
return loads(fp.read(),
File "/usr/lib/python3.10/codecs.py", line 322, in decode
(result, consumed) = self._buffer_decode(data, self.errors, final)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xce in position 4411: invalid continuation byte
and my python code is
if__name__=="__main__":
model_name=sys.argv[3] iflen(sys.argv) >=4else'meta-llama/Llama-2-13b-chat-hf'tokenizer=AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)
# `torch_dtype=torch.float16` is set by default, if it will not cause an OOM Error, you can load model in float32.model=LlamaForCausalLM.from_pretrained(model_name, torch_dtype=torch.float16)
conf=model.config.__dict__conf["model_type"] ="llama"dtype=sys.argv[2] iflen(sys.argv) >=3else"float16"exportPath=sys.argv[1] iflen(sys.argv) >=2else"alpaca-33b-"+dtype+".flm"# add custom code here#torch2flm.tofile(exportPath, model, tokenizer, dtype = dtype)torch2flm.tofile(exportPath, model, tokenizer, pre_prompt="<FLM_FIX_TOKEN_1>",
user_role="[INST] ", bot_role=" [/INST]",
history_sep=" <FLM_FIX_TOKEN_2><FLM_FIX_TOKEN_1>", dtype=dtype)
How can I fix it?
The text was updated successfully, but these errors were encountered:
When I use alpace2flm.py to generate llama2-13b model, I met an error
and my python code is
How can I fix it?
The text was updated successfully, but these errors were encountered: