Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

i cannot load any ai models and i keep getting this error no matter what i do. this happened after i did "git pull" command from this repository #50

Open
0xYc0d0ne opened this issue Jun 13, 2023 · 1 comment

Comments

@0xYc0d0ne
Copy link

Exception in thread Thread-14:
Traceback (most recent call last):
File "B:\python\lib\threading.py", line 932, in _bootstrap_inner
self.run()
File "B:\python\lib\threading.py", line 870, in run
self._target(*self._args, **self._kwargs)
File "B:\python\lib\site-packages\socketio\server.py", line 731, in _handle_event_internal
r = server._trigger_event(data[0], namespace, sid, *data[1:])
File "B:\python\lib\site-packages\socketio\server.py", line 756, in trigger_event
return self.handlers[namespace]event
File "B:\python\lib\site-packages\flask_socketio_init
.py", line 282, in _handler
return self.handle_event(handler, message, namespace, sid,
File "B:\python\lib\site-packages\flask_socketio_init
.py", line 828, in _handle_event
ret = handler(*args)
File "aiserver.py", line 615, in g
return f(*a, **k)
File "aiserver.py", line 3191, in get_message
load_model(use_gpu=msg['use_gpu'], gpu_layers=msg['gpu_layers'], disk_layers=msg['disk_layers'], online_model=msg['online_model'])
File "aiserver.py", line 1980, in load_model
model.load(
File "C:\KoboldAI\modeling\inference_model.py", line 177, in load
self._load(save_model=save_model, initial_load=initial_load)
File "C:\KoboldAI\modeling\inference_models\hf_torch_4bit.py", line 198, in _load
self.model = self._get_model(self.get_local_model_path(), tf_kwargs)
File "C:\KoboldAI\modeling\inference_models\hf_torch_4bit.py", line 378, in _get_model
model = load_quant_offload(llama_load_quant, utils.koboldai_vars.custmodpth, path_4bit, utils.koboldai_vars.gptq_bits, groupsize, self.gpu_layers_list, force_bias=v2_bias)
TypeError: load_quant_offload() got an unexpected keyword argument 'force_bias'

@0xYc0d0ne 0xYc0d0ne changed the title i cannot load any ai models and i keep getting this error no matter what i do i cannot load any ai models and i keep getting this error no matter what i do. this happened after i did "git pull" command from this repository Jun 13, 2023
@0cc4m
Copy link
Owner

0cc4m commented Jun 13, 2023

You need to update the gptq module. Run install_requirements again.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants