Skip to content

Commit

Permalink
should be allow gptq/gptq_v2 convert to marlin (ModelCloud#377)
Browse files Browse the repository at this point in the history
Co-authored-by: LRL-ModelCloud <[email protected]>
  • Loading branch information
LRL-ModelCloud and LRL-ModelCloud authored Aug 17, 2024
1 parent 67c73d9 commit 242da1e
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion gptqmodel/models/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -1383,7 +1383,7 @@ def skip(*args, **kwargs):
load_checkpoint_in_model = True
quantize_config.runtime_format = FORMAT.GPTQ_V2

if backend == BACKEND.MARLIN and quantize_config.format == FORMAT.MARLIN:
if backend == BACKEND.MARLIN and quantize_config.format in [FORMAT.MARLIN, FORMAT.GPTQ, FORMAT.GPTQ_V2]:
if is_sharded:
raise ValueError(
"The loading of sharded checkpoints with Marlin is currently not supported."
Expand Down

0 comments on commit 242da1e

Please sign in to comment.