Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

加载模型出错,ValueError: Unrecognized configuration class <class 'transformers_modules.fuzi-mingcha-v1_0.configuration_chatglm.ChatGLMConfig'> #13

Open
zyh3826 opened this issue Jan 9, 2024 · 8 comments

Comments

@zyh3826
Copy link

zyh3826 commented Jan 9, 2024

您好按照你的方式加载模型出错,@Furyton @zwh-sdu 错误如下:

>>> model = AutoModelForCausalLM.from_pretrained(p, trust_remote_code=True)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/data/zhaoyuhang/anaconda3/envs/fuzimingcha/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 487, in from_pretrained
    raise ValueError(
ValueError: Unrecognized configuration class <class 'transformers_modules.fuzi-mingcha-v1_0.configuration_chatglm.ChatGLMConfig'> for this kind of AutoModel: AutoModelForCausalLM.
Model type should be one of BartConfig, BertConfig, BertGenerationConfig, BigBirdConfig, BigBirdPegasusConfig, BioGptConfig, BlenderbotConfig, BlenderbotSmallConfig, BloomConfig, CamembertConfig, CodeGenConfig, CpmAntConfig, CTRLConfig, Data2VecTextConfig, ElectraConfig, ErnieConfig, GitConfig, GPT2Config, GPT2Config, GPTBigCodeConfig, GPTNeoConfig, GPTNeoXConfig, GPTNeoXJapaneseConfig, GPTJConfig, LlamaConfig, MarianConfig, MBartConfig, MegaConfig, MegatronBertConfig, MvpConfig, OpenLlamaConfig, OpenAIGPTConfig, OPTConfig, PegasusConfig, PLBartConfig, ProphetNetConfig, QDQBertConfig, ReformerConfig, RemBertConfig, RobertaConfig, RobertaPreLayerNormConfig, RoCBertConfig, RoFormerConfig, RwkvConfig, Speech2Text2Config, TransfoXLConfig, TrOCRConfig, XGLMConfig, XLMConfig, XLMProphetNetConfig, XLMRobertaConfig, XLMRobertaXLConfig, XLNetConfig, XmodConfig.
@zyh3826
Copy link
Author

zyh3826 commented Jan 9, 2024

还有个就是pyluceneapi.py里有一系列org.apache.lucene,请问这个是咋安装的,因为我的singularity安装后没法运行,得自己安装包

@Furyton
Copy link
Member

Furyton commented Jan 9, 2024

您好按照你的方式加载模型出错,@Furyton @zwh-sdu 错误如下:

>>> model = AutoModelForCausalLM.from_pretrained(p, trust_remote_code=True)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/data/zhaoyuhang/anaconda3/envs/fuzimingcha/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 487, in from_pretrained
    raise ValueError(
ValueError: Unrecognized configuration class <class 'transformers_modules.fuzi-mingcha-v1_0.configuration_chatglm.ChatGLMConfig'> for this kind of AutoModel: AutoModelForCausalLM.
Model type should be one of BartConfig, BertConfig, BertGenerationConfig, BigBirdConfig, BigBirdPegasusConfig, BioGptConfig, BlenderbotConfig, BlenderbotSmallConfig, BloomConfig, CamembertConfig, CodeGenConfig, CpmAntConfig, CTRLConfig, Data2VecTextConfig, ElectraConfig, ErnieConfig, GitConfig, GPT2Config, GPT2Config, GPTBigCodeConfig, GPTNeoConfig, GPTNeoXConfig, GPTNeoXJapaneseConfig, GPTJConfig, LlamaConfig, MarianConfig, MBartConfig, MegaConfig, MegatronBertConfig, MvpConfig, OpenLlamaConfig, OpenAIGPTConfig, OPTConfig, PegasusConfig, PLBartConfig, ProphetNetConfig, QDQBertConfig, ReformerConfig, RemBertConfig, RobertaConfig, RobertaPreLayerNormConfig, RoCBertConfig, RoFormerConfig, RwkvConfig, Speech2Text2Config, TransfoXLConfig, TrOCRConfig, XGLMConfig, XLMConfig, XLMProphetNetConfig, XLMRobertaConfig, XLMRobertaXLConfig, XLNetConfig, XmodConfig.

您好,感谢您的关注。
这里需要使用 AutoModel 来加载模型,即

from transformers import AutoModel
model = AutoModel.from_pretrained("/path/to/fuzi-mingcha-v1_0", trust_remote_code=True)

@zyh3826
Copy link
Author

zyh3826 commented Jan 10, 2024

您好按照你的方式加载模型出错,@Furyton @zwh-sdu 错误如下:

>>> model = AutoModelForCausalLM.from_pretrained(p, trust_remote_code=True)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/data/zhaoyuhang/anaconda3/envs/fuzimingcha/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 487, in from_pretrained
    raise ValueError(
ValueError: Unrecognized configuration class <class 'transformers_modules.fuzi-mingcha-v1_0.configuration_chatglm.ChatGLMConfig'> for this kind of AutoModel: AutoModelForCausalLM.
Model type should be one of BartConfig, BertConfig, BertGenerationConfig, BigBirdConfig, BigBirdPegasusConfig, BioGptConfig, BlenderbotConfig, BlenderbotSmallConfig, BloomConfig, CamembertConfig, CodeGenConfig, CpmAntConfig, CTRLConfig, Data2VecTextConfig, ElectraConfig, ErnieConfig, GitConfig, GPT2Config, GPT2Config, GPTBigCodeConfig, GPTNeoConfig, GPTNeoXConfig, GPTNeoXJapaneseConfig, GPTJConfig, LlamaConfig, MarianConfig, MBartConfig, MegaConfig, MegatronBertConfig, MvpConfig, OpenLlamaConfig, OpenAIGPTConfig, OPTConfig, PegasusConfig, PLBartConfig, ProphetNetConfig, QDQBertConfig, ReformerConfig, RemBertConfig, RobertaConfig, RobertaPreLayerNormConfig, RoCBertConfig, RoFormerConfig, RwkvConfig, Speech2Text2Config, TransfoXLConfig, TrOCRConfig, XGLMConfig, XLMConfig, XLMProphetNetConfig, XLMRobertaConfig, XLMRobertaXLConfig, XLNetConfig, XmodConfig.

您好,感谢您的关注。 这里需要使用 AutoModel 来加载模型,即

from transformers import AutoModel
model = AutoModel.from_pretrained("/path/to/fuzi-mingcha-v1_0", trust_remote_code=True)

谢谢回复,没注意,习惯用AutoModelForCausalLM
还有就是pylucene_task*的api.py里有一系列org.apache.lucene,请问这个是咋安装的呢

@Furyton
Copy link
Member

Furyton commented Jan 10, 2024

您好,目前我们使用的 lucene 的安装方式确实有些繁琐,我们会在近期采用一版新的检索方案,简化安装过程,更新后会告知您

感谢您的关注和理解。

@xinghen91
Copy link

image
image
我整体把项目搭建起来了,通过镜像跑起来之后,用浏览器访问地址报这个错,不知道是访问地址不对还是部署哪里有问题,能帮忙解答下么,或者有没有社群能够讨论?

@zwh-sdu
Copy link
Contributor

zwh-sdu commented Jan 25, 2024

image image 我整体把项目搭建起来了,通过镜像跑起来之后,用浏览器访问地址报这个错,不知道是访问地址不对还是部署哪里有问题,能帮忙解答下么,或者有没有社群能够讨论?

您好,这一步部署的只是检索模块,您部署出来的这个地址是在 cli_demo.py 中检索部分请求的地址
即下面这行命令中的 "法条检索对应部署的 pylucene 地址" 和 "类案检索对应部署的 pylucene 地址"

python cli_demo.py --url_lucene_task1 "法条检索对应部署的 pylucene 地址"  --url_lucene_task2 "类案检索对应部署的 pylucene 地址"

@zwh-sdu
Copy link
Contributor

zwh-sdu commented Jan 25, 2024

您好按照你的方式加载模型出错,@Furyton @zwh-sdu 错误如下:

>>> model = AutoModelForCausalLM.from_pretrained(p, trust_remote_code=True)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/data/zhaoyuhang/anaconda3/envs/fuzimingcha/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 487, in from_pretrained
    raise ValueError(
ValueError: Unrecognized configuration class <class 'transformers_modules.fuzi-mingcha-v1_0.configuration_chatglm.ChatGLMConfig'> for this kind of AutoModel: AutoModelForCausalLM.
Model type should be one of BartConfig, BertConfig, BertGenerationConfig, BigBirdConfig, BigBirdPegasusConfig, BioGptConfig, BlenderbotConfig, BlenderbotSmallConfig, BloomConfig, CamembertConfig, CodeGenConfig, CpmAntConfig, CTRLConfig, Data2VecTextConfig, ElectraConfig, ErnieConfig, GitConfig, GPT2Config, GPT2Config, GPTBigCodeConfig, GPTNeoConfig, GPTNeoXConfig, GPTNeoXJapaneseConfig, GPTJConfig, LlamaConfig, MarianConfig, MBartConfig, MegaConfig, MegatronBertConfig, MvpConfig, OpenLlamaConfig, OpenAIGPTConfig, OPTConfig, PegasusConfig, PLBartConfig, ProphetNetConfig, QDQBertConfig, ReformerConfig, RemBertConfig, RobertaConfig, RobertaPreLayerNormConfig, RoCBertConfig, RoFormerConfig, RwkvConfig, Speech2Text2Config, TransfoXLConfig, TrOCRConfig, XGLMConfig, XLMConfig, XLMProphetNetConfig, XLMRobertaConfig, XLMRobertaXLConfig, XLNetConfig, XmodConfig.

您好,感谢您的关注。 这里需要使用 AutoModel 来加载模型,即

from transformers import AutoModel
model = AutoModel.from_pretrained("/path/to/fuzi-mingcha-v1_0", trust_remote_code=True)

谢谢回复,没注意,习惯用AutoModelForCausalLM了 还有就是pylucene_task*的api.py里有一系列org.apache.lucene,请问这个是咋安装的呢

您好,我们更新了 ES 检索的部署代码,您可以尝试用 ES 替代 pylucene 进行部署

@chloroplast567
Copy link

请问您更推荐哪种检索方式呢?PyLucene方式部署起来感觉有些繁琐

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants