You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Trying to export a bark model, that is a custom or unsupported architecture, but no custom export configuration was passed as custom_export_configs.
#846
Open
RibalBaghdadi opened this issue
Jul 26, 2024
· 0 comments
from transformers import AutoProcessor, AutoModel
from optimum.intel import OVModelForCausalLM
model_id = "suno/bark-small"
processor = AutoProcessor.from_pretrained("suno/bark-small")
model = OVModelForCausalLM.from_pretrained(model_id, export=True)
Framework not specified. Using pt to export the model.
ValueError Traceback (most recent call last) in <cell line: 1>()
----> 1 model = OVModelForCausalLM.from_pretrained(model_id, export=True)
2 frames /usr/local/lib/python3.10/dist-packages/optimum/exporters/openvino/main.py in main_export(model_name_or_path, output, task, device, framework, cache_dir, trust_remote_code, pad_token_id, subfolder, revision, force_download, local_files_only, use_auth_token, token, model_kwargs, custom_export_configs, fn_get_submodels, compression_option, compression_ratio, ov_config, stateful, convert_tokenizer, library_name, **kwargs_shapes)
228 custom_architecture = True
229 if custom_export_configs is None:
--> 230 raise ValueError(
231 f"Trying to export a {model_type} model, that is a custom or unsupported architecture, but no custom export configuration was passed as custom_export_configs. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an issue at https://github.com/huggingface/optimum-intel/issues if you would like the model type {model_type} to be supported natively in the OpenVINO export."
232 )
from transformers import AutoProcessor, AutoModel
from optimum.intel import OVModelForCausalLM
model_id = "suno/bark-small"
processor = AutoProcessor.from_pretrained("suno/bark-small")
model = OVModelForCausalLM.from_pretrained(model_id, export=True)
Framework not specified. Using pt to export the model.
ValueError Traceback (most recent call last)
in <cell line: 1>()
----> 1 model = OVModelForCausalLM.from_pretrained(model_id, export=True)
2 frames
/usr/local/lib/python3.10/dist-packages/optimum/exporters/openvino/main.py in main_export(model_name_or_path, output, task, device, framework, cache_dir, trust_remote_code, pad_token_id, subfolder, revision, force_download, local_files_only, use_auth_token, token, model_kwargs, custom_export_configs, fn_get_submodels, compression_option, compression_ratio, ov_config, stateful, convert_tokenizer, library_name, **kwargs_shapes)
228 custom_architecture = True
229 if custom_export_configs is None:
--> 230 raise ValueError(
231 f"Trying to export a {model_type} model, that is a custom or unsupported architecture, but no custom export configuration was passed as
custom_export_configs
. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an issue at https://github.com/huggingface/optimum-intel/issues if you would like the model type {model_type} to be supported natively in the OpenVINO export."232 )
ValueError: Trying to export a bark model, that is a custom or unsupported architecture, but no custom export configuration was passed as
custom_export_configs
. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an issue at https://github.com/huggingface/optimum-intel/issues if you would like the model type bark to be supported natively in the OpenVINO export.The text was updated successfully, but these errors were encountered: