Comments (1)
@smile2game Thank you. Qwen is not natively supported in Transformers (but Qwen2 is huggingface/transformers#28436). I tried running the export for Qwen-7B and we get:
Traceback (most recent call last):
File "/home/felix/miniconda3/envs/fx/bin/optimum-cli", line 8, in <module>
sys.exit(main())
File "/home/felix/optimum/optimum/commands/optimum_cli.py", line 163, in main
service.run()
File "/home/felix/optimum/optimum/commands/export/onnx.py", line 261, in run
main_export(
File "/home/felix/optimum/optimum/exporters/onnx/__main__.py", line 351, in main_export
onnx_export_from_model(
File "/home/felix/optimum/optimum/exporters/onnx/convert.py", line 1035, in onnx_export_from_model
raise ValueError(
ValueError: Trying to export a qwen model, that is a custom or unsupported architecture, but no custom onnx configuration was passed as `custom_onnx_configs`. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an issue at https://github.com/huggingface/optimum/issues if you would like the model type qwen to be supported natively in the ONNX export.
which is expected. Have you checked: https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#customize-the-export-of-transformers-models-with-custom-modeling?
from optimum.
Related Issues (20)
- Olmo 7B by AI2 currenlty not support BetterTransformer for optimized inference HOT 1
- KeyError: 'xla_fsdp_v2' in self.is_fsdp_xla_v2_enabled = args.fsdp_config["xla_fsdp_v2"]
- `Invalid fd was supplied: -1` optimum.onnxruntime load onnx model failed HOT 1
- Native Support for Gemma HOT 5
- Flash Attention v2 for Roberta-large HOT 5
- Issue converting owlv2 model to ONNX format HOT 4
- Optimum for Jetson Orin Nano HOT 1
- 似乎不支持cpu HOT 2
- Llama-2-7b is failing with bfloat16 export with onnx HOT 1
- Implement ORTModelForZeroShotObjectDetection HOT 3
- changes to _maybe_log_save_evaluate() not reflected in optimum repo HOT 1
- 'gemma is not supported yet with the onnx backend' - Exporting on-the-fly to onnx HOT 8
- Gemma Onnx suuport HOT 5
- tflite support for gemma HOT 1
- [BUG] Mistral feature extraction export to ONNX is broken HOT 5
- [BUG] Mistral feature extraction export to ONNX is broken
- [BUG] Cannot export Gemma/Mistral to ONNX/TensorRT using INT8 HOT 9
- Cannot download ONNX external data file from Hugging Face Hub HOT 4
- Cannot download from private repository on Hugging Face using optimum 1.17 HOT 3
- onnx model issue for zeroshot facebook/bart-large-mnli HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from optimum.