Naming mismatch that make I cant run in vllm (v0.19.0 + transformers 5.5)
#1
by allanchan339 - opened
TypeError: Invalid type of HuggingFace config. Expected type: <class 'vllm.transformers_utils.configs.qwen3_5.Qwen3_5Config'>, but found type: <class 'transformers.models.qwen3_5.configuration_qwen3_5.Qwen3_5TextConfig'>
Same error 😂
vllm version is:
vllm --version
0.19.0
transformers is
uv pip list | grep transformers
Using Python 3.12.3 environment at: vllm/.venv
transformers 5.5.0
serve command is:
vllm serve ./models/qwen/qwopus3.5-27b-v3-fp8-vllm-ready \
--dtype bfloat16 \
--gpu-memory-utilization 0.93 \
--max-model-len 2048 \
--enforce-eager \
--trust-remote-code \
--served-model-name qwopus3.5-27b-v3
Error message is:
(APIServer pid=3732473) File "~/Developer/llm/vllm/.venv/lib/python3.12/site-packages/vllm/model_executor/models/qwen3_5.py", line 110, in get_hf_config
(APIServer pid=3732473) return self.ctx.get_hf_config(Qwen3_5Config)
(APIServer pid=3732473) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(APIServer pid=3732473) File "~/Developer/llm/vllm/.venv/lib/python3.12/site-packages/vllm/multimodal/processing/context.py", line 140, in get_hf_config
(APIServer pid=3732473) raise TypeError(
(APIServer pid=3732473) TypeError: Invalid type of HuggingFace config. Expected type: <class 'vllm.transformers_utils.configs.qwen3_5.Qwen3_5Config'>, but found type: <class 'transformers.models.qwen3_5.configuration_qwen3_5.Qwen3_5TextConfig'>
Hi, thanks for all! Just want to say that I got the same bug! Any docker image should I use?
Same here
same here
Same, seems like this is borked.