Skip to content

Are Qwen1.5 MOE models supported? #6415

Closed
@l3utterfly

Description

@l3utterfly

I tried to convert the Qwen 1.5 MOE: https://huggingface.co/Qwen/Qwen1.5-MoE-A2.7B

It gives me an error message:

 python convert-hf-to-gguf.py /home/layla/src/text-generation-webui/models/Qwen1.5-MoE-A2.7B
Loading model: Qwen1.5-MoE-A2.7B
Traceback (most recent call last):
  File "/home/layla/src/llama.cpp/convert-hf-to-gguf.py", line 2296, in <module>
    main()
  File "/home/layla/src/llama.cpp/convert-hf-to-gguf.py", line 2276, in main
    model_class = Model.from_model_architecture(hparams["architectures"][0])
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/layla/src/llama.cpp/convert-hf-to-gguf.py", line 215, in from_model_architecture
    raise NotImplementedError(f'Architecture {arch!r} not supported!') from None
NotImplementedError: Architecture 'Qwen2MoeForCausalLM' not supported!

I see others have uploaded GGUFs of the Qwen MOE models on HF, so it leads me to think I'm doing something wrong

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions