Not working in LM Studio (Mac)

#1
by riddhidutta - opened

It is giving error ```
πŸ₯² Failed to load the model

Failed to load model

Error when loading model: ValueError: Unsupported model type: glm4v_vision


LM Studio version 0.3.33
MLX Community org

Yes, get the same message using mlx-vlm 0.3.9 on the command-line (as per example in the Model card)

File "mlx_vlm/models/glm4v/vision.py", line 270, in __init__
    raise ValueError(f"Unsupported model type: {self.model_type}")
ValueError: Unsupported model type: glm4v_vision
MLX Community org

When downloading the latest mlx-vlm from github, it gives

The tokenizer you are loading from 'mlx-community_GLM-4.6V-Flash-4bit' with an incorrect regex pattern: https://huggingface.co/mistralai/Mistral-Small-3.1-24B-Instruct-2503/discussions/84#69121093e8b480e709447d5e. This will lead to incorrect tokenization. You should set the fix_mistral_regex=True flag when loading this tokenizer to fix this issue.

ValueError: Failed to process inputs with error: PreTrainedTokenizerFast._batch_encode_plus() got an unexpected keyword argument 'images'

MLX Community org

Thank you to @Timmy-web

This solved the problem: pip install git+https://github.com/huggingface/transformers.git

Yes, get the same message using mlx-vlm 0.3.9 on the command-line (as per example in the Model card)

File "mlx_vlm/models/glm4v/vision.py", line 270, in __init__
    raise ValueError(f"Unsupported model type: {self.model_type}")
ValueError: Unsupported model type: glm4v_vision

Hi,

I got the same problem...
I just created a conda env and ran "pip install -U mlx-vlm", then I ran the inference command written in the model card and got "Unsupported model type" issue.

May I know how to solve this issue?

Thanks a lot!

MLX Community org

@neil0306

I first ran pip install -U mlx-vlm, and it provided me version 0.3.9. That is when I also got the Unsupported model type: glm4v_vision error.
Then I ran git clone https://github.com/Blaizzy/mlx-vlm.git and used that copy of mlx-vlm. This picks up the latest code changes after version 0.3.9. That's when I got the tokenization error.
Then Timmy-web gave the hint to also install the transformers code, which worked.

User Noodlz wrote in another chat:
So basically the pip version is basically not update yet. The cost of the bleeding edge life, haha.
For anyone else reading it. I uninstalled mlx-vlm with pip uninstall mlx-vlm, grabbed the latest with git clone https://github.com/Blaizzy/mlx-vlm.git, then within that mlx-vlm folder, did pip install . which gave me the latest

@neil0306

I first ran pip install -U mlx-vlm, and it provided me version 0.3.9. That is when I also got the Unsupported model type: glm4v_vision error.
Then I ran git clone https://github.com/Blaizzy/mlx-vlm.git and used that copy of mlx-vlm. This picks up the latest code changes after version 0.3.9. That's when I got the tokenization error.
Then Timmy-web gave the hint to also install the transformers code, which worked.

User Noodlz wrote in another chat:
So basically the pip version is basically not update yet. The cost of the bleeding edge life, haha.
For anyone else reading it. I uninstalled mlx-vlm with pip uninstall mlx-vlm, grabbed the latest with git clone https://github.com/Blaizzy/mlx-vlm.git, then within that mlx-vlm folder, did pip install . which gave me the latest

Thank you so much!

MLX Community org

Hey guys,

This will be updated later today :)

This comment has been hidden (marked as Resolved)

Sign up or log in to comment