patch on device does not work with this gguf

#1
by Zimdin12 - opened

Hmm it does not seem to work (atleast in same workflow as ggufs by Phil2Sat/Qwen-Image-Edit-Rapid-AIO-GGUF).
I get error UnetLoaderGGUFAdvanced: cannot reshape array of size 23845472 into shape (12288,2520)

this happens when Unet Loader (GGUF/Advanced)'s patch_on_device is enabled

I am newbie to quantizing, Followed the steps in this repo https://github.com/city96/ComfyUI-GGUF.
I might have missed something which I couldn't understand.
@Phil2Sat - can help with this.
What Patch on Device does?

try this: https://github.com/phil2sat/convert

this is with custom patches then the tool auto py

Thanks @Phil2Sat . Will use this.
May I know what patch_on_device does?
I have not used this till now.

i also dont know but since my version of the convert script works i guess i did it right.

city96 also doesnt include the 2509 low quant fix, generated images below q4_k_m look horrible. i fixed it the link i posted, this is my actual version i also use.

here is the thread about it https://huggingface.co/QuantStack/Qwen-Image-Edit-2509-GGUF/discussions/6 but my fix there is old

@Zimdin12 , Try using v11.4 ggufs. Quantized them using @Phil2Sat 's tools

will try if im home

It seems to work. thank you !

Zimdin12 changed discussion status to closed

Sign up or log in to comment