Loading model localluy

#1
by GardensOfBabylon29 - opened

Is there a guide on how to load it locally using python , no Comfyui , only pure code

Technically you can run comfyui workflows as pure python scripts as well.

Yes it is possible

Clone and build this -> https://github.com/leejet/stable-diffusion.cpp

This is possible in Windows but it easiest in Linux

Then download the diffusion model, VAE and LLM

Then run it using something like this (this is a Linux command, a similar one can be run in Windows CMD using ^ instead of \ at the end of each line:

./bin/sd-cli
--diffusion-model /home/user/Projects/stable-diffusion/Models/city96/Qwen-Image-gguf/qwen-image-Q4_K_M.gguf
--vae /home/user/Projects/stable-diffusion/Models/city96/Qwen-Image-gguf/qwen_image_vae.safetensors
--llm /home/user/Projects/stable-diffusion/Models/city96/Qwen-Image-gguf/Qwen2.5-VL-7B-Instruct-UD-Q4_K_XL.gguf
-p ""
--cfg-scale 1.0
--sampling-method euler
--steps 10
-v
--offload-to-cpu
--vae-tiling
--vae-on-cpu
--clip-on-cpu
--diffusion-fa
-W 1080
-H 1920
-o output_%03d.png

Go and read up about the different flags and what they do

Sign up or log in to comment