ollama support

#6
by madhutter - opened

cant get this working on latest ollama 0.21.0-rc0

3090 24gb, 64gb ram
tried q3,q4 variants, all error with: Error: 500 Internal Server Error: unable to load model:

any ideas?

Unsloth AI org

Ollama doesn't support GGUFs with separate mmproj files.

For instructions to run this, I'd recommend reading this: https://github.com/ollama/ollama/issues/15235#issuecomment-4187108500

And change the model name to Qwen3.6 etc

thanks @danielhanchen

if someone has the same issue and is using the chart to deploy ollama, you can do this:

models:
    pull:
      - hf.co/unsloth/Qwen3.6-35B-A3B-GGUF:UD-Q4_K_XL
    create:
      - name: Qwen3.6-35B-A3B-Q4_K_XL
        template: |
          # Modelfile generated by "ollama show"
          # To build a new Modelfile based on this, replace FROM with:
          # FROM hf.co/unsloth/Qwen3.6-35B-A3B-GGUF:UD-Q4_K_XL

          FROM /home/ubuntu/.ollama/models/blobs/sha256-707a55a8a4397ecde44de0c499d3e68c1ad1d240d1da65826b4949d1043f4450
          # FROM /home/ubuntu/.ollama/models/blobs/sha256-356dfaa3111376a4f7165e32e8749713378d1700b37cf52e0c50d9f23322334d
          TEMPLATE "{{ if .System }}<|im_start|>system
          {{ .System }}<|im_end|>
          {{ end }}{{ if .Prompt }} .Prompt }}<|im_end|>
          {{ end }}<|im_start|>assistant
          <think>

          </think>

          {{ .Response }}<|im_end|>
          "
          PARAMETER stop <|im_start|>
          PARAMETER stop <|im_end|>
          PARAMETER stop <think>
          PARAMETER stop .Prompt }}<|im_end|>
madhutter changed discussion status to closed

Sign up or log in to comment