You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

gemma-3-12b-it-vl-Polaris-AIExpert-Gemini-Heretic

This is a 1.4/0.6 nuslerp merge of:

  • DavidAU/gemma-3-12b-it-vl-Polaris-Heretic-Uncensored-Thinking
  • DavidAU/gemma-3-12b-it-vl-Polaris-Heretic-AIExpert-NM-Gemini250x

Brainwaves

          arc   arc/e boolq hswag obkqa piqa  wino
qx86-hi   0.623,0.795,0.855,0.724,0.498,0.785,0.711
qx64-hi   0.618,0.790,0.845,0.725,0.472,0.791,0.734

Perplexity
mxfp8     11.325 ± 0.108
qx86-hi   11.591 ± 0.113
qx64-hi   11.820 ± 0.115
mxfp4     13.850 ± 0.141

gemma-3-12b-it-vl-Polaris-Heretic-Uncensored-Thinking
qx86-hi   0.619,0.791,0.859,0.705,0.482,0.765,0.714

gemma-3-12b-it-vl-Polaris-Heretic-AIExpert-NM-Gemini250x
qx86-hi   0.599,0.772,0.857,0.745,0.476,0.799,0.722

Base model
gemma-3-12b-it-heretic
qx86-hi   0.534,0.699,0.872,0.603,0.448,0.733,0.658

Build recipe

models:
  - model: gemma-3-12b-it-vl-Polaris-Heretic-Uncensored-Thinking
    parameters:
      weight: 1.4
  - model: gemma-3-12b-it-vl-Polaris-Heretic-AIExpert-NM-Gemini250x
    parameters:
      weight: 0.6
merge_method: nuslerp
dtype: bfloat16
name: gemma-3-12b-it-vl-Polaris-AIExpert-Gemini-Heretic
name: gemma-3-12b-it-vl-Polaris-X2-Gemini

-G

Downloads last month
2
Safetensors
Model size
12B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for nightmedia/gemma-3-12b-it-vl-Polaris-AIExpert-Gemini-Heretic

Datasets used to train nightmedia/gemma-3-12b-it-vl-Polaris-AIExpert-Gemini-Heretic

Collections including nightmedia/gemma-3-12b-it-vl-Polaris-AIExpert-Gemini-Heretic