Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
unsloth
/
Hermes-4-70B-GGUF
like
23
Follow
Unsloth AI
19.2k
Transformers
GGUF
English
Llama-3.1
unsloth
instruct
finetune
reasoning
hybrid-mode
chatml
function calling
tool use
json mode
structured outputs
atropos
dataforge
long context
roleplaying
chat
imatrix
conversational
arxiv:
2508.18255
License:
llama3
Model card
Files
Files and versions
xet
Community
2
Deploy
Use this model
4409552
Hermes-4-70B-GGUF
846 GB
Ctrl+K
Ctrl+K
2 contributors
History:
32 commits
danielhanchen
Upload folder using huggingface_hub
4409552
verified
8 months ago
Q6_K
Upload folder using huggingface_hub
8 months ago
Q8_0
Upload folder using huggingface_hub
8 months ago
UD-Q8_K_XL
Upload folder using huggingface_hub
8 months ago
.gitattributes
3.2 kB
Upload folder using huggingface_hub
8 months ago
Hermes-4-70B-IQ4_NL.gguf
Safe
40.1 GB
xet
Upload folder using huggingface_hub
8 months ago
Hermes-4-70B-IQ4_XS.gguf
Safe
37.9 GB
xet
Upload folder using huggingface_hub
8 months ago
Hermes-4-70B-Q2_K.gguf
Safe
26.4 GB
xet
Upload folder using huggingface_hub
8 months ago
Hermes-4-70B-Q2_K_L.gguf
26.6 GB
xet
Upload folder using huggingface_hub
8 months ago
Hermes-4-70B-Q3_K_M.gguf
Safe
34.3 GB
xet
Upload folder using huggingface_hub
8 months ago
Hermes-4-70B-Q3_K_S.gguf
Safe
30.9 GB
xet
Upload folder using huggingface_hub
8 months ago
Hermes-4-70B-Q4_0.gguf
Safe
40.1 GB
xet
Upload folder using huggingface_hub
8 months ago
Hermes-4-70B-Q4_1.gguf
Safe
44.3 GB
xet
Upload folder using huggingface_hub
8 months ago
Hermes-4-70B-Q4_K_M.gguf
Safe
42.5 GB
xet
Upload folder using huggingface_hub
8 months ago
Hermes-4-70B-Q5_K_M.gguf
Safe
49.9 GB
xet
Upload folder using huggingface_hub
8 months ago
Hermes-4-70B-UD-IQ1_M.gguf
17.1 GB
xet
Upload folder using huggingface_hub
8 months ago
Hermes-4-70B-UD-IQ1_S.gguf
Safe
15.9 GB
xet
Upload folder using huggingface_hub
8 months ago
Hermes-4-70B-UD-IQ2_M.gguf
Safe
24.3 GB
xet
Upload folder using huggingface_hub
8 months ago
Hermes-4-70B-UD-IQ2_XXS.gguf
Safe
19.4 GB
xet
Upload folder using huggingface_hub
8 months ago
Hermes-4-70B-UD-IQ3_XXS.gguf
Safe
27.7 GB
xet
Upload folder using huggingface_hub
8 months ago
Hermes-4-70B-UD-Q2_K_XL.gguf
Safe
27 GB
xet
Upload folder using huggingface_hub
8 months ago
Hermes-4-70B-UD-Q3_K_XL.gguf
Safe
34.8 GB
xet
Upload folder using huggingface_hub
8 months ago
Hermes-4-70B-UD-Q4_K_XL.gguf
Safe
42.7 GB
xet
Upload folder using huggingface_hub
8 months ago
Hermes-4-70B-UD-Q5_K_XL.gguf
Safe
49.9 GB
xet
Upload folder using huggingface_hub
8 months ago
README.md
Safe
10.1 kB
Create README.md
8 months ago