Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
kshitijthakkar
/
loggenix-moe-1b-pretrain
like
0
Text Generation
Transformers
Safetensors
nvidia/Nemotron-CC
nvidia/Nemotron-Math
nvidia/Nemotron-Code
English
qwen3_moe
Mixture of Experts
pretrained
causal-lm
mixture-of-experts
conversational
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
eec5b66
loggenix-moe-1b-pretrain
14 GB
Ctrl+K
Ctrl+K
1 contributor
History:
36 commits
kshitijthakkar
Upload tokenizer_config.json with huggingface_hub
eec5b66
verified
about 1 month ago
checkpoints
Upload folder using huggingface_hub
about 1 month ago
eval
Upload eval/step_8000/results.json with huggingface_hub
about 1 month ago
.gitattributes
Safe
1.52 kB
initial commit
about 1 month ago
generation_config.json
Safe
180 Bytes
Upload generation_config.json with huggingface_hub
about 1 month ago
tokenizer_config.json
Safe
13.1 kB
Upload tokenizer_config.json with huggingface_hub
about 1 month ago