Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up
Goekdeniz-Guelmez 's Collections
Josiefied-Qwen3.5-Gabliterated
JOSIE-1.1
JOSIE-MoE
JOSIE
Gabliteration
Josiefied and Abliterated Models
Josiefied and Abliterated Qwen3
J.O.S.I.E.-R1
Josiefied and Gabliterated Qwen3-Instruct
Josiefied and Abliterated Qwen2.5
J.O.S.I.E.-Dev-v6.0
J.O.S.I.E.-Dev-v4o

JOSIE-MoE

updated Mar 7

JOSIE models using a custom dynamic Mixture of Expert architecture.

Upvote
2

  • DynaMoE: Dynamic Token-Level Expert Activation with Layer-Wise Adaptive Capacity for Mixture-of-Experts Neural Networks

    Paper • 2603.01697 • Published Mar 2 • 2
Upvote
2
  • Collection guide
  • Browse collections
Company
TOS Privacy About Careers
Website
Models Datasets Spaces Pricing Docs