Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

demiant
/
sae-gemma-2-2b-multistage-unfrozen-tied-20x-l1-jump-positive-ortho-l1-10

SAELens
Model card Files Files and versions
xet
Community

Instructions to use demiant/sae-gemma-2-2b-multistage-unfrozen-tied-20x-l1-jump-positive-ortho-l1-10 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.

  • Libraries
  • SAELens

    How to use demiant/sae-gemma-2-2b-multistage-unfrozen-tied-20x-l1-jump-positive-ortho-l1-10 with SAELens:

    # pip install sae-lens
    from sae_lens import SAE
    
    sae, cfg_dict, sparsity = SAE.from_pretrained(
        release = "RELEASE_ID", # e.g., "gpt2-small-res-jb". See other options in https://github.com/jbloomAus/SAELens/blob/main/sae_lens/pretrained_saes.yaml
        sae_id = "SAE_ID", # e.g., "blocks.8.hook_resid_pre". Won't always be a hook point
    )
  • Notebooks
  • Google Colab
  • Kaggle
sae-gemma-2-2b-multistage-unfrozen-tied-20x-l1-jump-positive-ortho-l1-10 / 614404096
340 MB
Ctrl+K
Ctrl+K
  • 1 contributor
History: 1 commit
demiant's picture
demiant
Upload SAE 614404096
da208b3 verified over 1 year ago
  • cfg.json
    2.45 kB
    Upload SAE 614404096 over 1 year ago
  • sae_weights.safetensors
    340 MB
    xet
    Upload SAE 614404096 over 1 year ago
  • sparsity.safetensors
    73.8 kB
    xet
    Upload SAE 614404096 over 1 year ago