Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Website
Tasks
HuggingChat
Collections
Languages
Organizations
Community
Blog
Posts
Daily Papers
Learn
Discord
Forum
GitHub
Solutions
Team & Enterprise
Hugging Face PRO
Enterprise Support
Inference Providers
Inference Endpoints
Storage Buckets
Log In
Sign Up
aws-neuron
/
optimum-neuron-cache
like
32
Follow
AWS Inferentia and Trainium
175
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
679
7e577ee
optimum-neuron-cache
/
inference-cache-config
9.54 kB
Ctrl+K
Ctrl+K
5 contributors
History:
9 commits
dacorvo
HF Staff
Add Zephyr to mistral variants
9164704
verified
about 2 years ago
gpt2.json
278 Bytes
Create gpt2.json
about 2 years ago
llama-variants.json
2.62 kB
Add most popular llama variants
about 2 years ago
llama.json
2.3 kB
Added Llama-70b batch_size 4 to inference cache
about 2 years ago
mistral-variants.json
3.57 kB
Add Zephyr to mistral variants
about 2 years ago
mistral.json
769 Bytes
Remove variants from main mistral config
about 2 years ago