Microscopic-Mistral
Collection
20 items β’ Updated
docker model run hf.co/DrNicefellow/Microscopic-Mistral-18k-stepsSelf trained microscopit Mistral. Around 810M parameters.
The tokenizer is the one from https://huggingface.co/mistralai/Mistral-7B-v0.1.
It is being trained on around 400B tokens and this is step 18k.
The evaluation is being conducted now.
This model is available under the Apache 2.0 License.
Join our Discord server here.
Eager to buy me a cup of 2$ coffe or iced tea?π΅β Sure, here is the link: https://ko-fi.com/drnicefellow. Please add a note on which one you want me to drink?
Install from pip and serve model
# Install vLLM from pip: pip install vllm# Start the vLLM server: vllm serve "DrNicefellow/Microscopic-Mistral-18k-steps"# Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:8000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "DrNicefellow/Microscopic-Mistral-18k-steps", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }'