DevOps Incident Responder (GGUF)
A fine-tuned Mistral-NeMo-Minitron-8B-Instruct model for DevOps incident diagnosis and resolution.
Model Description
Analyzes error logs, stack traces, and incident descriptions to provide:
- Root cause analysis
- Severity assessment
- Step-by-step fixes with exact commands
- Prevention guidance
Tech coverage: Kubernetes, Docker, Terraform, Azure, GCP, Node.js, Redis, MongoDB, Nginx, PostgreSQL, InfluxDB
Training Details
- Base Model: nvidia/Mistral-NeMo-Minitron-8B-Instruct
- Method: QLoRA (4-bit quantization + LoRA adapters)
- Dataset: 4,755 examples (scraped + synthetic)
- Epochs: 2
- LoRA Rank: 32, Alpha: 64
- Quantization: Q4_K_M (4.8 GB)
Usage with Ollama
- Download the GGUF file
- Create a Modelfile:
- Downloads last month
- 18
Hardware compatibility
Log In to add your hardware
4-bit
Model tree for irfanalee/incident-responder-gguf
Base model
nvidia/Mistral-NeMo-Minitron-8B-Base Finetuned
nvidia/Mistral-NeMo-Minitron-8B-Instruct