MedGemma-4b-ICD
Fine-tuned version of google/medgemma-4b-it for automated ICD medical coding from clinical text. Given a clinical note or diagnosis description, this model generates the corresponding ICD-10 code(s).
๐ Live Demo: spaces/abnuel/med-coding
Model Description
ICD (International Classification of Diseases) coding is a critical but labor-intensive clinical workflow. This model was fine-tuned using supervised fine-tuning (SFT) with TRL on a curated dataset of clinical text paired with ICD-10 codes, enabling automated code suggestion from free-text diagnoses and clinical documentation.
- Base model: google/medgemma-4b-it
- Fine-tuning method: SFT (Supervised Fine-Tuning) via TRL
- Task: ICD-10 code generation from clinical text
- Domain: Clinical NLP / Healthcare AI
Intended Uses
- Assisting medical coders with ICD-10 code lookup from clinical notes
- Supporting clinical decision support systems
- Research into automated medical coding pipelines
How to Use
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
model_id = "abnuel/MedGemma-4b-ICD"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(
model_id,
torch_dtype=torch.bfloat16,
device_map="auto"
)
prompt = "Patient presents with type 2 diabetes mellitus with diabetic chronic kidney disease, stage 3."
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
outputs = model.generate(**inputs, max_new_tokens=64)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
Training Details
- Fine-tuning framework: TRL (Transformer Reinforcement Learning)
- Method: Supervised Fine-Tuning (SFT)
- Hardware: GPU (CUDA)
- Base model license: Gemma terms of use
Limitations
- Performance may vary on clinical notes with uncommon or highly specialized terminology.
- Should not be used as a sole source of truth for billing or clinical decision-making without human review.
- Trained on a specific dataset; generalization to all ICD-10 editions and specialties has not been fully evaluated.
Related Models & Resources
- abnuel/MedGemma-4b-ICD-Coder โ companion model checkpoint
- abnuel/fine-tuned-openbiollm-medical-coding โ Llama3-OpenBioLLM-8B fine-tuned on the same task
Citation
If you use this model in your research, please cite:
@misc{adegunlehin2025medgemma-icd,
author = {Abayomi Adegunlehin},
title = {MedGemma-4b-ICD: Fine-tuned MedGemma for ICD-10 Medical Coding},
year = {2025},
url = {https://huggingface.co/abnuel/MedGemma-4b-ICD}
}