πŸ“± Gemma 3 270M Form Generator - LoRA Adapter

LoRA adapter untuk generate form definitions dalam JSON format. Trained dengan Unsloth framework untuk efficiency maksimal.

🎯 Model Info

  • Base Model: google/gemma-3-270m-it
  • Training: Unsloth + BF16 pure (no quantization)
  • LoRA Rank: 128
  • Dataset: bhismaperkasa/form_dinamis
  • Language: Bahasa Indonesia
  • Epochs: 4
  • Size: ~50 MB (adapter only)

πŸš€ Usage

Load Adapter

from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel
import torch

# Load base model
base_model = AutoModelForCausalLM.from_pretrained(
    "google/gemma-3-270m-it",
    torch_dtype=torch.bfloat16,
    device_map="auto"
)

# Load adapter
model = PeftModel.from_pretrained(
    base_model,
    "bhismaperkasa/gemma-3-270m-form-generator-adapter_unslothed"
)
model.eval()

tokenizer = AutoTokenizer.from_pretrained("bhismaperkasa/gemma-3-270m-form-generator-adapter_unslothed")

Generate Form

prompt = "<start_of_turn>user\nbuatkan form login<end_of_turn>\n<start_of_turn>model\n"
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)

outputs = model.generate(
    **inputs,
    max_new_tokens=256,
    temperature=0.7,
    top_p=0.95,
    do_sample=True
)

result = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(result.split("<start_of_turn>model\n")[-1])

πŸ“Š Training Details

  • Framework: Unsloth (2x faster, 60% less VRAM)
  • Precision: BF16 (pure, no quantization)
  • Batch Size: 8
  • Learning Rate: 5e-5
  • Optimizer: AdamW 8-bit
  • Final Loss: ~0.23-0.25

πŸŽ“ Why LoRA Adapter?

Using adapter instead of merged model because:

  • βœ… Smaller size (~50 MB vs ~540 MB)
  • βœ… Easy to switch base models
  • βœ… Better for experimentation
  • βœ… Can combine multiple adapters

πŸ”— Related Models

  • Merged Version: bhismaperkasa/gemma-3-270m-form-generator-bf16 (if available)

βš–οΈ License

Apache 2.0 (following Gemma license)

Downloads last month
3
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for bhismaperkasa/gemma-3-270m-form-generator-adapter_unslothed

Adapter
(56)
this model