Frankie Thiak
Upload folder using huggingface_hub
cb8785d verified
|
raw
history blame
342 Bytes
NLLB-200 (distilled-600M) — Mizo Dictionary LoRA
Base model: facebook/nllb-200-distilled-600M
Direction tuned: eng_Latn -> lus_Latn (dictionary-style targets)
Usage (pseudocode): load base, then PeftModel.from_pretrained(base, this_adapter),
set forced_bos_token_id to tok.convert_tokens_to_ids('lus_Latn'), generate.
Pushed on 2025-11-12.