MMS-TTS Lamba (Togo) โ€” Eyaa-Tom Fine-tuned

Fine-tuned version of facebook/mms-tts-las on the Eyaa-Tom dataset for Lamba (Togo) (las).

Lamba (Togo). Fine-tuned from facebook/mms-tts-las.

Language Details

Field Value
Language Lamba (Togo)
ISO 639-3 (MMS) las
Your ISO las
Region Togo
Family Gur (Niger-Congo)
Base model facebook/mms-tts-las

Training Statistics

Metric Value
Training samples 45
Validation samples 8
Best validation mel-L1 3.6687
Uploaded variant best

Usage

from transformers import VitsModel, VitsTokenizer
import torch, torchaudio

model     = VitsModel.from_pretrained("Umbaji001/eyaa-tom-mms-tts-las")
tokenizer = VitsTokenizer.from_pretrained("Umbaji001/eyaa-tom-mms-tts-las")

inputs = tokenizer("your text here", return_tensors="pt")
with torch.no_grad():
    waveform = model(**inputs).waveform[0]

torchaudio.save("output.wav", waveform.unsqueeze(0), model.config.sampling_rate)

Training Details

  • Loss: Mel-spectrogram L1 (avoids VITS training restriction)
  • Optimizer: AdamW โ€” lr=2e-4, betas=(0.8, 0.99)
  • Scheduler: ExponentialLR ฮณ=0.999
  • Epochs: 6 | Batch size: 4 (effective 16 w/ grad accumulation)

Citation

@article{pratap2023mms,
  title={Scaling Speech Technology to 1,000+ Languages},
  author={Pratap, Vineel et al.},
  journal={arXiv preprint arXiv:2305.13516},
  year={2023}
}

Fine-tuned: 2026-02-25 โ€” Eyaa-Tom project

Downloads last month
5
Safetensors
Model size
36.3M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Umbaji001/eyaa-tom-mms-tts-las

Finetuned
(2)
this model

Paper for Umbaji001/eyaa-tom-mms-tts-las