Model Card for unige-fti/Llama-Aladdin-8B
Built with Llama.
Model Details
Model Description
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by: [More Information Needed]
- Funded by [optional]: [More Information Needed]
- Shared by [optional]: [More Information Needed]
- Model type: [More Information Needed]
- Language(s) (NLP): [More Information Needed]
- License: [More Information Needed]
- Finetuned from model [optional]: [More Information Needed]
Model Sources [optional]
- Repository: [More Information Needed]
- Paper [optional]: [More Information Needed]
- Demo [optional]: [More Information Needed]
Uses
Direct Use
[More Information Needed]
Downstream Use [optional]
[More Information Needed]
Out-of-Scope Use
[More Information Needed]
Bias, Risks, and Limitations
[More Information Needed]
Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
Training Details
Training Data
[More Information Needed]
Training Procedure
Preprocessing [optional]
[More Information Needed]
Training Hyperparameters
- Training regime: [More Information Needed]
Speeds, Sizes, Times [optional]
[More Information Needed]
Evaluation
Testing Data, Factors & Metrics
Testing Data
[More Information Needed]
Factors
[More Information Needed]
Metrics
[More Information Needed]
Results
[More Information Needed]
Summary
Model Sources
- Repository: Github repository
- Paper: https://arxiv.org/abs/2602.16290
How to Get Started with the Model
TODO
Training Details
Training Data: Closed-track training data only.
Datasets span multiple dialect regions and domains
Parallel corpora:
- SauDial
- Casablanca corpus
- JODA
- UFAL Levantine
- DODA
- Atlas
Monolingual dialect corpora:
- MADAR
- Shami
- Saudi Tweets
- EDGAD / EDC
- HABIBI lyrics
Citation
If you use this model in your research, please cite the following paper:
@inproceedings{mutal2026aladdinfti,
title = {Aladdin-FTI @ AMIYA: Three Wishes for Arabic NLP: Fidelity, Diglossia, and Multidialectal Generation},
author = {Mutal, Jonathan and Al Almaoui, Perla and Hengchen, Simon and Bouillon, Pierrette},
booktitle = {Proceedings of the AMIYA Shared Task, co-located with VarDial at EACL 2026},
year = {2026},
address = {Rabat, Morocco},
publisher = {Association for Computational Linguistics},
}
Glossary [optional]
[More Information Needed]
More Information [optional]
[More Information Needed]
Model Card Authors [optional]
[More Information Needed]
Model Card Contact
[More Information Needed]
- Downloads last month
- 5
Model tree for unige-fti/Llama-Aladdin-8B
Base model
meta-llama/Llama-3.1-8B