Update README.md
Browse files
README.md
CHANGED
|
@@ -30,6 +30,8 @@ default: minhash_deduped
|
|
| 30 |
</a>
|
| 31 |
</p>
|
| 32 |
|
|
|
|
|
|
|
| 33 |
AraMix ([https://arxiv.org/abs/2512.18834](https://arxiv.org/abs/2512.18834)) is an Arabic pretraining corpus containing 178 billion tokens across 179 million documents (in the minhash subset). Rather than scraping the web again, AraMix combines seven publicly available Arabic datasets, applies Arabic-specific quality filtering, and performs cross-dataset deduplication.
|
| 34 |
|
| 35 |
We train a 1.4B parameter language model through nanotron on 30 billion tokens to show that the `matched` subset of AraMix, outperforms the previous state-of-the-art model-free approach, [arabicweb24](https://huggingface.co/datasets/lightonai/ArabicWeb24) (see [Appendix A9 in the Fineweb-2 paper](https://arxiv.org/pdf/2506.20920)). Furthermore, the `minhash_deduped` subset performs on-par with nearly 5 times the total number of tokens.
|
|
|
|
| 30 |
</a>
|
| 31 |
</p>
|
| 32 |
|
| 33 |
+
**AraMix family:** [AraMix](https://huggingface.co/datasets/AdaMLLab/AraMix) (base) | [AraMix-domain-classified](https://huggingface.co/datasets/AdaMLLab/AraMix-domain-classified) (with domain labels) | [AraMix-HQ](https://huggingface.co/datasets/AdaMLLab/AraMix-HQ) (model-filtered)
|
| 34 |
+
|
| 35 |
AraMix ([https://arxiv.org/abs/2512.18834](https://arxiv.org/abs/2512.18834)) is an Arabic pretraining corpus containing 178 billion tokens across 179 million documents (in the minhash subset). Rather than scraping the web again, AraMix combines seven publicly available Arabic datasets, applies Arabic-specific quality filtering, and performs cross-dataset deduplication.
|
| 36 |
|
| 37 |
We train a 1.4B parameter language model through nanotron on 30 billion tokens to show that the `matched` subset of AraMix, outperforms the previous state-of-the-art model-free approach, [arabicweb24](https://huggingface.co/datasets/lightonai/ArabicWeb24) (see [Appendix A9 in the Fineweb-2 paper](https://arxiv.org/pdf/2506.20920)). Furthermore, the `minhash_deduped` subset performs on-par with nearly 5 times the total number of tokens.
|