Model Description

GemmaX2-28-2B-Pretrain is a language model developed through continual pretraining of Gemma2-2B using a mix of 56 billion tokens from both monolingual and parallel data across 28 different languages. Please find more details in our paper: Multilingual Machine Translation with Open Large Language Models at Practical Scale: An Empirical Study.

  • Developed by: Xiaomi
  • Model type: GemmaX2-28-2B-Pretrain is obtained by continually pretraining Gemma2-2B on a large amount of monolingual and parallel data. Subsequently, GemmaX2-28-2B-v0.1 is derived through supervised finetuning on a small set of high-quality translation instruction data.
  • Languages: Arabic, Bengali, Czech, German, English, Spanish, Persian, French, Hebrew, Hindi, Indonesian, Italian, Japanese, Khmer, Korean, Lao, Malay, Burmese, Dutch, Polish, Portuguese, Russian, Thai, Tagalog, Turkish, Urdu, Vietnamese, Chinese.

Note that GemmaX2-28-2B-Pretrain is NOT translation model.

Training Data

We collect monolingual data from CulturaX and MADLAD-400. For parallel data, we collect all Chinese-centric and English-centric parallel datasets from the OPUS collection up to August 2024 and conduct a series of filtering processes, such as language identification, semantic duplication filtering, quality filtering, and more.

Citation

@misc{cui2025multilingualmachinetranslationopen,
      title={Multilingual Machine Translation with Open Large Language Models at Practical Scale: An Empirical Study}, 
      author={Menglong Cui and Pengzhi Gao and Wei Liu and Jian Luan and Bin Wang},
      year={2025},
      eprint={2502.02481},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2502.02481}, 
}
Downloads last month
194
Safetensors
Model size
3.2B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for ModelSpace/GemmaX2-28-2B-Pretrain

Base model

google/gemma-2-2b
Finetuned
(507)
this model
Finetunes
1 model
Quantizations
1 model

Collection including ModelSpace/GemmaX2-28-2B-Pretrain