YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Languages:

  • Source language: English

  • Source language: isiZulu

Model Details:

  • model: transformer

  • Architecture: MarianMT

  • pre-processing: normalization + SentencePiece

Pre-trained Model:

Corpus:

Benchmark:

Benchmark Train Test
Umsuka 17.61 13.73

GitHub:

Citation:

@article{umair2022geographical,
  title={Geographical Distance Is The New Hyperparameter: A Case Study Of Finding The Optimal Pre-trained Language For English-isiZulu Machine Translation},
  author={Umair Nasir, Muhammad and Amos Mchechesi, Innocent},
  journal={arXiv e-prints},
  pages={arXiv--2205},
  year={2022}
}
Downloads last month
7
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support