Jun 17, 2024
3 stories
1 save
A family of multilingual language models supporting 23 languages, designed to balance breadth and depth by allocating more capacity to fewer languages during pre-training.
A massively multilingual generative language model that follows instructions in 101 languages, trained by finetuning mT5.
A human-curated instruction-following dataset that spans 65 languages, created to bridge the language gap in datasets for natural language processing.