News
It involves building the Aya collection, the largest multilingual dataset collection to date, which includes 513 million examples, and Aya-101, an AI model capable of covering more than 100 languages.
13d
Africanews on MSNAfrica’s First Multilingual Small Language Model Gets Even Smaller - Thanks to Top African InnovatorIn a major leap for inclusive and accessible artificial intelligence, Africa’s first multilingual Small Language Model (SLM), ...
Hosted on MSN10mon
Multilingual Dataset Enhances Language Model Testing - MSNThe BELEBELE dataset offers a comprehensive benchmark for evaluating language models across 122 language variants. It reveals that smaller multilingual models often outperform large, English ...
Apple researchers highlight limitations in large language models’ ability to perform accurate mathematical reasoning, citing token sensitivity and probabilistic output.
Their model fared significantly better than the largest first-generation OLMo models from the Allen Institute for AI, even though the OLMo models have twice as many parameters. On tasks of reasoning ...
Researchers find large language models process diverse types of data, like different languages, audio inputs, images, etc., similarly to how humans reason about complex problems. Like humans, LLMs ...
Strategy: To ensure no Indian language is left behind, India must adopt a scalable AI strategy by combining large, mid-sized, and small models that learn from each other.
Google DeepMind, Google LLC’s artificial intelligence research unit, today unveiled two new AI models that are capable of advanced mathematical reasoning for solving complex math problems, which cu ...
A new study introduces choice engineering—a powerful new way to guide decisions using math instead of guesswork. By applying carefully designed mathematical models, researchers found they could ...
Results that may be inaccessible to you are currently showing.
Hide inaccessible results