Cohere, an enterprise AI company, has released “Tiny Aya,” a multilingual AI model capable of operating on personal laptops without an internet connection, according to a report by TechCrunch on February 17th. The model supports over 70 languages, including a range of South Asian languages like Bengali, Hindi, Punjabi, Urdu, Gujarati, Tamil, Telugu, and Marathi.
Tiny Aya utilizes an open-weight system, meaning only the model’s weights are publicly available. The model is based on 3.35 billion parameters. Cohere has also released regionally focused versions of Tiny Aya, including ‘Tiny Aya-Global’ for broad language support, ‘Tiny Aya-Earth’ specializing in African languages, ‘Tiny Aya-Fire’ focused on South Asian languages, and ‘Tiny Aya-Water’ supporting languages from the Asia-Pacific region, West Asia, and Europe.
The company emphasized that Tiny Aya was trained using a cluster of 64 NVIDIA H100 GPUs and is designed to deliver strong performance even with limited computing resources. A key feature highlighted is its offline translation capability, intended for employ in areas with unreliable internet access.
The Tiny Aya model is currently available for download via the AI model sharing platform Hugging Face and the Cohere platform. Cohere plans to release the training and evaluation datasets, as well as a technical report, at a later date.
Cohere also recently launched an LLM supporting 23 languages, including Korean, which is more powerful than previous versions, according to a report from aitimes.com.