Tuesday, February 17, 2026

Cohere releases Tiny Aya, a family of 3.35B-parameter open-weight models supporting 70+ languages for offline use, trained on a single cluster of 64 H100 GPUs (Ivan Mehta/TechCrunch)

Ivan Mehta / TechCrunch:
Cohere releases Tiny Aya, a family of 3.35B-parameter open-weight models supporting 70+ languages for offline use, trained on a single cluster of 64 H100 GPUs  —  Enterprise AI company Cohere launched a new family of multilingual models on the sidelines of the ongoing India AI Summit.



No comments:

Post a Comment

Cohere releases Tiny Aya, a family of 3.35B-parameter open-weight models supporting 70+ languages for offline use, trained on a single cluster of 64 H100 GPUs (Ivan Mehta/TechCrunch)

Ivan Mehta / TechCrunch : Cohere releases Tiny Aya, a family of 3.35B-parameter open-weight models supporting 70+ languages for offline u...