- Aya 23 LLM Series Launched: Cohere introduces a new family of open-source LLMs that understand 23 languages.
- Two Main Algorithms: Aya 23 series includes an 8-billion parameter model and a 35-billion parameter model.
- Advanced LLM Architecture: Aya-23-35B utilizes the decoder-only Transformer architecture with several enhancements for better performance.
Impact
- Multilingual Capabilities: Aya 23 supports 23 languages, enhancing global communication and applications.
- Enterprise Optimization: Cohere’s LLMs are tailored for enterprise needs, providing high-quality performance and response.
- Technological Advancements: The Aya-23-35B model features improved context understanding and efficient inference through grouped query attention.
- Enhanced Text Processing: The rotational positional embeddings in Aya-23-35B improve word location processing, leading to better output quality.
- Open-Source Contribution: The Aya dataset, developed with contributions from 3,000 participants, showcases the strength of collaborative development.





Leave a comment