Palo Alto startup Zyphra has introduced ZAYA1-8B, a new reasoning-focused mixture of experts language model with just over 8 billion parameters and only 760 million active parameters. Despite its relatively small size, Zyphra said the model delivers competitive benchmark performance against GPT-5-High, DeepSeek-V3.2, as well as Claude Sonnet 4.5. The model is available through Hugging Face under the Apache 2.0 open source license, allowing devel…
This story is only covered by news sources that have yet to be evaluated by the independent media monitoring agencies we use to assess the quality and reliability of news outlets on our platform. Learn more here.