Ant Group Open-Sources Ling-2.6-Flash Model with Multiple Precision Options
2 Articles
2 Articles
Ant Group Open-Sources Ling-2.6-Flash Model with Multiple Precision Options
April 29, 2026 — Ant Group’s BaiLing large model team has officially open-sourced Ling-2.6-Flash, offering multiple precision formats—including BF16, FP8, and INT4—to give developers flexibility across hardware environments, inference costs, and deployment needs. Ling-2.6-Flash is an instruction-tuned model with 104 billion total parameters and 7.4 billion activated parameters. Two weeks ago, it was quietly released on OpenRouter under the anony…
Read the original note in the following link: DeepSeek V4: The open AI model that competes with Claude and GPT at lower cost DeepSeek-V4 again installs a pressure on the market of advanced models, because its argument does not rest solely on performance. The family arrives with open weights, support for a context window of 1 million tokens and a price structure that remains far below the high-end proprietary models. The comparison is especially …
Coverage Details
Bias Distribution
- There is no tracked Bias information for the sources covering this story.
Factuality
To view factuality data please Upgrade to Premium
