Claude Sonnet's Memory Gets a Big Boost with 1M Tokens of Context
Anthropic's Claude Sonnet 4 now supports prompts up to 750,000 words, boosting performance on complex coding tasks and expanding its enterprise appeal, with charges increasing for larger inputs.
9 Articles
9 Articles
Anthropic's Claude Sonnet 4 Model Gets a 1M Token Context Window
Anthropic today announced that its Claude Sonnet 4 model, the company’s mainstream model that sits below its flagship Claude Opus 4 model, will now support a 1 million token context window. This long context support is now in public beta and available through the Anthropic API and on Amazon Bedrock, with support on Google’s Vertex AI coming soon. A million tokens is the rough equivalent of 750,000 words, allowing the model to reason over a large…
Claude Gets 1M Tokens Support Via API To Take On Gemini 2.5 Pro - Cybernoz - Cybersecurity News
Claude Sonnet 4 has been upgraded, and it can now remember up to 1 million tokens of context, but only when it’s used via API. This could change in the future. This is 5x more than the previous limit. It also means that Claude now supports remembering over 75,000 lines of code, or even hundreds of documents in a single session. Previously, you were required to submit details to Claude in small chunks, but that also meant Claude would forget the …
Coverage Details
Bias Distribution
- 80% of the sources are Center
Factuality
To view factuality data please Upgrade to Premium