CoreWeave Achieves New Record-Breaking AI Inferencing Benchmark with NVIDIA GB200 Grace Blackwell Superchips
- CoreWeave, known as 'the AI Hyperscaler,' unveiled MLPerf Inference v5.0 results on April 2, 2025.
- MLPerf Inference is a standard suite used to measure machine learning performance in real scenarios.
- A CoreWeave instance used NVIDIA GB200, containing two Grace CPUs and four Blackwell GPUs.
- Peter Salanki, CoreWeave CTO, stated the company commits to delivering cutting-edge infrastructure.
- NVIDIA H200 GPU instances achieved 33,000 TPS on Llama 2 70B, a 40% gain over H100.
Insights by Ground AI
Does this summary seem wrong?
15 Articles
15 Articles
All
Left
2
Center
3
Right
2

+13 Reposted by 13 other sources
CoreWeave Achieves New Record-Breaking AI Inferencing Benchmark with NVIDIA GB200 Grace Blackwell Superchips
CoreWeave is the first cloud service provider to submit MLPerf Inference v5.0 results for NVIDIA GB200 Superchips
·Lawrenceville, United States
Read Full ArticleCoverage Details
Total News Sources15
Leaning Left2Leaning Right2Center3Last UpdatedBias Distribution43% Center
Bias Distribution
- 43% of the sources are Center
43% Center
L 29%
C 43%
R 29%
Factuality
To view factuality data please Upgrade to Premium
Ownership
To view ownership data please Upgrade to Vantage