Microsoft Struggled with Critical Copilot Vulnerability for Months
3 Articles
3 Articles
Cybersecurity researchers have revealed a major vulnerability in the Microsoft 365 Copilot AI, allowing hackers to steal sensitive business data without any user clicks. Named EchoLeak, this flaw exploited the generative AI integrated with Microsoft tools to exfil information via a simple, even unopened e-mail.
Microsoft struggled with critical Copilot vulnerability for months
A major security flaw in Microsoft 365 Copilot allowed attackers to access sensitive company data with nothing more than a specially crafted email—no clicks or user interaction required. The vulnerability, named "EchoLeak," was uncovered by cybersecurity firm Aim Security. The article Microsoft struggled with critical Copilot vulnerability for months appeared first on THE DECODER.
The security company Aim Security detects a vulnerability in Microsoft's AI assistant Copilot, which was able to reveal sensitive company data – without any user interaction. The article Microsoft fought five months with critical Copilot vulnerability first appeared on THE-DECODER.de.
Coverage Details
Bias Distribution
- There is no tracked Bias information for the sources covering this story.
To view factuality data please Upgrade to Premium