Microsoft Copilot Chat Error Sees Confidential Emails Exposed to AI Tool
Microsoft rolled out a fix for a bug that let Copilot Chat bypass data loss prevention and access confidential emails in Microsoft 365 apps, including Outlook and Teams.
- On Jan. 21, Microsoft acknowledged a coding bug in Microsoft 365 Copilot Chat that let it access and summarise some users' confidential emails, according to initial reports.
- Microsoft traced the issue to a code-level bug that bypassed data loss prevention and sensitivity labels, allowing Copilot Chat to surface content from Draft and Sent folders in Outlook.
- Within Microsoft 365, the issue affected the Work tab of Copilot Chat across Outlook, Teams, Word, Excel, PowerPoint and OneNote, with the UK's National Health Service among organisations flagged.
- Microsoft has deployed a worldwide configuration update for enterprise customers and said it began rolling out a fix earlier this month while monitoring the rollout and telling BBC News patient information was not exposed.
- Gartner's Nader Henein noted that rapid AI feature rollouts raise risks, saying `Under normal circumstances, organisations would simply switch off the feature and wait till governance caught up`.
15 Articles
15 Articles
What happened with the Microsoft 365 Copilot bug?
Copilot accessed and summarized protected email content Microsoft confirmed a bug in its Copilot feature that caused the assistant to summarize emails from users’ Sent and Drafts folders. The issue bypassed data loss prevention (DLP) protections in affected tenants, allowing Copilot to surface…
News extracted from HD Technology. Visit www.hd-tecnologia.com for the latest news. Artificial intelligence gets back under the magnifying glass in corporate environments. Microsoft confirmed a critical failure in Copilot that allowed access and summarizing confidential emails, even when security policies were configured to prevent it. The problem affected Microsoft 365 business users, where the AI tool ignored data loss prevention restrictions …
Microsoft's Copilot was secretly reading confidential emails for weeks, and what it did with them is every company's worst nightmare
Microsoft’s Copilot was recently caught summarizing confidential emails without the proper permissions, completely bypassing the security policies designed to keep that sensitive information protected. This is a significant concern for companies that rely on AI assistants to handle their data. According to The Mashable, the issue specifically affected Copilot Chat for some Microsoft 365 enterprise users. Copilot Chat rolled out to Microsoft 365 …
Coverage Details
Bias Distribution
- 67% of the sources are Center
Factuality
To view factuality data please Upgrade to Premium













