Powered by Smartsupp

Microsoft's Copilot AI Bug Exposed Confidential Email Summaries for Weeks



By admin | Feb 18, 2026 | 1 min read


Microsoft's Copilot AI Bug Exposed Confidential Email Summaries for Weeks

Microsoft has acknowledged that a software flaw enabled its Copilot AI to generate summaries of users' private emails for several weeks without authorization. This issue, initially brought to light by Bleeping Computer, permitted Copilot Chat to access and condense email content starting in January, bypassing existing data loss prevention measures designed to keep sensitive information out of Microsoft's large language model.

Copilot Chat is a feature available to Microsoft 365 subscribers, offering AI-driven chat capabilities within Office applications such as Word, Excel, and PowerPoint. Microsoft identified the bug with the tracking code CW1226324, noting that it incorrectly processed both draft and sent emails marked with a confidential label through the Copilot chat function.

The company stated that it started deploying a corrective update for this flaw in early February. A Microsoft representative did not reply to inquiries, including a question regarding the number of customers impacted by the incident.

In a related development this week, the European Parliament's IT division informed legislators that it has disabled the built-in AI functionalities on government-issued devices. This decision was based on apprehensions that these AI tools might transfer potentially sensitive communications to cloud servers.




RELATED AI TOOLS CATEGORIES AND TAGS

Comments

Please log in to leave a comment.

No comments yet. Be the first to comment!