
Microsoft fixes flaw exposing customer emails to Copilot
TL;DR
Microsoft has resolved a flaw in its AI assistant Copilot that allowed access to confidential customer emails. The company emphasizes data protection.
Microsoft announced it has fixed a flaw in its AI assistant, Copilot, which allowed the reading and summarization of confidential emails from paying customers. This issue was identified and resolved after the company realized that Copilot was accessing information without adhering to data protection policies.
The flaw specifically affected customers using the Microsoft Office suite, exposing emails that should have been protected from unauthorized access. Microsoft did not disclose the exact number of affected users but emphasized the importance of quickly addressing the issue to maintain customer trust.
With the fix implemented, Microsoft has reinforced its security protocols to prevent the issue from recurring. The company is also reviewing its monitoring practices to ensure that Copilot operates in accordance with confidentiality policies.
The solution was well received by the market, which highlighted Microsoft's promptness in addressing the issue. Security experts stressed the importance of maintaining constant vigilance over AI tools that handle sensitive data.
Microsoft Office users should be aware that the flaw has been fixed and that the company is committed to protecting their information. Microsoft continues to enhance its security measures to prevent future incidents.
Content selected and edited with AI assistance. Original sources referenced above.


