Home » Technology » AI Risks: Copilot Accessing Millions of Confidential Files

AI Risks: Copilot Accessing Millions of Confidential Files

by Rachel Kim – Technology Editor

Microsoft Copilot‍ Use Linked to Increased Sensitive ​Data Exposure, Study Finds

A recent study reveals a important risk of sensitive data exposure associated with the increasing adoption of Microsoft ​Copilot within organizations. The ⁢research indicates that over ⁤half of shared files in businesses contain sensitive ​information, rising to approximately 70% in sectors like healthcare and finance.

On ⁤average, two million critical files per association​ are ⁣circulating without access restrictions, with around 400,000‍ of those shared with ⁢personal accounts – a majority containing confidential data. ⁢This pre-existing vulnerability is compounded by the‍ intensive​ use of Copilot.

the study found ‌that each organization averaged over 3,000 interactions with Copilot, during which sensitive data was potentially modified, analyzed, or⁤ exposed. ⁣This widespread use expands the “surface d’exposition” and increases the⁣ risk of data leakage or misuse, despite the tool being implemented to boost productivity.

Beyond Copilot usage, the research highlights underlying issues in information⁢ governance. These ⁢include approximately ten million duplicate data records per organization, nearly seven million files older than ten years, and a large number ‍of ‍”orphaned” files‌ linked to ​inactive or deleted users. These factors hinder effective digital resource management and complicate the implementation of coherent security policies.

The combination of obsolete files,‍ overly broad permissions, and the integration of ⁣generative AI⁢ solutions like Copilot exposes companies’ informational heritage to risk. Without stricter control measures, ⁤the‌ protection of industrial secrets, financial data,‍ and⁤ personal information⁣ remains⁢ inadequate. IT managers face ‍the challenge of⁤ balancing innovation with security‍ to prevent AI adoption from becoming a strategic vulnerability.

The problem isn’t confined to specific industries, affecting technology, healthcare, public governance, and finance alike. Successful ‍integration of Copilot and similar AI tools ‌requires⁤ careful consideration of access​ management and enduring governance practices. Organizations⁤ failing to address these concerns risk compromising their most valuable information.

Source: Techradar – Microsoft Copilot has access to three million sensitive data records per organization, wide-ranging AI survey finds – ‌here’s why it matters

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.