In a previous post, we discussed the role of data security posture management (DSPM) in protecting sensitive data throughout the IT environment. DSPM has been called “data-first” security. While traditional tools secure the systems, devices and applications that store, transfer or process data, DSPM tools focus on protecting the data itself.
DSPM is becoming a critical component of the security arsenal due to the scale, frequency and high cost of data breaches. The adoption of AI ups the ante, as organizations begin feeding sensitive data into AI tools to glean new insights that can improve their operations.
Microsoft Copilot is a prime example. Copilot helps users with a range of tasks, such as drafting content, summarizing information, conducting research and automating workflows. To do that, it searches information in Microsoft 365 data stores, using this data to generate accurate and relevant responses to queries.
As powerful as it is, Copilot can also create significant risk. As it searches organizational content, it could expose sensitive information or intellectual property. The DSPM features in Microsoft Purview can help organizations improve their data security.
Why the Use of Copilot Requires DSPM
Copilot uses Microsoft Graph to access contextually relevant data to respond to a user’s queries. Graph gives Copilot access to all files, emails, chats and other data that the user can access based on the user’s permissions. If those permissions are broad or poorly managed, Copilot could produce sensitive data the user is not authorized to access.
It’s a common problem. Many organizations have unlabeled data in SharePoint sites that every user in the organization can access. Traditionally, sensitive data may have been protected by “security by obscurity” — users were unlikely to find data they didn’t know existed. However, that model simply won’t work when gen AI is searching the data stores.
DSPM reduces the risk by identifying all data within M365 and classifying it according to predefined criteria. Organizations can then apply sensitivity labels to define the level of protection that should be applied to the content. This can include access restrictions, conditional access, watermarking and encryption depending on the level of sensitivity.
How Microsoft Purview Protects Data
Microsoft Purview includes powerful DSPM capabilities for reducing the risks associated with Copilot and other gen AI applications. It is designed to prevent data that a user doesn’t have permission to access from being included in a response generated by AI.
Purview starts by discovering data stored throughout M365 and creating a comprehensive data map. Administrators can view a unified catalog of data through the Purview portal and use natural language search and AI-powered features to gain new insights about the data. Purview tracks the origin and transformation of data so that users can understand where it came from and how it’s been modified.
Most importantly, Purview can automatically classify and label personally identifiable information and other sensitive data based on predefined or custom rules. It also integrates with Microsoft 365 sensitivity labels and Azure Information Protection, allowing for a unified approach to data governance.
Data loss prevention and access control features prevent data leaks and ensure that only authorized users can view, modify or delete sensitive information. Administrators can also set retention policies to manage data across its lifecycle and ensure compliance with internal policies and regulatory requirements.
How Cerium Can Help
The Cerium team has deep expertise in the M365 solution suite, including Copilot and Purview. We also understand the critical importance of protecting sensitive information as organizations adopt and use AI. We can help you develop a data governance and security strategy that leverages DSPM tools such as Purview. Let us help ensure that a costly data breach doesn’t overshadow the benefits of AI.