How To Implement Copilot 365 without Dying in the Process

Sergi Ariño MayansSergi Ariño Mayans
Written by
Sergi Ariño Mayans
and
-
April 10, 2024

I. Introduction:

With the hype of AI, many companies seem to have found the solution to all their problems, including ones they don’t even have in the first place. This has evolved to a point where internal meetings like this one happen on almost a weekly basis:

  • There is a new AI tool that we need for our work. It provides plenty of possibilities and opportunities!
  • OK, how will this tool affect our business?
  • Well, there’s a lot we don’t know for sure. For example: how much of what it says is made up… or if it will take away our jobs… or the security risks… or if it could damage our reputation…
  • I see. What do we know for sure?
  • Only that we want to have it everywhere as fast as we can!

It’s exhausting being the one who seems to have a red flag attached to his arm as a natural extension, but it is crucial for business to understand the need to find the right balance between the opportunities (e.g. increase of efficiency, productivity, smarter products and processes), the needs (specific use cases, identifying pain-points and needs/chances for improvement) and the risks (e.g. legal: contracts, licensing, copyrights, privacy, IT, cybersecurity, AIA, GDPR, NIS2, etc.) of new artificial intelligence tools.

II. Microsoft Copilot 365

I bet some of the new AI tools you are being challenged with are the Microsoft Copilots, which includes, among others, Copilot 365, Copilot Azure, Copilot Security, Copilot Github, Copilot Dynamics, Copilot Fabric and any other unknown Copilot Yet To Come…

(I feel your pain!)

So here is a quick guide on basic things to watch out for from a data protection perspective while planning on implementing Copilot 365 in your company, without dying in the process:

Access rights:

  • Copilot can access all data stored in your Microsoft tenant. In theory, only content from references where the user has appropriate permissions will be included in responses.
  • So, make sure you have the right TOMs and safeguards in place to limit access under the need-to-know principle. Otherwise, that poses the risk that sensitive or highly confidential information is accessed and used internally by unauthorized employees, which could potentially access this content with a simple query.
  • Even if your employees are subject to confidentiality agreements, it’s definitely a better option to minimize unnecessary risks.

Data classification:

  • It is crucial to keep the data up-to-date and that your company carefully classifies and labels the data (e.g. personal data, sensitive, business critical, trade secrets, etc.), which goes hand in hand with the access rights mentioned above. This will help ensure that Copilot only processes information that the respective users are supposed to have access to (authorization and classification).

Customize settings:

  • In one of the recent latest updates (February 2024) on Microsoft website’s regarding data privacy and security for Copilot 365, they added a new paragraph that went unnoticed for many but has significant implications:
“When web grounding is enabled, Copilot for Microsoft 365 may automatically generate a web search query, if Copilot for Microsoft 365 determines that web data can improve the quality of the response. […]
Web search queries might not contain all the words from a user's prompt. They're generally based off a few terms used to find relevant information on the web. However, they may still include some confidential data*, depending on what the user included in the prompt. […]

*Emphasis mine. (Source: Data, Privacy, and Security for Microsoft Copilot for Microsoft 365 | Microsoft Learn.)

  • Your organization’s Microsoft 365 admin can turn off Copilot’s ability to access and include web content when it responds to users prompts, which obviously has in turn other disadvantages in terms of quality/completeness of the output. It’s up to you to customize the settings and decide what best fits your purpose, but be aware that this feature is automatically turned on by default.

Governance:

  • It is advisable from a governance perspective to establish ground rules by defining internal guidelines and setting boundaries (for example, allowed/forbidden prompts, uses cases, general prohibitions, etc.)

Awareness:

  • In terms of liability (negligence), it is essential to train employees.

Works council:

  • Depending on your location it must be mandatory to get the works council involved, basically given the implications that the tool might have when it comes to the processing of employees’ data, monitoring, performance, etc.

Accountability:

  • Don’t forget about the RoPA and to conduct a proper DPIA to identify and minimize risks.

Use your own judgment:

  • The output generated by Copilot is not guaranteed to be 100% accurate. It is intended to be an assistant -a resource- but it shouldn’t be counted on to make the final calls. There is a reason why the tool is called Copilot and not Pilot. Therefore, use your own judgment, verify the results and do not assume it is better than it is. We need to acknowledge AI’s limitations, not ignore them.

III. Bonustrack:

If you are also busy assessing the implementation of other Microsoft Copilots, check what the current status of the tool is (General Availability vs Preview). Most of its products are still in the Preview phase, with significant gaps and compliance limitations in contrast with its General Available version. For instance, during the Preview phase of Copilot 365, its supplementary terms and conditions stated that Microsoft’s Data Processing Addendum was not applicable in full and some of its sections, such as data security (!) were left out.

Remember to look under the hood.

There are still plenty of open, complex questions from a data protection perspective that are surely keeping the EDPB & Co busy. While it remains to see how the entire thing plays out in practice, grab a bucket of popcorn and enjoy the ride, because we’re just getting started!

Loved this article? Share it with your network: