Microsoft 365 Copilot is an enormously powerful tool that can supercharge your organisation’s productivity. While much has been written about the dangers and risks of not getting information security right when it comes to AI, the positive benefits of good governance and security don’t always get the spotlight they deserve. So, as Microsoft 365 Copilot becomes increasingly widespread, we’ve explored the reasons Copilot-appropriate security and governance is so important. We’ve also set out 5 different ways for you to get the ball rolling in your organisation. Let’s get to work.
Unless you’ve been working under a rock for the last few years, you’ll likely be familiar with ChatGPT, Gemini, Copilot, and other AI tools. What’s perhaps less commonly understood is the distinct difference between ‘Copilot’ and ‘Microsoft 365 Copilot’. While ‘Copilot’ is Microsoft’s chat-based general-purpose AI offering (previously ‘Bing Chat’), ‘Microsoft 365 Copilot’ is a specialised AI assistant that draws data and information from right across your organisation’s suite of M365 applications.
The basic version of ‘Copilot’ is freely available to all employees when you have a M365 license, whereas ‘Microsoft 365 Copilot’ is a licensed application that will only integrate with other applications – including SharePoint, Word, Outlook and Teams – when you have a qualifying subscription. Throughout the rest of this article, we’ll be talking about the fully licensed version – Microsoft 365 Copilot.
Having access to an intuitive AI tool that pulls data and information from right across the organisation presents a huge opportunity to boost the value of Microsoft 365 and increase productivity. However, it clearly also creates a challenge for all those responsible for information security. So implementing strong security and governance before rolling out Microsoft 365 Copilot is the best way to maximise Copilot’s capabilities while minimising risk.
Copilot security and governance benefits
There are several significant benefits of demonstrating your commitment to data security and user privacy ahead of a Microsoft 365 Copilot introduction.
Perhaps the biggest benefit is the reassurance it offers employees and other important stakeholders. It increases trust in AI and automation and that, in turn, helps build employee confidence – improving adoption and promoting productive use. Which is more important than ever, as a recent Salesforce survey revealed that levels of mistrust in AI are still very high. With clear guidelines and role-based access controls, Copilot becomes much more transparent and less of a mysterious dark art.
The quality of Copilot’s output can be improved by introducing content filters and clear usage guidelines. These will stop the production of misleading or harmful content, safeguarding your organisation’s reputation and further adding to employee confidence. Additionally, having solid governance guidelines will allow you to monitor and correct any biases or inappropriate content in Copilot’s outputs.
Another significant benefit of prioritising good governance and security before introducing Microsoft 365 Copilot to the workplace is the opportunity for proactive alignment with your organisation’s goals. What do we mean by that? Well, good governance policies will help to sync Copilot usage with your business‘s objectives and improve productivity, reducing its use on unnecessary or inappropriate tasks. On top of this, Copilot-specific governance and performance monitoring will enable you to capture insights into how AI is enhancing processes to help further improve alignment with strategic goals.
There are also substantial information security benefits for those bringing AI governance to the forefront. With the right monitoring and logging practices, any unusual activity can be detected early to prevent data breaches. Pre-emptive identity management and access policies will streamline Copilot-related user access processes, avoiding manual intervention. And good governance and incident response plans will improve the speed and efficiency of security incident response. It’s a win-win.
Native security vs what you need to do
The benefits of getting security and governance right when it comes to AI are pretty clear, which raises questions about what’s already in place and what needs to be done.
Microsoft has already ensured that you have ‘tenant isolation’ and clear training boundaries. This means that Microsoft 365 Copilot will only use data from an employee’s Microsoft 365 tenancy. It will not use data from tenants the employee is a guest of, and it will not use data from tenants with a cross-tenant sync. Nor will Copilot use your proprietary data to train the foundational large language model that it uses for all tenants.
However, everything else is down to you.
Which means housekeeping, permissions, labels, policies, people, and more. To help you get the ball rolling, we’ll look at each of those five areas in a little more detail and set out some key actions.
5 key areas for action
1. Housekeeping
Your organisation may already have a sound approach to data lifecycle management…only retaining active data, archiving inactive data required for compliance or reference purposes and securely disposing of unnecessary data. If not, this shouldn’t be ignored. It’s also important to consider whether any sites or data libraries in SharePoint need to be excluded from search, such as leader-only materials or HR sites that may contain confidential information.
2. Permissions
Reviewing, and potentially tightening, current permissions should be a key area of focus. Unfortunately, Microsoft 365 permissions are extremely complex and there’s no single easy way of doing this. We often suggest implementing role-based access control and configuring conditional access policies, such as those based on location, device, and duration. Microsoft Entra ID (formerly Azure AD) will helpfully enable you to create group permissions based on departments, roles, or projects. You can also apply app-specific permissions and controls, such as those in SharePoint permissions and Teams settings, to control who can view, edit, or share content. Another handy tool is Privilege Access Management – an application that helps combat permission sprawl by providing high-privilege roles with temporary, time-bound access.
3. Labels
Just as you should review permissions, you also need to review the use of labels. Sensitivity labels are key to Microsoft’s application of data loss prevention policies and encryption. But, in practice, the application of labels is patchy. As a first step, identify and fix sensitive files with no label, sensitive files that are wrongly labelled, and non-sensitive files that have a sensitive label. You can also consider using Microsoft 365’s Information Protection and Data Loss Prevention policies to enforce sensitivity labels. This provides an additional layer of access control by limiting access to highly sensitive data.
4. Policies
So far, we’ve looked at the importance of policies, permissions and labels. It’s worth considering how Microsoft Purview can help you manage this. Purview is a suite of data governance, compliance, and risk management tools that help manage and protect data, whether on premises or in the cloud. Key capabilities include data discovery, classification, security, and compliance and it covers many of the requirements we’ve already touched upon.
5. People
The success of Microsoft 365 Copilot hinges on the people who use it. Their understanding of it, their confidence in it, their correct use of it and their behaviours around it. So, as with any new technology, a well-managed adoption programme is vital. Providing the right training and support is an important part of this, but so too is designing a comms plan that’s geared up to encourage the right behaviours. Generative AI is already producing frighteningly good content, which can sometimes make people complacent. But it’s not always error-free, so employees need to understand their responsibility to check the accuracy of their AI assistant’s output. A solid communication plan should set out your organisation’s ground rules around the usage of AI and define the responsibility of each employee.
Next steps
While there are clear benefits to having sound security and governance for Copilot, the preparatory work can sometimes feel a little daunting for already-busy IT teams. Much depends on the strength of existing measures and the level of resources available.
An easy way of dealing with this is to call on the expertise of a specialist partner, such as Silicon Reef. Our Microsoft 365 Copilot Readiness & Deployment service provides you with a structured approach, a huge step up the learning curve, and the ability to accelerate your deployment.
Learn more about the Microsoft 365 Copilot Readiness & Deployment service or arrange a non-obligation discussion with a subject matter expert.