By Sharon Stewart |
September 27, 2024
The world is awash with AI – social posts, webinars, events, white papers and more – all shouting and sharing about the power of generative AI and the exciting potential this innovative technology can apparently offer you and your organisation.
Microsoft is leading the AI charge with Copilot for M365. In this series of blog posts, we’re taking a deep dive into the world of AI and Microsoft Copilot. For our latest blog, Sharon Stewart, one of our consultants here Metataxis, has been conducting tests on Copilot for Microsoft 365 to evaluate its capabilities and limitations, and to assess its performance in terms of information management and governance. We want to find out if it will really revolutionise the way we use Microsoft 365, and what it means in terms of managing information, particularly for our clients.
These are her findings:
What we asked Copilot to do:
- Understand existing content: summarise documents and meetings, including extracting action items
- Edit content: add images and slides to presentations, and modify existing materials
- Help: find documents authored or co-authored by colleagues or past employees, desk research, and catching up with what’s new in our organisation
- Security: test the boundaries in terms of accessing secure information
We found that using Copilot did enhance productivity in some areas – but not all. It’s great for starting off the creative process and providing an initial foundation to overcome the blank page block.
However, when summarising meeting notes and pulling out action points, I found myself making notes to check against Copilot’s output due to inaccuracies.
Overcome that blank page block, but watch out for inaccuracies.
2. Permissions and access
Several sources claim that Copilot respects permissions and security settings. I tested this and discovered the following:
- Privacy: Copilot respected privacy and I could not see files that were locked down to certain staff members or specific areas of the business
- Boundaries: Copilot respected the boundaries of external tenants, and I could only see content on the Metataxis tenant despite having access to external ones
- Email: Copilot respected boundaries of private email accounts – and I could not access private email accounts elsewhere in the business
- Correspondence: Copilot respected correspondence beyond the boundaries of where I have been directly involved e.g., I was only shown correspondence where I was directly involved – not between two or more colleagues
3. Copyright
I noticed that the images Copilot generated for my PowerPoint presentations were coming from the visual media company Getty Images.
I reached out to Getty and they informed us that they have an agreement with Microsoft, so effectively there are no copyright issues.
In September 2023, Microsoft published their Copilot Copyright Commitment for customers, ensuring that customers using Microsoft’s Copilot AI tools are protected against copyright claims related to the content generated by Copilot. Essentially, Microsoft will defend and assume liability for any copyright issues that arise, as long as users are using the tools in accordance with Microsoft’s responsible AI usage guidelines. Organisations rolling out Copilot need to ensure that their users understand what those guidelines are.
4. Retrieving content and utilising metadata
I set out to understand how Copilot searches for content across our organisation and how it utilises metadata. I know that Copilot uses Microsoft Graph to retrieve information in response to our prompts, but wanted to understand how it was using our metadata to enhance the quality of the output. I asked Copilot for help on a broad range of queries such as: “when is our next meeting with a particular person?” and “find all the documents written by specific authors and co-authors.”
The results were mixed. I knew that certain documents existed, but Copilot omitted key content and the results were inconsistent.
It’s important to apply critical thinking when using Copilot.
Without a logical explanation as to why, I asked Copilot which metadata it used to perform the search. Copilot replied, “I’m sorry but I can’t discuss the specifics of my search methods or the metadata I use” and “sorry, I can’t chat about this.” When asked why it couldn’t chat about this, the send button was greyed out and would not allow me to ask additional questions. It then suggested that I start a new chat. Exploring the inner workings of Copilot revealed that it operates as a black box, highlighting the importance of applying critical thinking when using the tool.
Steps for Copilot success
This testing produced some valuable insights, and revealed that there are several steps to undertake prior to adopting Copilot to ensure success:
- Copilot for M365 respects permissions but it is vital to manage security settings effectively
- Copilot for M365 can be a helpful assistant but it still requires human oversight; it can make mistakes, return misinformation, and generate false content. Coupled with a lack of transparency around what Copilot is doing under the bonnet, thinking critically and being knowledgeable about your subject area is essential to avoid falling foul of hallucinations
- Microsoft is continuously rolling out Copilot modifications and enhancements, so keep an eye on the M365 roadmap and factor changes into governance, training and change management initiatives
- Copilot for M365 will not fix underlying information management issues, so having a robust framework in place is key to maximising its value and minimising the risk of using incorrect information.