Software

AI + ML

Copilot for Microsoft 365 might boost productivity if you survive the compliance minefield

Loads of governance issues to worry about, and the chance it might spout utter garbage


Microsoft has published a Transparency Note for Copilot for Microsoft 365, warning enterprises to ensure user access rights are correctly managed before rolling out the technology.

Concerns over data governance have held up some Copilot projects as biz customers consider how best to integrate the service into their organizations, and Microsoft's Transparency Note warns that administrators must check user access is configured correctly before rolling anything out.

The note makes it clear: "Copilot for Microsoft 365 only accesses data that an individual user has existing access to, based on, for example, existing Microsoft 365 role-based access controls."

Copilot for Microsoft 365 is an add-on for which Microsoft expects $30 per user per month, with an annual subscription. It uses large language models (LLMs) and integrates data with Microsoft Graph and Microsoft 365 apps and services to summarize, predict, and generate content.

At first glance, the service is innocuous enough. It gets input from a user in an app, such as Word. That user prompt is then parsed to improve the odds of getting something useful out of the service and then sent to the LLM for processing. What comes out of the LLM is post-processed before being returned to the user.

According to Microsoft: "This post-processing includes other grounding calls to Microsoft Graph, responsible AI checks such as content classifiers, security, compliance and privacy checks, and command generation."

In addition to ensuring user access is configured correctly, the Transparency Note warns organizations to consider legal and compliance issues when using the service, particularly in regulated industries.

"Microsoft is examining regulatory requirements that apply to Microsoft as a provider of the technology and addressing them within the product through a process of continuous improvement," the document states.

Then there's the recommendation to allow Copilot for Microsoft 365 to reference web content from Bing to improve "the quality, accuracy, and relevance" of its responses. Allowing Microsoft Graph to be extended with sources like CRM systems, external file repositories, and other organizational data is another recommendation that will require enterprises to take a long, hard look at governance.

Microsoft's Transparency Note for Copilot for Microsoft 365 is a useful document, highlighting that enterprises must consider the implications of deploying the service.

Last month, Jack Berkowitz, chief data officer of Securiti, told us of bigger corporations pausing Copilot deployments because the tool is accessing data and "aggressively summarizing information" that certain employees shouldn't have access to – salaries, for example.

"Now, maybe if you set up a totally clean Microsoft environment from day one, that would be alleviated," he said. "But nobody has that. People have implemented these systems over time, particularly really big companies. And you get these conflicting authorizations or conflicting access to data."

The much-touted productivity gains from the AI service need to be balanced by its risks – even Microsoft notes "users should always take caution and use their best judgment when using outputs from Copilot for Microsoft 365" – and worries over compliance and data governance must be addressed before unleashing the service on an organization. ®

Send us news
25 Comments

Microsoft expands Copilot bug bounty targets, adds payouts for even moderate messes

Said bugs 'can have significant implications' – glad to hear that from Redmond

After Copilot trial, government staff rated Microsoft's AI less useful than expected

Not all bad news for Redmond as Australian agency also found strong ROI and some unexpected upsides

Microsoft's drawback on datacenter investment may signal AI demand concerns

Investment bank claims software giant ditched 'at least' 5 land parcels due to potential 'oversupply'

Microsoft warns Trump: Where the US won't sell AI tech, China will

Rule hamstringing our datacenters is 'gift' to Middle Kingdom, vice chair argues

Under Trump 2.0, Europe's dependence on US clouds back under the spotlight

Technologist Bert Hubert tells The Reg Microsoft Outlook is a huge source of geopolitical risk

Microsoft names alleged credential-snatching 'Azure Abuse Enterprise' operators

Crew helped lowlifes generate X-rated celeb deepfakes using Redmond's OpenAI-powered cloud – claim

Some workers already let AI do the thinking for them, Microsoft researchers find

Dammit, that was our job here at The Reg. Now if you get a task you don't understand, you may assume AI has the answers

Despite Wall Street jitters, AI hopefuls keep spending billions on AI infrastructure

Sunk cost fallacy? No, I just need a little more cash for this AGI thing I’ve been working on

Satya Nadella says AI is yet to find a killer app that matches the combined impact of email and Excel

Microsoft CEO is more interested in neural nets boosting GDP than delivering superhuman intelligence

Windows 11 adoption picking up speed, but older sibling still ahead

Microsoft Copilot reckons that it didn't have to be like this

Microsoft 365 price rises are coming – pay up or opt out (if you can find the button)

It's not auto-enrollment. It's just your current plan with extra Copilot for more money. Completely different

Microsoft boffins promise entire game worlds made from AI slop

WHAM, bam, no thank you, ma'am?