IT admins will be able to block Copilot from making personal assesments about employees

Users cannot disable the policy.

Reading time icon 2 min. read


Readers help support Windows Report. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help Windows Report sustain the editorial team Read more

Microsoft Copilot personal

Microsoft is set to introduce a new policy allowing IT admins to block Copilot from making personal assessments about people entirely in Teams meetings.

According to the latest entry to the Microsoft 365 Roadmap, the new policy will forbid Copilot to respond with outputs that might infer emotions, make evaluations, discuss personal traits, or use context to deduce answers.

When this policy is enabled, Copilot won’t be able to respond to users asking the AI models to evaluate people, and users cannot disable the policy. However, the policy is limited to Copilot in Teams meetings only.

Here’s what the entry says:

This policy enables IT admins to block Copilot responses that might infer emotions, make evaluations, discuss personal traits, and use context to deduce answers. When this setting is applied it restricts copilot’s ability to make inferences or evaluations about people or groups when prompted to do so by users. Users cannot disable this setting. The setting applies to Copilot in Teams meetings.

The policy will apply to all devices, from Teams on Desktop to iOS, Mac, and Android, and Microsoft will introduce it in October.

With it, the Redmond-based tech giant hopes to stop individuals from using Copilot in other ways to get work done. So you won’t be able to badmouth your coworkers with Copilot and expect the AI model to take your part. It won’t answer.

We may see how this policy works at the upcoming Microsoft 365: Copilot Wave 2, scheduled for next week.

What do you think about this new policy?

More about the topics: Copilot, microsoft

User forum

0 messages