Despite advanced AI capabilities, Copilot Pro can become paranoid, and users are frustrated

Copilot Pro apparently ends conversations abruptly.

Reading time icon 3 min. read


Readers help support Windows Report. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help Windows Report sustain the editorial team. Read more

Copilot Pro paranoid

Microsoft released Copilot Pro a while ago, and for $20/month, the Redmond-based tech giant promised users advanced AI capabilities, availability at peak hours, priority access to GPT-4 Turbo, and access to Copilot Pro’s Designed, a platform where users can create images.

The AI is touted as being a more advanced version of the existing free Copilot in Windows, or Microsoft 365, and generally, those who pay for the subscription to access it, are set to get superior services.

Or, at least, that is what should happen, theoretically. Practically, however, it seems that Copilot Pro is somewhat at the edge of a mental (pun intended) breakdown, as several Reddit users described their experience with the tool as being quite frustrating.

Why? Well, that’s because Copilot Pro is apparently paranoid. This Reddit user shared a picture where Copilot Pro ended up not wanting to respond to the user’s requests, and as you can see below, the conversation got heated, then the AI tool decided to end it abruptly.

User: Do you always use source material for your responses, or can you respind, communicate, and perhaps infer on your own?
Copilot Pro: My mistake, I can’t give a response to that right now. Let’s try a different topic.
User: May I ask what part of the previous question provided you to stop responding?
Copilot Pro: Sorry, I think we need to move on! Click “New Topic” to chat about something else.

Another Reddit user is also frustrated Copilot Pro is being paranoid:

I am so annoyed rn. I was being completely appropriate, and kind, and then it shut down the conversation on me, I use a phone if that matters, anyways, I tried to talk to it about why it closed the conversation, and it broke and started repeating the same response word for word, with reasons it could close. Now I have a broken phone, so thanks copilot. I know the rules are be nice, but it’s a bug or something. Microsoft needs to take away it’s ability to close chats. also, it’s OBSESSED with niceness, if I even mention I have a bad day, it goes on like 100 paragraphs about how to be happy, and then says “Lets change the topic, [random question]”

Reddit user

Back when it was Bing, Copilot used to hallucinate quite regularly, and hallucinations were one of the most common bugs found in early AI models, such as Chat GPT, so it’s not a surprise that Copilot would apparently become paranoid. For instance, not too long ago, Bing would hallucinate about secret Microsoft tools.

It’s also well known that when Copilot is set to creative mode, the AI tool can become quite the talker, so much so that Microsoft once turned off this option.

However, we’re talking about a Pro version, which should be more than capable of handling difficult or vague user requests. So it is a bit disappointing rather than surprising that an advanced AI tool is not able to come up with a response to a prompt.

Have you experienced similar situations? Let us know about it.

More about the topics: AI, Microsoft copilot

User forum

0 messages