Microsoft is thinking about turning Copilot into an AI therapist, according to new patent

The model is also capable of giving medical advice.

Reading time icon 4 min. read


Readers help support Windows Report. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help Windows Report sustain the editorial team. Read more

microsoft ai therapist

Microsoft might turn Copilot into an AI therapist capable of providing psychological therapy and medical advice to users and even looking out for them in case of exceptional cases or emergencies. The Redmond-based tech giant unveiled it in a recently published paper, where it patented the enhancement under the name of Providing emotional care in a session.

The patent describes an AI-powered system that provides emotional support during chats between a user and a virtual assistant.

The session begins with the AI receiving images from the user. These images could be anything related to the user’s feelings or experiences. Then, it accesses the user profile that contains emotional information about the user, such as their likes, dislikes, and emotional triggers. This profile helps the AI system understand the user’s emotional state better.microsoft ai therapist

The AI analyzes the received images and generates text descriptions. These descriptions are created based on the emotional information from the user profile. For example, if a photo shows a sunset and the user profile indicates that sunsets make the user feel calm, the description might include words like “peaceful” or “serene.”microsoft ai therapist

In the paper, Microsoft says this AI therapist creates a memory record after generating descriptions. This record includes the images and their corresponding descriptions. Think about a session diary, where the AI records what it sees and interprets emotionally.

The user can send more images during the session. The AI therapist repeats generating descriptions based on the user’s emotional profile. The AI creates additional memory records for the new set of pictures and their descriptions. These records help the AI build a more comprehensive understanding of the user’s emotions over time.

The system then uses images and emotional profiles to provide personalized emotional support. It continuously learns and adapts to users’ emotional needs by creating detailed memory records of their interactions, offering more empathetic and relevant support in future sessions.

Microsoft says this AI therapist would be helpful to all individuals who need emotional support, but it can also act as a personal assistant to take care of them and their specific needs.

Embodiments of the present disclosure propose to utilize an electronic conversatiomal agent to provide timely and efficient emotional care to users. The users not only include the old people, but also include other people in various ages. Herein, “emotional care” may refer to emotional comuunications of assistances provided to a user in various approaches, such as emotional chatting, providing knowledge related to diseases, foods and medicines, monitoring a psychological or cognitive condition of the user through conducting psychological or cognitive tests, creating and storing memory records for the user, etc.

Would it be feasible? With the way AI research is going, it is most definitely so. The paper says this AI therapist needs to remember details about the user through memory records, which means the system would need to have either a considerable context length or an unlimited one. The Redmond-based tech giant would not be a stranger to that: back in 2023, the company funded LongMem, a prototype of an AI seemingly capable of unlimited context length.

With the other AI advancements, such as hyperreal, almost-human voice capabilities of the new Copilot (based on OpenAI’s ChatGPT-4o), Microsoft could easily build this system.

The only question remains: would people actually look for psychological help in an AI? Or, even more, would it be ethical? Just a couple of months ago, an AI model was found to be the culprit of a tragedy that saw a teenager committing suicide.

While the AI therapist is definitely a concept coming from a good place, there are various topics to be considered: the accuracy of the data, privacy filters, security points, and so on.

But it can become a reality in a few years.

You can read the full paper here.

More about the topics: AI, microsoft

User forum

0 messages