laitimes

Relying on "empathetic" financing of more than 300 million, can AI really do a good job in emotional business?

author:New list

The corners of the mouth are laughing downwards, and the corners of the mouth are crying upwards.

Relying on "empathetic" financing of more than 300 million, can AI really do a good job in emotional business?

This man's name is Wang Yousheng (a character in the TV series "Bright Sword"), and with just one expression, he fascinated the AI into "seven up and eight down", and his emotional values repeatedly jumped horizontally in happiness, disgust, and sadness.

Recently, creators have used AI to identify the subtle expressions and emotional changes of the characters in the play, appreciate film and television dramas in a microscopic way, and create some hot Internet stalks and famous scenes for the second time, opening up a new track for content creation.

For example, the UP main of station B "Fra Xiwei", "You may have missed it", has used AI emotion recognition "Chunshan Learning", "Messi's China Tour", "Xiaomi announced SU7" and other hot topics, and the highest one on station B has more than 6 million views.

Relying on "empathetic" financing of more than 300 million, can AI really do a good job in emotional business?
Relying on "empathetic" financing of more than 300 million, can AI really do a good job in emotional business?

YouTubers used Sam Altman's interview video to identify emotions and try to speculate about the big guy's mental state at a particular moment. Some netizens in the comment area said that in the future, AI can be allowed to identify micro-expressions in real time, and it is easy to play memes.

There are also developers who train machines to be more emotional, trying to make new tricks in the field of AI chatbots. The previously popular "Dan" mode of ChatGPT has taken a differentiated path from various types of training GPTs.

Both wild players and tech giants are exploring an AI that "understands humans better".

In March this year, the start-up Inflection AI updated the emotional chatbot Pi for individual users, and in April, Hume AI launched the emotional voice chatbot EVI and successfully completed a $50 million (about 362 million yuan) Series B financing.

Relying on "empathetic" financing of more than 300 million, can AI really do a good job in emotional business?

Pi (https://pi.ai/onboarding)

Relying on "empathetic" financing of more than 300 million, can AI really do a good job in emotional business?

Hume AI(https://www.hume.ai/)

From the beginning, through facial micro-expressions, I want AI to "see" people's emotional changes, to "listen" to understand words with the help of voice, can AI do a good job in emotional business?

Relying on "empathetic" financing of more than 300 million, can AI really do a good job in emotional business?

From "seeing" to "listening", AI emotion recognition is on fire

On April 7, AI start-up Hume AI released its first chatbot, Empathetic Voice Interface (EVI), which claims to be "the first artificial intelligence with emotional intelligence" and can recognize more than 50 human emotions.

Relying on "empathetic" financing of more than 300 million, can AI really do a good job in emotional business?

We tried the demo version of EVI and it can be used without logging in. However, at present, only male voices are supported, and only English voices are supported.

"I'm a little sad how you are today," I said to EVA.

"Oh my God, it sounds like you've been through a lot today," EVI replied, "I'll be here with you," he added.

Relying on "empathetic" financing of more than 300 million, can AI really do a good job in emotional business?
Relying on "empathetic" financing of more than 300 million, can AI really do a good job in emotional business?

In a simple conversation, EVI also marked my mood at the moment, "0.645 sadness, 0.434 pain, and 0.401 frustration".

As long as I keep talking to him, he will automatically analyze the emotions behind each sentence, happiness, encouragement, anger, confusion, etc., and mark different emotional concentrations.

Relying on "empathetic" financing of more than 300 million, can AI really do a good job in emotional business?

As you can see, in the middle of the screen is a waveform of sound, and the bars of different color lengths reflect different emotional components, representing orange for happiness, blue for calmness, and red for anger. In addition, the content of the voice interaction will also be converted into text and displayed in the history dialogue on the right.

Relying on "empathetic" financing of more than 300 million, can AI really do a good job in emotional business?

We further mentioned that "I have been living a terrible life lately, and I have watched a very depressing movie".

"What kind of movies do you watch, but not every movie is right for you, so why don't you change the genre? Do you like comedy or horror?"

More critically, it seems to be aware that I am feeling depressed when I express it, and EVI deliberately changes his tone of voice when he says these sentences, a bit like amusing you by pranks.

Since it is an AI that focuses on "emotional intelligence" and "psychological healing", what will happen if I keep transmitting negative emotions and use Hume AI as an emotional trash can?

In the following experience, I began to "internal friction" the AI, constantly telling him "I'm so sad", "I'm out of love, I have no friends", and even pretending to be angry, "Why do you keep asking me questions, I don't know, but I think you must be wrong", and so on.

Relying on "empathetic" financing of more than 300 million, can AI really do a good job in emotional business?

I've also tried to interrupt him abruptly and tell him repeatedly, "No, you're wrong, listen to me". EVI will know when it's time to finish speaking, and stop and listen to me when interrupted, rather than outputting the idea and content on its own. The details of this conversation can give people a stronger sense of interaction.

During the whole process, EVI said a lot of words such as "Don't worry, don't worry, don't worry, you say, I'm listening", and most of the advice given was "You tell me a little more, what is the story", and even changed the tone of the output speech, which was not affected by my anxiety and impatience in the slightest.

Previously, an emotional healing AI game "Forest Chat Healing Room" also became popular on the Internet.

In the game, users can chat with the AI through text and voice. The AI animal therapist will analyze the different expressions in the user's conversation content, analyze their emotional values, and help the user find and solve the problem.

For example, we also say "had a terrible day":

Relying on "empathetic" financing of more than 300 million, can AI really do a good job in emotional business?

Similarly, the AI will recognize the emotional component of the voice based on the change in the intonation. The AI animal therapist will further ask "can you elaborate on this?" and "if you are unhappy, will you choose to go to the beach or the forest?" to judge the user's mood through follow-up questions and multiple rounds of dialogue.

The chat process is more about guiding us to sort out the ins and outs of the whole problem and make a decision. However, the AI therapist only chats with the user through "voice", and cannot interrupt it through voice or text during the AI speech. Therefore, compared with text input, voice communication can better grasp the rhythm of the conversation and be more immersive.

The founder of Hume AI also mentioned that the amount of information transmitted by voice will be twice that of text, and the future AI interactive interface will be voice-based.

However, because Hume AI currently only supports English, it is somewhat inaccurate in recognizing emotional differences in voice intonation, but in terms of interactive dialogue, EVI is more realistic than other conversational robots and conveys more caring value.

In terms of overall experience, compared with AI bots such as ChatGPT, these "mind-reading" AIs pay more attention to the user's personal emotions and mental state, and fully demonstrate their "empathy" during the conversation.

Relying on "empathetic" financing of more than 300 million, can AI really do a good job in emotional business?

Where is the AI that provides emotional value?

In fact, AI products that focus on emotional needs are not new. With the continuous improvement of AI technology, the product experience has been greatly optimized. AI that loves, knows, and understands you, and can make people feel emotional, is also more attractive than AI that simply improves efficiency.

Li Qinggong, the product leader of West Lake Xinchen, believes that what the big model says can make the user feel better, and what the user's mood is like in certain situations, these are the first abilities that "empathy AI" must have.

Therefore, in order to effectively provide emotional support, AI that can recognize emotions needs to remember the user's historical information, personality preferences, emotional state, etc., in order to better understand the needs of the moment and provide personalized solutions.

But it also requires the AI model to have a "long memory" ability to establish and maintain an emotional connection with the user. Brick, a practitioner who focuses on the field of AI psychological counseling, said that most AI products that focus on empathy are now very vaguely positioned.

Whether it is the previously popular Pi or Hume AI, they either lack a long-term memory system or lack multiple forms of interaction. When we turn on Hume AI again, EVI still starts the conversation with "Hello, I'm here", completely unreminiscent of you being the migrant worker who was sad about watching the movie.

Relying on "empathetic" financing of more than 300 million, can AI really do a good job in emotional business?

A developer said to the "number one AI player": "In fact, the role of AI in companionship is far greater than providing emotional value. For example, when you are depressed, you can have a tree hole to output and get a more realistic feedback experience. The ultimate emotional value still comes from the user himself. ”

In addition, the ultimate purpose of emotion recognition is mostly to understand the user's psychological state and output healing suggestions, but this does not mean that conversational robots can completely replace psychological counselors.

"For novice counselors, it may be possible to provide interviewees with psychological counseling services that are far beyond their own level through appropriate prompts with the help of ChatGPT, Claude, and Gemini Pro," Brick believes, "but AI cannot solve the problem of embodiment in the short term, and psychotherapy requires more three-dimensional interaction, such as Morita therapy, where the scene is more important than the content itself." ”

In general, the current emotion recognition AI is a verified and feasible track, and related products are emerging in an endless stream. However, if we expect an AI that has "emotional intelligence" with the help of datasets such as facial emotions and voice changes, which can understand and help us get rid of mental internal friction, this idea may take some time to be realized.

In fact, when emotions are quantified numerically, it is difficult for AI to taste the little ninety-nine in the human heart. After all, we all have a million interpretations of the meaning of life.