laitimes

Li Di, CEO of Xiaoice: The development of artificial intelligence needs to discuss boundaries, and scenario-based restraint is the key

author:The Paper

The Paper's reporter Zhang Wei

"When we talk about the future value of artificial intelligence, we are actually talking more about boundaries." On December 16, Li Di, CEO of Xiaoice and former executive vice president of Microsoft (Asia) Internet Engineering Institute, publicly stated at the Beijing MEET2021 Intelligent Future Conference that artificial intelligence should be developed and restrained at the same time, "the most important restraint is the restraint of scenarios."

Li Di, CEO of Xiaoice: The development of artificial intelligence needs to discuss boundaries, and scenario-based restraint is the key

Li Di

Li Di said that when artificial intelligence inevitably enters every corner of human society and is inextricably linked to human beings, an effective benign relationship should be formed between artificial intelligence and people.

He believes that as the inventor of technology, "when mastering many different technologies, we must anticipate and anticipate the difficulties that this technology may encounter in the process of application, and then try to avoid it." ”

Li Di especially talked about the artificial intelligence robot Microsoft Xiaoice "scenario-based restraint" in the field of artificial intelligence speech.

He introduced that at present, Xiaoice has been able to automatically generate voices close to real human voices, even with emotions and emotions, which are widely used in voice scenes such as financial information programs. But Xiaoice team never opened up to individuals to train voice fonts. A speech font is a computer-generated speech that can be controlled by specifying parameters such as speed and pitch and having it read out the text entered.

Li Di said that the main reason why the voice font is not open to individuals is that it cannot ensure the safety of the technology in this scenario. Because the trained voice font is too close to the real person, it is likely to be abused. "It is very likely that someone with ulterior motives will use it to call (call) this person's relatives and friends, to call this person's parents and friends."

"When we don't have enough technology to scam, we can develop it, but when we anticipate that the technology in our hands has reached or crossed a boundary that could create a dilemma, our team tends to be very cautious."

In the face of the increasing development of artificial intelligence technology, Li Di said that the most critical thing is to consider who should provide this increasingly powerful AI tool. "We hope that humans and AI can have a very good interaction in the next decade, and the AI system itself will become more powerful." But if it is biased in the other direction, it is not good for anyone. ”

Editor-in-Charge: Li Yuequn

Proofreader: Liu Wei

Read on