laitimes

Microsoft CTO talks to Bill Gates about GPT-4 and the future of artificial intelligence

According to Bloomberg, Microsoft co-founder Bill Gates has been working closely with OpenAI, the parent company of ChatGPT. On March 30, he shared his conversation with Microsoft CTO Kevin Scott to discuss how his company is leveraging its emerging AI technology.

Gates talked about his experience using GPT-4 because he was the first person outside of OpenAI to see the technology. When asked about the capabilities and limitations of GPT technology, Gates said it doesn't understand context in the same way that humans do. He also said that "mathematics is a very abstract type of reasoning," which is currently ChatGPT's biggest weakness. OpenAI's technology can "strangely" solve many mathematical problems, especially in abstract form, but it will still make mistakes "confidently" when answering numerical questions.

In response to concerns that AI might eliminate the need for human labor, Gates said: "Microsoft is talking about making humans more productive. So some things will be automated, but many things will be facilitated, and ultimately the engagement will be very human and one who can do more than ever before. ”

Here are some of the excerpted conversations:

Kevin Scott: There have been some interesting developments in technology over the last few years, notably GPT-4 and ChatGPT developed by Microsoft in partnership with OpenAI. In fact, the first demonstration of GPT-4 outside OpenAI was held in August last year, how did you feel after seeing GPT-4?

Bill Gates: Artificial intelligence has always been the holy grail of computer science. Before the advent of machine learning, the overall progress of artificial intelligence was quite slow, and even speech recognition was barely adequate. After that, in the field of machine learning, especially perception, speech recognition, picture recognition, etc. ushered in rapid development. But these models still fall short in complex logic to speak, read, and do things like humans.

Early text-generating models lacked contextual understanding. It can generate a sentence, such as "Joe is in Chicago" at the beginning and "Joe is in Seattle" after a few sentences. Locally, it produces good sentences, but from a human point of view, it is inconsistent.

At the time, the teams at OpenAI and Microsoft were extremely enthusiastic about GPT-3 and even early versions of GPT-4. I said to them, "If it can pass AP Biology and give adequate and reasonable answers to questions outside the training set, then it will be an important milestone, so please keep up the good work." ”

I thought it would take them at least two or three years to succeed. Surprisingly, when the OpenAI and Microsoft teams came to my house in early September and showed me the latest model, they asked me to ask some AP Biology questions, and it was shocking that it could answer all but one math-related question. I also asked it, "What would be said to the father of a sick child?" "It gave a very delicate and considerate answer, better than everyone in the room at the time. Later, I got an account, I asked it to write college admissions, write poetry, let it write an episode of "Ted Lasso" based on certain plots... It's really hard to imagine where its limits are.

Although there are still some problems to be improved, it can be said that this will be a fundamental change, and now natural language can be used as the main interface for human-computer interaction, which is a huge improvement.

Kevin Scott: There are a lot of questions about artificial intelligence, GPT-4, let's talk about what it's not good at, the last thing we want to give people is the impression that it's AGI (General Artificial Intelligence), it's perfect, and it doesn't need a lot of extra work to improve it. The math problem you just mentioned is one of them, so what do you think AI systems need to improve over time?

Bill Gates: When asking questions to machines, background knowledge about context is a common question. For example, if you ask a machine a joke and ask it a serious question, humans can change their emotions from your facial expression and know that the context of a joke is no longer a joke, but the artificial intelligence continues the joke. This sense of context plays a huge role in interaction.

Also, in terms of the difficulty of solving the problem, when we do the same mathematical equation, it may take five or six simplifications to convert it to the correct form, and we keep learning these simplifications. Machine reasoning is achieved through a hierarchical linear descending inference chain, and if the simplification takes 10 runs, the machine may not. Mathematics is very abstract reasoning, which is the biggest weakness of artificial intelligence.

Paradoxically, it can solve a lot of mathematical problems. If you let it explain something in an abstract form, essentially giving an equation or program that matches the problem, it will do it so perfectly that it can be used as a solver. However, if you do numerical operations, then it will often go wrong. Whatever the weak link, it takes time to address and should be taken seriously. We need more innovative models that train models mathematically through prompts or training.

How to evaluate GPT-4?

Those who say it's bad are wrong, and those who say it's AGI are not right. Our point of view is somewhere in between, and what to do is make sure it can be used in the right way.

Kevin Scott, Chief Technology Officer of Microsoft

Kevin Scott: We all know that you have personally experienced several major technological changes and have your own unique perspective. At a time when this is another moment of great change, what advice do you have for those who are considering using new technologies? How should they use the new technology? How does this relate to your ideas in the PC and internet age?

Bill Gates: Computers were not available for personal use, and then microprocessors and a large number of companies came into personal computers, and IBM, Apple, and Microsoft were involved in software development. Then the Internet connected these and then evolved into mobile computing and mobile phones. The digital world has dramatically changed our lives.

The birth of computers that can read and write is as profound as any of these steps. A small percentage of people think we may be overestimating technology, and rightly so. But in this revolution, we've underestimated the ability of natural language and computers to process natural language, and the impact it has on white-collar jobs, including sales, service, doctors, and I thought it would be many years from now.

The new phase of artificial intelligence has just begun, and we are in the phase of fanaticism about it, just as we were once fanatical about the Internet, and of course looking back now, the Internet has become an important tool. This is a huge breakthrough, a milestone in the entire field of digital computing.

Kevin Scott: One thing I've been thinking about is that since Ada Lovelace wrote the first computer program, there's a technical barrier to entry for making a digital machine work for people, you have to be a skilled programmer, you have to understand the customer's needs, and then you build software to make the machine do things for you.

Now, with natural language interfaces, AI can write code to launch a whole set of services and systems, allowing ordinary people to use machines to complete complex tasks without having to spend years learning specialized knowledge. What do you think about this?

Bill Gates: Every advance in technology lowers the barrier to entry for people to use it. An example is a spreadsheet, where you don't have to understand logic or symbols, although you still need to understand formulas. There are many programs that can help you visualize company data or make complex queries to understand attrition and sales performance. You don't have to go to the IT department and wait in line and have them tell you.

Whether it's querying, reporting, or triggering a workflow and an activity, you just need to describe it in language and you'll generate a program with a full suite of query and programming tools for everyone to use. Artificial intelligence is empowering people to interact most directly, and this is what we are working on right now.

Kevin Scott: From a personal standpoint, what excites you the most? You care deeply about areas such as education, public health, climate and sustainable energy, what impact will AI have on these areas?

Bill Gates: We've been thinking about health and education. In health systems with few doctors and difficulty in obtaining medical advice, AI-enabled medical research will be of great interest. Also, all would like to have a private tutor to help. For example, in some special schools, students receive line-by-line feedback from teachers on writing, but for most children, they do not receive one-on-one instruction.

I think education will be the most interesting area of application, followed by health. Of course, these technologies also have great business opportunities in sales and service scenarios. For example, in a class of twenty or thirty people, the teacher cannot pay attention to one student alone and understand everyone's behavior at the same time. The use of artificial intelligence technology dialogue and feedback in multiple disciplines can effectively improve the level of education.

We must admit that computers still have a lot to do in revolutionizing education. Over the next 5-10 years, we need to think about learning from a new perspective and how we can help in education, not just finding material through computers.

Kevin Scott: This is a global problem. We also see that parental involvement has a great impact on a child's education. Some parents are busy with work and it is difficult to contact their children, imagine that there is such a technology, it does not care what language you speak, can build a bridge between parents and teachers, help parents understand the problems that hinder the growth of their children, and even personalize the education of children to really solve the problem at hand, which is very exciting.

So, what challenges do you think we will face in the next 5-10 years? What is the direction of our efforts?

Bill Gates: I think there will be a series of innovations in the execution of algorithms, and the transition of many chips from silicon to optics will reduce energy and costs. NVIDIA is currently leading in this regard, and there will be more challengers in the future, because everyone hopes that the lower the cost of operation and training, the better. Ideally, we want the model to run on the end side, so that it can be operated on a separate client device without having to go to the cloud.

There will also be significant challenges on the software side. For example, do users need a specific version, or a version that is continuously improving? Even Microsoft pursues both. Ideally, we want to handle different requirements more accurately for different domains through training data, and maybe even some pre-check, post-check logic that applies to it.

In addition, there are many social issues, including the promotion of education, medical development and so on. Microsoft has been working to improve productivity, and in the future, some things will be automated, and eventually some tasks may only need one person to complete, but this person will be able to accomplish more than ever before. The challenges and opportunities will also be numerous. I see the team at OpenAI exploring this, but I believe a lot of other agencies and organizations are pushing for that as well. Technological innovation will be faster than ever before, with far more people, resources and companies targeting this goal than ever before.

Kevin Scott: Early in my career, I spent most of my training as a computer scientist, writing compilers, writing a lot of assembly language and design programming languages, or doing research on parallel optimization and high-performance computer architecture in graduate school. After leaving graduate school, I thought I would never use these things again. But today when we build supercomputers to train models, these technologies come into play. If you were a young programmer in your 20s right now, what technologies would you be interested in?

Bill Gates: There's quite a bit of math in there. Luckily, I used to do a lot of math-related things, which was the gateway to programming. Some programmers don't have a background in mathematics, and I recommend that they master some math because a lot of computing isn't just a programming problem.

The original Macintosh was a 128K machine, 22K of which were bitmap screens, and almost no one could write a suitable program, only Microsoft and Apple succeeded. But now that you're manipulating these models with billions of parameters, can we skip some parameters, or simplify some parameters, or do precompute? On resource-constrained machines, optimization becomes particularly important.

While progress in computing acceleration has been better than expected over the past six months, how big of a resource bottleneck will there be in the coming years? How do we ensure that businesses allocate these resources in a smarter way? In any case, almost every field of computer science, including database type technology, programming technology, etc., requires us to think in a completely new way.

Kevin Scott: Finally, what do you do outside of work? We all know that you love to read, often carrying books with you with you in a huge handbag, and reading a lot of everything from science to fiction wherever you go. What is your reading rhythm?

Bill Gates: I've been playing pickles for over 50 years, and I love playing tennis and reading. I've read more than 80 books in the last year, including by Thomas Sowell, Vaclav Smil, Steven Pinker, whose minds have shaped my thinking. At the same time, reading also relaxes my mind. I think I should read more novels, and people recommend a lot of good novels to me, which is why I share my book list on the Gates Notes. Finishing Editor/Chen Jiajing

Read on