laitimes

Kai-Fu Lee: The revenue of the single product of 010,000 things 2C is expected to exceed 100 million yuan this year, and the product ROI is close to 1

author:Quantum Position

Hengyu is from the Au Fei Temple

Quantum Position | 公众号 QbitAI

The large model can recognize three people drinking coffee in a picture, which is very interesting. But what's the use of that?

The above emotion comes from Dr. Kai-Fu Lee, founder and CEO of 010000. At the press conference for the first anniversary of Zero One Things, he said that it should not only show demos, but should be combined with real scenarios to unlock the innovation of 2C applications:

All of our models are really valuable when they generate value for users.
Kai-Fu Lee: The revenue of the single product of 010,000 things 2C is expected to exceed 100 million yuan this year, and the product ROI is close to 1

In March last year, Kai-Fu Lee announced that he would enter the large model under the name of Project AI2.0, and then led the seventh spire incubator of Sinovation Works, 010,000 Things, to enter the large model track.

Today, 010000 has been established for more than a year, and its models have been launched one after another, and the results dominate the list; Open source and closed source are grasped with both hands, and they are promoted simultaneously; The API platform is open and available globally; The product AI assistant is Wanzhi, and the applet is available for free.

Full-stack advancement on the road to large models seems to have become the common perception of zero and one things in the outside world.

And in terms of the market, 010000 was established less than a year ago and became one of the unicorns, and was known as one of the five tigers of the domestic large model.

Its overseas 2C products have been online for nearly 10 million users for 9 months, and the revenue of a single product is expected to exceed 100 million yuan this year, and the product ROI (input-output ratio) is close to 1.

And now, when the big model track is in its second year, and the competitors are moving from running wild to running a marathon, Zero One Everything has launched a basket of new models.

Kai-Fu Lee: The revenue of the single product of 010,000 things 2C is expected to exceed 100 million yuan this year, and the product ROI is close to 1

The dual-track strategy of open and closed sources is all new

In this press conference, 0100000 things have new actions in closed-source models and open-source models.

Let's start with the closed-source aspect.

The closed-source model that we focus on this time is Yi-Large.

如下图显示,零一万物官方给出的评测结果中,推理方面,Yi-Large在HumanEval和MATH都位列第一,超越GPT-4、Claude3 Sonnet、Gemini 1.5 Pro以及LLaMA3-70B-Instruct(都是时下大模型领域的佼佼者)。

Kai-Fu Lee: The revenue of the single product of 010,000 things 2C is expected to exceed 100 million yuan this year, and the product ROI is close to 1

According to third-party evaluations, Yi-Large's bilingual ability in Chinese and English is also good.

Kai-Fu Lee: The revenue of the single product of 010,000 things 2C is expected to exceed 100 million yuan this year, and the product ROI is close to 1

Yi-Large Moon-XLarge

Yi-XLarge has started training, although it is not long, but according to the running results, the effect of Yi-XLarge is better than Large.

在初期训练中,Yi-XLarge MoE已经与Claude-3-Opus、GPT4-0409等国际厂商的最新旗舰模型互有胜负。

Kai-Fu Lee: The revenue of the single product of 010,000 things 2C is expected to exceed 100 million yuan this year, and the product ROI is close to 1

△Yi-XLarge 初期训练中评测 (May 12, 2024)

Let's take a look at the open source model.

The Yi-1.5 open source series can be said to be a big deal, and 6 models have been open-sourced at the same time, namely:

  • Yi-1.5-34B Base + Chat
  • Yi-1.5-9B Base + Chat
  • Yi-1.5-6B Base + Chat

In the evaluation set, Yi-1.5-34B is not inferior to the model with 70B parameter size, and has better performance in the same size.

Kai-Fu Lee: The revenue of the single product of 010,000 things 2C is expected to exceed 100 million yuan this year, and the product ROI is close to 1

Kai-Fu Lee said at the scene that open source is not the end of the release model, and maintaining the community is also an important part.

In the international development community, the Yi series models have a variety of applications, from learning, navigation, sales, API applications, business writing, etc., to accumulate early users.

In addition, Kai-Fu Lee is proud to see that there are many public welfare projects based on the Yi series of large models in the open source community, "so many patients and their families who are facing the torture of the disease can use the large models to understand how to get the best treatment." It is particularly gratifying that technology can truly benefit mankind. ”

Kai-Fu Lee: The revenue of the single product of 010,000 things 2C is expected to exceed 100 million yuan this year, and the product ROI is close to 1

Finally, Kai-Fu Lee announced the global launch of the Yi API platform in multiple sizes and scenarios.

He focused on the Yi-Large API, which is based on the 100 billion parameter SOTA base model, and the current price is 20 yuan/1 million tokens, which is about one-third of the price of GPT-4-turbo. He also said that if you want to have a higher cost performance, you can choose a more cost-effective model in the second row of the picture below.

"I hope that after today's release, there will be no reason why entrepreneurs, large companies, individual gamers, or non-profit organizations should not try to use our Yi-Large API." Kai-Fu Lee said.

Kai-Fu Lee: The revenue of the single product of 010,000 things 2C is expected to exceed 100 million yuan this year, and the product ROI is close to 1

TC-PMF with the introduction of "technology" and "cost".

In the heyday of the mobile Internet, PMF (Product-Market Fit) was the core goal pursued by many start-ups.

However, the advent of the era of large models has brought many changes.

There are decisive differences between the two eras at the level of entrepreneurial infrastructure:

For example, in the era of mobile Internet, the marginal cost of user growth is very low.

But in the era of large models, model training and inference costs constitute a growth trap that every startup must face. User growth requires high-quality applications, and high-quality applications are inseparable from powerful pedestal models, which are often backed by high training costs, and then need to consider the inference costs that grow with the scale of users.

Therefore, Kai-Fu Lee believes that the concept of PMF can no longer fully define AI-First entrepreneurship based on large models, and the four-dimensional concept of Technology and Cost should be introduced.

This is the TC-PMF.

Kai-Fu Lee said:

做 Technology-Cost Product-Market-Fit(TC-PMF),技术成本和产品市场契合度,尤其推理成本下降是个“移动目标”。

This is a hundred times more difficult than traditional PMF.

Kai-Fu Lee: The revenue of the single product of 010,000 things 2C is expected to exceed 100 million yuan this year, and the product ROI is close to 1

He introduced the TC-PMF methodology for zero and one things.

Kai-Fu Lee: The revenue of the single product of 010,000 things 2C is expected to exceed 100 million yuan this year, and the product ROI is close to 1

The first is globalization.

The goal of Zero One is to become a global model company. Last year, the European and American markets have experienced a GPT moment, users are familiar with AI applications, and commercialization is progressing rapidly.

This year, Zero One Everything began to adopt its own SOTA base model to continuously improve the user experience of the product.

According to reports, the revenue of a single product is expected to exceed 100 million yuan this year, and the product ROI is close to 1.

The second is the co-construction of the model base.

We believe that the training, service, and inference design of the model must be highly compatible with the underlying Infra architecture and model structure. At present, Zero One Everything full-stack AI Infra realizes the world's most advanced FP8 training framework end-to-end, and is currently the only team in China to achieve this result.

After optimization in many aspects, the training cost of the 100 billion parameter model has been reduced by as much as double year-on-year.

The third is the integration of modularity, which simply means that the product starts from the real user experience and forms a positive cycle with the model iteration.

Since September last year, 010000 has taken the lead in verifying TC-PMF at sea, and as soon as the model is launched, it will form a user flywheel with the product.

At the same time, using Descartes, the vector database with the best performance/recall rate, the deployment cost of zero 10,000 things is only 18% of that of the previously purchased third party.

Today's experience is so silky and fast, "part of the reason is that it comes from the RAG project of zero and one things, and from our self-developed vector database".

Kai-Fu Lee: The revenue of the single product of 010,000 things 2C is expected to exceed 100 million yuan this year, and the product ROI is close to 1

最后一点是Al-First。

李开复在现场谈到,应用创新需要清晰回答 When、How、Who。

Kai-Fu Lee: The revenue of the single product of 010,000 things 2C is expected to exceed 100 million yuan this year, and the product ROI is close to 1

In the past year, what have all ten thousand things been running wildly?

In late March last year, Kai-Fu Lee, chairman and CEO of Sinovation Works, officially announced that he would enter the large model under the name of Project AI2.0.

Three months later, Kai-Fu Lee's AI large model company 010000 officially spoke out, which is also the 7th company incubated by the spire of Sinovation Works.

In November, the first open-source model of 0100000 things, the Yi series of large models, was officially unveiled, with two models, namely Yi-34B and Yi-6B. At that time, it became the only domestic large-scale model that successfully topped the HuggingFace, and crushed a number of large-scale models such as Llama-2 70B and Falcon-180B with a size of 34B.

Kai-Fu Lee: The revenue of the single product of 010,000 things 2C is expected to exceed 100 million yuan this year, and the product ROI is close to 1

12月,Yi-34B-Chat公布新成绩,在Alpaca经认证的模型类别中,以94.08%的胜率,超越LLaMA2 Chat 70B、Claude 2、ChatGPT。

Since the beginning of this year, zero and one things have moved frequently.

In January, Zero One Everything handed over the multi-modal large model answer sheet, which also belongs to the Yi series and also has two versions: Yi-VL-34B and Yi-VL-6B, which are open source to the world.

The official test data is that Yi-VL-34B has an accuracy rate of 41.6% on the English dataset MMMU, second only to GPT-4V with an accuracy rate of 55.7%, surpassing a series of multi-modal large models.

Kai-Fu Lee: The revenue of the single product of 010,000 things 2C is expected to exceed 100 million yuan this year, and the product ROI is close to 1

Then, in March, zero and one things are still speeding up.

The 9 billion parameter Yi-9B was launched in full swing. It is known as the "top of science" in the Yi series, and it has made up for code mathematics, and its comprehensive ability has not been left behind.

尤其要提到,Yi-9B对开发者格外友好,Yi-9B(BF 16)和其量化版Yi-9B(Int8)都能在消费级显卡上部署。

Kai-Fu Lee: The revenue of the single product of 010,000 things 2C is expected to exceed 100 million yuan this year, and the product ROI is close to 1

Still in March, the 010,000 Things API open platform debuted, providing developers with three versions of the model, supporting 200K context windows, and having multi-modal capabilities.

Entering May, just a few days ago, Zero One Everything officially announced a one-stop AI work platform - Wanzhi.

Kai-Fu Lee: The revenue of the single product of 010,000 things 2C is expected to exceed 100 million yuan this year, and the product ROI is close to 1

It can be used as meeting minutes, weekly reports, writing assistants, and can also speed read any document to help you make PPT.

What's more, it is more suitable for the physique of Chinese workplace babies, and it is a god-level productivity tool tailored for domestic workers.

Kai-Fu Lee: The revenue of the single product of 010,000 things 2C is expected to exceed 100 million yuan this year, and the product ROI is close to 1

Now, Kai-Fu Lee has to ask about 100 questions every day on Wanzhi to understand the latest experience of using the product and give timely feedback to the badcase.

He mentioned that OpenAI has melted a lot of money, first pursuing GPU, then considering applications, and the zero-10,000 model is to use fewer chips and lower costs to find TC-PMF.

To be a great large-scale model company, the bottom cannot be bad, but it cannot only be the bottom.

It is also important to create the application, how to make the product people understand the model, and how to make the model and the people who understand the model also make the application.

Of course, we believe that AGI will happen, and we will promote the occurrence of AGI, but at the same time we are pragmatic, and our energy will not be spent on industry forecasting, nor will we make it the only thing to think about miracles. OpenAI can go and try this path, but it won't be the path we take.

Kai-Fu Lee: The revenue of the single product of 010,000 things 2C is expected to exceed 100 million yuan this year, and the product ROI is close to 1

One More Thing

At the end of the event, Kai-Fu Lee shared that a year ago, he voluntarily promised investors that he would not cash out within 10 years.

He said:

"I think the best way to cash out is to go public as soon as possible, which is what we are trying to do in the future."

— END —

量子位 QbitAI 头条号签约

Follow us and be the first to know about cutting-edge technology trends

Read on