laitimes

With more than 100 billion financing in 10 years, why can OpenAI become a 570 billion "monster"?

author:Titanium Media APP
With more than 100 billion financing in 10 years, why can OpenAI become a 570 billion "monster"?

The 150th issue of Titanium Degree Pictures

From ChatGPT to Sora, generative AI products developed by OpenAI in the United States have set off a new round of AI craze.

On April 1, OpenAI announced that users can use ChatGPT for free without registration. At the same time, OpenAI also launched a new voice engine model, Voice Engine, which allows users to generate almost identical audio with only 15 seconds of sound clips, which is another big breakthrough. This will cement OpenAI's important position in the AI field.

According to the statistics of Titanium Media and Titanium Map, in the past ten years, OpenAI has received a total of 14.3 billion US dollars (about 103.430 billion yuan) in financing, and the latest valuation has exceeded 80 billion US dollars (578.632 billion yuan), becoming a veritable "monster" in the field of AI - this is much faster than the growth time of Amazon, Microsoft, Google, and Apple to become trillion-dollar giants.

There are many top talents, there are a lot of abundant funds, and there are not a few large technology companies that deploy AI technology.

At a cost of $10 million, OpenAI "stole" top talent from Silicon Valley technology giants

OpenAI's ability to continue to create blockbuster products is inseparable from the top team members behind it.

According to the latest revelation of OpenAI's chief operating officer, Brad Lightcap, as of now, OpenAI has about 1,200 employees, of which hundreds are from the R&D team. According to reports, OpenAI has attracted some outstanding talents from Google and other technology giants through annual compensation packages of up to $5 million to $10 million, and launched a talent war.

Elon Musk, the "Iron Man of Silicon Valley" and CEO of Tesla, complained on social platforms on April 4 that Tesla is raising the pay of AI engineers to prevent companies like OpenAI from poaching the wall "with huge salaries". The competition for AI engineers, he argues, "is the craziest battle for talent I've ever seen."

According to statistics, as of the end of February 2023, more than half of OpenAI's 736 employees (more than 368) are from other companies. Among them, nearly 30% (about 110) employees come from Google, Meta, Amazon, Uber, Microsoft and other Silicon Valley technology giants. Google has sent the most talents to OpenAI, with 59 people, while Meta and Apple are second and third, with 34 and 15 employees, respectively.

For example, Ilya Sutskever, the co-founder of OpenAI, who worked on the open-source deep learning framework TensorFlow from Tesla, Google Brain and other teams, and Andrej Karpathy, one of the founding members of OpenAI and an AI technology researcher at the company, also came from Tesla. In addition, many of the "inventors" of ChatGPT are from Google's AI team, including Barret Zoph, Liam Fedus, Luke Metz, Rapha Gontijo Lope, etc.

With more than 100 billion financing in 10 years, why can OpenAI become a 570 billion "monster"?

From ChatGPT to Sora, products are difficult to replicate

For OpenAI, the past decade has been an important decade of ups and downs.

Since 2018, OpenAI has released GPT, a generative pre-trained language model that can be used to generate articles, code, machine translation, Q&A, and other content. In the past six years, OpenAI's GPT model has evolved from version 1.0 to version 4.0, making generative AI technology popular all over the world and causing a global boom.

In addition, from DALL-E, Whisper, to the Sora video generation model, OpenAI has received unprecedented attention.

In fact, OpenAI's large model continues to lead the way behind its core concept - Scaling Law, which is that the larger the scale of the large model, the larger the amount of computation, and the larger the data scale, the better the performance of the large model.

With more than 100 billion financing in 10 years, OpenAI was "fattened" by Microsoft

Whether it is the 175 billion parameter model of GPT-3.5 or the so-called 1.8 trillion parameter model of GPT-4, a large amount of data, computing power and algorithm support are required behind it.

According to a number of industry estimates, OpenAI needs 25,000 NVIDIA A100 GPU computing cards to train GPT-4 once, and the training cost is as high as $63 million. The latest Sora model requires 720,000 NVIDIA H100 GPU graphics cards at the inference level alone, and 4,000-10,500 NVIDIA H100 GPUs for one month at the training level, costing hundreds of millions of dollars.

Therefore, for OpenAI, a large amount of capital is needed to support the use of "violence aesthetics" to build a GPT model.

As of April 2, 2024, OpenAI has received a total investment of $14.3 billion in its 10 years since its establishment, with investors including Microsoft, Sequoia Capital and other well-known institutions and enterprises.

Among them, in the past five years, Microsoft has invested a total of 13 billion in OpenAI three times - in 2019, OpenAI received a $1 billion investment from Microsoft for the first time, and in 2021 and 2023, Microsoft invested a total of $12 billion in OpenAI.

It is reported that not only the investment, but also Microsoft and OpenAI have reached a closer cooperation: all OpenAI technologies run on Microsoft Azure cloud servers, and Microsoft provides OpenAI with a large model computing center, and Microsoft has also won certain benefits for itself from the investment, at the level of profits and shares, before the investment is recovered, Microsoft has the right to receive 75% of OpenAI's profits, and after the investment is recovered, Microsoft will hold 49% of OpenAI's shares.

In addition, Microsoft has also integrated OpenAI's technology into its search engine Bing, marketing software, GitHub coding tools, and Microsoft 365 office software to build an AI service system.

It can be said that Microsoft's investment of tens of billions is equivalent to renting OpenAI, and when OpenAI starts to make money, Microsoft can also share the money.

However, now, with the rising demand for large model computing power, tens of thousands of NVIDIA H100 GPU graphics cards can no longer be satisfied with OpenAI. Recently, it was reported that Microsoft and OpenAI will spend more than $115 billion to build an AI supercomputing data center called "Stargate" (Stargate), which will be implemented as soon as 2026, and OpenAI CEO Sam Altman (Sam Altman) also announced $7 trillion to build an AI semiconductor network.

Core technology theory, top teamwork, and financial computing power support have enabled OpenAI to continue to launch blockbuster products, show the infinite possibilities of AI, and also revolutionize all walks of life.

It is reported that OpenAI will release GPT-5 as soon as June this year, which is expected to trigger a new round of global AI arms race.

(This article was first published on the Titanium Media App, drawing by Chu Yanmo, edited by Liu Yaning and Lin Zhijia, planning and produced by Titanium Media Visual Center)

Read on