laitimes

Artificial Intelligence Industry In-Depth Report: Industry applications have reached the eve of the outbreak

(Report producer/author: Zheshang Securities, Liu Wenshu)

1 Core view: AI manufacturers lead industrial transformation, and the wave of industry application is at hand

Recently, overseas manufacturers have successively released financial reports and disclosed the commercialization progress in the field of AI. Represented by Microsoft, Google, Meta and Amazon, overseas AI manufacturers are actively embracing the wave of AI commercialization in large models, cloud computing, search and advertising, productivity tools (office, code, CRM and other scenario applications), and are leading a new round of AI industry transformation. Recently, Microsoft and Salesforce have successively launched business models and charging plans for generative artificial intelligence, including Microsoft 365 Copilot and GitHub Copilot. After calculation, we believe that the AI big model is expected to increase the user penetration rate and paid volume and price based on the improvement of product competitiveness, which will bring a significant marginal contribution to the company's revenue scale. Looking forward to China, we believe that AI applications are expected to usher in the resonance of policy and demand in the second half of the year, showing a "hundred flowers" situation. On the one hand, the regulatory policies for large models have been continuously improved, and the time for full commercialization is gradually approaching. On July 13, the Cyberspace Administration of China and other seven departments issued the "Interim Measures for the Management of Generative Artificial Intelligence Services", which aims to promote the healthy development and standardized application of generative artificial intelligence, safeguard national security and social public interests, and protect the legitimate rights and interests of citizens, legal persons and other organizations, and the management measures have been officially implemented on August 15. On the other hand, large-model application products of domestic software are expected to be released intensively in the second half of the year. We saw at least five listed companies hold AI product launches at the end of June and early July, indicating that software companies have tapped into the needs of government or enterprise customers in large models, and have begun to prepare and plan.

2 AI commercialization has driven the growth of the cloud computing industry, and industry giants continue to consolidate barriers

2.1 The scale of the global cloud computing market continues to grow, and AIGC continues to enrich application scenarios

The potential space of the global cloud computing market is close to $500 billion, and it is expected to benefit from the accelerated growth of AI commercialization demand in the future. According to T4.ai statistics, the global cloud computing market will reach $492 billion in 2022, a year-on-year increase of 21.2%, and is expected to maintain growth in the next few years, with a market size of $663 billion by 2024, with a compound growth rate of 18.13% from 2018 to 2024. In terms of structure, public cloud will continue to occupy the mainstream position, and the market size of public cloud SaaS, PaaS, and IaaS will maintain a growth trend. T4.ai statistics show that the global public cloud market will reach $364 billion in 2022, accounting for 73.98% of the cloud computing market, and from the perspective of segmentation, Public SaaS dominates the public cloud segment, and the market size is expected to reach $169 billion by 2024.

Artificial Intelligence Industry In-Depth Report: Industry applications have reached the eve of the outbreak

The global cloud computing market presents a three-pronged situation, and the market share of Microsoft and Google continues to increase. Synergy Research Group statistics show that in 2023Q1, Amazon, Microsoft, and Google occupy the top three global cloud computing market shares, with market shares of 32%, 23%, and 10% respectively. From the perspective of long-term trends, market concentration continues to increase, and the share of leading manufacturers including Amazon, Wesoft, Google, Alibaba and IBM continues to increase. From the perspective of the competitive landscape, Amazon's AWS industry leader position is solid, but Microsoft and Google's market share continues to increase, and it is expected to challenge Amazon's leading position in the future.

Cloud computing business is expected to continue to benefit from AI commercialization. On the demand side, the application of AI large models will greatly stimulate the demand for computing power and data storage space of enterprises. For business models and cloud computing vendors themselves, AI big models will give cloud computing vendors the opportunity to provide more diversified services such as AI model training, deployment, data analysis and prediction, and at the same time optimize and improve their own computing power, storage space and network security based on the application of AI large models. In downstream application scenarios, AI+cloud computing has broad application space in the fields of Internet of Things, chatbots, and business analytics.

Generative artificial intelligence stimulates the explosive growth of computing, data processing and storage, and leading manufacturers have intensively deployed AI+cloud computing tracks. Amazon AWS, Google GCP and Microsoft Azure have successively launched services such as basic models, model inference optimization, fine tuning and database on the basis of the public cloud platform, competing to attract enterprise users and developers and lay out the AI+cloud computing business ecology. Under the background of the wave of AI commercialization, large manufacturers will benefit from the advantages of AI infrastructure layout, continue to expand cloud computing services related to AI large models, and form an industry competition pattern with continuous increasing concentration.

Research shows that AI technology combined with cloud computing services can greatly improve enterprise efficiency. In April 2023, Forrester Consulting's comprehensive study based on customer interviews and financial analysis showed that Azure AI services can generate a 284% ROI for enterprise users, and enterprise users can realize approximately $31.5 million (including principal) on an investment of $8.2 million over the three-year project period. Azure AI services contribute to economic benefits in five major links: business growth, cost optimization, manual processing automation, operational efficiency, and abandonment of traditional technologies.

The results show that AI+Cloud Service can improve benefits in multiple aspects of enterprises. Forrester Consulting research shows that companies can achieve investment cost recovery in only 6 months, and can achieve 150% increase in productivity, 7% reduction in cost expenditure, and 60% reduction in internal document processing errors. At the same time, AI combined with cloud services in areas such as decision-making, team collaboration and creativity, and risk identification is also expected to provide significant value to enterprises.

Cloud vendors lay out chip tracks and focus on laying out AI+cloud computing infrastructure - AI-accelerated chips. Since 2018, Amazon has successively launched three self-developed chips, Graviton, Trainium and Inferentia, which are used to serve customers in its own cloud computing business ecosystem. At present, Google and Microsoft are also developing their own AI acceleration chips and server chips, which are expected to be released in 2024-2025 to consolidate the core competitiveness of manufacturers' cloud computing services.

2.2 Amazon: The world's leading cloud computing vendor, focusing on the layout of AI business

The company's cloud computing business has maintained a leading position in the industry for a long time, and the growth momentum is stable. In 2022, Amazon's AWS business achieved revenue of $80.096 billion. It has grown by 28.8% year-on-year and a compound growth rate of 35.62% from 2017 to 2022. In a single quarter, Amazon AWS business achieved operating income of $22.140 billion in 2023Q2, an increase of 12.2% year-on-year and 3.68% month-on-month.

Artificial Intelligence Industry In-Depth Report: Industry applications have reached the eve of the outbreak

The profitability of Amazon's AWS business remains at a good level, capital expenditure continues to be invested, and the AI+cloud computing business is expected to enable continuous improvement in profitability. In 2022, the operating profit of the AWS business was $22.841 billion, a year-on-year increase of 23.3%, and a compound growth rate of 39.45% from 2017 to 2022. Capital expenditure continues to be invested, and the AI+ cloud computing business is expected to consolidate Amazon's global cloud computing leading position. Capital expenditures for the company's AWS business reached $27.755 billion in 2022, up 25.9% year-over-year, and grew at a compound growth rate of 24.74% from 2017 to 2022. For the future, while the company has guided a slowdown in capital expenditure growth throughout the year, it will maintain capital expenditure investment for the AWS business.

AWS has significant advantages in infrastructure fields such as SaaS and PaaS, and has continuously enriched its business ecosystem in the form of ecological partners in recent years. In 2020, Amazon launched Amazon AppFlow, an automated data flow service, and in 2021, it was announced that customers can subscribe to SaaS products directly through the Marketplace, and AWS SaaS Boost, launched in 2022, can help merchants build a smooth path to the cloud with traditional software. AWS PaaS provides a professional, complete aPaaS platform facility for continuously building and iterating next-generation applications, including core capabilities such as low-code/no-code application platforms, intelligent BPM, and iPaaS data integration. AWS PaaS provides comprehensive deployment services to customers in more than 70 listed companies covering 15 industries, becoming the cornerstone of the platform to support different phases of digital transformation.

On April 13, 2023, AWS launched Amazon Bedrock, which provides developers with foundational big language models for building and scaling applications based on generative AI. Amazon Bedrock services can be used to generate text, images, audio, and synthetic data. Users have access to the Titan family of base models developed by AWS as well as large models from vendors such as AI21, Anthropic, and Stability AI. At the same time, the company introduced Amazon Code Whisperer, a generative AI programming tool that helps "write" large amounts of undifferentiated code without the heavy lifting, allowing developers to build applications faster and focus on more creative development efforts. According to official information, the generative programming tool supports Python, Java, JavaScript, TypeScript, and C#, as well as ten new languages (including Go, Kotlin, Rust, PHP, and SQL).

AWS said on July 26 that Bedrock, an AI platform released in April, has been trialed by thousands of customers, including Ryanair, the world's largest hedge fund Bridgewater Fund, and others. Amazon also added two big language models from AI company Cohere to Bedrock, as well as the latest language models from Stability AI and Anthropic. In addition, Amazon is currently offering a preview version of ChatGPT-like chatbot Agents for Bedrock.

The AI inference side lays out chips + cloud computing, and the future business growth space is large. The company revealed in the earnings call that it began developing custom AI chips for training (Trainium) and inferentia a few years ago, aiming to provide customers who build and run large language models with attractive cost-effective options. The company believes that in the future, a large number of large-scale language model training and inference will run on AWS's Trainium and Inferentia chips. In the future, the company plans to democratize access to generative AI, reduce the cost of training and running models, empower companies of all sizes to easily customize large-language models, and build generative AI applications in a secure and enterprise-grade manner.

2.3 Google: The latecomers are ahead, and cloud services are expected to become a new business growth pole

Google Cloud's business started late, and its revenue maintained a rapid growth trend. In 2022, Google Cloud business achieved revenue of $26.280 billion, a year-on-year increase of 36.8%, and a compound growth rate of 45.66% from 2018 to 2022. In the second quarter of 2023, Google Cloud achieved revenue of US$8.031 billion, a year-on-year increase of 28.0% and a sequential increase of 7.74%, and the scale of cloud service business has maintained rapid growth in the past three years.

The profitability of Google Cloud business continues to improve. In 2023Q2, Google Cloud business achieved an operating profit of 395 million US dollars, an increase of 106.8% from the previous month, and since the beginning of this year, Google's business has got rid of the loss state, and the profitability of the business has improved significantly, and it is expected to become a new growth pole for the company's business in addition to search and advertising in the future.

Artificial Intelligence Industry In-Depth Report: Industry applications have reached the eve of the outbreak

The company is firmly committed to capital expenditure, and the cloud computing business is expected to accelerate the footsteps of Amazon and Microsoft. In the second quarter of 2023, Google's capital expenditure was $6,888 million, up 0.9% year-over-year and 9.53% sequentially. In the field of cloud business, Google compared with Microsoft Azure, Amazon AWS late entry, according to the company's guidance, the company will continue to increase capital expenditure investment in the future, the company expects that capital expenditure in the second half of 2023 and 2024 will continue to grow, and in the data center, GPU, TPU and other fields of key investment, as the company continues to iterate AI large models and enrich the product application matrix, the future is expected to build a sound AI application ecology. With many years of deep AI experience, Google Cloud has strong strength in the field of AI and machine learning. Google has launched TensorFlow in 2015, AutoML in 2018, and Vertex AI in 2021, deeply deploying models, tools, platforms and other links in the field of AI and machine learning. In terms of developer-oriented products, the Google Cloud platform has formed a relatively complete service matrix, including Speech-to-Text for speech-to-text, Vision AI for image recognition, and Video AI for video recognition. In terms of AI solutions, there are currently products such as Contact Center AI, an intelligent customer service solution, and Discovery AI for Retail, a product search solution.

Google Cloud Computing Business relies on Google Cloud Platform services and Google Workspace collaboration tools to continuously build a business ecosystem with a flexible business model. GCP service has dynamic portraits for different customers' IaaS resource retrieval requirements to meet long-tail requirements and achieve continuous customer acquisition. Google Workspace offers basic features for free, and premium features are subscription models with tiered pricing. Google Workspace provides four subscription plans of Business Starter, Business Standard, Business Plus and Enterprise according to the different needs of enterprise customers to meet the needs of private deployment.

In the future, GCP is expected to build competitiveness in the field of complementary ecological and differentiated fields in the field of cloud computing. On the ecological side, GCP is expected to strengthen its service capabilities for the pendant industry, develop ecological partnerships, and establish cooperative relationships with a number of companies downstream of the cloud computing service industry chain. In terms of differentiated competition, Google's deployment in multi-cloud and in-depth application in the field of AIGC have gradually formed a point of differentiation. Led by the AI trend, GCP will continue to benefit from AI technology in the future, and will continue to consolidate its advantages in multi-cloud to achieve differentiated development with other cloud vendors.

2.4 Azure is deeply bound to OpenAI, and the "AI + data + cloud computing" strategy can be expected in the future

Microsoft's cloud computing business revenue scale continues to grow, and cloud products such as Azure, Dynamic, and Azure provide multi-polar growth momentum. In fiscal 2023, Microsoft's intelligent cloud business achieved operating income of $87.907 billion, a year-on-year increase of 16.8%, and a compound growth rate of 22.23% from fiscal year 2018 to 2023. According to the company's financial report, the revenue of Microsoft's cloud-related business in fiscal 2023 was $111.6 billion, a year-on-year increase of 22.1%, and the compound growth rate of fiscal year 2018-2023 reached 33.22%. In fiscal 2023, Microsoft's cloud-related business revenue accounted for more than 50% of the company's total revenue for the first time, and AI+cloud computing has become the company's core business.

In a single quarter, the revenue growth rate of intelligent cloud business continued to slow down, and profitability is expected to recover under the background of AI commercialization. In FY23Q4, the company's intelligent cloud business achieved revenue of 23.993 billion US dollars, a year-on-year increase of 13.8%, and the year-on-year growth rate of revenue has continued to slow down since FY22Q3. FY23Q4 Microsoft Intelligent Cloud operating profit was 10.526 billion yuan, a year-on-year increase of 19.5%, and profitability has rebounded. We believe that with the deep binding of Microsoft's cloud computing business with OpenAI, it is expected to fully benefit from the landing of AI commercial applications in the future, and achieve a double improvement in business scale and profitability.

Microsoft continues to invest in capital expenditure and artificial intelligence infrastructure to promote the in-depth layout of the AI field. Micro capital expenditures in FY2023 were US$28,107 million, up 17.7% year-over-year, and FY2018-2023 grew at a CAGR of 19.30%. In a single quarter, Microsoft capital expenditure in FY23Q4 was $8.943 billion, an increase of 30.2% year-on-year and 35.4% month-on-month, and Microsoft's capital expenditure showed an accelerated growth trend under the wave of AI commercialization. As the scale of traditional and AI businesses continues to grow, Microsoft plans to continue to increase investment in data centers, CPUs, GPUs, etc., and we expect FY24 capital expenditure to continue to increase sequentially with quarterly demand changes.

Azure and OpenAI are deeply bound to provide large model services such as GPT. On March 21, 2023, Microsoft launched the Microsoft Azure-based enterprise Azure OpenAI ChatGPT service, Azure users can use OpenAI large models (including Dall-E 2, GPT-3.5, Codex, etc.), customers and developers in various industries can directly integrate the revolutionary experience brought by ChatGPT into actual business systems or apps. Azure is deeply tied to OpenAI, which is expected to enable business growth to accelerate.

On July 19, Microsoft's "Inspire 2023" conference was held, and the Azure cloud + AI function once again ushered in a major update. New features in Azure include vector search, Whisper audio models, Llama 2 language models, custom neural voice models (CNVs), and more. Vector search, a key technology to enhance generative AI capabilities, is currently in preview; Whisper voice models efficiently transcribe 57 commonly used voices and will be available in Azure soon; Meta's recently open-sourced Llama 2 Language Model is currently available for deployment on Azure with models of 7 billion, 13 billion, and 70 billion parameters and is in preview. Custom neural voice helps users synthesize angry, happy, fearful, serious, and other styles of speech, which is suitable for business scenarios such as audiobooks, news broadcasts, language learning, and voice assistants.

The model training and inference service in the MaaS service mode can significantly increase the amount of value for Azure customers. According to Microsoft's Azure official website, Azure OpenAI services are charged according to the number of tokens processed and the rental time of the virtual inference engine. Referring to Microsoft's Azure official materials, consider a 2C e-commerce APP, with 100,000 monthly active users, each interacting an average of 10 times per month, processing about 250 English words per interaction (assuming that 1000 tokens is equal to about 750 words), and renting Azure's virtual engine for 730 hours per month, the e-commerce APP needs to pay an annual fee of $1030 to the Azure OpenAI service, with an annualized fee of $12,360.

Artificial Intelligence Industry In-Depth Report: Industry applications have reached the eve of the outbreak

Microsoft the release of the Fabric platform, Azure continues to promote the "cloud + AI + data" strategy. In May 2023, Fabric integrated technologies such as Azure Data Factory, Azure Synapse Analytics, and Power BI into a unified offering that enables data and business professionals to better explore and explore enterprise data. Fabric has a built-in "OneLake" multi-cloud data lake, all Fabric work is automatically connected to OneLake, and data can be organized in an intuitive data center and automatically indexed for discovery, sharing, governance, and compliance management. OneLake serves developers, business analysts, and business users alike, eliminating data silos and providing a single, unified storage system for all developers that can be managed through centrally enforced policies and security settings.

Fabric incorporates Azure OpenAI services at every level to help customers unlock the full potential of their data. Developers are able to leverage AI technology to analyze their data and help business users gain insight into the data. Users can use Copilot in Fabric to create data flows and data pipelines, generate code and complete functions, build machine learning models, or visualize results using a conversational language. Microsoft revealed in a fiscal 2023 conference call that more than 8,000 users have signed up to try the service, and more than 50% are using no fewer than 4 workloads.

For Azure OpenAI services, more than 11,000 organizations are currently using the service (up from more than 2,500 in FY23Q3). The company's use of data for various AI products and applications has exploded. According to the company, the Azure-related business RPO (Remaining Obligation) is an all-time high of $224 billion, and 45% is expected to be fulfilled within 12 months. In terms of performance guidance, the company expects the revenue of FY2024Q1 intelligent cloud business to be 233~23.6 billion US dollars, which is expected to increase by 15%~16% year-on-year.

3 AI subverts the search engine and advertising industry ecology and brings innovation to the user experience

3.1 The search engine market structure is stable, and AI empowers accurate matching of search and advertising

The global search engine market structure remains stable, and Google occupies the leading position. Statcounter statistics show that in June 2023, Google accounted for 92.64% of the global search engine market, and Bing accounted for 2.77% of the second place. In the Chinese market, Baidu still occupies the first position in the omnichannel search engine market share, with a market share of 61.47% in June, followed by Bing with a market share of 14.44%. At present, we believe that the penetration of generative artificial intelligence in the field of search engines is still in the early stages, and the possibility of significant changes in the search engine market structure in the short and medium term is small, but it is expected to realize commercial value monetization in the fields of enterprise-level knowledge retrieval and advertising.

The global search engine and advertising market continues to grow. According to Business Research Insights, the global search engine market size was $167.02 billion in 2021, and it is expected that the market size will grow to $348.8 billion by 2028, with a compound growth rate of about 11.09% in 2021~2028. In the field of advertising marketing, an important commercial monetization method for search engines, according to Precedence Research, the global digital advertising market can reach 1.25 trillion US dollars by 2030, with a compound growth rate of about 9.22% from 2021 to 2030. As the main scene of digital advertising, search engines are expected to benefit from the increase in advertisers' willingness to advertise, and the market scale continues to expand.

Microsoft, Google and other manufacturers have successively introduced AI large model technology into the search engine ecology, triggering a new round of changes in the information search ecology. Microsoft's GPT-based search engine New Bing continues to iterate and continuously optimize the user experience. On February 8, 2023, Microsoft announced the launch of New Bing with ChatGPT function, and announced access to GPT-4 capabilities on March 15, and announced that Bing Chat functions entered the Open Preview mode on May 4, that is, it is open to all users, the New Bing user experience continues to be optimized, and the user ecosystem is accelerated. Microsoft Chief Marketing Officer Yusuf Mehdi said in early May that since its launch, Microsoft's new Bing AI has had more than 1 billion chats, the Bing Vincent feature has created more than 200 million images, and the new Bing daily active users have exceeded 100 million. New Bing chat continues to iterate in the direction of multi-modal and multi-functional. In the May 4 update, New Bing presents four core highlights: the evolution of plain text search chat to multimodal answers, Bing Image Creator support for multiple languages, and the addition of history recording capabilities and support plugins. New Bing can retrieve and give pictures through the Internet based on the user's text input and explain the source of the pictures in text form, and videos can also appear in the answer; In terms of chat history, users can pause chats and resume previously paused chats later, and support for exporting chat history to local or sharing to social platforms.

Google launched the Search Labs project to provide a search generation experience that provides users with more accurate search responses. On May 25, 2023, Google announced that it would open access to Search Labs, an AI search project, to test the new search function SGE (Search Generative Experience) embedded in generative AI, unlike New Bing, SGE uses Google's traditional input field, after the user asks a question, the generated AI results will appear near the top of the page, in a shaded area below the search bar, above the traditional search results. It contains "key information to consider, as well as links to dig deeper," along with a button to expand the snapshot at the top right of the AI results and a card showing the source article. Google also opened up access to other Search Labs features, including Code Tips and Add to Sheets. Code Tips harnesses the power of large language models to provide guidance for writing code, allowing developers to ask introductory questions about programming languages (C, C++, Java, JavaScript, Python, etc.) and related tools (Docker, Git, shell) and algorithms. Add to Sheets allows users to insert search results directly into Google's spreadsheet.

3.2 Google: The leading position of search engines is solid, and AI+ advertising empowers high-quality business development

Google's search and advertising business is fundamental, and the empowerment of AI large models is expected to alleviate the pressure of the macro environment, and the main business will return to growth. In the second quarter of 2023, Google's search business achieved revenue of $42.628 billion, up 4.8% year-over-year and 5.6% sequentially, and Google Advertising time-of-business revenue was $58.143 billion, up 3.3% year-over-year and 6.6% sequentially. In 2022, due to the influence of macro factors to stimulate online demand, search engine and online advertising business achieved rapid growth, and the growth rate slowed down significantly in 2023, and Google is expected to achieve accurate matching of search and precision marketing of advertising based on large model technology in the future, and achieve high-quality business growth.

Artificial Intelligence Industry In-Depth Report: Industry applications have reached the eve of the outbreak

According to the analysis of advertising business indicators, the willingness of brands to launch has picked up, and the demand for precise advertising will be more vigorous in the future. Among the several core advertising business indicators disclosed in the company's financial report, Paid Clicks benefited from the growth of Google Search on mobile devices in the second quarter of 2023; Cost-per-click (CPC) and Cost-per-impression (CPI) have improved compared to 2023Q1, reflecting the expected recovery of advertiser willingness in the coming quarters.

The company actively deploys AI into Google search & advertising scenarios. Google uses technologies such as AI to optimize for ad budget control, smart bidding, precision delivery, Quality Score, and more. Budget control can help advertisers achieve intelligent allocation of advertising resources, intelligent bidding can help advertisers obtain reasonable advertising pricing, AI-powered accurate delivery can reach specific target user groups, and machine learning models also play a core statistical and ranking role in quality score. On the creative side, Google opened up Automatically Created Assets beta to all advertisers – ACA automates asset creation, reducing manual effort and increasing creative freshness.

The AI big model is expected for advertisers to achieve precision marketing, thereby achieving high-quality development of Google's advertising business. Several product teams within Google are actively deploying the PaLM2 language model to empower advertisers to quickly create media content, and also prompt video creators on You Tube what content to create. In the field of automated advertising business and advertising-related customer services, the Google team will use generative artificial intelligence technology based on generative artificial intelligence technology to embed generative artificial intelligence in more business areas to achieve the goal of expanding revenue and improving profit margins, so as to achieve high-quality business development. Google's application of the Flamingo model to YouTube to generate descriptions for short videos is expected to improve the user experience and enable precision marketing. In May 2023, Google DeepMind announced that its AI visual language model, Flamingo, is generating descriptions for YouTube Short, which enables categorizing videos and matching search results to user queries. The model captures the auditory and visual components of short-form content and transforms them into summaries that are easy for users to search and access.

Google's Demand Gen Campaigns product opens up advertising scenarios such as Gmail, YouTube, and search engines to provide advertisers with more accurate marketing suggestions. Demand Gen campaigns helps advertisers reach new target audiences, match creative portfolios tailored to AIGC technology with audiences to deliver the most relevant messages, and measure brand and search lift and conversions.

3.3 Microsoft: Enterprise search business is expected to contribute new revenue increments

Microsoft's fiscal 2023 annual report financial report disclosed that the growth rate of search and advertising business continued to slow down. In the second quarter of 2023, Microsoft's Search and Advertising revenue was $3.012 billion, up 2.9% year-over-year, and the year-over-year growth rate of revenue in a single quarter has continued to slow since the third quarter of 2022. We believe that with the embedding of GPT large models in search engines and the commercialization of services such as Bing Chat Enterprise, AI is expected to add new impetus to Microsoft's search and advertising business. The size of Bing users continues to increase, and ARPU is still at a low level. In terms of user usage and per capita income contribution, the number of Bing users reached 1.2 billion in 2022, a year-on-year increase of 9.1%, and ARPU recovered to $9.66 in 2022 after a decline in 2022, a year-on-year increase of 24.6%.

Powered by the AI big model, Microsoft's search engine market share is on the rise, and it is expected to challenge Google's supremacy in the future. Statcounter data shows that in April 2023, Bing surpassed Baidu to become the first domestic desktop search engine market share, with Bing having a market share of 32.28% and Baidu 26.21% in June. In the global desktop search engine market, Google still maintains an absolute hegemony, but Bing shows an upward trend, with Bing's share of 8.74% in June 2023, up 1.93 percentage points month-on-month, while Google's market share is 83.94%, down 3.71 percentage points month-on-month. Enabled by AI big models, New Bing is expected to attract more users and accelerate the growth of the company's search engine and advertising business.

New Bing Chat has started the process of commercialization and monetization, and the enterprise-side business is expected to contribute new revenue increments to the company. On July 18, 2023, Microsoft launched Bing Chat Enterprise Edition, which is included for free in Microsoft 365 E3, E5, Business Standard, and Business Premium, and will be available as a standalone product in the future for $5 per enterprise user per month. Bing Chat for Business is based on web data and improves the accuracy and efficiency of business user information retrieval by providing complete, verifiable, cited answers, as well as visual answers including graphics, charts, and images.

With nearly 400 million Microsoft 365 enterprise seats, Bing Chat Enterprise has considerable revenue space. Considering that Bing Chat Enterprise will charge $5 per enterprise user per month in the future, we take the 382 million disclosed at the end of FY23Q3 as the benchmark for the number of seats, on this basis, every 1% penetration rate of AI large-model enterprise search engine services can bring Microsoft search and advertising business $229 million in revenue thickening, considering that the penetration rate of Microsoft 365 enterprise users will continue to increase in the future, we believe that the search engine business is expected to significantly increase users ARPU to accelerate the growth of the search business.

4 AI+ Office: A New Revolution in Productivity Tools

4.1 Generative AI-enabled productivity has been significantly improved

Generative AI has the potential to reshape business productivity and has great potential in software development, customer operations, marketing, and more. According to the McKinsey research report, generative AI is expected to achieve a double improvement in the quality and efficiency of high-quality content generation in specific business tasks, and thereby increase revenue for enterprises, such as using AIGC to generate creative content in marketing scenarios. McKinsey calculates that 63 AIGC use cases in the 16 business functions identified by the researchers are expected to create an economic value of 2.6~4.4 trillion US dollars per year, and if you consider the future AIGC technology to fully penetrate into all productivity links, the potential economic value can reach 6.1~7.9 trillion US dollars, and the market space is broad. From the perspective of investment in segmented business scenarios, generative artificial intelligence is expected to achieve large-scale deployment in software development, sales and other links. According to a McKinsey research report, in the fields of customer operations, marketing and sales, and software engineering and research and development, the investment in generative artificial intelligence may account for more than 75% of the functional cost, and the investment scale can reach more than 400 billion US dollars, and enterprises are expected to deploy AIGC applications on a large scale in supply chain management, product development, risk control compliance, financial law and other fields.

Artificial Intelligence Industry In-Depth Report: Industry applications have reached the eve of the outbreak

Scientific research institutes have proven that generative AI can significantly improve user productivity and work quality. In a recent paper published in Science, MIT researchers demonstrated that using ChatGPT significantly increases productivity, improves the quality of tasks, and reduces the time it takes to complete tasks, based on controlled experiments conducted by 453 college-educated professionals. At the same time, compared with the control group, the overall score, writing quality, content quality and originality of the experimental group completed the task were significantly improved. ChatGPT is particularly effective for people with poor writing and communication skills, and can improve users' competitiveness in the labor market by providing creative content as well as generating skills.

Taking marketing as an example, AI has become an important productivity tool for integration and all aspects of the business. According to Hubspot data, it is estimated that by 2028, the potential space for AI technology application in the global marketing field can reach 107.54 billion US dollars, with a compound growth rate of 31.47% in 2020~2028, and the market size is expected to grow rapidly. BCG Consulting's report on generative AI applications shows that 70% of respondents have incorporated AIGC technology into their marketing operations, including but not limited to personalized content and idea generation, predictive analytics, chatbots, etc., and data shows that generative AI technology can increase the productivity of such workers by more than 30%.

Tome can automatically generate PPT based on AI large models, and users only need to enter the slide theme. Tome can create a PPT from 0, the generated content includes a cover page, a catalog page and a content page, and the content is accompanied by Open AI's drawing tool DALL· E2 is automatically generated. TOME integrates with a number of third-party applications, including DALL· E, Giphy, Figma, Airtable, Twitter, and more, can embed content from the web into user-generated PPTs. Tome has a rich library of templates, such as resumes, news, market analysis, etc., to meet the needs of user education, marketing, entertainment, etc.

For individual users, Tome offers both free and pro plans. Under the free version, users have 500 points that can be used to intelligently generate PPT, and the number of points consumed by each PPT varies according to the amount of content. The professional version costs $8/month (for annual users, monthly users charge $10/month), and professional version users have unlimited credits, while supporting services such as exporting PPT documents and customizing logos.

4.2 Major manufacturers continue to integrate AI into existing product ecosystems to empower productivity improvement

Since Microsoft announced its increased investment in OpenAI in January 2023, it has continued to release AI large model application products and services and continuously improved the AI application product matrix. The company has successively integrated GPT large models into search engines, browsers, office software, operating systems and other applications and released related products, and with the large-scale commercialization of generative artificial intelligence applications in the future, it is expected to further enhance the company's business ecological barriers and core competitiveness.

Microsoft builds AI Copilot from the data, security, model and other levels, aiming to become an AI helper for B-side and C-end users. The company integrates AI big model capabilities into office applications such as word, excel, Powerpoint, and outlook, while Microsoft provides exclusive data services for users and enterprises, and in terms of user security and protection, Microsoft continues to deploy artificial intelligence security technology, such as helping users determine whether images are generated by artificial intelligence based on the information contained in metadata. Currently Microsoft 365 Copilot trial customers have access to more than 50 plug-ins, including Adobe, Thomson Reuters, Mural and more, and users are expected to realize AI office experiences based on natural language interaction in the future.

The company launched Windows Copilot, and the AI big model is expected to subvert the operating system use ecology. On May 23, 2023, Microsoft announced the addition of the Copilot Assistant to Windows 11 at its annual developer conference. Windows Copilot is integrated into the operating system sidebar and helps users with various tasks such as summarizing, rewriting, explaining, and more. Microsoft said that Copilot can make every user a high performer, improve work and learning efficiency, users can call it in any application, and provide intelligent suggestions and actions according to the user's needs.

Google combines large model technology with Gmail, Map and other product ecology to promote user productivity. At the 2023 Google I/O conference, Google released Gmail-Help me write, Google Map immersive maps, and new features of Magic Compose in Google Messages, integrating generative artificial intelligence technology into various application products, achieving a significant improvement in user efficiency and content generation quality.

Since the launch of a series of AI large model applications at the Google 2023 I/O Conference, products have entered the user ecology one after another. On May 27, Google began embedding a new feature of Magic Compose in the Messages app to help users intelligently write short messages, and Magic Compose can provide different styles and tone of reply suggestions, such as formal, casual, exciting, or Shakespearean replies, based on the content of the conversation and the user's prompts. In addition, Google embedded the "Help me write" tool into Gmail to help users quickly compose professional emails and draft replies based on the entire conversation, while users can perform a series of operations on the AI-generated email content, such as meeting personalized needs such as more formal content and streamlined content.

4.3 Microsoft: The leader in AI commercialization, Copilot business is expected to bring volume and price increase

4.3.1 Office Copilot pricing released, the market space is broad

On March 16, 2023, Beijing time, Microsoft officially announced the launch of Microsoft 365 Copilot. Users can use text commands and interaction to allow Copilot to realize Word document editing, Excel table analysis, and intelligent creation of PPT content, which is expected to greatly improve user productivity.

Copilot is embedded in office applications such as Word and Excel, making it convenient for users to operate and support personalized instructions. In the Microsoft product release demonstration video, when users use Word, Excel and other applications, Copilot appears in the operation interface in the form of dialog boxes, users only need to enter requirements, and can put forward performance requirements for the style and simplicity of the content; In Excel, Copilot can quickly complete the data analysis process and present relevant charts and tables according to the problems users want to solve, which is expected to greatly improve user productivity.

Artificial Intelligence Industry In-Depth Report: Industry applications have reached the eve of the outbreak

Microsoft 365 Copilot consists of three core modules, each of which works together to solve the various needs of users. Copilot has a complex processing engine, its core module in addition to the AI large model, but also Microsoft 365 APP (Word, Excel, Outlook and other application technology teams) and Microsoft Graph (containing the user's text, email, meeting and other information), in the process of user use, the APP side will combine the instructions and needs of the user side to submit the requirements to the AI large model. After model inference and computation, the results are returned to the application and feedback on user instructions.

Microsoft 365 Copilot is priced at $30 per month and promises to significantly increase user ARPU. On July 18, 2023, Microsoft officially announced that for subscribers to Office 365 E3, E5, Business Standard and Business Premium, Microsoft 365 Copilot is priced at $30 per user per month ($360/year). Subscriptions for enterprise users range from $10 to $38/month per user. In May, Microsoft announced that it was expanding its Microsoft 365 Copilot paid early access program to 600 enterprise clients worldwide, including companies such as KPMG, Lumen and Emirates NBD. Microsoft 365 Copilot pricing is expected to significantly increase the ARPU value of business and individual users, while the office tool based on AI large model As a user productivity tool, it is expected to promote the payment rate of suites such as Office, and achieve a rise in the company's business volume and price.

Microsoft Office and related cloud service revenue maintained a growth trend, and revenue growth accelerated in the past two quarters. The company's financial report shows that the company's Office products and cloud services (Office products and cloud services) achieved operating income of 48.728 billion US dollars in fiscal 2023, a year-on-year increase of 8.6%, and the compound growth rate of fiscal year 2017~2023 is 11.34%. In a single quarter, FY2023Q4 company's revenue in this business reached US$12.905 billion, an increase of 10.9% year-on-year and 3.75% sequentially, and the growth rate of the company's Office-related business has increased significantly since entering 2023.

Microsoft Office for Business and Personal subscriptions continued to grow steadily. According to the company's financial report, the number of Microsoft Office for Business subscription seats increased by 11% year-over-year in the fourth quarter of fiscal 2023, and the year-over-year growth rate was more than 10% for 12 consecutive quarters. The company disclosed at the FY23Q3 results exchange meeting that the number of Office for Business seats reached 382 million by the end of the quarter, and it is expected to continue to grow in the future. In terms of individual users, as of FY23Q4, Office Personal Edition subscribers reached 67 million, a year-on-year increase of 12.2%, maintaining a continuous growth momentum. We believe that with the gradual large-scale commercial implementation of Office Copilot, it is expected to benefit from the huge Office usage ecology and the growing user scale, and achieve both volume and price increases.

Office Copilot is expected to be commercially available within the year. Microsoft revealed at the performance exchange that Office Copilot has been trialed in more than 600 large enterprise customers, and according to the technology media The Information, at least 100 customers have paid an additional $100,000 annual fee for 1,000 subscription accounts, and in the future, with the further improvement of product coverage, Office Copilot business is expected to contribute more significant revenue increments to the company.

It is estimated that Office Copilot is expected to contribute significant revenue increments to Microsoft's Office business in the future. For commercial users, we take the 382 million disclosed at the end of FY23Q3 as the benchmark for the number of seats, and for individual subscribers, we take the 67 million subscribers disclosed in FY23Q4 as the benchmark, then 1% Office Copilot penetration can create an annual revenue increment of $1.375 billion for the enterprise user business and $241 million for the individual user business, considering the Microsoft Office The number of enterprise seats and individual subscriptions will continue to grow in the future, and we believe that Office Copilot business is expected to become the core growth point of Microsoft Office-related business.

4.3.2 GitHub Copilot: Dramatically improve code developer efficiency

Github Copilot X combines GPT-4 with Github Copilot to extend AI empowerment to the entire workflow of developers. Released in preview in 2021 and officially launched in June 2022, Github Copilot is the first large-scale production AI development tool based on OpenAI Codex models, which can realize functions such as code generation and automatic recommendations, greatly improving the coding efficiency of developers. Github CEO Thomas Dohmke revealed that GitHub Copilot has written 46% of its code since its release, helping developers code 55% faster, and more than 5,000 enterprises and 1 million developers worldwide use the tool. On March 23, 2023, Github Copilot X was released, using GPT-4 large models, introducing chat, voice and other functions, developers can get a chatGPT-like chat experience during the programming process. Developers can use the Github Copilot Chat function to automatically generate code under natural language interaction, while the Github Copilot Voice function integrates "speech-to-code" AI technology to support user voice input prompts and achieve "writing code with your mouth". In addition, the new version of Github Copilot implements iteration of functions such as bug query, code interpretation, and developer input code identification of exception code. According to Dohmke, Github Copilot based on GPT-4 will increase developer productivity by a factor of 10.

GitHub has simultaneously launched a number of feature preview versions, aiming to deeply empower developers to improve efficiency. The Copilot for Pull Requests feature is based on the GPT-4 model, which automatically generates pull request descriptions and automatically generates tags based on changed program code. The GitHub Copilot for Docs tool automatically answers users' questions about project files through a chat interface. Copilot for the command line interface can be used to compose commands and loops, satisfying developers' queries with fuzzy lookup flags. Github Copilot adopts a subscription-based charging model, and the AI model is expected to empower the penetration rate of paying users. At present, the Github Copilot paid plan is divided into two types: personal version and enterprise version, the individual user subscription fee is $10/month or $100/year, and the enterprise version is paid $19/month per user. On July 21, GitHub officially launched a beta version of Copilot Chat, which is currently limited to enterprise users. In the future, with the gradual enrichment and full opening of functions, it is expected to build a richer software development ecosystem.

The number of GitHub developers has accelerated, and Copilot has a good application ecosystem. As of 2022, the number of GitHub community developer users reached 94 million, and the compound growth rate of the number of users from 2017 to 2022 reached 31.40%, and the scale of users grew rapidly. With the growth of GitHub's user scale, platform revenue has also maintained a rapid growth trend, SignHouse data shows that GitHub's annual recurring revenue in 2022 reached $1 billion, and the platform has formed a good paid user ecology, GitHub Copilot is expected to achieve rapid penetration on this basis.

Artificial Intelligence Industry In-Depth Report: Industry applications have reached the eve of the outbreak

At present, GitHub Copilot has been used by more than 1 million developers, and it is optimistic that the new business will promote the double increase of community user scale and payment penetration. We believe that with the acceleration of AI commercialization and the accelerated prosperity of the open source ecosystem, based on the platform's 94 million developer users at the end of 2022, considering that GitHub Copilot has an annual fee of $100 for individual users and $228 for enterprise users, we take the average as the ARPU value. It is estimated that every 1% penetration of GitHub Copilot can bring $154 million in annual revenue increments.

4.4 Salesforce: Years of hard work, the era of big models is expected to continue the company's growth trend

4.4.1 Actively deploy AI+CRM to empower enterprise productivity transformation

In 2016, the company officially launched Salesforce Einstein, deeply laying out the AI+CRM business ecology. At the 2016 Dreamforce conference, the company officially launched the artificial intelligence platform Salesforce Einstein and positioned it as "everyone's data scientist", releasing AI that combines Sales Cloud, Service Cloud, Marketing Cloud, Commerce Cloud and APP Cloud Cloud's five application modules. The company aims to use deep learning, NLP and multimodal data mining capabilities to customize analysis models for customers through Einstein, and embed AI capabilities into business scenarios to complete business insights, opportunity mining, customer behavior prediction, strategy recommendation, etc., and model capabilities continue to improve with the continuous accumulation of data.

The company continues to expand the capabilities of the Einstein platform with AI. In 2020, the company announced its RPA product, Einstein Automate, to provide an end-to-end intelligent automation solution. Low-code and no-code design tools enable users to automate cumbersome, complex, and meaningless digital business processes to increase efficiency, save time, and reduce operating costs. Einstein Automate consists of two tools, Flow Orchestrator and MuleSoft Composer. Flow Orchestrator is used to build automated business processes, while MuleSoft Composer is a data integrator that can quickly connect disparate applications and data to Salesforce for automation. In 2021, the company launched the AI Relationship Insights tool (ERI). ERI tools autonomously explore the Internet and internal data sources to discover relationships between customers, prospects and the company, and help sales reps close deals faster through online resources such as social media and collaboration applications. The company continues to deploy application services on the AI tool Einstein platform and continuously tap the potential of AI+CRM applications.

The company successively launched Slack GPT and Tableau GPT to continue to expand the AIGC product and service matrix. On May 4, 2023, Salesforce launched Slack GPT, which embeds Salesforce's original Einstein GPT functionality into the Slack app, allowing business users to summarize meetings, write meeting notes, and directly substitute sales data from Sales Cloud. At the same time, Slack GPT realizes the native integration of the platform ecosystem, integrating the ChatGPT version of OpenAI and Anthropic's Claude into the platform, strengthening cooperation with AI vendors and providing a better experience for enterprise users to deploy AI capabilities. Tableau GPT is based on AI big model technology to revolutionize the way users process data and analyze. Tableau GPT is built on Einstein GPT to better understand and interact with data in a trusted and secure way, enabling users to intelligently analyze and process operational data in a natural language conversation. On this basis, the company's Tableau Pulse further improves the efficiency of data processing and builds personalized business experiences for users. Automated analytics in Tableau Pulse help get an overview of metrics that need attention, while personalized metrics show the latest metric values, trend visualizations, and AI-generated analytics insights.

4.4.2 AI large models drive price increases and repricing of new services

Salesforce announces official pricing for GPT-powered sales and service capabilities, bringing AI big models to a new stage of commercial transformation. On July 19, Salesforce announced the formal launch and pricing of Service GPT, Sales GPT, and the Einstein GPT Trust Layer, with features including service response, email, and data security. Sales GPT and Service GPT are included in Sales and Service Cloud Einstein, respectively, for $50/user/month, and currently only users who purchase the Unlimited version have access to GPT-supported features.

AI technology has promoted the competitiveness of products and services, and the company announced that the price increase of services is expected to bring incremental business revenue. On July 11, Salesforce announced that it would increase sales cloud, service cloud, Marketing Cloud, Industries, and Tableau pricing by an average of 9%. For selling cloud and service cloud offerings, the company announced that pricing for the Professional, Enterprise, and Unlimited editions will be increased from $75/150/300 to $80/165/$330, respectively, with the new pricing effective worldwide in August 2023. The company's revenue growth rate has slowed in recent years, and AIGC is expected to become a new growth engine. The company's total operating revenue in fiscal 2023 was US$31.352 billion, an increase of 18.3% year-on-year, and the growth rate fell to less than 20% for the first time, and the compound growth rate for fiscal years 2014-2023 was 25.46%. From a single quarter perspective, the company's revenue in the first quarter of fiscal 2024 was US$8.247 billion, a year-on-year increase of 11.28%, and the quarterly revenue growth rate declined for six consecutive quarters. With the blessing of AI large model technology, the company is expected to achieve a double increase in enterprise user penetration rate and customer unit price through value-added services, promoting the company to return to high growth.

Artificial Intelligence Industry In-Depth Report: Industry applications have reached the eve of the outbreak

5 The ecology of overseas large models is three-legged, and the trend of "big fish eating small fish" is obvious

5.1 Microsoft and OpenAI, GPT-4 large model leading position is solid

In the early morning of March 15, 2023, Beijing time, OpenAI officially released the GPT-4 large model, compared with GPT-3 and GPT-3.5 models, GPT-4 has multi-modal functions, supporting accepting image and text input and outputting text. GPT-4 recognizes the information contained in the image based on the image information and generates textual feedback. At the same time, in terms of paper understanding, GPT-4 can intelligently identify the text and chart content contained in the paper screenshot and summarize and analyze it, so as to realize the summary and induction function of the paper. In the reasoning of complex problems, GPT-4 can accurately solve mechanical problems and output the reasoning calculation process according to the requirements of the topic, which is a significant improvement compared with GPT-3 and GPT-3.5.

The price of large-model APIs continues to decrease, and the conditions for commercialization are further mature. On June 13, OpenAI officially announced function calls and other API updates, in which gpt-3.5-turbo-16k provides 4 times the context length of gpt-3.5-turbo, and the price is $0.003 per 1K input token and $0.004 per 1K output token. The 16k context means that the model can now support about 20 pages of text in a single request. ChatGPT quickly became a popular application, and OpenAI continued to iterate the closed-source large model ecosystem. Since the release of ChatGPT at the end of November 2022, OpenAI has continued to iterate, and has now formed a representative of ChatGPT, including GPT-3.5, GPT-4, DALL· Product matrix including E2, Whisper, CLIP, etc. Microsoft has invested in OpenAI, and is expected to deeply benefit from the commercial layout of AI large models in the future. In January 2023, Microsoft announced an additional investment in OpenAI, which was valued at approximately $29 billion. According to the OpenAI equity model disclosed by Fortune, the return on OpenAI investment will be reimbursed in the order of "FCP-Microsoft-Employee-OpenAI Nonprofit". After OpenAI becomes profitable in the future, after returning $1 billion to First Close Partners (FCP), Microsoft will receive a 75% share of OpenAI's profits until Microsoft recovers its $13 billion investment. Microsoft will then own 49% of OpenAI, and when Open AI reaches $150 billion in profits, all VCs (including Microsoft) will be transferred to the Open AI nonprofit fund. In the future, with the acceleration of AI commercialization, Microsoft is expected to deeply benefit from the business growth brought by investing in Open AI.

GPT large models are used by many enterprises around the world. The performance of GPT-4 has been greatly improved, and many overseas companies have successively deployed GPT large-model applications. Governments and enterprises, including Morgan Stanley and Duolingo, have deployed GPT-4 technology to achieve diversified functions. In mid-June, Mercedes-Benz announced that it had partnered with Microsoft to become the world's first car company to integrate ChatGPT into an in-vehicle voice control system.

5.2 Google's PaLM 2 model deeply integrates the business ecology and actively deploys multi-modal large models

On May 9, 2023, Google held the Google I/O Developer Conference and officially launched PaLM2, a large-language model benchmarking GPT-4. PaLM2 is used on Google's conversational robot Bard and integrates Google Lens functionality into Bard. At the same time, after the AI function embedded in Google Family Bucket Workspace is fully upgraded to Duet AI, the functions include but are not limited to assisted writing, generating images for PPT with text, intelligent input and analysis of table data, etc. In addition, Google has launched Studio Bot, a programming robot for Android development.

Artificial Intelligence Industry In-Depth Report: Industry applications have reached the eve of the outbreak

Bard continues to iterate and is now multilingual and multi-modal. On July 13, 2023, Google announced another upgrade of Bard, adding new features such as graphic dialogue and code export. Users can now converse with Bard in dozens of languages, including Arabic, Chinese, German and Spanish. In terms of productivity, in addition to the code tool Colab, users can now export the Python code generated by Bard to Replit, which greatly improves developer efficiency. Bard has multimodal capabilities to further optimize the user experience. Based on the updated BARD support voice to communicate with people, users can choose to read or listen to AI-generated responses, allowing users to more carefully control how friendly Bard is, choosing five different response tones that are simple, long, short, professional, or relaxed. At the same time, Bard has online image recognition capabilities, large models can interpret pictures put into chat through prompt fields, and users can ask AI to recognize more information about image content or generate captions based on images.

Google released the RT-2 large model, laying out the field of embodied intelligence. On July 28, Google DeepMind unveiled a new robot model, Robotics Transformer 2 (RT-2), a new visual-language-action (VLA) model that learns from network and robot data and translates that knowledge into generic instructions for robot control. RT-2 shows that vision-language models (VLMs) can be transformed into powerful visual-language-action (VLA) models that directly control robots by combining VLM pre-training with robot data. With two VLA instances based on PaLM-E and PaLI-X, RT-2 leads to a highly improved robot strategy and, more importantly, significantly better generalization and emergence capabilities inherited from web-scale's visual-language pre-training. RT-2 is not only a simple but effective modification of existing VLM models, but also shows the promise of building general-purpose physical robots that can reason, solve problems, and interpret information to perform a variety of tasks in the real world.

5.3 Meta adheres to the ecological route of open source large model and builds competitiveness with multimodal capabilities

On July 19, Meta open-sourced LLaMA 2 big models. The LLaMA 2 model family consists of three parameter variants: 7 billion, 13 billion, and 70 billion. Llama 2's pre-trained corpus size has increased by 40%, trained on 2 trillion tokens, fine-tuned Chat models trained on 1 million human labeled data, twice the context length of Llama 1, and a grouped query attention mechanism. The Llama 2 model is superior to the Llama 1 model. Compared to Llama 1-65B, Llama 2-70B improved its scores on MMLU and BBH by about 5 points, respectively. In addition to code benchmarks, LLama 2 7B and 34B outperform Falcon 7B and 40B in all types of benchmarks. In addition, the performance of the Llama 2-70B model exceeds that of all open source models.

Meta has rich scenario and data advantages in the multimodal field, and multimodal large models are the core development direction. Based on popular social products such as Facebook and Instagram, Meta has accumulated a large number of video and picture resources, and has innate advantages in model data. On July 16, Meta announced the development of an artificial intelligence model called CM3Leon, which can generate high-quality images from text, as well as text descriptions for images, and edit images according to text instructions. According to Meta, the model has reached the highest level in the industry in text-to-image generation, surpassing products from Google, Microsoft and others. CM3Leon is a model based on Transformer, a neural network structure that uses attention mechanisms to process input data. Compared to other diffusion-based models, the Transformer model is more efficient, faster to train, and less computationally expensive.

5.4 Big manufacturers continue to invest in AI entrepreneurial teams, and the big model competition presents a situation of "big fish eating small fish"

Microsoft invests in artificial intelligence startup Inflection AI and continues to stock up on AI technology. On June 29, 2023, AI startup Inflection AI announced a new funding round totaling $1.3 billion, with Microsoft as one of the leading investors. Inflection AI focuses on creating personal AI assistants, and the company's core product, Pi, launched in May, aims to be a friendly and supportive companion, providing text and voice conversations, friendly suggestions, and concise messages in a natural, fluid way. The Pi was trained on Inflection-1, the company's in-house large-language model, using thousands of NVIDIA H100 GPUs. According to official information, Inflection-1 outperforms GPT-3.5, LLaMA, Chinchilla and PaLM-540B in various benchmarks of large language models. Inflection AI is currently working with partners CoreWeave and NVIDIA to build the world's largest AI cluster, containing 22,000 NVIDIA H100 Tensor Core GPUs. Microsoft's investment in Inflection AI indicates that the company attaches great importance to the application prospects and business value of AI large models in personal digital assistant scenarios, Inflection AI deploys NVIDIAAI technology to develop, train and deploy large-scale generative AI models, which is expected to be integrated into Microsoft's various consumer products in the future to build a complete personal AI application ecosystem.

In addition to Microsoft, Google, NVIDIA, Salesforce and other overseas manufacturers have successively invested in AI startups such as Cohere and Runway. Since the beginning of this year, a large number of excellent AI startups have emerged overseas, creating AI product applications based on generative AI technology in subdivided scenarios such as emotion recognition, search engine, and model security. Large overseas factories continue to absorb the technology and products of start-up companies through acquisition or investment, forming an industry pattern of "big fish eating small fish".

(This article is for informational purposes only and does not represent any investment advice from us.) For information, please refer to the original report. )

Selected report source: [Future Think Tank]. 「Link」

Read on