laitimes

Artificial intelligence development has entered a new phase, and generative AI will change the future

author:Computing miscellaneous

WAIC 2023 World Artificial Intelligence Conference is the first national exhibition with the theme of artificial intelligence. At the meeting, hundreds of enterprises from all over the world participated in the exhibition, and Sherry Marcus, president of Amazon Web Services Generative AI Product Research Institute, gave a keynote speech on building a generative AI ecosystem at the meeting, which attracted widespread attention.

Artificial intelligence development has entered a new phase, and generative AI will change the future

Looking at the entire IT industry, the hottest topics this year are around AI, and some areas have achieved good results in AI applications, and more in-depth applications are gradually being developed.

So why has AI been on the rise so rapidly in recent years?

Sherry Marcus believes that machine learning is now at an inflection point and is influenced by three major drivers.

The first is the massive explosion of network data, and this data is available; Secondly, the computing power has increased significantly, allowing AI to do more complex things; But the above two points are not enough, the emergence of the Transformer AI model and the evolution of the complexity of the model have achieved the convergence trend and promoted the development of AI.

Based on this, generative AI lowers the barrier, makes it easier to integrate with real business, and puts AI development on the fast track.

For generative AI, Amazon Web Services is prepared

On the one hand, with the rapid development of AI big models, on the other hand, enterprises that urgently need AI to empower their businesses, how can they build a bridge of interoperability?

Sherry Marcus believes there are four key factors that can help them accelerate their generative AI journey. These include: building great AI applications with best-in-class foundational models; Ensure the security and privacy of the model; Reduce costs and keep latency low; Eliminate the heavy lifting and increase efficiency with professional code generation tools.

Amazon Web Services recently launched a generative artificial intelligence product called Amazon Bedrock. One of the most important capabilities of this product is the extremely easy customization of the model. Customers only need to show Amazon Bedrock a few labeled data examples in Amazon S3, and Amazon Bedrock can fine-tune the model for a specific task, with as few as 20 examples, without labeling large amounts of data.

In addition, Amazon Web Services has an AI tool called Amazon CodeWhisperer, which is trained with billions of lines of open source code and can generate code suggestions in real time based on users' code comments and existing code. It also features security vulnerability scanning.

Currently, Amazon CodeWhisperer supports 15 programming languages (including Python, Java, and JavaScript) and can be integrated into development tools such as VS Code, IntelliJ IDEA, PyCharm, and more.

In terms of practical applications, some news organizations can quickly complete various news by using generative AI to facilitate the collaboration of their teams. In the e-commerce sector, thousands of Chinese suppliers have used Amazon's generative AI capabilities to accelerate the generation of product images on the platform, resulting in 270% faster inference.

Amazon Bedrock is changing the world

Deloitte leverages Amazon Web Services' Amazon Bedrock fully managed service to extend its generative artificial intelligence (AI) capabilities. The service provides easy API access to pre-trained base models from leading AI companies and Amazon. By partnering with Amazon Web Services, Deloitte helps users build new AI applications faster and at scale, unlocking the tremendous value of generative AI capabilities. With Amazon Bedrock, Deloitte can provide customers with more cost-effective services, helping to examine large amounts of data to customize models and apply natural language to a variety of use cases.

The Accenture Velocity team is using Amazon Code Whisperer to speed up coding tasks. They found that Amazon CodeWhisperer could help reduce development effort by 30 percent, allowing customers' developers to focus more on security, quality, and performance improvements.

In finance, generative AI has a wide range of applications, with data showing that 85% of financial services leaders have developed strategies for using generative AI in product and service development, ranging from marketing content creation, customer service chatbots, text summaries, risk management, and more. By analyzing large amounts of financial data, generative AI provides investors with assessments, forecasts, investment strategies, and recommendations. By summarizing the stock research report information, it provides reference for investment analysts. In addition, it can generate suitable insurance solutions according to customer needs, preferences and scenarios.

In M&M, generative AI will have a huge impact on the entire value chain of pharmaceuticals, clinical trials, and medical practices, including designing and synthesizing new protein sequences, predicting drug effects and side effects, identifying patient risk factors, providing personalized care approaches, and even synthesizing patient and medical data for simulation studies.

For example, Philips HealthSuite Imaging, a Philips medical imaging system, is deployed in the cloud and accelerates the development of cloud-based generative AI applications by using Amazon Bedrock-based Foundation Models to provide clinical decision support and achieve more accurate diagnoses.

The cloud platform empowers users with customized models

The purpose of developing AI large models is to simplify the complex, but many enterprises are facing cost and time pressure from large model training.

In contrast, a cloud-based specialization model is more suitable for most enterprises.

Sherry Marcus said that most users do not need to train models from scratch, nor can they rely on a single, large-scale language model to cope with various tasks, but want to be able to access multiple models at the same time, and then customize their models according to their needs and data.

This means building on large models and ensuring that users can customize their own models in a private and secure way.

Amazon Web Services can provide users with trained and pre-trained models, including models with the Amazon Titan family of base models, or models trained by third-party partners like Stability AI or Anthropic, and open source models available through Amazon SageMaker JumpStart.

Users can use these models directly, or build specialized or specialized models on the basis of large language models according to their own needs, and use their own data. In this way, on the one hand, customers can have the richness of large models, and at the same time, they can have the rapid iteration that small models can bring.

Generative AI accomplishes development tasks

At present, the focus of enterprises is to reduce costs and increase efficiency, so can generative AI be achieved in terms of development?

According to Sherry Marcus, Amazon Web Services' Amazon CodeWhisperer, which was tested during preview, completed tasks on average 57 percent faster and had a 27 percent higher success rate than participants. It brings the entire industry level of software development to a higher level, and even college students can use such tools to develop more complex software.

Artificial intelligence development has entered a new phase, and generative AI will change the future

In addition, programmers usually need to spend a lot of time querying relevant information online when writing code, and Amazon Code Whisperer can also be used as an AI intelligent assistant, and code can be automatically generated through comments, which greatly saves development time.

Amazon CodeWhisperer directly recommends one or more code snippets in the code editor to help developers be more productive during the coding process. The code recommendations provided by Amazon Code Whisperer are large language models (LLMs) trained on Amazon and open source code based on billions of lines. Developers can quickly and easily accept topmost recommendations by simply clicking the Tab key, use the arrow keys to see more recommendations, or continue writing code.

Amazon CodeWhisperer analyzes the training data for security vulnerabilities and filters out as many as possible. The purpose of this filtering is to exclude unsafe code patterns from the training data and prevent the model from learning and generating similar code in the future. To help developers build applications responsibly, Amazon CodeWhisperer also provides a reference tracker that displays license information for recommended code and, if necessary, links to corresponding open source repositories.

When it comes to challenges, Sherry Marcus believes that generative AI should have a deeper understanding of industries, give a greater impetus to digital transformation, and ensure that AI must operate in a safe, legal, and responsible environment, and only then can it truly achieve disruptive innovation.

Read on