laitimes

Zhongguancun Kejin launched a new generation of generative BI based on large models

author:Chopping wood nets

Summary:

Data has become a key element and an important strategic resource for the in-depth development of the digital economy. However, with the advancement of industry informatization and the transformation of products and services from "less species and more quantity" to "many kinds and small quantities", the massive accumulation of data and the shortening of the life cycle have been significantly shortened. Traditional report-based data descriptive analysis (BI) tools are difficult to explore the correlation and potential value of massive multi-source heterogeneous data. Large models show great potential in the integration of big data, strong algorithms, and large computing power, and gradually form a new generation of data governance paradigm with predictive and guiding analysis as the core. In view of this, based on the self-developed financial model and data-driven decision-making practice, Zhongguancun Kejin has comprehensively upgraded the new generation of conversational intelligent analysis and decision-making platform, forming a trusted source of global data quality standard management, enhanced root cause analysis and predictive decision support based on human-machine collaboration, natural language conversational analysis and insight ecology of all decision-making feedback, etc., to promote the effective governance of data assets, accurate data decision-making, and form an organizational effect of multi-dimensional intelligent decision-making data application from top to bottom. It truly realizes the data-driven comprehensive factor value release.

1. Complex decision-making challenges and optimal management of massive data in the digital age

At present, digital technology and digital construction, with data resources as the key element, have become an important driving force for industrial intelligent transformation. However, with the changes in the market environment and the diversification of consumer needs, traditional data analysis (BI) that relies only on report statistics can no longer meet the needs of automation and intelligence. In particular, with the trend of massive data accumulation and significantly shortened life cycle, the challenge of achieving more comprehensive, more balanced, and more universal data insights and accurate intelligent decision-making is becoming more severe.

First, the shortening of the data life cycle and the centralization of user needs highlight the contradiction of data management, and enterprises urgently need to optimize from a global perspective. The demand for personalized and customized products and services is increasing, and the shortening of lifecycle management strategies for products and services has led to a significant shortening of the data lifecycle, and its generation, flow, and failure are all accelerating. According to IDC, the total amount of data worldwide will grow to 175 zettabytes by 2025, with a compound annual growth rate of 61%. The rapid growth of data scale is accompanied by the increase in data diversity and complexity, and more than 80% of enterprises have challenges in data caliber, and it is difficult to ensure the quality and consistency of data, and even generate data silos, while the report shows that 27.1% of enterprises say that although they have broken data silos, it is difficult to form business collaboration. According to a survey by DAMA, less than 30% of enterprises are able to fully standardize and manage data. Redundancy and inconsistency in data make companies face greater risks and challenges when making decisions. Therefore, enterprises need to establish a comprehensive data management system from top to bottom, strengthen the monitoring and management of data quality, and realize the effective use and protection of data assets.

Second, traditional descriptive data analysis is highly templated and homogeneous, which is difficult to meet the needs of specific insights and intelligent decision-making such as prediction and attribution. The three levels of big data application are descriptive analysis, predictive analysis and guided analysis, but traditional data analysis tools (BI) are often descriptive analysis based on empirical or theoretical paradigms, providing standardized reports and general analysis templates, which cannot achieve personalized, customized and diversified data analysis needs. More than 70% of enterprises say that in the face of complex and dynamic business scenarios, this "one-size-fits-all" approach often struggles to provide sufficient insight, limiting the potential value of data in complex decision-making. According to McKinsey's report, only a handful of organizations are able to effectively leverage in-depth analytics tools such as predictive analytics and attribution analytics to support specific intelligent decision-making needs. Therefore, the core of the new generation of data governance is dynamic and multi-dimensional data insight and intelligent decision-making, focusing on data insights of related analysis, causal analysis, and driver analysis.

Third, the penetration rate and universality rate of data tools are insufficient, which restricts the optimization of crowd intelligence data decision-making with full participation. According to the survey, although more than 98.6% of enterprises have adopted at least one data analysis tool, only 28.6% of enterprises have been able to fully implement the application scenario. The technical threshold is the main factor restricting the participation of all employees in the optimization of crowd intelligence data decision-making. According to Gartner, more than 70% of data analytics projects lack the data culture and talent to effectively leverage data tools for decision support. In this case, enterprises should pay attention to improving the data literacy of employees, adopt more intelligent data analysis tools, lower the technical threshold, simplify the operation interface, improve the universality and ease of use of tools, let more employees participate in the process of data decision-making optimization, and promote crowd intelligence data decision-making optimization with full participation.

2. The construction practice of a new generation of data governance and intelligent decision-making platform based on large models

In the crowd intelligence optimization management of massive data in the digital era, Zhongguancun Kejin has comprehensively upgraded a new generation of conversational intelligent analysis and decision-making platform based on its self-developed financial model and data-driven decision-making practice. By improving the standardization and uniformity of data management, increasing the penetration and ubiquity of data tools, and developing customized data analysis models, enterprises can better leverage data assets, support complex decision-making processes, and achieve innovative growth.

(1) Consolidate the foundation of data governance with the intelligent indicator middle platform and drive consistent, standardized, and multi-reusable trusted sources

The fundamental problem of data governance is to resolve the underlying conflicts of data management sources in traditional descriptive analytics practices. Problems such as data fragmentation, inconsistent standards, and redundant construction increase the complexity of data management and reduce the efficiency and accuracy of decision-making. Data silos between departments and inconsistent statistical caliber of indicators hinder effective collaboration, which is not conducive to managers responding to market changes based on data-driven refined management and rapid decision-making, while traditional BI tools rely on graphical user interfaces (GUIs) to limit their flexibility and scalability, and it is difficult to keep up with the rapidly developing business needs.

In this context, the next generation of data governance and analytics decisions emphasizes the processing power of the data back-end rather than the visual presentation of the front-end, so that the analytical data can be processed in an automated and programmatic way, and the data analysis is intensively integrated into the business process. Among them, the systematic construction based on the indicator middle platform effectively copes with the problem of data efficiency. It decouples the semantic layer of data from the application layer, and realizes the one-time definition and multi-reuse of data indicators through the model of the unified semantic layer. This design not only ensures the consistency and standardization of data, but also improves the reusability of data services, providing enterprises with a trusted data source. In such an intelligent indicator middle platform, enterprises can build a unified data model and indicator system to achieve centralized management and maintenance of data. Through APIs and microservice architecture, these metrics can be called by different business systems and applications to support real-time data analysis and decision-making.

Zhongguancun Kejin launched a new generation of generative BI based on large models

Figure: A new index center is built to solve the problem of data caliber

According to the application practice of Zhongguancun Kejin, the intelligent indicator middle platform improves the high integration and reusability of various indicator sources in terms of data processing and visualization, effectively precipitates enterprise data assets and supports decision-making. Through the hierarchical management model of "atomic indicators-derived indicators-calculated indicators", a comprehensive data index system has been established that includes thousands of atomic, derived and calculated indicators, covering business lines, operations, finance, risk management and other fields. In terms of data interpretation, the platform is able to transform complex data into easy-to-understand visualizations in as little as 1 minute, greatly accelerating the grasp of the business meaning behind the data, speeding up the decision-making process and supporting more accurate business insights.

Zhongguancun Kejin launched a new generation of generative BI based on large models

Figure: A new generation of generative BI based on the metrics middle platform

(2) Integrate the hybrid intelligence of large model + small model, and build in predictive and guiding analysis such as abnormal analysis and attribution analysis

At the heart of next-generation data insights and intelligent decision-making is the ability to quickly discover value, identify anomalies, and make the right decisions from massive amounts of data. Traditional BI tools can provide reporting and analysis of historical data, but they are often incapable of dealing with complex data relationships, in-depth analysis, and prediction. Especially in scenarios that require rapid attribution analysis and forward-looking prediction of large amounts of heterogeneous data, traditional methods are difficult to meet the needs of enterprises for agile decision-making. In addition, due to the lack of sufficient intelligence support, users often need to invest a lot of time in manually analyzing the data, which is not only inefficient, but also easy to overlook important information due to lack of experience or limited perspective.

Based on the concept of human-machine collaborative integration intelligence, the exploration idea of true intelligence is to introduce advanced technologies such as root cause analysis, anomaly analysis, attribution analysis, trend prediction and machine learning algorithms to build a platform that can conduct in-depth analysis and intelligent decision support, so as to maximize the release of data value. Through the integrated application of these technologies, the platform can automatically identify abnormal fluctuations in data (abnormal movement analysis), dig deep into the root cause of problems (root cause analysis), identify key factors that affect changes in business metrics (attribution analysis), and make accurate predictions based on historical data trends (trend forecasting). In addition, the application of machine learning algorithms enables the analysis model to continuously learn and optimize, improving the accuracy and adaptability of the analysis. More importantly, the platform combines human intuitive judgment with the computing power of machines to achieve decision support with integrated intelligence. This integrated intelligence model gives full play to the respective advantages of humans and machines, which not only ensures the scientific and accurate decision-making, but also retains the intuitive judgment and innovative thinking of human beings on complex problems.

Zhongguancun Kejin launched a new generation of generative BI based on large models

Figure: Based on the large model + small model, data-driven decision-making is truly realized

Practice has proved that augmented analysis based on "intelligent model + complex calculation" can effectively support enterprises to make faster and more accurate decisions in the face of complex business environments and decision-making challenges. For example, Zhongguancun Kejin used this enhanced analysis technology to automatically find 15 abnormal problems in just half a year after the platform was launched, and the timely detection and processing of these problems not only ensured the stability of enterprise operations, but also provided real-time data support for investment decisions. In addition, as the frequency of use of intelligent models increases, models can continuously improve the accuracy of predictions through continuous self-optimization, so that enterprises can face market uncertainty more confidently. Another advantage of intelligent models is that they can automatically select the appropriate machine learning algorithm based on specific business scenarios and data characteristics. Whether it's a decision tree, random forest, or other algorithm, the intelligent model can be flexibly adjusted to achieve the best analysis results. This flexibility and adaptability allows companies to more accurately predict future business performance, such as sales in 2024, so that they can set more reasonable annual goals and apply for budgets.

Zhongguancun Kejin launched a new generation of generative BI based on large models

Figure Make annual target and budget decisions based on sales forecasts

(3) Integrate the conversational guidance interaction of large models, and drive the positive cycle of intelligent decision-making in real time

Integrating the conversational interaction of large models, with the goal of application universality, it effectively solves the problems of high technical threshold, complex data model, and difficult to understand analysis results. In traditional business intelligence (BI) practice, enterprises often face challenges such as information overload and high technical thresholds. It not only leads to a decrease in decision-making efficiency, but also makes it difficult to fully tap and utilize the value of data. Users need to go through a complex process of query and analysis to get the information they need, which is often time-consuming and labor-intensive. In addition, the static reports and dashboards of traditional BI tools often fail to meet rapidly changing business needs, making the decision-making process rigid and difficult to adapt to dynamic market environments.

The next-generation conversational decision-making platform combines large models and generative AI to provide a more intuitive and flexible way of data interaction and analysis, thereby promoting the formation of ecological feedback for data decision-making. Users can quickly ask for their data query and analysis needs by talking to the system through natural language. The system uses the algorithms behind powerful large models to process these requirements in real time, which can not only provide accurate data analysis results, but also provide in-depth analysis based on further queries of users, and realize progressive intelligent decision support. This kind of construction idea greatly lowers the technical barrier to entry, making it easy for users with non-technical backgrounds to participate in the data analysis and decision-making process. What's more, it is able to respond to changes in users' needs in real time, providing dynamic analysis and forecasting, which not only accelerates the decision-making process, but also improves the quality and accuracy of decision-making.

In a fast-moving business environment, efficient decision-making is a key factor in business operations. Practice has shown that an intelligent analysis platform based on conversational guidance can not only accelerate the popularization of data-driven culture, but also form an efficient positive decision-making cycle within the enterprise. Through conversational interaction, enterprises can quickly conduct in-depth analysis around key operational and management issues, such as business trends, channels, key business progress, talent placement, and potential business issues.

Through the intelligent analysis platform, enterprises can obtain real-time analysis of operational indicators such as daily, monthly, and quarterly reports, shortening the data processing process that would otherwise take days to be completed in seconds. For example, a financial institution has realized real-time query and analysis of multiple analysis fields such as capital analysis, marketing analysis, credit business analysis, customer operation analysis, risk control analysis and financial analysis through a new generation of conversational intelligent analysis and decision-making platform, which has significantly improved the timeliness and accuracy of decision-making. In contrast, the traditional data analysis process generally takes 3-5 days to complete from development to front-end and back-end participation, while the conversational intelligent decision-making platform only takes 3 minutes to complete the visualization and embedded use of reports, which greatly shortens the cycle from demand proposal to launch. This kind of rapid data processing and analysis capability not only improves the operational efficiency of the enterprise, but also brings several times the comprehensive benefit improvement for the enterprise.

Zhongguancun Kejin launched a new generation of generative BI based on large models

Figure Semantic understanding and interaction decision-making based on large models

3. Prospects

With the popularization of data governance based on large models, the value of data has been unleashed like never before, and the future of data insights will move towards a new stage of more intelligence, interaction, and participation of the whole people. Data-intensive innovation is not just the preserve of senior executives or data analysts, every employee can participate in it, and enterprises will build a data-centric decision-making ecosystem, in which data is not only an aid to decision-making, but also a driving force for continuous optimization and progress. With the increase in the frequency of use and the accumulation of user feedback, the conversational intelligent decision-making platform continues to learn and evolve, and its analysis models and prediction algorithms can more accurately meet actual business needs. This top-down and bottom-up two-way optimization capability will further promote the release of the value of data elements, become an important force to promote the digital transformation of enterprises, and open a new era of intelligent decision-making.

Read on