laitimes

If generative AI is the new operating system, then agents are the new applications

For too long, the computing world has been dominated by legacy operating systems such as Windows, Linux, and Unix. These platforms form the backbone of our interactions with our computers, enabling software to run smoothly and services to run seamlessly. However, the rise of generative AI technology is triggering a paradigm shift in the technology ecosystem.

Generative AI is no longer just an advanced tool for specialized use cases. It's deepening into a new parallel technology stack—similar to the foundational layer in a traditional operating system—that fundamentally redefines the way organizations operate, innovate, and compete. As we enter a new era of generative AI, agents are beginning to emerge as new applications, and this shift will also reshape fundamental business strategies across industries.

In this article, we will explore how generative AI has rapidly evolved into a parallel technology stack that evolves in a similar way to traditional enterprise IT ecosystems. It will be critical for senior executives to grasp this shift, as it has the potential to disrupt operations and reshape the shape of competitive advantage in the marketplace.

If generative AI is the new operating system, then agents are the new applications

Generative AI is the new operating system.

Generative AI: The New Operating System Technology Stack

The role of the operating system is to provide the basic interface between hardware and software. Similarly, generative AI is now emerging as parallel operating systems that orchestrate the interaction between large-scale computing infrastructure and complex applications that are being transformed into new forms of AI agents. This transformation is not simply automation or incremental improvements, but a fundamental reimagining of the digital technology stack – from hardware to software, where AI will be integrated at every level.

The traditional computing stack consists of the underlying hardware, operating system kernel, utilities, shells, and applications that serve the end user. Each layer is important and indispensable for establishing an overall environment that supports digital interactions.

However, generative AI is introducing a wave of disruption:

  • Hardware layer: Traditionally dominated by Intel, ARM, and SPARC, the hardware layer is evolving to meet the needs of AI workloads. Companies such as NVIDIA, Groq, Cerebras, Graphcore and SambaNova are developing and delivering next-generation hardware with a focus on AI-specific processors. These AI accelerators are responsible for powering the core infrastructure and complex computations required for AI models.
  • Infrastructure providers: In the traditional model, OEMs such as Dell, HPE, and IBM formed the backbone of enterprise infrastructure, providing servers, storage, and networking solutions to run applications powered by operating systems. Today, cloud platforms such as Amazon Web Services, Microsoft Azure, and Google Cloud lead the AI stack, providing infrastructure and services optimized for generative AI. Similar to the PC and server markets, there are other players offering a wide variety of generative AI-as-a-services in addition to hyperscalers. Their scalability and flexibility are essential to support AI models that require a lot of processing power.
  • AI model ecosystem: Traditional operating system kernels are responsible for managing low-level functions such as memory and processes. Language models such as Llama, Gemma, Mistral, and Phi act as the core of the generative AI stack, enabling systems to generate text, images, and even code that resemble human expression habits. Apple Intelligence and Microsoft's Copilot+ PC are early examples of generative AI playing an integral part of the operating system. Just as the kernel is essential to the functionality of the operating system, these models become the building blocks of AI systems.
  • Core Utilities and Frameworks: Corresponding to core operating system utilities, AI tools such as embedding models, vector databases, and orchestration frameworks, including LangChain and LlamaIndex, are now the new key components. They simplify the handling of complex workflows, enabling agents to retrieve knowledge and perform tasks efficiently. These frameworks also provide a "shell" for developers and organizations to build, train, and deploy AI agents.

Agents: New applications

In a traditional technology stack, applications are built on top of the operating system and are responsible for performing specific tasks and providing value to users. In the generative AI technology stack, AI agents take on the role of these applications, but their capabilities far exceed those of traditional software. AI agents are autonomous, able to interact with their environment, learn from data, and perform multi-domain tasks from customer service to software development.

For top managers, the rise of AI agents brings new strategic opportunities. These agents can act as personalized assistants to streamline workflows, manage operations, and even make decisions based on data-driven insights. Unlike traditional software, where most operations are inseparable from human intervention, AI agents are able to operate autonomously, constantly learning and improving. This shift has a significant impact on both operational efficiency and customer engagement.

Take, for example, customer support. In the past, businesses relied on a combination of scripted chatbots and human agents to handle customer inquiries. Today, AI agents powered by generative AI models are not only able to understand and answer questions in a variety of natural language forms, but also anticipate customer needs and solve problems before they arise. This increased level of sophistication allows the company to improve the customer experience while reducing operational costs.

Strategic implications for top management

For top executives, the shift from traditional OS-driven applications to generative AI technology stacks represents not only a technical evolution, but also a strategic upgrade of the way enterprises operate, innovate, and compete.

To lead this shift, leaders should consider the following:

Embracing new technology stacks: Applying generative AI as a parallel technology stack is no longer an option, it's a necessity. Without the ability to integrate these technologies into operational processes, organizations are likely to fall behind competitors who are successful in achieving this goal. Early adopters have already experienced the benefits of generative AI such as enhanced decision-making, improved operational efficiency, and improved customer experience.

Redefining IT infrastructure: The transition to AI-driven hardware and cloud-based infrastructure requires us to re-evaluate our own IT architecture. Traditional servers and on-premises data centers may no longer be suitable for AI workloads. Instead, organizations need specialized AI hardware and cloud platforms to provide the scalability and flexibility that generative AI systems require.

Automation with AI agents: AI agents are not only advanced applications, but also autonomous systems that can handle complex tasks across functions. By deploying these agents across marketing, operations, and finance, organizations can reduce operational costs while improving decision-making and service delivery.

Invest in talent and training: While generative AI has great potential, it also requires specialized skills to be effective. Top management must ensure that their teams are capable of using AI models, embeddings, and agents. Investing in AI training and bringing in talent with expertise in AI development will be key to maintaining a competitive edge.

Fostering a culture of innovation: The rise of generative AI as a new operating system technology stack offers unprecedented opportunities for innovation. Companies that are encouraged to experiment with AI will gain a first-mover advantage and will be better equipped to develop new products, services, and business models. Top executives should also focus on leading and fostering a culture of innovation that supports AI-driven exploration.

Looking to the future

The evolution of generative AI towards parallel operating system technology stacks is already underway. As this technology matures, we will see further development of AI models, agents, and infrastructure. For top managers, this shift presents both challenges and new opportunities. Whoever embraces this new technology stack will be expected to lead the way in their industry, push new limits of efficiency, increase customer engagement, and create more innovative products and services.

In the coming years, AI agents are likely to become as ubiquitous as traditional applications, and businesses that use them effectively and leverage their expertise will gain a significant competitive advantage. Generative AI is shaping a new future, and market players interested in harnessing its potential are poised to thrive and thrive in this unprecedented digital age.

Read on