AI Growth Now Limited by Computing Power, Not Model Quality: Goldman Sachs

A Goldman Sachs report indicates that AI expansion is now limited by the availability and cost of computing power, not model quality. The focus is shifting from building better models to efficiently deploying them in production environments. AI-native companies are targeting gaps in traditional SaaS by building outcome-driven systems. Control over compute resources and efficient deployment will shape the next phase of industry growth.

Key Points: AI Growth Limited by Computing Power: Goldman Sachs

  • AI growth is constrained by computing power and cost, not model quality
  • Focus shifts from building better models to efficient deployment in production
  • AI-native companies target gaps between existing SaaS solutions, monetizing on impact
  • Control over compute resources is central to the next phase of industry growth
3 min read

AI growth no longer limited by models, but by computing power: Goldman Sachs

Goldman Sachs report says AI growth is now constrained by computing power and cost, not model quality, shifting focus to efficient deployment.

"Compute has emerged as the binding constraint in scaling AI. As inference demand grows faster than available capacity, the key differentiator is no longer just model quality, but the ability to reliably access and finance compute in the most performant way - Goldman Sachs"

New Delhi, April 22

The rapid expansion of artificial intelligence is now being constrained not by the quality of models but by the availability and cost of computing power, according to a report by Goldman Sachs.

The report highlighted that as demand for AI applications accelerates, particularly in real-world deployment or "inference" use cases, the need for compute infrastructure such as high-performance chips and data centres is growing faster than supply. This has made access to reliable and cost-efficient compute a key differentiator for companies operating in the AI space.

It stated, "Compute has emerged as the binding constraint in scaling AI. As inference demand grows faster than available capacity, the key differentiator is no longer just model quality, but the ability to reliably access and finance compute in the most performant way".

It noted that the focus in the industry is shifting away from just building better models to how effectively these models are run in production environments. Factors such as cost efficiency, reliability, performance, and the ability to route workloads across systems are becoming critical in determining competitive advantage.

According to the report, value in the AI ecosystem is increasingly moving towards the execution layer, including technologies that manage model deployment, optimise compute usage, and ensure consistent performance. Companies that can secure and efficiently utilise compute resources are better positioned to scale their AI offerings.

The report also pointed to a broader structural shift in the software industry. AI-native companies are not directly competing with traditional software-as-a-service (SaaS) firms but are instead targeting gaps between existing solutions. While SaaS platforms have historically focused on systems of record and functional silos, AI-native firms are building systems of action that deliver end-to-end outcomes.

These AI-driven solutions are being deployed faster, often going live within weeks, and are seeing higher conversion rates from pilot projects. Unlike traditional models that charge based on user seats or features, AI-native companies are increasingly monetising based on business impact and productivity gains.

Goldman Sachs further noted that applied AI is approaching an inflection point similar to the generative AI boom seen in 2022. While large-scale commercialisation may still take 5-10 years, advancements in foundation models, simulation, and edge technologies are enabling AI systems to move beyond decision support into real-world execution.

This includes applications across logistics, labour automation, and defence, where AI systems can act, learn, and improve continuously. Companies with sustained real-world deployments are expected to gain an advantage by building proprietary data ecosystems that enhance performance over time.

So the report outlined that as AI continues to evolve, control over compute resources and the ability to deploy models efficiently will play a central role in shaping the next phase of industry growth.

- ANI

Share this article:

Reader Comments

S
Sneha F
Finally someone saying it! We've been obsessed with making bigger models in India but the real bottleneck is getting the GPUs and cloud credits. My startup spent 3 months just negotiating with AWS for compute. The "inference" demand they mention is real - we need cheaper, faster deployment options here. 😤
V
Vikram M
Interesting perspective but I think Goldman Sachs is missing the Indian context. Our strength in IT services isn't about building frontier models but applying AI efficiently for clients. The "systems of action" concept they mention actually fits our outsourcing model well. We should double down on deployment and cost optimization rather than competing with OpenAI.
J
James A
From a global perspective, this is a huge opportunity for India. With our engineering talent and lower operational costs, we could become the world's "AI execution hub." But we need massive investment in power infrastructure - data centers are energy hogs and our grid is already struggling. 5-10 year timeline seems optimistic for commercialisation.
K
Kavya N
As someone working in AI deployment for Indian manufacturing clients, this resonates deeply. We've got great models but deploying them on the shop floor with limited compute is a nightmare. The shift from "systems of record" to "systems of action" is happening, but our SMEs can't afford the cloud costs. Need more edge computing solutions made in India! 🇮🇳
S
Sarah B
The bit about AI-native companies monetising based on business impact is interesting. In India, we're still stuck on the SaaS model of per-seat pricing. But I worry this could lead to AI only being accessible to big corporations. We need public investment in compute infrastructure - maybe

We welcome thoughtful discussions from our readers. Please keep comments respectful and on-topic.

Leave a Comment

Minimum 50 characters 0/50