Field to foundry to factory: The evolving role of cloud in enterprise AI


  • Companies are closing the fun experimentation phase of AI and starting the hard work of creating practical uses that can be deployed at scale.
  • Automating routines and augmenting known processes will deliver more immediate value than AI-generated content.
  • Preparing enterprise data for AI will be critical.
  • Enterprise AI development will be conducted in cloud, but where functional AI and related data reside is yet to be determined.

The fun first phase for generative AI is done. Now it’s time to get to work.

Cloud has been a terrific playing field for early generative AI, allowing tech execs, business leaders and digital novices to conjure potential uses. In the development of Topaz, its artificial intelligence suite, Infosys has compiled more than 10,000 potential AI use cases.

“Cloud has been one of our biggest enablers in this AI journey,” says Vivek Sinha, global head of AI and automation at Infosys.

How companies ultimately put generative AI to work remains to be seen. But where that discovery takes place is clear: Cloud served as the exploratory playground for generative AI. Now cloud will convert to forge, foundry and workshop for tech leaders to hammer out useful things from generative AI.

Whether the cloud AI foundry becomes a cloud AI factory will be a question for companies to answer in the future as generative AI tools reach production. That’s the consensus from technology leaders who gathered at Infosys’ Richardson Hub in February to explore how enterprises can behave in a more AI-first manner.

From possibilities to practicalities

Generative AI and its practitioners need to prove their relevance. To do so, they must move from wow and possibilities to work and practicalities.

“2024 is the year where we want to take that big leap of going from experimentations and cool demos to very boring products being deployed at scale, powered by generative AI,” says Miku Jha, director of AI, machine learning and generative AI at Google Cloud.

“It's absolutely fun and easy to put use-cases in LLMs for pilot deployments,” Jha says. “It's exceptionally hard to take that same use case and get it into deployment.”

Cloud is an ideal platform for the first phases of generative AI development because cloud platforms can provide answers to enterprise-spanning questions ranging from technology architecture, data governance, security, compliance, and regulation.

AI and generative AI are like cloud in that they all hold the potential for enterprise-spanning impact. While most companies and industries are in an early phase of putting generative AI to work, Nelson observed that a varied range of industries – finance, telecom, retail, healthcare and life sciences – appear to be leading the way toward production.

Pockets of value

Regardless of industry, the most promising pockets of value for generative AI begin with existing business functions.

“Anything that is repetitive can benefit from generative AI,” Google Cloud’s Jha says.

This matches findings from the Generative AI Radar research by the Infosys Knowledge Institute last year. North American business leaders see generative AI having its greatest impact in improving user experience, followed by boosting efficiency and automation. Functions such as improving design and creating content – the things that generated so much buzz in the free playground era of generative AI – will have lesser impact, the survey found.

Randall Witt, global IT infrastructure director for industrial tech manufacturer Littelfuse , says the company is focused on using AI to improve customer service experience. As a manufacturer, that means using it to reduce defects by studying repetitive processes and analyzing data to discover where issues could develop in the supply chain and operations. Its customers are looking to upgrade their sensors, software and controls so they can add capabilities to support AI.

“We’re seeing a lot more in the area of predictive analytics and smart automation,” he says.

Increased productivity is the core of the economic potential of generative AI, Google’s Jha noted. A 2023 McKinsey & Co. report indicated that about 70% of the $4.4 trillion economic potential of generative AI will come from increased labor productivity.

Global engineering and project management firm AECOM is focusing its AI efforts on labor productivity, says Abhay Dharkar, senior IT director. Finding ways to achieve labor efficiency begins with identifying where the labor force is collecting lots of data, he notes.

“We are looking at our data. To us, that's going to be the key factor and the differentiating factor. How can we use our data to be more efficient or make us more efficient in how we do our business?” Dharkar says.

Data+generative AI is a critical equation

But that will only be possible with quality data, notes Anant Adya, head of cloud and Infosys Cobalt.

“More and more enterprises are saying, how can we get our data ready for AI? That's where the gap is,” he says. “One of our preparations when we talk about AI to customers is, ‘How can you get your enterprise data ready for AI?’”

Generative AI is leading companies to think differently about data and how they can apply AI to their own data, Nvidia’s Nelson says. The key question, he says, is “how do I get this thing to have knowledge of my own data, in a way that's safe?”

Nelson observed that many pilot programs moving toward production are in the form of chatbot or digital helper. These prototypes apply an AI model to proprietary company data to produce an AI assistant for a specific use, such as HR processes or internal code generation.

Google Cloud’s Jha argues that companies looking to bring AI to production should use a platform with robust data governance because legal and copyright issues around data keep good AI models out of production.

“Every enterprise should be asking for it: Your data is your data,” she says. “So at least in the context of generative AI, any training data is your data and any prompt that you engineer, that’s your data. Any output that comes from the model, that’s your data.”

Where to site the AI factory?

As their AI foundries begin producing useful things, companies must decide where to locate their AI factories: in cloud or on-premises?

AECOM follows a cloud with purpose philosophy, Dharkar says. While the company will use cloud for AI, it also must balance that with its broader tech estate, in terms of where data and applications reside: in cloud or on-premises.

“We need to make sure however we build our landscape for AI, it incorporates both,” he says.

Nvidia’s Nelson says how an enterprise thinks about the role of cloud in AI will shift as it moves from exploratory AI to proof-of-concept to production and scale.

AI applied in business tends to be cloud first, particularly in the exploration phase, but most companies have multicloud hybrid with on-premises technology estates. As companies near production with AI tools, that becomes a moment to evaluate again the role of cloud in AI at production, in terms of security, scalability, regulatory and compliance matters.

“Do an exploration in the cloud, in an easy way. And then there's a bunch of important discussions to be had,” Nelson says.

Related Stories

Connect with the Infosys Knowledge Institute

Opt in for insights from Infosys Knowledge Institute Privacy Statement