- GenAI is throwing a monkey wrench into the gears of the enterprise multi-cloud strategy
- Experts told Fierce Network that their strategy needs to cover more than just data storage
- Hyperscalers also have a role to play in making data more accessible to cloud-based models
The data landscape is about to get even more complicated. Enterprises have spent the past several years trying to wrap their arms around what multi-cloud means for their data strategy, but generative artificial intelligence (GenAI) is about to shift the sands under their feet all over again, a pair of analysts and an executive from Kyndryl told Fierce.
According to Kyndryl’s Global Cloud Practice Leader Nicolas Sekkaki, the problem enterprises face is ensuring their data is in the right place so it can be accessed by the AI they intend to use. When the AI runs on one cloud and the data is stored in another, that means one or the other has to move. And that, he added, can be quite costly, depending on the volume of data.
Add into the mix additional data coming from an enterprise’s software-as-a-service providers hosted on still other platforms – plus regulatory compliance considerations to top it off — and you have a recipe for chaos.
“Now with generative AI data is becoming even more important and this kind of question is rising. But industry hasn’t fixed it. I think this is the next complexity,” Sekkaki said. “Data architecture is becoming really complicated…If you’re not thinking about it now, down the road you will face this problem.”
Dealing with data
AvidThink Founder Roy Chua told Fierce that some enterprises have come out with what they call data strategies or data policies. But for the most part, he said, they’ve not done a great job. Why? Well, because “to the extent they have [implemented a data strategy], they’ve just dealt with the data storage problem.”
With generative AI in the picture, Chua said they need to expand their horizons.
“In terms of strategies it’s more than ‘how do I deal with data storage,’” he said. “The data transport, the data mobility question will become a bigger element that the CIOs will have to deal with as well.”
Sid Nag, VP for Gartner’s Technology and Service Provider Group, agreed. He noted the analyst firm in June came up with a template – called the Cross-Cloud Integration Framework – to help enterprises address the issue. The framework basically takes a “stackwise approach” to the problem, ensuring the necessary network connectivity, security, services, applications and business outcomes are laid out to ensure data is flowing as it needs to across all environments.
If this sounds a little familiar that’s because Gartner worked with Google on the hyperscaler’s Cross-Cloud Network. But Nag noted that only addresses the connectivity layer, leaving more work to be done. And with a nudge from GenAI-hungry customers, hyperscalers will likely be forced to move on this front.
Indeed, DE-CIX CEO Ivo Ivanov recently told Fierce that “AI cannot function” without data. “Anything that undermines the flow of data to and from an AI model will impact the value you can get out of the model,” he said. “All data sources – be they data lakes/warehouses or live data from IoT devices, the production infrastructure, customer networks, etc. – must flow unhindered to the location of the AI model for both training and inference purposes.”
Beware cloud silos
But today, there are still roadblocks in the form of cloud silos. Nag told Fierce Gartner’s report outlines a way to fix that. The document calls on public cloud providers to “address cross-cloud GenAI-federated functionality by collaborating on standardized protocols and interfaces that allow AI models and data to be seamlessly distributed and executed across different cloud environments.”
How, exactly? Well, the report states, “This would involve establishing interoperability standards, data security and privacy protocols, and efficient communication frameworks to enable federated learning and inference across multiple clouds. Additionally, developing tools and services that facilitate the management, deployment and monitoring of federated AI systems and associated data across clouds would be essential.”
“Ultimately, the goal would be to create a unified ecosystem where users can leverage the strengths of multiple cloud providers while seamlessly accessing and deploying AI capabilities,” it concluded.