- Data traffic will skyrocket when inference takes over AI workloads
- Moving information securely and efficiently will require AI exchanges, which are like cloud exchanges
- AI exchanges require standardized interconnects, which don't exist today.
Developments in artificial intelligence (AI) are moving so fast that it’s hard to tell what might be coming around the next corner. But there’s one thing that everyone seems to agree on: at some point, AI will shift from training to inference, with machines talking to machines – and that’s going to be a complete game changer.
“When AI starts to talk to AI, we really think that's going to represent another fundamental shift in the volume of data required to run businesses that represents another step function change in terms of opportunity for this company,” Lumen CEO Kate Johnson told Fierce in a recent interview for the Five Nine podcast.
Digital Realty CTO Chris Sharp agreed that once machines start talking to machines, we’ll be “in that next era of AI capabilities.”
So, why does this matter? Well, leaving aside the implications for the AI itself, it means networks will have to evolve into data superhighways optimized for AI. Think cloud interconnection hubs on steroids.
In his previous roles, Sharp helped pioneer private interconnect to hyperscale clouds and built the first cloud exchange that allowed enterprise clients to hook into multiple clouds privately. He told Fierce that he sees parallels between what’s happening now with AI and what happened back in those days with the cloud. The industry will need “AI exchanges” to enable the machine-to-machine future.
“Enterprises are going to want to leverage the right LLM for their outcome, and they’ll want to have that orchestration of those capabilities because they want to be able to pick and choose just like the clouds,” he explained, noting Amazon, Google, Microsoft and other AI players each have unique strengths. “You want to be able to pull all that together and not have to have a monumental engineering feat to do that.”
AI exchanges would remove the complexity of connecting AI capabilities scattered across different locations. Sharp envisions these providing both a control/orchestration layer and physical layer connections.
The control/orchestration layer would be software that helps navigate data to the right destination and also allows customers to track how their bits are flowing. This will be especially critical given the sensitivity of the data that is used to inform AI models.
And the physical layer is the actual high-density fiber connectivity needed to transport data from one location to another.
DE-CIX CEO Ivo Ivanov also recently highlighted the importance of strong interconnection and AI exchanges in an interview with Fierce.
"The very essence of AI is data – AI cannot function without it," he explained. "Anything that undermines the flow of data to and from an AI model will impact the value you can get out of the model. All data sources – be they data lakes/warehouses or live data from IoT devices, the production infrastructure, customer networks, etc. – must flow unhindered to the location of the AI model for both training and inference purposes."
Ivanov continued: "If a company connects its own network to an exchange platform, and then connects directly with the chosen networks and cloud providers, they have control of the data journey, ensuring the shortest pathway, lowest latency and greatest level of security, while keeping the costs in check."
If this is all sounding kind of familiar, that’s because what Sharp and Ivanov are essentially describing is a data center. Indeed, Sharp said data centers seem to be a natural integration point for AI, networking and software.
There’s only one problem. Almost no one is thinking this far ahead yet.
“The role of interconnection and data exchanges has to be standardized and contemplated, and that’s not happening today,” he said. “Customers and companies aren’t thinking about this exchange of data and the magnitude and capacity required that’s going to surpass anything we’ve seen on the internet…We need to be mindful of how AI evolves and how the interconnection becomes a critical success factor.”
But there’s a little time to get it together. Analysts previously told Fierce inference is expected to surpass training in terms of percentage of AI workloads in the next two to three years.
Fierce reporter Julia King contributed to this story.
Update 9/30/2024 9:21 am ET: This story has been updated to add comments from