Fierce Network TV

Network evolution is critical for AI at the edge

As enterprises increasingly deploy AI at the edge, the network infrastructure supporting these applications becomes ever more fundamental to success.  
 

Speaking ahead of Mobile World Congress in Barcelona, Kangwarn Chinthammit, Director and Technology Product Manager at VeloCloud, shares insights on this transformation. 
 

The continued migration of intelligent applications from the data center to the edge isn't just about technology – rather, according to Chinthammit, it's about improving real-world operations, from factory quality control to customer experiences in retail and healthcare. And with Gen-AI applications now being deployed where business actually happens, network requirements are evolving rapidly. 
 

This evolution brings new challenges, including increased upload traffic and encrypted applications that require sophisticated traffic management. Here, VeloCloud's SD-WAN solutions have adapted to ensure AI applications receive optimal performance while maintaining security and user experience. 
 

Chinthammit emphasizes that edge AI deployment creates a constant cycle of model training and updates between data centers and edge locations. This distributed approach requires networks capable of managing both application performance and model distribution effectively. 
 

VeloCloud is responding by putting AI at the center of its product strategy. The company is focusing on delivering cloud-based solutions that optimize performance for both end-users and applications, all while simplifying management of increasingly complex edge environments. 
 


Steve Saunders:

The enterprise landscape is undergoing a significant transformation with the rise of generative AI and agentic workloads at the edge. As organizations harness these technologies, they're redefining how they operate and interact with customers. 

Kangwarn Chinthammit:

So, we have been seeing the growth at the edge for quite some time with the new use cases, whether it to improve the quality in the factory, to improve the end user experience, interacting at the retail store or the hospital. And now with the growth of AI, we're going to see massive change in terms of incorporating this new technology to improve the operation and the end user experience or the efficiencies of how enterprises conduct business. 

You actually want to use AI or consume AI outside of the data center because that's where the business is taking place. That's where you interact with your consumer, your customer, and AI is actually playing a big role in improving that user experience. And then in the factory, you want to use AI to improve the quality of the goods that are being produced and to improve the condition and the safety of the worker. 

When you actually deploy AI at the edge, the network becomes a critical component, because in the end, either you have the model deployed remotely or you have an application that make use of AI. They have to communicate not only to the services that may be sitting in the data center, but also communicate with one another. So, the network now becomes a very critical component to make that whole experience good for the end user. 

The traffic flow is also changing, so there will be a lot more upload traffic. Applications are encrypted, so we need to be able to identify all these different traffic flows. And it's not just one, but many of them that constitute an application or use case that need to be protected, secure, and make sure that the user experience is still good. SD-WAN has been focusing a lot on providing quality of service to the end user. It needs to be able to do the same for the AI applications and making sure that those AI applications have the best performance in order to do their jobs. 

So at VeloCloud, we recognize that this is the trend that is happening, to put the AI front and center for our product to be not only the network solution for AI, but also to use AI in every aspect of the solution and have the cloud deliver product that provides performance and ease of use for the solution. And deliver the best possible experience, not only to the end user, but also to the applications. 

As we push more intelligence and application closer to the data source, as you constantly deploy and improve your model, you have a need to actually train that model in the data center and then push it back out to the edge where the data source is. And this constant cycle relies on the network being there to basically allow you to, number one, manage these highly distributed applications, and number two, be able to update the model, get information back, and constantly improve it.

Steve Saunders:

As enterprises embrace AI and modern applications, the role of network infrastructure outside the datacenter becomes increasingly crucial. By leveraging these technologies, organizations can enhance efficiency, improve customer experiences, and maintain a competitive edge. The future is clear. Is your network infrastructure AI-ready? 

The editorial staff had no role in this post's creation.