Artificial intelligence is poised to take a large step out of the cloud and into edge computing, which will benefit edge AI hardware vendors.
The shift from the cloud to edge AI, which includes devices, gateways and on premise services, will be driven by machine learning, or inference, and then by training, according to a report by ABI Research. Edge AI inference will increase from just 6% last year to 43% by 2023, according to ABI.
For the telecom industry, moving network technologies from the core to the edge results in faster provisioning of new services and applications. When paired with machine learning and analytics, AI holds promise for Internet of Things applications, virtual reality, autonomous vehicles, and the rollout of 5G services.
While there's still a lot of ground to cover before autonomous vehicles and virtual reality reach mass deployments, the report highlighted how AI can push adoption rates forward across various verticals.
Chipset vendors with power-efficient chipsets, such as Intel, NVIDIA and Qualcomm, are pushing to deliver hardware that will enable automotive OEMs to experiment with on-device AI training for autonomous driving.
In partnership with Cambricon Technologies, Huawei has already rolled out on-device AI training for battery-powered management for its P20 pro handset. While training at the edge on devices is starting to gain momentum in terms of R&D, it could be a while for it to be a "realist" approach for most segments, according to the report.
"The massive growth in devices using AI is positive for all players in the ecosystem concerned, but critically those players enabling AI at the edge are going to see an increase in demand that the industry to date has overlooked," said ABI Research industry analyst Jack Vernon, in a prepared statement. "Vendors can no longer go on ignoring the potential of AI at the edge.
"As the market momentum continues to swing toward ultra-low latency and more robust analytics, end users must start to incorporate edge AI in their roadmap. They need to start thinking about new business models like end-to-end integration or chipset as a service."
RELATED: AI, machine learning integrations will help push off-premise cloud service market to $374B by 2022
Vernon said the shift to the edge for AI processing is being driven by low-cost hardware, mission-critical applications, and the desire to avoid more expensive cloud applications.
"Consumer electronics, automotive, and machine vision vendors will play an initial critical role in driving the market for edge AI hardware," Vernon said. "Scaling said hardware to a point where it becomes cost effective will enable a greater number of verticals to begin moving processing out of the cloud and on to the edge."
ABI Research identified 11 verticals that it said were ripe for the adoption of AI, including automotive, mobile devices, wearables, smart home, robotics, small unmanned aerial vehicles, smart manufacturing, smart retail, smart video, smart building, and oil and gas sectors. By 2023, it's projected that there will be 1.2 billion shipments of devices capable of on-device AI inference, which is up from 79 million in last year.
ABI Research said that cloud providers would play a large role in AI training. Out of the 3 billion AI devices shipments in 2023, more than 2.2 billion would rely on cloud service providers for AI training. Cloud providers' market share for AI training will drop from its current 99% down to 76% by 2023.