- AI workloads are expected to move from centralized training to inferencing at the edge
- Starlink might have more trouble serving these workloads than terrestrial telcos
- Latency, power and compute are key issues
Starlink burst onto the broadband scene in recent years, lighting up industry conversations like a supernova. But could the hype around the company end up flaming out in the era of edge AI?
The question is an interesting one and the answer, according to several analysts is: it depends.
Certainly, Starlink has a lasting place in the broadband conversation due to its unique ability to quickly connect folks in remote areas. And depending on how the next few years play out – that is, if the company secures money from the Broadband Equity, Access, and Deployment (BEAD) Program for expansions and gets its hands on more spectrum – it could become an even more prominent figure in the broadband scene.
But there are two things that artificial intelligence (AI), and edge AI in particular, requires that Starlink lacks: low latency and compute.
As Colin Campbell, SVP of Technology for North America at Cambridge Consultants, noted, “If you want to be truly on the edge, you want to be as close as possible [to end users], and space networks aren’t close” by definition. Additionally, satellites by design have limited space for the compute power required to process edge AI workloads at scale.
Sure, you could send those workloads to ground for processing, he said, but that would just add to the latency issue.
According to Starlink’s website, the company currently provides typical latency of 25 to 60 milliseconds (ms), with latency climbing to over 100 ms for certain extremely remote locations. While that sounds high compared to the 10-20 ms of latency fiber providers like AT&T and Frontier provide, it’s not actually that much of a problem – at least not yet.
Why? Well, as Recon Analytics founder Roger Entner told Fierce “We are still looking for the use case where a few milliseconds or even 10 or 20 milliseconds of additional latency make a difference.”
But given AI is expected to evolve from primarily training today to inferencing at the edge in a few years’ time, it could soon be a problem for Starlink.
Houston, we have some problems
Jack Gold, of J. Gold Associates, pointed out there are a few other factors that likely won’t work in Starlink’s favor when it comes to serving edge AI. First, satellites don’t exactly get updated that often for obvious reasons. And second, they’re expensive.
In contrast, “edge systems on the planet are much less expensive, relatively easy to deploy and maintain, and can be updated as often as needed and at a reasonable cost,” he noted.
Then there’s just the nature of how satellite networks function. “You are not always connected to the same one even for the same communications,” Gold added. “The satellites will pass you off to the next one as they fly by overhead. So, if you are computing something on one, it may not even finish by the time you move to the next satellite…So it seems impractical to run AI on the satellite.”
And then there’s the question of what running any real AI compute on a satellite would do to the bird’s power load. If it increases it too high, that “may be a problem in getting enough solar power or at the least would add more costs,” Gold said.
Long story short, there seem to be a lot of drawbacks to the prospect of running edge AI applications on satellites. But there could be one use case that could end up being Starlink’s Goldilocks zone.
“For some applications, satellites may be the way to interact with terrestrial based edge systems, especially for remote use cases where latency is not a critical issues (e.g., not health or safety related, where a few seconds or even minutes won’t affect outcomes negatively), or if there is no real terrestrial network to access,” Gold concluded.
It’s hard to know how or even if Starlink is thinking about offering edge AI services. Fierce tried to reach out to the company via its parent SpaceX (since apparently Starlink lacks a media contact) but got no response. Starbase, we’re here if you want to talk.