Ciena's CEO says company's growth is linked to AI

  • Ciena reported its 4th quarter earnings today, which were better than expected
  • The improved outlook comes from the boom in artificial intelligence
  • AI is lifting all boats as even service providers and cloud providers are collaborating to meet the demand

Ciena surprised this morning with an earnings report from its fiscal fourth quarter 2024 that was much better than expected. After the call, Fierce Network spoke with Ciena's CEO Gary Smith who indicated that a lot of the positivity derived from all the activity around artificial intelligence (AI), which is bringing Ciena new business from service providers as well as cloud providers.

Smith kicked off today’s earnings call reporting revenue of $1.12 billion in the quarter. He said orders in the quarter were once again above revenue even though the company had expected orders to be below revenue just a few months ago.

"Our business is linked heavily into the growth of bandwidth around the world," Smith told Fierce after the earnings call.

Gary Smith, CEO, Ciena. Source: Ciena

He attributed the uptick to increased demand for the company’s Reconfigurable Line Systems (RLS), primarily from large cloud providers. And he said the company was also doing well selling its WaveLogic coherent optical pluggables, which optimize performance in data centers as they support traffic from AI and machine learning.

The company also said it now expects average annual revenue growth of approximately 8% to 11% over the next three years.

One of the more interesting points from today’s call was the mention of the company’s Managed Optical Fiber Networks (MOFN) technology. MOFN is designed for global service providers that are building dedicated private optical networks for cloud providers.

Smith told Fierce that MOFN came about a few years ago when cloud providers wanted to enter countries where they weren’t allowed to build their own fiber networks. “They had to go with the incumbent carrier, but they wanted to have control of their network within country,” he said. “It was sort of a niche-type play. But we’ve seen more recently, over the last 6-9 months, that model being more widely adopted.”

He said the model is also being adopted in North America, with the most prominent example being Microsoft’s announcement that it was using Lumen Technologies to expand its network capacity.

MOFN is becoming more widely utilized, and the good news for Ciena is that cloud providers often request that Ciena equipment be used so that it matches with the rest of their network, according to Smith.

The trickle-down effect

It’s been a rough couple of years in the telecom sector with the hype from 5G fizzling out; cable broadband hitting the skids; and fiber broadband planning for big things but not quite ready for deployments.

So, all the boom around generative AI couldn’t come at a better time. And it looks like telecom service providers will benefit.

“All the investment so far has been in the super compute cycle. But now, it’s rolling out of the data center and becoming traffic. It’s an opportunity for service providers to extract value from that,” Smith said.

“As omnipotent as cloud providers are, they can’t provide everything,” Smith said.

It might also solve some tension between service providers and cloud providers. For instance, in Europe, service providers have been complaining that it’s not fair that they must invest so much in their networks while hyperscalers such as Amazon and Netflix flood those networks without having to make any investments in operations or maintenance.

Smith said MOFN is a way for service providers and hyperscalers “to collaborate and get value from the growth in cloud.”

From Ciena’s perspective, AI comes at a fortuitous time, as well. “You’re having to connect these GPU clusters over greater distances. We’re beginning to see general, broader traffic growth in things like inference and training. And that’s going to obviously drive our business, which is why we’re forecasting greater than normal growth.”