Op-ed: All hope is not lost for Intel in the AI market

AI has become the hot new arena for competition in the technology world. But Intel, despite being a major force in processors, has been a laggard in this space and has been slower to react than rivals like Nvidia and AMD.

Indeed, companies like Nvidia have seen their value grow astronomically as they provided the raw processing power needed to pursue the AI solutions that show promise to revolutionize computing technology and how we use that technology. Others, like AMD and a slew of startups have been a bit slower to produce products for this space, but are also reaping the benefits of the demand for AI processors. 

So, can Intel recover and capture a significant piece of the AI market? I believe they can.

Missteps

Intel does have its AI accelerator Gaudi, which it got when it purchased Habana Labs several years ago. And Gaudi does offer some higher end capability for training solutions beyond the core CPU. But Intel missteps in AI over the past four to five years, with some acquisitions it never was able to take full advantage of, has put it behind in the race to be a leader in AI solutions. 

While Intel claims Gaudi can run AI workloads at a fraction of the cost of Nvidia chips on a price/performance basis, it still has a minor share of the market. Much of this is due to the fact that Nvidia leveraged not only its hardware assets but also its software assets, with a large percentage of AI solutions built on the proprietary CUDA language. There are many initiatives to move to open AI programming models (e.g., PyTorch, OpenVINO), but it will take some time before they overtake the CUDA environment, to the advantage of competitors like Intel.

Changing AI workloads

In the next two to three years, I expect to see 75% to 85% of AI workloads move to production inference based systems, rather than the massive new model training systems of today. That doesn’t mean the high end training systems will go away. It does mean that as models are produced and sent to production systems, those solutions will be running on more traditional server and cloud systems that have an AI accelerator attached to a typical CPU that can handle not only AI but traditional workloads as well. 

This is an area where Intel can excel (and potentially AMD as well) and an area where Nvidia is at a disadvantage. And a move to more open standards for AI software will make it easier for Intel and others to compete as proprietary systems lose favor with enterprises. But it means that Intel has a couple of years before its efforts in powering AI solutions really takes off and has a positive effect on its revenues. 

Will it persevere or will the market make Intel take some action that might negatively impact this by shutting down programs or divesting certain resources? It would be a mistake in my opinion to do either, but Intel as a public company will face pressure to do so. Hopefully it can resist doing so until its emerging strengths and revenue opportunities become apparent.

All about the software, baby!

AI software and frameworks being used to build models is a critical matter.  To date, these have been optimized mostly for proprietary GPUs. But going forward, there are significant initiatives to both create open software initiatives (e.g., open AI frameworks like PyTorch) and porting and optimizing solutions away from proprietary software like CUDA. While Nvidia has an interest in keeping its customers on its proprietary CUDA software for its GPUs, Intel and others have an overwhelming desire to open up this software ecosystem to become cross platform compatible.

That means that Intel has invested a significant amount of resources to promote open source compatible AI software that enables models to run on virtually any processing platform. Not only is this a good approach to keep enterprises from becoming bound to a single processing supplier, but it’s also critical for Intel’s and other players’ long term success. 

Intel has been making some significant progress in this regard. I expect its focus on cross platform software infrastructure and not requiring customers to be on proprietary systems will ultimately become a major advantage for Intel, as well as for the growth and diversification of the AI ecosystem.

Enterprises should be examining what it means to be running on a single source proprietary software framework either on prem or in the cloud and how it will affect any possibility of moving to emerging processing products for potentially better TCO and ROI. It also has a major impact on the creation and use of open application models at repositories like Hugging Face. This is an area that Intel and others can exploit.

Intel has wisely focused on its own Intel Developer Cloud to give customers a platform to try out AI solutions with Intel software and hardware assets. This can kick start both developers primarily, but also large enterprises on the road to deploying AI with relatively low risk and give them a “try before you buy” capability.

Bottom Line: I expect Intel to ultimately gain a major share of the inference model based workload market as it continues to push more aggressively in its performance CPUs, accelerators and especially in its drive to promote open software and non-proprietary solutions that can run on their platform. 

Enterprises would do well to stay abreast of what Intel is up to, and watch carefully as new products are introduced. Intel clearly has regained momentum in AI and will be a good option in the near future.


Industry Voices are op-eds from industry experts or analysts invited to contribute by Fierce staff. They do not represent the opinions of Fierce.

Jack Gold is the founder and principal analyst at J.Gold Associates, LLC. With more than 45 years of experience in the computer and electronics industries, and as an industry analyst for more than 25 years, he covers the many aspects of business and consumer computing and emerging technologies.