-
New company Articul8 will offer a GenAI platform based on Intel-developed technology
-
Analysts told Silverlinings Articul8’s approach is similar to Amazon’s Bedrock in that it will allow enterprises to tap into multiple LLMs
-
Orchestration will become a key capability for all AI players as the number of models increases, they added
Intel is jumping headlong into the generative artificial intelligence (GenAI) race, teaming with private equity backers to launch Articul8. The company is essentially a spin off which will offer a full-stack GenAI platform based on Intel-developed technology.
Intel’s VP and GM of Data Center and AI Arun Subramaniyan, who formerly led AWS’s machine learning, quantum computing and HPC efforts, is jumping ship to lead the charge as Articul8’s first CEO.
Articul8's platform includes Intel Xeon Scalable processors and Intel Gaudi accelerators, but also supports a range of hybrid infrastructure alternatives, according to the announcement. Boston Consulting Group (BCG) already deployed the technology in May 2023.
AvidThink Founder and Principal Roy Chua told Silverlinings the move “demonstrates Intel's willingness to spin out a fast and agile team that has demonstrated early market traction with credible partners like BCG (and their clients).”
He added that since “Intel is still working to improve its internal software culture and capabilities,” the new venture and external funding will allow Articul8’s team to “pivot, scale and succeed” faster than Intel might be able to alone.
Articul8 is aiming to differentiate itself in the GenAI space with its ModelMesh technology, which will sift through a collection of large language models (LLMs) and probabilistic models to find the one with the right functionality, size and cost performance for a given enterprise’s needs. The company’s platform can be deployed on any of the three major hyperscale clouds or on-premises.
AI buffet
Gartner VP of Cloud Services and Technologies Sid Nag told Silverlinings that Articul8’s offering is aimed squarely at “enterprises that require deployment of Generative AI in a secure manner and for specific industries that have regulatory requirements around data protection and data privacy.” Think entities in finance, telecommunications, aerospace and government, among other verticals.
He added “deployment of vertical and domain specific GenAI is increasingly going to be in demand by enterprises and some of these deployments may or may not need the public cloud and could be deployed on-premises given the size of these curated LLMs - which is why this technology offers the option of both on premises and cloud based deployment models.”
Both Nag and Chua noted that ModelMesh appears to be similar to Amazon’s Bedrock offering in that it supports multiple LLMs. But Nag also highlighted Articu8’s inclusion of probabilistic models. These, he said, are machine learning technology that can deliver better outcomes than LLMs in certain instances.
“Probabilistic models are a great way to understand the trends that can be derived from the data and create predictions for the future,” he explained.
Chua pointed to ModelMesh’s orchestration capabilities as another key offering from Articul8 and tipped players of all shapes and sizes to increase investments on this front as the number of LLMs and foundation models increases.
To date, Intel’s role in the AI realm has primarily been focused on chips. Last month, the company trotted out a trio of new processors and accelerators during its AI Everywhere event in New York City. Articul8, though, gives Intel a path to get into the AI model game. The move comes after Intel joined the IBM and Meta-led AI Alliance in December.
Investors in the new venture include DigitalBridge (which has also invested in DataBank, Vantage Data Centers and Zayo, among other companies), Fin Capital, Mindset Ventures, Communitas Capital, GiantLeap Capital (which contributed $40 million), GS Futures and Zain Group.
Silverlinings reached out to Intel for comments on the announcement but did not receive a response in time for publication.