ChatGPT, the much-hyped generative AI chatbot from OpenAI, is attracting the attention of the largest tech companies in the world, with Amazon Web Services (AWS), Google Cloud, Microsoft and Nvidia standing to benefit from the tool’s popularity and capabilities.
In fact, Microsoft announced this week that ChatGPT would be added soon to Azure OpenAI Service, which provides advanced coding and language AI models and became generally available on Monday as part of its OpenAI partnership. Customers will gain access to a “fine-tuned” version of GPT-3.5 that’s been trained and runs inference on Azure AI infrastructure, according to Microsoft.
“While several tech players have been working on this for years, a combination of better compute, more data and advancements in architecture has improved generative AI to the point where it can now create content and interaction that feels close to human,” UBS equity analyst Lloyd Walmsley wrote in a report on ChatGPT released this week. “Productization of this tech is seeing an inflection point with OpenAI providing the missing piece of the puzzle: widespread access to developers and the public.”
A large language model (LLM) that’s trained on massive amounts of text to interact conversationally with users, ChatGPT has taken off since its public debut on Nov. 30. People are using it to generate code, write everything from songs to research grants and tell bad jokes. Silverlinings employs ChatGPT – aka Chatty ChatGPT – as its poet laureate.
Microsoft already invested $1 billion in OpenAI in 2019 to become its “preferred partner” for commercializing new AI tech and plans an additional $10 billion investment, according to reports.
Walmsley expects the growing use of ChatGPT will boost the tech giant through increased consumption of its Azure cloud, giving it a “leg up” on broader AI workloads through the Azure OpenAI Service and by embedding it into its Bing web search engine and Office 365 collaboration and productivity tools.
“Microsoft has begun embedding OpenAI technology into its products, such as GitHub Copilot,” he wrote, referring to the service that uses AI to turn natural language prompts into coding suggestions for programmers. “These technologies have the potential to continually differentiate Microsoft's products across a variety of products.”
For Nvidia, a chipmaker and computing company, increased use and commercialization of LLMs are expected to increase demand for its graphic processing units.
A ‘near-term threat’ to Google
But ChatGPT could be a “near-term threat” to Google – and result in the possible loss of Google Search query share and revenue risks tied to search format changes – according to Walmsley, who also acknowledged a potential upside.
“On the (+) side, recent use cases (e.g. ChatGPT, GitHub Copilot) help bring to life apps that could spin out of Google's own advanced work in the space,” Walmsley wrote. “We do not think that Google lags OpenAI from a technology standpoint. In fact, our research and checks have suggested that Google's LLM tech (based on its Pathways architecture) is best in class, but the company has just not yet opened it up to the public in the same way that OpenAI has.”
And, as with Microsoft Azure, AWS and Google Cloud Platform should benefit from ChatGPT through increased use of its cloud compute and storage services, he added.