AI

Sweltering in the heat? So is Oracle's new GenAI suite

  • Oracle launched HeatWave GenAI, a suite of generative AI capabilities integrated into its cloud database

  • HeatWave GenAI includes in-database large language models (LLMs), an automated in-database vector store and scale-out vector processing 

  • Oracle pitched itself as an alternative to Snowflake, Google BigQuery and Databricks for vector processing

Oracle lived up to its name this week with a prophetic launch that timed perfectly with national weather patterns.

As millions of Americans sizzle in a record-breaking heatwave, Oracle debuted its HeatWave GenAI, a suite of generative artificial intelligence (AI) capabilities integrated directly into its cloud database. The HeatWave lineup includes in-database large language models (LLMs), an automated in-database vector store and scale-out vector processing capabilities.

Gartner analyst Balaji Abbabatulla said the portfolio follows a trend toward GenAI offerings that has become "prevalent among software providers.” Indeed, Oracle pitched itself as an alternative to the likes of Snowflake, Google BigQuery and Databricks.

By embedding large language models (LLMs) and automated vector processing within the database itself, HeatWave aims to help businesses build GenAI applications without needing specialized AI knowledge, moving data or incurring additional costs. 

Vector processing specifically involves performing the same operation on multiple data points simultaneously, which significantly speeds up data handling and computation.

This approach is especially useful for tasks like data analysis, machine learning and semantic search where large volumes of data need to be processed quickly and efficiently.

The most likely to benefit from Oracle's HeatWave GenAI offerings, Abbabatulla told Fierce, are global enterprise customers with a multi-region, multi-cloud footprint who want to progress their GenAI proof of concept to deployment at scale with a “consistent experience.”

HeatWave GenAI components

Headline features of the new HeatWave portfolio include the following:

  • In-Database LLMs: HeatWave GenAI includes two LLMs – Llama 3 and Mistral – that operate within Oracle's database. These enable the competion of tasks like data search, content generation and retrieval-augmented generation (RAG). It also means customers don’t need to use external GPUs or AI services.
  • Automated in-database vector store: This feature allows users to create and manage vector stores for unstructured content directly within the database, simplifying the process and enhancing security. Abbabatulla said the concept of vector storage is not new, but the additional aspects of HeatWave GenAI such as the automation of vector store using GenAI and continuous optimization of compute and storage costs is new.
  • Scale-out vector processing: This provides fast and accurate semantic search results using standard SQL, leveraging HeatWave’s scale-out architecture for high performance.
  • HeatWave Chat: A user-friendly tool that lets developers interact with the database using either natural language or SQL, maintaining context and history of queries to get better insights.

HeatWave GenAI is built on Oracle's existing HeatWave platform, which already unified transactional and analytical processing within a MySQL-compatible service.

The Oracle GenAI suite is now available across all Oracle Cloud regions and OCI Dedicated Regions. Abbabatulla noted the new GenAI offering “should resonate with Oracle’s OCI customer base.”

However, he said while the suite “has potential to deliver business value, we will need to wait for deployment at scale, across multiple cloud and customer environments to validate the real business impact.”

Competitive landscape

Oracle made bold claims that HeatWave GenAI offers significant performance advantages over competitors, including speed and cost efficiency: The technology is said to be 30 times faster than Snowflake, 18 times faster than Google BigQuery and 15 times faster than Databricks for vector processing, while also being cheaper.

While we can neither confirm nor deny those comparisons, here’s what we know about these other GenAI platforms:

Snowflake AI Data Cloud: A platform designed to integrate AI and machine learning capabilities into the data management process. Informatica this month announced Native SQL ELT support for Snowflake’s Cortex AI Functions, as well the launch of Enterprise Data Integrator (EDI) and Cloud Data Access Management (CDAM) for Snowflake.

Databricks Lakehouse platform: Unifies data management and analytics by combining data engineering and AI. Built on open source and open standards, the lakehouse architecture combines elements of data lakes and data warehouses.

Google BigQuery: A serverless data warehouse platform that enables scalable data analysis and AI model integration. It integrates with various AI and machine learning tools in Google Cloud, but also supports querying data stored on other clouds.

GlobalData analyst Charlotte Dunlap said application platform leaders including Oracle, IBM, Red Hat, Microsoft and Salesforce have an edge over pure plays in the GenAI game, because they’re "laser focused" on integrating AI capabilities into their comprehensive app development solutions.

"This addresses enterprise DevOps teams’ need to quickly build modern apps and deploy them in highly complex distributed cloud environments with greater ease," Dunlap told Fierce. 

"It makes sense Oracle is playing to its database strengths through GenAI-injected improvements. We’ll see application platform rivals continually integrating GenAI across their solutions including automation, observability, security, and FinOps."

6/2/24 11:00 am EST: This article was updated with additional comments from GlobalData analyst Charlotte Dunlap.