Move over, data platforms – this is the dawning of the 'Age of Intelligence'

  • The observability industry is about to make its second major pivot, shifting focus from data platforms to AI
  • An increase in data volume and complexity is pushing beyond the bounds of what a human can manage alone
  • AI-enabled observability systems can help spot issues even when alerts haven’t been set up

To hear New Relic’s Chief Technology Strategist Nic Benders tell it, the observability industry is on the cusp of a monumental shift.

According to Benders, the early days of observability were about instrumentation – that is, ensuring the right tools were available to help IT teams spot anomalies and gain full visibility into their systems. But with the rise of the cloud, the focus shifted to data around 2014. The goal, he said, was to be the platform of choice for data storage to make queries and analysis that much easier. And now, things are about to shift again.

As Benders put it, the observability market is entering the Age of Intelligence, wherein observability tools aren’t just used to monitor artificial intelligence (AI) deployments but also tap into the power of AI themselves to see and do more than a human could alone.

“The future is being the place where the reasoning occurs and some of the data lives,” he told Fierce.

Shifting sands

To be clear, this doesn't mean that data platforms will disappear any more than instrumentation disappeared at the previous shift. It just means that data platform offerings have become table stakes. The real differentiator going forward will be intelligence.

Why? Well, Benders explained that in a multi-cloud, multi-platform world, it’s just not practical to strive to be the only place data is stored. And besides, these days there’s just too much data and too much complexity for a human to manually manage alone.

That means it’s basically impossible to be able to dream up and configure alerts for every possible anomaly that deserves attention. But AI can help.

In the Age of Intelligence, “it’s no longer about the ability to answer questions but to know what questions to ask” in the first place, Benders said.

“The long strategy of ‘I’m going to create a dashboard for everything, I’m going to write an alert for everything’ is not a race we can win,” he continued. “Computers have increased in complexity, software has increased in complexity, our jobs have increased in complexity so much that we have to shift some of that burden back into the computer.”

Benders isn’t alone in highlighting this issue. In its 2024 State of Observability report, Dynatrace noted that 86% of 1,300 technology leaders surveyed said cloud-native technology stacks have produced an “explosion of data that is beyond humans’ ability to manage.” The report also found that 93% of organizations have or plan to adopt AIOps to help manage complex multi-cloud environments within the next 12 months.

On the horizon

In practice, Benders said solving for this problem requires two things. First, it requires the ability to retrieve data from a wide variety of sources in a non-competitive manner. On this front, APIs and partnership arrangements will be key, he said.

Second, it will require a kind of agentic AI known as a mixture of experts. Specifically, Benders said there will be an orchestrator AI with the natural language user interface (kind of like a large language model) that will figure out what problem it’s dealing with and send requests to different systems for data. These might include statistics repositories, time series predictors or even other LLMs.

The idea is to “stitch together multiple systems” with AI. Benders noted New Relic isn’t the only vendor in this space thinking about this.

And according to Gartner’s August 2024 Magic Quadrant ranking for Observability Platforms, it’s up against some stiff competition. New Relic is a top three leader facing off with Dynatrace and Datadog. But Amazon Web Services and Microsoft are hot on its heels as Challengers in the market.

Moor Insights and Strategy's Jason Andersen told Fierce there are "hurdles aplenty" to getting customer buy in.

"The biggest one I think is that there is a lack of trust on many dimensions. Do I trust the model to be accurate? Do I trust the vendors expertise in qualifying and supporting the AI solution? Do I trust my team to be able to manage this thing once it is deployed?" he said. Andersen added different industries will also have to sort through and agree upon "what is and what is not proprietary data" to determine what is fed into the AI and how. 

As we've heard before, it'll also require cultural changes within each organization. Specifically "process re-engineering and people changing," Andersen said. 

"What I am seeing now is what I am calling 'Turbo Button AI,' where a person can do an existing task the same as they used to, just faster. That is nice for now, but real ROI comes from transforming, not just speeding up," he concluded.