UPDATED 13:42 EDT / FEBRUARY 01 2024

AI

Riding the wave: How Dynatrace looks to capitalize on growth of AI and need for transparency

Companies are moving rapidly to integrate artificial intelligence into key workloads, but can anyone really be sure of what’s happening inside the AI engine?

This was the central question in Las Vegas this week as Dynatrace Inc. unveiled a series of announcements at its annual Perform event that were designed to add a new dimension to the unified observability and security market. The company introduced observability for large language models and launched a new core platform that provided a single pipeline for managing petabyte-scale data ingestion.

Dynatrace is seeking to separate itself from competing observability firms by offering a better understanding of data-hungry AI application performance using predictive and causal machine learning. As Chief Executive Rick McConnell (pictured) noted in his opening keynote remarks on Wednesday, Dynatrace is seeking to add value through insight.

“Our approach is radically different,” McConnell said. “The Dynatrace advantage is delivering answers, not just data. Those answers enable resolution.”

Appeal for developers

Dynatrace’s AI Observability solution is designed to cover the end-to-end AI stack, including Nvidia GPUs, foundational models like GPT4, orchestration frameworks and vector databases. That should appeal to developers who are under increased pressure to build highly accurate models without breaking the bank, according to Andreas “Andi” Grabner, DevOps activist at Dynatrace and a Cloud Native Computing Foundation ambassador.

“Cost, efficiency and accuracy are three pillars for me that are very important to highlight for developers,” Grabner said in an exclusive interview with SiliconANGLE. “We can help developers building new and different systems using AI.”

The company’s observability solutions leverage Davis AI, a causation engine that can automatically detect performance anomalies in apps, services and infrastructure. Davis was first introduced by the company in 2017.

“Dynatrace has been doing AI for many years, with Davis AI, so it makes sense that they would understand the AI value chain,” said Rob Strechay, managing director and lead analyst at theCUBE Research. “Observability of AI applications, not just the data, will become increasingly important as large language models and segmented language models become sources of record for organizations and their customers to rely upon.”

Tackling sprawl

The announcement of Dynatrace’s OpenPipeline technology was designed to address the growing challenge of “pipeline sprawl” that can plague observability environments. There is evidence of a market need for this type of offering. Survey data recently provided by Enterprise Technology Research to SiliconANGLE showed that nearly half of respondents in the observability market were looking to consolidate tooling in 2024.

“I suffer myself from tool sprawl in my team,” Bernd Greifeneder, chief technology officer at Dynatrace, said during his keynote remarks on Wednesday. “I’ve spoken to customers who have over 100 tools just for a domain like security.”

This problem has been amplified for businesses seeking to manage AI deployments in cloud-native environments. An example can be found in the case of albelli-Photobox Group, a digital-only provider of photo printing and gifting services.

Photobox’s platform was built on a custom technology stack powered by EC2 on Amazon Web Services Inc. and microservices running on Kubernetes. The company had to rely on multiple monitoring and logging solutions, so it turned to Dynatrace for AIOps capabilities that would reduce both the number of tools needed and the time to implement workflows across a complex stack.

“We see our engineers wanting to run workloads across different technologies,” Alex Hibbitt, an engineering director at Photobox, told SiliconANGLE. “There is a huge amount of tooling that is all quite different in terms of how AI solves problems.”

Measured AI adoption

For a company such as Photobox, the adoption of AI and technology to manage it was a given. That is not necessarily the case in the highly regulated banking industry where data sharing can be limited.

TD Bank has been using Dynatrace’s observability tools to monitor a massive amount of its own data. This involves the measurement of 19.5 billion critical customer transactions annually, according to Chris Conklin, a technology executive in enterprise monitoring at TD Bank, who spoke at Perform.

The bank relies on this capability to spot problems and take rapid action before they become irritants for customers. TD Bank is evaluating future use of generative AI, but data controls remain the primary focus.

“We as a bank struggle with a little bit of the AI space because of the risk,” Conklin said in an exclusive interview with SiliconANGLE. “We have to protect the data. We’re scratching the surface, there are a lot of test beds.”

This linkage between data and AI is driving a rising number of use cases. Yet one Microsoft Corp. executive sounded a note of caution that mushrooming new applications for AI may not be the best course of action for many firms.

“I see a lot of customers that are very excited about AI and have 65 use cases,” said Eve Psalti, Microsoft’s senior director of artificial intelligence at Microsoft Corp. “Well, you may not want 65 use cases. Start small.”

Dynatrace’s announcements this week highlighted a continued mission in the enterprise world to move beyond the “black box” of opaque AI. As companies increasingly integrate new AI applications into business models, the need for visibility and transparency becomes more critical, and this will drive competition.

“2024 is shaping up to be a landmark year in the observability space,” said Strechay. “Dynatrace’s strategic moves, reflected in their recent announcements, are in line with where we see the market is going: a platform-centric observability market. As we continue to monitor these developments, it’s clear that the race for the ultimate observability platform is far from over.”

Photo: Mark Albertson/SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Support our open free content by sharing and engaging with our content and community.

Join theCUBE Alumni Trust Network

Where Technology Leaders Connect, Share Intelligence & Create Opportunities

11.4k+  
CUBE Alumni Network
C-level and Technical
Domain Experts
15M+ 
theCUBE
Viewers
Connect with 11,413+ industry leaders from our network of tech and business leaders forming a unique trusted network effect.

SiliconANGLE Media is a recognized leader in digital media innovation serving innovative audiences and brands, bringing together cutting-edge technology, influential content, strategic insights and real-time audience engagement. As the parent company of SiliconANGLE, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — such as those established in Silicon Valley and the New York Stock Exchange (NYSE) — SiliconANGLE Media operates at the intersection of media, technology, and AI. .

Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a powerful ecosystem of industry-leading digital media brands, with a reach of 15+ million elite tech professionals. The company’s new, proprietary theCUBE AI Video cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.