Dynatrace Bolsters Platform with New Technologies for AI, Data, and Pipeline Management

Dynatrace, the unifying observability and security provider, is debuting a variety of new enhancements and core technologies for its analytics and automation platform that aspire to increase the security and quality of its platform’s analytics, AI, and automation capabilities. Culminating in three unique advancements—AI Observability, Data Observability, and OpenPipeline—Dynatrace’s latest capabilities emphasize the company’s core focus on improving the confidence and transparency of AI initiatives.

With the popularity of GenAI and LLMs exponentially increasing, organizations interested in implementing these technologies will have to reckon with the major complexities it brings—such as the risk of unpredictable behaviors, bad end-user experiences, and hallucinations.

Dynatrace AI Observability for large language models (LLMs) and generative AI (GenAI) is a comprehensive solution that leverages Dynatrace’s hypermodal Davis AI and other technologies to deliver a holistic and accurate view of AI-powered apps.

This level of visibility allows enterprises to automatically identify performance bottlenecks and root causes, as well as drive compliance with privacy, security, and governance standards through output origin tracing. AI Observability also empowers businesses to predict and control costs through token consumption monitoring—the basic units used to process queries for GenAI models.

AI Observability is an end-to-end solution that covers everything from infrastructure—such as NVIDIA GPUs—to foundational models—like GPT4—semantic cashes and vector databases, and orchestration frameworks. The solution also supports the major platform for building, training, and delivering AI models, including Microsoft Azure OpenAI Service, Amazon SageMaker, and Google AI Platform.

“If you look at most [AI observability] solutions, they focus very much on the infrastructure component—which we have in there, as well—but they don't put that information into the context that I would need to make those decisions,” said Alois Reitbauer, chief technology strategist at Dynatrace. “We can provide a unique view that combines the business view and the technology view and help you to tweak and tune on a much more fine-grained level…[our] uniqueness really comes from this end-to-end visibility where we combine the technical and the business aspects.”

Data Observability further drives confidence in AI, enabling various sectors of business to ensure that the data within the Dynatrace platform is high quality. Working in tandem with other core Dynatrace platform technologies—including Davis AI—Data Observability provides:

  • Up-to-date and timely data for analytics and automations, alerting users to any issues or anomalies, for greater data freshness
  • Monitoring for unplanned changes in data volume, such as any increases, decreases, or gaps, that may reflect undetected issues
  • Monitoring for patterns, deviations, or outliers in datasets
  • Trackable data structure that alerts to unexpected changes, including new or deleted fields
  • Data lineage that reveals root-cause details in data origins and what services it will impact downstream
  • Observable digital services’ usage of servers, networks, and storage, driving greater availability

“Where Data Observability comes into play is really understanding how the data works—and we are already using this internally at Dynatrace,” said Reitbauer. “As we opened up the Dynatrace platform where you can send any type of third-party data…we realized that some data points do not make sense or the data distribution changes significantly, or we see a lot of missing data. [Customers] want to know this because they’re most likely going to be other processes in their company that rely on the accuracy of the data and also the data analysis.”

The final addition to Dynatrace’s platform, OpenPipeline, is designed to deliver a single pipeline for managing petabyte-scale data ingestion to users. This not only improves the overall security of Dynatrace’s analytics, AI, and automation but also empowers greater cost-effectiveness.

OpenPipeline delivers  customers greater visibility and control of the data being ingested into the Dynatrace platform, all while ensuring that the context of the data and its cloud ecosystem remain intact. This solution evaluates data streams 5-10x faster than other legacy technologies, empowering better management of the voluminous, dynamic data coming from customer hybrid and multi-cloud environments, according to the company.

OpenPipeline works with other Dynatrace technologies—including Grail data lakehouse, Smartscape topology, and Davis AI—to resolve the various challenges associated with creating unified pipelines for data management. By offering petabyte-scale data analytics, unified data ingest, real-time data analytics on ingest, full data context, data privacy and security controls, and cost-effective data management, OpenPipeline further expands the Dynatrace platform utility and value.

“One of the key differentiators…is simply the scale of data processing [that OpenPipeline provides] ...Most data pipelines, at some point, are limited by the capacity that they can process,” explained Reitbauer. “[OpenPipeline’s] flexibility of how you can work with this data, combined with a massive scale of processing of those data pipelines—because you don't want your data pipelines to eventually become the bottleneck for data ingestion—[is crucial].”

Dynatrace AI Observability is now available for all Dynatrace customers. Data Observability and OpenPipeline are expected to be generally available within 90 days of this announcement.

To learn more about Dynatrace and its latest advancements, please visit