The evolution of data architecture is accelerating. In 2025, 85% of DBTA subscribers reported plans to modernize their data platforms—driven largely by the explosive rise of GenAI and large language models.
Modernization has shifted from a strategic advantage to a business necessity. To power AI-driven innovation, organizations need intelligent, interconnected data platforms that unify information, enable real-time insights and scale with agility.
DBTA recently held a webinar, Top Trends in Modern Data Architecture for 2026, with leading technology innovators for a dynamic roundtable exploring the key trends defining modern data architecture in 2026.
Data management is strategic and complex, said Errick Coughlin, chief architect at Informatica. Modern data architecture needs to be:
- Modular and flexible
- Unified and interoperable
- Versatile integrations
- Production-oriented
- AI and agent-driven
Informatica offers the Intelligent Data Management Cloud. The platform is a unified data ecosystem for AI/analytics. The solution delivers easy hybrid/multi-cloud integration with broad connectivity across apps and data sources, Coughlin noted. Informatica extends trust and metadata across the enterprise.
There are 4 critical layers of GenAI, explained Conor Jensen, field CDO at Dataiku. There is the Generative Model Layer, where the AI model is trained, validated, and fine-tuned based on the use case and data. There is the feedback and improvement layer, that utilizes user feedback and interaction analysis to identify errors and provide corrective inputs, enabling models to learn. Then there is the deployment and integration layer that focuses on setting up the infrastructure (computing resources, model serving, security) and integrating the model with front-end and backend systems. And finally, the monitoring and maintenance layer that tracks performance metrics (accuracy, precision, recall) and involves retraining or updating models as needed.
The 3 critical failure points include rigid architecture failure, model selection failure, and security and cost governance failure, Jensen said.
There are 3 critical success points to combat this. It includes:
Flexibility is key: As models evolve at breakneck speed, your architecture must adapt without rebuilding. Connect new foundation models to existing layers without disrupting operations.
Knowing the best LLM for each use case: Match models to business needs— API or locally-hosted for sensitive data. Systematic evaluation across your architecture layers optimizes both performance and cost.
Full control over usage and cost: Balance centralized governance with innovation. Implement controls that provide visibility and predictability while still enabling teams to move faster across all four layers.
According to Clive Bearman, senior director of product marketing at Qlik, there are 3 trends in data architecture for 2026.
- Real-Time/Streaming Data
- Canonical Data Lakehouse Architecture
- Trusted Data Products for AI
“Agentic AI will transform data engineering from a world of brittle pipelines and reactive tasks into autonomous, goal driven systems that will (eventually) monitor, diagnose, fix, and optimize themselves... The result is faster delivery, fewer incidents, and lower TCO—without sacrificing control or compliance,” Bearman said.
Modern enterprises have invested billions in data infrastructure, yet their AI initiatives hit a wall: most organizational data remains invisible to AI systems. This creates a critical gap between AI potential and AI reality, explained Jerod Johnson, director, technology evangelism at CData Software.
CData Software bridges this divide with architecture designed specifically for the security requirements and scale demands of enterprise AI deployment.
Across enterprise deployments, successful AI data architectures share common characteristics, emerging from organizations that have solved the problem at scale. This includes the following steps:
- Centralize around a lakehouse: Choose the platform that fits your existing stack—Fabric, Snowflake, Databricks—and make it the unified source of truth for analytics and AI consumption.
- Bridge legacy, don't replace: Incrementally connect existing systems to modern architecture. You don't need to rip and replace working systems to enable AI access.
- Security enables, not blocks: The right architecture eliminates trade-offs between security and speed. Governed access should accelerate deployment, not slow it down.
- One architecture for BI and AI: Use the same unified data layer, the same security model, and the same governance framework for both dashboards and AI agents.
For the full webinar, featuring a more in-depth discussion, Q&A, and more, you can view an archived version of the webinar here.