Newsletters




How AI is Changing the Conversation Around Data Architecture for 2026


Right now, AI technologies are not just enhancing existing technology portfolios—they are reshaping how organizations think about modern data architecture, elevating modernization from a competitive advantage to an essential business requirement. 

To build integrated AI-enabling infrastructure stacks, organizations are evaluating and adopting a variety of supporting components that collectively enable the diverse data integration, contextual understanding, and flexible processing capabilities required. These also include semantic layers and unified data layers, data lakehouses, and data fabric, as well as active metadata and data catalogs.

DBTA recently held a webinar, What's Ahead in Data Architecture for 2026, with several experts who discussed key technologies and trends paving the road ahead.

The future of data will be defined by organizations’ ability to make data reliably AI-ready, said Eric Carr, lead partner sales engineer, Fivetran.

Sixty percent of organizations are actively researching GenAI, including LLMs, RAG, and knowledge graphs, further reflecting a growing demand for real-time, AI-ready data.

Data warehouses have emerged to solve many of the challenges associated with data lakes turned swamps. Improved data lake management and maintenance are reviving data lakes by:

  • Addressing management and compliance challenges by introducing open table formats
  • Offer scalable and affordable storage for large data volumes
  • Interoperability to work alongside data warehouses, catalogs and other modern data stack technologies

The foundation for an AI-ready data architecture is leveraging an open data infrastructure, Carr explained. An open data infrastructure is:

  • Automated: Streamlining data management end-to-end.
  • Pluggable: Works with any compute engine, catalog, BI tool, or AI model.
  • Standards-based: Built on open formats and protocols for interoperability.
  • Flexible: Preserves optionality, avoids lock-in, and scales with future workloads.

 According to Bharath Vasudevan, vice president of product and GTM, Quest, today’s data management landscape is “silo’d” and there are too many manual, time consuming tasks and workflows performed by data modelers, data stewards, data analysts and data engineers.

This process can be automated, he explained, with AI autonomously turning raw data into curated, analysis ready “data products,” accelerating insight.

2026 will be the year where companies move from “giving tools and assistants” to “building an agentic workforce,” predicted Ronen Schwartz, CEO, K2view.

K2view offers data products that deliver just the right data to an agent, Schwartz said. The K2view Agentic Data Product Platform provides:

  • Real time response at scale – for millions of agents to reason and act quickly across vast amounts of enterprise data
  • Cost – AI tokens are spent on reasoning, not data integration
  • Security – agent access is limited to approved data products; everything else stays encrypted
  • Fresh data – the customer can define what needs to be up to date

For the full webinar, featuring a more in-depth discussion, Q&A, and more, you can view an archived version of the webinar here.


Sponsors