AI PROJECT CONSOLIDATION
What’s hot: The year ahead will see efforts to reign in the huge volume of AI projects now proliferating outside the scope of IT departments. “IT leaders are being called in to fix or unify fragmented, business-led AI projects, signaling a clear shift toward CIOs—like myself,” said Shelley Seewald, CIO at Tungsten Automation. The impetus is on IT leaders and managers to be “more involved much earlier in shaping AI strategy and governance. Time and time again, we see CIOs being thrown into projects aimed to stabilize AI projects that were launched outside of IT—often stemming from pressure by business units looking to innovate quickly,” she said. “That’s when the CIO gets the call to clean up, unify, and retrofit governance and integration.”
Current status: Just getting started: “It provides a huge opportunity for CIOs to reclaim a leadership role in innovation, not just execution,” said Seewald. “I like to think of it as bringing structure to the chaos by focusing on business-first AI and avoiding the big, shiny tool in front of me.”
Potential roadblocks: Seewald doesn’t see any downsides or roadblocks to this framework, “as it provides a great opportunity to elevate the role of IT from being reactive to proactive. When we take the time to slow things down, we can avoid the buildup of technical debt that brings unexpected roadblocks from rushed experimentation, allowing us to actually move faster in the long run. I strongly emphasize ‘go slow to go fast.’ It’s easy to just speed ahead with any new technology, but remaining focused first on clarity, alignment, and strategic intent wins competitive advantage.”
HYBRID AI
What’s hot: Generative AI (GenAI) architectures are becoming inherently hybrid: “relational in integrity, semantic in intelligence,” said Allen Terleto, VP of partners and alliances at Cockroach Labs. “The transition from read-only RAG [retrieval-augmented generation] to agentic, read write GenAI will mark a fundamental shift in how systems reason and act. These next-generation architectures will transact on knowledge—not just retrieve it.” In the process, “AI will cease to be a standalone solution or a bolt-on layer of intelligence.”
Current status: The convergence toward hybrid AI is accelerating, Terleto said. “[Witness] Databricks’ agreement to acquire Neon, the serverless PostgreSQL platform, for roughly $1 billion. This highlights the push to bring transactional integrity into AI-native ecosystems.” Across the data world, “Vector search has become table stakes, signaling a structural shift in the data landscape.”
Potential roadblocks: “Regulatory and audit pressures will intensify as AI agents begin to act, not just infer,” said Terleto. “Operational AI will face the same scrutiny as financial systems, with regulators demanding explainability, traceability, and strict data governance for agentic actions.
Enterprises will need to embed observability, data lineage, and deterministic rollback into their AI and OLTP stacks to meet audit standards without stifling innovation.”
In addition, he noted, “skills and operating models continue to lag behind ambition. Many teams lack hands-on experience running AI and OLTP patterns safely at scale.”
INTELLIGENT CONVERGENCE
What’s hot: The year ahead will see a convergence of emerging intelligent data infrastructure innovations—tiered storage, vector search, and real-time columnar analytics—predicted Anil Inamdar, global head of data services at NetApp Instaclustr.
“Specifically, tiered storage that automatically moves data between hot and cold tiers, vector search that integrates into the databases businesses are already using, and more growth of real-time columnar analytics. Together, these architectures will help collapse the cost/performance trade-off. In 2026, I expect we’ll see a lot more orgs realize they don’t need separate systems for real-time, analytical, and AI workloads.”
Current status: Emerging but accelerating rapidly. “Some industries where milliseconds are most directly tied to dollars—financial services and retail, in particular— are already running these architectures in production,” said Inamdar.
Potential roadblocks: “As with just about anything in the data stack, there’s operational complexity in ensuring you’re getting the most out from the technology you’re choosing,” said Inamdar. “Each piece requires specialized expertise, whether that’s vector indexing, columnar data modeling, or tiered storage tuning. Sequence adoption carefully, starting with high-impact but low-risk implementations before attempting full-on architectural transformation.”
THE RISE OF CONTEXT
What’s hot: There’s a shift underway “from valuing data not only for the knowledge it stores, but for what it remembers,” said John Capello, VP of strategy at Nasuni. “AI doesn’t just need raw data; it needs context from historical data and stored knowledge.” Version history, in particular, represents data with context. “We’re beginning to recognize that context-rich, time-stamped data, particularly unstructured data, offers unparalleled insight into how knowledge evolves over time.” Early adopters who have already built an infrastructure capable of tracking, versioning, and indexing their unstructured data “will find themselves ahead, armed with unique, high value datasets.”
Current status: “The idea of data memory as competitive advantage is still being uncovered,” said Capello. “Most enterprises are adept at collecting data, but few have developed the infrastructure or mindset to harness historical and contextual layers of that data for strategic purposes. They treat unstructured data as static: something to be stored, queried, and reported on, rather than as an evolving record of how work actually happens.”
Potential roadblocks: “There’s the challenge of infrastructure and cost,” said Capello. “Capturing billions of restore points requires scalable, cloud-native storage solutions capable of handling massive data throughput without sacrificing accessibility or security.” Data governance and compliance also represent major hurdles, he added. “Version histories often contain sensitive or proprietary information, and maintaining them securely while ensuring regulatory compliance demands sophisticated policies and tooling.”