EnterpriseDB (EDB), the leading sovereign AI and data company, is releasing a suite of validated performance efficiencies within EDB Postgres AI (EDB PG AI), designed to drastically reduce data center power consumption, lower token usage, and deliver an unprecedented "intelligence per watt" standard for the enterprise.
"The AI energy conversation has been about what happens with the models and GPUs. Almost nobody is talking about what happens at the data layer that every agent, every model, every inference call depends on," said Quais Taraki, CTO at EDB. "You can't control consumption at the model layer. Agents consume what they consume. But you can control efficiency at the data layer, and for most enterprises, that's the only lever they actually have."
EDB PG AI addresses the agentic energy challenge on two complementary fronts: first, by shrinking the core infrastructure footprint required to run enterprise applications; and second, by making the data-layer operations that power agentic AI—especially search, retrieval, and vector indexing—far more efficient. Together, those gains improve not just how much infrastructure enterprises need but how effectively that infrastructure is used per unit of energy, according to the company.
At the infrastructure level, EDB PG AI helps enterprises reduce the servers and cores required to run applications, lowering both data center energy use and emissions.
At the workload level, EDB is targeting one of the most underappreciated drivers of AI energy cost: the intensive data-layer operations as agents create databases, adjust queries, and move data across enterprise environments 24/7/365.
Building and maintaining vector indexes is among the most resource-intensive activities in modern databases—and one that scales directly with the number of agents in production.
Additionally, EDB PG AI delivers an "intelligence per watt" standard for global enterprises to measure, improve, and operationalize AI efficiency at scale, as autonomous systems create more databases, pipelines, and queries over time, the company said.
The platform is built around three principles that compound as agentic workloads scale:
Measure: Quantify the energy and infrastructure cost per unit of AI intelligence produced, extending the Incendium-validated methodology to agentic, RAG, and multi-agent workloads.
Optimize: Reduce compute, storage, and network demand per AI operation through database consolidation, storage tiering, query acceleration, vector indexing, and token reduction.
Govern: Maintain visibility and control over data layer operations as autonomous agents create databases, indexes, pipelines, and queries at machine speed.
"Enterprises succeeding with AI at scale are 275% more likely to prioritize energy-efficient data infrastructure than the rest of the market. They're also seeing 5x the ROI. That's the connection most of this industry is missing. This idea of 'intelligence per watt' isn't just an environmental metric—it's a performance indicator. The companies getting the most from AI are the ones demanding the most from their data layer," said Kevin Dallas, CEO of EDB.
For more information about this news, visit www.enterprisedb.com.