<< back Page 2 of 2

The Tools, Platforms, and Strategies Shaping Today’s Data Stacks


BEYOND CLOUD

While cloud will continue to dominate as the underpinning technology of choice, it appears to have reached its limits. That’s the word from Redgate’s survey, which found a slight decline in the percentage of customers migrating to the cloud.

Those classifying themselves as hosting databases “mostly” or “all” in the cloud dropped from 36% in 2023 to 30% this year, the Redgate survey shows. Cost management was cited as the main challenge (63%). “While nearly every organization continues to have a cloud strategy, many IT teams have grown more knowledgeable about the cloud’s inherent limitations,” said Gorman. “They now recognize that every resource or service has built-in constraints to prevent excessive consumption, making cloud migrations particularly challenging for the largest databases.”

The plateauing of cloud is particularly evident in “regulated industries such as financial services and healthcare and increasingly self-hosting critical applications and data,” said Steve Zisk, senior product marketing principal at Redpoint Global. This is “due to security and privacy requirements, opting to keep customer or patient data behind the firewall in on-prem or private cloud deployments to reduce the risk of a data breach.”

On-prem approaches “no longer have to sacrifice cloud-like agility and scalability,” said Zisk. “Tools like hyper-converged infrastructure, private clouds, and containerization technologies—Kubernetes—empower organizations to deploy and scale applications quickly while retaining full control over their environments. These advancements make on-premises solutions an attractive alternative to traditional SaaS platforms, combining the best of both worlds and allowing for the highest levels of data security.”

There are many cases for keeping data systems on-prem or hybrid, said Shankar. “Critical workloads, regulatory, security, and cost considerations are important factors favoring on-premise and hybrid solutions. The advent of edge AI use cases has moved databases closer to the customers where inference can be done in real time. I have seen companies cost-optimize after they realize that long-term cloud costs—especially for high-throughput databases—can outweigh the benefits of elasticity.”

The industry in general “has now reached understanding that not everything will move to the cloud, and especially not to proprietary DBaaS solutions,” said Zaitsev. “I’m not sure if we’ve reached ‘peak cloud,’ but I think we’re getting close.”
Two fundamentally different approaches to cloud have emerged, Zaitsev continued. “One is proprietary cloud—the one which tends to be heavily promoted by cloud vendors—where we use as much of proprietary, differentiated cloud functions as possible, creating vendor lock-in and giving cloud vendor all the pricing power.”

The other approach is commodity cloud, “where we use cloud building blocks, which are essentially the same between all the clouds and deploy open source solutions on top, making it easy to change cloud vendors and also run in multi-cloud and hybrid environments.”

With increasing understanding of cloud architecture comes a realization that “the largest systems may need to remain on-premises to maintain the necessary performance levels,” said Gorman. “This, in turn, can lead to smaller, interconnected systems being pulled back on-premises due to their reliance on these large databases and the gravitational force of their data.”

Cloud opens up “different choices because they have different requirements,” said Porter. “For most businesses, the public clouds are the obvious choice, especially if they can build the security perimeter they need in the public services. However, more and more, companies with isolation requirements are moving their data from on-premises to special parts of the cloud—either special cloud regions or special locked down compartments backed by the public cloud infrastructure. We see very few companies remaining totally on-premises.”

DON’T IGNORE STORAGE

With the resurgence of on-prem or hybrid computing comes additional attention on storage requirements, both for data access and security. Storage goes hand-inhand with any platform and application hosting considerations. “Potential cost benefits of keeping data behind the firewall can include limiting ingress/egress and storage charges for cloud data, along with indirect costs for monitoring and managing data compliance and usage,” said Zisk. “By retaining control over privacy exposure, security, and usage patterns (i.e., who has the right to look at the data, use it, export it, etc.), an organization ensures more direct and often simpler data usage and management costs.”

Object storage has become a critical component of this emerging architecture—“not a nice-to-have, but indispensable to the AI data pipeline,” Levens said.

Object storage is critical “to host evolving data lakes and fold into larger archiving and preservation workflow and, of course, on-demand, anytime dataset processing and retrieval everywhere needed,” said Levens. “These tools and workflows need to fit the way the solutions team needs to add and refine metadata management, which is critical in analyzing data at scale, where organizations can tag and catalog data to enrich and prepare it for analysis and AI.”

Decisions pertaining to “hosting either your database technology and storage [are] driven by what serves the goals and mission best,” said Levens. “Yet, for many customers on-premises storage remains an essential element in business for its performance, cost, and security strengths. Many are adopting a hybrid approach, leveraging cloud for the scalability it offers, and on-premises for its accessibility and long-term data lifecycle capabilities.”

<< back Page 2 of 2


Newsletters

Subscribe to Big Data Quarterly E-Edition