Exploring The Key Pillars of a Modern Resilient Data Architecture

<< back Page 2 of 5 next >>

The rise of digital organizations means greater attack surfaces for cybercriminals and more data put at risk. “Data is increasingly being stored across multiple clouds, SaaS appli­cations, and the edge,” said W. Curtis Preston, chief technical evangelist at Druva. “This means the data center is no longer the center of data, which is creating more complexity and risk. Organizations have been faced with trying to keep pace with a relentless amount of cyberattacks. These chal­lenges are driving a need to build more resilient architectures that can keep data secure and ensure recovery is fast and easy to manage when an attack does hit.”

The shifting of data between cen­tralized and decentralized environ­ments is another pervasive factor in the need for resiliency. Data gravity is constantly shifting from centralized to decentralized, said Tapan Patel, senior manager for data management at SAS. “This creates barriers and stresses as organizations struggle with conflict­ing data connectivity, integration, and governance needs. With growing data complexities, delivering reliable and trusted data has become more time-consuming, expensive, labor-intensive, and siloed.”

Data resiliency is essential in avoid­ing cascading failures, which are an ongoing threat to today’s highly net­worked enterprises. “This is when a small, local failure through different types of dependencies takes down an entire system,” said Sush Apshankar, principal consultant for cognitive and analytics at ISG. For example, he said, overload occurs “when one cluster goes down and all its traffic moves to another cluster. A resilient architecture enables the business to be more self-sufficient and quicker to respond to changes.”


It’s important to look at data resil­iency from both a user and a technol­ogy perspective. “From a user’s per­spective, resiliency is typically defined by how well an application continues to perform in the event of unplanned interruptions,” said Carsten Bau­mann, director of strategic initiatives and solution architect at Schneider Electric. “From a hardware perspec­tive, the network, storage, and com­pute platforms must be operational or provide redundancy levels that ensure the applications continue to perform. Looking further down the hardware stack, power is of the utmost criticality. Without it, none of the required services can be offered. These fundamental requirements are often overlooked.”

Support for real-time comput­ing needs to be at the heart of data resiliency initiatives. “As users adopt online business apps, mobile apps, and streaming apps, harnessing real-time data is a top analytics require­ment,” said Patel. The next generation of data architecture “needs to support real-time data processing by default. Traditional data architectures tend to lock up data assets in repositories, slowing down insights and applica­tion development.”

A resilient data architecture “must be built around the ability to provide continuous service—no matter what,” said Carlos Rivero, vice president, data and analytics at GCOM, and for­mer CDO for the state of Virginia. “This means that opportunities for networking or connectivity failures must be minimized as they are the most common source of service inter­ruptions.” Rivero also pointed to the importance of managing cloud for resiliency, noting that “care must be taken to choose multiple availability zones for backups and data storage, and these different zones must span multiple geographical regions with underlying infrastructure that is both mutually exclusive and redundant.”

Underestimating “region-level fail­ure protection in public clouds can be problematic,” agreed Ranganathan. “Regions can fail for many reasons, such as snowstorms, building fires, and extended power outages.”

<< back Page 2 of 5 next >>