Newsletters




Automation Takes on the Heavy Lifting of Data Management

<< back Page 3 of 3

Even a seemingly straightforward process such as performance tuning may be difficult to turn over to automation at this time, said Brunswick. “Performance-tuning, particularly for high-transaction-rate systems, is difficult to fully automate because of the complexity and variance between organizations.”

Data managers and administrators need to ask “when it’s smart to automate, and when it’s not,” said Sumeet Singh, VP of engineering for Juniper Networks. “Projects that require more strategy, innovation, or creativity are best left to the human touch. For companies implementing automation, it’s crucial they have specialists who can help troubleshoot or intervene if a problem arises.”

There are still “many situations in which we do things manually based on ideas and information that aren’t captured, standardized, or maintained in a way that automated processes can use,” said Freivald. “Even AI and machine learning processes are only as good as the data we feed them. It’s hard enough to capture all of that data about highly structured IT processes, but data management requires businesspeople and IT to collaborate—so those combined business/IT processes, heuristics, and ideas are even harder to nail down in a way that doesn’t leave us open to high risks. We’re getting better, but these things take time.”

BEST APPROACHES

To achieve successful data automation, data managers and administrators need to take a holistic or strategic approach, industry observers advise. “The key aspect for database automation is the company’s data strategy,” said Golombek. “All initiatives and applications should be aligned with that strategy.”

Cloud computing offers a way to make this transition—37% of respondents to the ITI-IOUG survey said moving to cloud has aided their efforts to automate data operations. “The most direct way to address data center automation—such as auto-healing, predicted failures, and performance degradation—is to decouple the physical mapping between database assets and physical servers,” said Kevin McNamara, CTO and co-founder of HiveIO.

“This can be accomplished through cloud computing or software-defined virtualization solutions,” said McNamara. Traditionally, there has been resistance to moving to the cloud because performance has been tied to hardware physics but, McNamara said, with the advancement of storage acceleration, such as inline deduplication, compression, and cache, many administrators have now shifted to software-defined processes rather than hardware acceleration. “In doing so, cloud features, such as live migration, erasure coding, and API metrics, offer significant value.”

Others advocate a workflow-by-workflow approach to automating database tasks. “Automation matters most when you need to do something fast, like fix something that is broken or roll something out that has value,” said Singh. “Organizations ought to identify their key troubleshooting workflows, as those will lend themselves quite naturally to automated outcomes. It’s also important to focus on targeting one workflow at a time, diligently building automated outcomes into data center processes.”

Vendors are building greater automation into their solutions, but this is only part of the story. “Vendors supply some automation today, including the cloud vendors, however those hoping that the cloud will automate the bulk of their larger database operations will be sadly mistaken, and will find both missing functionality and a lack of existing functional depth,” said Schumacher. It’s important, then, for enterprises to “look at databases in the same way as other DevOps approaches, and also to look beyond the tooling provided by the database vendors,” said Brunswick. “Third-party integration providers can lay the foundation for effective automation of processes involving different database vendors and hybrid ground/cloud environments.”

ENTER AI

Of course, these days, no discussion about data automation is complete without mentioning the growing role of artificial intelligence (AI). While Schumacher noted current AI is still in its “baby-steps phases” with auto-scaling and best practices enforcement that learns its specific environments, AI will increasingly be applied to application-oriented data and building out recommendation engines.

“Like all AI and machine learning domains, the key is to successfully model the problems you are trying to solve,” said Brunswick. “Aspects like future capacity forecasting or log anomaly detection, for example, may be well-suited to an AI approach, whereas others, such as performance tuning, may be much harder to model.”

AI and machine learning offer a powerful option for getting new insights or predictions out of data, Golombek noted. “Especially for automation, these algorithms can make a data scientist’s job easier.” However, he cautioned that organizations have to be aware that it’s just another option—another tool. Teams should try to integrate AI and data science seamlessly into their organizational structure, and technology stack and landscape. “If you are able to integrate business intelligence tools with AI concepts, you are on a good path to being future-ready,” said Golombek.

<< back Page 3 of 3

Sponsors