<< back Page 3 of 3

Building a Competitive Data Architecture, One Technology at a Time

As a result of these capabilities, machine learning is fast becoming a technology of choice for data professionals—and may help digitize sophisticated data science tasks. “Machine learning helps predictive models to become smarter over time, process and generate natural language, and uncover patterns in large datasets,” said Helena Schwenk, vice president at Exasol. “A data science and ML platform can accelerate this process by helping data scientists and, most importantly, other data-savvy users—citizen data scientists—build predictive models, derive advanced insights, and infuse AI into applications at scale, ultimately helping to alleviate the data scientist bottleneck.” In the end, the entire enterprise benefits, Schwenk continued. “Platform domain experts, such as quants, risk analysts, and business analysts, can become directly involved in the process of developing models without a deep knowledge of machine learning.”


AIOps—AI operations—leverages AI and machine learning to acquire enterprise IT data, analyze it, and take required actions for autonomous IT operations. AIOps “helps transform enterprise IT operations from being slow and reactive to agile and proactive, thus addressing the key IT operational and business challenges,” said Akhilesh Tripathi, CEO of Digitate. AIOps “combines big data, analytics, and automation to help gain full-stack visibility across hybrid environments, predict failures and their direct impact on business, and provide a resilient and efficient IT.”


Data stacks are getting more complex, making workflow orchestration and monitoring increasingly important. “Every handoff from one system to another carries some potential for error or data loss, even if the two systems in question have strong internal guarantees,” said Jeremiah Lowin, CEO and founder of Prefect Technologies.

“Though these errors are infrequent, they are disproportionately disruptive as they evade traditional monitoring systems and require all-hands-on-deck searches for the culprit,” Lowin continued. Workflow orchestration systems “may not prevent errors in all cases but can usually reduce the time-to-error-discovery from hours to minutes.” The benefits of workflow orchestration stem from a combination of reclaimed infrastructure spending and the reduction of maintenance, he added.



“Data governance and data quality software is fundamentally changing the way organizations compete on data analytics,” said Amy O’Connor, chief data and information officer of Precisely. “Previously, this type of work was done by hand, making it incredibly time-consuming, tedious, and oftentimes inaccurate. With data governance and data quality solutions, businesses can more easily and efficiently assess and explore their data.”

While the benefits of good data governance and data quality are obvious, most organizations are still struggling to implement it, said O’Connor. “For the last decade or so, everyone has been so focused on creating and using data, while just assuming the quality was good.” The implementation of data governance and data quality tools requires organizations to pause and re-evaluate their data, O’Connor noted. However, once companies are able to do this, the concept of data engineering, or building sound data processes so it can be reused, “will allow companies to have total control and be able to repurpose their data without inaccuracies.”

Demonstrating the importance of data integrity requires greater adoption of data governance and data quality tools, said O’Connor. “It’s likely that these technologies will become pervasive across industries.”


Big data will just keep getting bigger, and new approaches to big data architectures need to keep evolving as well. As the 2020s progress, we are likely to see additional approaches evolve, forming the foundation of data-driven enterprises positioned to compete in a hypercompetitive global economy.

<< back Page 3 of 3


Subscribe to Big Data Quarterly E-Edition