Newsletters




The Data Scene in 2017: More Cloud, Greater Governance, Higher Performance

<< back Page 4 of 4

“Governance has gotten sexy,” said Joe Pasqua, executive vice president of products at MarkLogic, noting that previously, organizations viewed data governance as a tax, something that wasn’t adding value to the business. With the rise of data lakes, governance has gained more urgency. “Data lakes don’t have the lineage and provenance of the data they’re analyzing,” Pasqua said. “When they put bad or misleading data into their analysis, they’re going to get unreliable results back out. That’s a lack of data governance.”

In addition, “perhaps even worse, organizations are afraid to share the data they’ve gone to great expense to create. They can’t answer questions such as: Under what agreements was the data collected? Which pieces are personal information? Who’s allowed to see it? In which geographies? With what redistribution rights? If you can’t answer these questions, you can’t share the data,” Pasqua said.

“We’re seeing accelerated adoption of data quality, master data management, and information governance solutions,” said Philip On, vice president of product marketing at SAP. There’s a notable trend toward the hiring of chief data officers, a recognition that the role of quality data in driving enterprise strategies is surfacing. “The next phase will be a growing maturity in the use of these solutions to demonstrate clear linkage of information policies, rules and quality with business processes and outcomes.”

Self-Service on the Rise

Another trend is the increasing demand from users for self-service applications. Use of self-service—via data preparation tools—is likely to triple over the next few years, On said. By giving information workers and analysts self-service access to a variety of data sources, better visibility into critical information is available, and fast, decisive action can be taken. The challenge, he continued, is ensuring that “the underlying data is reliable and relevant for users.” To build this trust, IT and data departments should “ensure these self-service applications are connected to or integrated with an enterprise-class data integration and quality platform so that data rules, definitions, and policies can be reused and operationalized.”

Real-Time Realities

With the increasing reliance on data analytics and the rise of more widespread networks such as the Internet of Things (IoT), there is also a need for real-time delivery of data insights. “The need for faster data analytics and databases is driving new requirements for technologies to optimize performance and meet the demand for real-time responses,” said George Teixeira, CEO and co-founder of DataCore Software. However, he cautioned, while computers have increasingly become faster, organizations have done a poor job of using them to work in parallel to attack the continual problems of latency and response times. In-memory technology promises blazing speeds, but this is muddied by the fact that I/O processing and faster response times have not been addressed, he added. Teixeira sees parallel I/O technologies—introduced in the past year—coming to the fore. This is advantageous as they “don’t disrupt current business workloads,” he said. Instead, he explained, the full power of multicores and the memory bandwidth available in servers is able to drive lower latency and faster response times because processors are working in parallel to service I/O requests and thus, are allowing more workloads to get accomplished faster.

Time for Data-Centric Security

Enterprises are pivoting from simply protecting infrastructure to more focused, data-centric security. With the vast majority of enterprises “vulnerable to data threats, focusing on data as the core of the security space is vital to staying one step ahead of vulnerabilities,” said Amit Walia, executive vice president and chief product officer for Informatica. “Recently, leaked exploits are part of ever-growing evidence that relying on the security of a network is not a smart or effective way to keep data safe. As we see more exploits over the coming year, there will be more pointed movement away from network security toward data-centric security.” The key, Walia said, is to start early, and “ensure that data is secured and locked at the point of inception, tracked across its lifecycle, and unlocked with confidence at the point consumption—before being forced to do so after data has been compromised.”

Sensors as a Service

Another area shaking up organizations is IoT, which has staggering implications for data collection methodologies. “SaaS can now mean sensors as a service, not just software as a service,” said Selfridge. The organizations succeeding in leveraging IoT data are  automating the data collection process; adding contextual information to sensor data by augmenting it with sales, campaign, marketing, maintenance, and other relevant data; and are prioritizing flexibility and improving device-intelligence to accommodate ever-expanding sources and types of data, Selfridge added.

There are a range of open source tools and platforms that are helping enterprises adapt to the flood of IoT data. “The IoT tool space is quickly becoming crowded,” said Rado Kotorov, vice president of product marketing for Information Builders. “Columnar and time-series databases are being launched all the time. But, in my opinion, it will be Spark that will drive significant adoption and applications in this space. Spark has matured and companies can run very advanced analytics on it.” Cloud platforms are also paving the way for the advanced analytics that can handle these new data loads, he added.  

<< back Page 4 of 4

Sponsors