Page 1 of 4 next >>

Seven Trends Shaping ‘Big Data’ into ‘All Data’


Has the meaning of big data changed? Many agree that data no longer has to be “big” to meet today’s evolving requirements. Perhaps a better way to describe big data would be “all data,” suggests Saptarshi Mukherjee, global head of product and solutions marketing, data analytics, at Google Cloud. “Today, we’re seeing less of a distinction between ‘big data’ analytics and data analytics, because, inherently, businesses are exposed to massive data growth, which is coming from a variety of systems and applications,” said Mukherjee. “Data analytics needs to address the traits of large amounts of data at all times.”                    

In particular, open source and cloud tools and platforms have brought data-driven sensibilities into organizations that previously did not have such expertise, making big data more accessible. “Hadoop helped make it easy to collect data quickly,” said Madhukar Kumar, VP of product and developer marketing at Redis Labs. “It was bundled with MapReduce to enable a way to crunch data—and the result was a new data ecosystem that grew the initial focus of the big data conversation. Apache Spark ran much faster by keeping all of the data in memory and helped alleviate some of the timeliness problem.”

Cloud is opening up a whole new way of approaching tried-and-true systems, such as data warehouses, and blending them with new layers, such as logical data warehouses. “You are taking the traditional analysis engine and running federation over the data lake data and bringing in big data processing technologies, such as Apache Spark, and processing data in the warehouse,” said Mukherjee. “This is enabling enterprises to break down data silos and take advantage of distributed computing to analyze any data at scale. Public cloud is accelerating this.” When organizations go to the cloud, they are operating in an unconstrained environment where they can commission compute and storage capacity, making data warehouses far more powerful, Mukherjee noted.

In addition, the evolving world of big data has taken on a new meaning beyond simply formats and storage capacity. “It’s not the size of the data that matters; it’s how you use it. Big data was a marketing term built to describe data with volume, velocity, and variety,” said Thomas LaRock, head geek for SolarWinds. “It doesn’t have to be large. It could be tiny amounts of data coming from millions of IoT devices.”

What are some of the major trends shaping data management as it takes on a larger role in enterprise decisions and operations? The following are some of the key developments seen this year.

Real Time

Real-time data and analytics—amplified by the Internet of Things—is a force altering the big data landscape. The need to have real-time analytics to improve business decisions is fueling demand for technologies such as in-memory computing, and it is a trend being seen at many of the world’s largest companies, said Abe Kleinfeld, CEO of GridGain Systems.

“As use of automation, machine learning, and AI continues to increase, and as companies amass more and more types of data, the need to quickly analyze data will only become more pronounced,” said Kleinfeld, who added that in-memory computing adoption will follow the upward trajectory of machine learning and AI.

Real-time analytics on streaming data is becoming accepted, said Mukherjee. “Enterprises are exposed to massive growth in streaming data. Streaming data is being generated from connected devices and connected applications. Streaming analytics is another technological development that, granted, has been around for many years, but is now ready for mainstream adoption. Specifically, you can analyze this data at a particular time and act on whatever insight you find at that moment, which requires a lot of compute power.”

Page 1 of 4 next >>


Newsletters

Subscribe to Big Data Quarterly E-Edition