Today, business users “can get data-driven applications delivered directly to them in real time—complete with insights relevant to their jobs, as well as recommended next-best actions,” said Ajay Khanna, vice president of product marketing at Reltio. “We’re entering the era where technology is finally able to deliver the value of data and analytics to the business users who need it most.”
Ultimately, analytics needs to be a part of everyone’s job in one form or another. “Everyone needs to understand their business, and data analytics is a great place to start,” said Simpson. There are still specialized teams with data scientists, he noted, but the real power play for the enterprise is to make analytics available to everyone.
Data at the Ready
The ability to rapidly move data to where and when it is needed is important, but that also begs the question, is enterprise data itself ready for widespread analytics? It may be a showstopper, Altshuller cautions. “With data all around us in various forms, it is not surprising that we still find our- selves in a bottleneck when it comes to data preparation,” he said. Enterprises need to employ smart analytics tools that can help serve up suggestions on how to improve data for better outcomes, as well as detect anomalies, he added.
Making data analytics-ready still requires data and IT managers to roll up their sleeves to ensure quality and consistency. “Data preparation activity and tasks have always been messy, filled with errors, labor-intensive, and time-consuming,” said Patel. “This does not bode well, especially for business units that are demanding faster time to insights to remain competitive or for business users who are enabled with self-service discovery, reporting, and analytics capabilities but do not want to rely on IT to prepare and manage data.”
The Internet of Things and connected devices also represent a shift away from traditional thinking about what constitutes a data environment, said Mike McNamara, senior manager of product and solution marketing at NetApp. “Getting value and insights quickly from a range of mixed-workload environments can truly differentiate an organization and propel its business. Data analytics and IoT, which extends compute and network capabilities to objects, sensors, and everyday items not normally considered computers, is reshaping our world.”
Another emerging data structure, data lakes, also constitutes “a powerful architectural approach for managing the grow- ing variety and volume of data, especially as companies turn to mobile, cloud-based applications and the Internet of Things as right-time delivery mediums for big data,” McNamara continued. “The ability to manage the lifecycle of data in the lake through automated policies based on age and relevancy is the difference between an efficient and useful repository of valuable data assets and a costly one.”
To be prepared for the new world of big data and business intelligence, “IT leaders need to build a system of trust—or collaboration and engagement of employees throughout the organization—to create a culture wherein data is utilized more pervasively,” Franco said. “They need to have a modern collaborative, governed approach to data preparation as opposed to a traditional, restricted access form of data governance. The ultimate goal of implementing self-service data preparation is not to improve only personal productivity, but rather to institute a collaborative platform that makes enterprise data cleansed, trusted, and reusable so that more employees can have access to, trust the quality of, and put into use enterprise information, which ultimately helps companies become data-driven.”
Factors such as “data ingestion, harmonization, cleansing, preparation, persistence, and access contribute to a very complicated process when you consider the attributes associated with data—volume, variety, speed of change, distribution (geo and platforms), context, access frequency, and veracity,” said Gupta. “The new BI world that exists in hybrid settings is collaborative across functions and can deal with these data attributes but has to be supported by the next-generation approaches, including in-memory, schema-less database, polyglot persistence, and high-volume ingestion engines that can deal with batch as well as real-time streaming inputs. On top of these, data quality and harmonization tools need to be implemented to prevent the garbage-in garbage-out phenomenon.”