The months and years ahead promise to be an interesting time in the data analytics space.
AI and machine learning are already making their mark, elevating analytics—and its front-end sibling, business intelligence. End users are seeing their capabilities expand in new directions, with an ability to ask questions they did not even consider with earlier technology iterations.
Industry leaders are tracking a range of developments that are defining analytics and BI in the year ahead:
LARGE LANGUAGE MODELS RECHARGE DATA ANALYTICS
The trend most likely to upend the analytics and business intelligence space is centered on generative AI (GenAI) and large language models (LLMs), which are poised to take understanding and predicting customers, markets, and operations to a whole new level.
GenAI and LLMs are “changing how we capture and analyze data, allowing analytics and technology teams to churn through large quantities of data quickly,” Honey Williams, VP and senior director, enterprise data and analytics technology, for Liberty Mutual Insurance, explained. “This takes simple but time-consuming activities off their plate. They can instead focus on generating value and insights.”
LLMs “are reshaping data analytics by expanding access, enabling sophisticated capabilities, and addressing challenges in data integration and customer insights,” said Srikar Bellur, Ph.D., assistant professor of data analytics at Harrisburg University of Science and Technology. “Their influence on BI tools, data democratization, and the pursuit of comprehensive customer views signifies an era where data insights are more accessible, actionable, and ethical, leading to empowered end users across industries.”
Analytics and business intelligence vendors are incorporating various forms of LLMs into their products. “They are using LLMs to provide better help in product, automatically building out solutions, and to analyze data with natural language input, or outputting insights from their products in natural language,” said Alan Jacobson, chief data and analytics officer at Alteryx. “While many products had these features before LLMs, using small language models and other techniques, there will be better implementations with higher customer usage now that the underlying algorithms are better.”
The rise of LLMs also opens up natural language processing as a means to interact with data. Previously, most, if not all, data access was via coding, visualization, and low-code/no-code methods, Jacobson pointed out. “Now, natural language is bringing a sense of balance to the data/interaction landscape. It will not replace the other modes, but it will ensure that all methods have their purpose.”
For instance, “LLMs will play a crucial role in facilitating transitions between these modes as a knowledge worker moves from asking a question to wanting to explicitly filter and sort data with low-code methods or interact with the data visually,” Jacobson explained.
The evolution of LLMs is also amplifying the possibilities of the scale and depth of analysis. “While natural language processing already made data more accessible, generative AI has taken it further by enabling vast and complex comparisons that would be nearly impossible for humans alone to achieve,” said Diby Malakar, VP of product management at Alation. “LLMs are the catalyst for making data-driven decisions faster and with greater precision, uncovering correlations and insights.”
Ultimately, LLMs will help elevate the jobs of data managers above the rote tasks that occupy much of their time. One of the most significant impacts of LLMs such as GPT-4 and PaLM “is automating complex language-based analytics, making tasks like data cleaning, summarization, and entity recognition faster and more accessible,” said Bellur.
In addition, “LLMs improve the integration and interpretation of unstructured data by handling natural language processing tasks with unprecedented accuracy, which is essential in fields like customer service, finance, and healthcare, where data variety and volume are high,” Bellur added. “LLMs also enable better semantic understanding of data, facilitating complex tasks like schema matching, data discovery, and query synthesis, which were challenging with traditional models. This deep understanding allows LLMs to generate insights from data that can answer questions more interactively.
The improved data accessibility and streamlined workflow offered by LLMs are changing how businesses use analytics, significantly reducing the time and expertise required to gain actionable insights from vast datasets.”
LLMs, when integrated with enterprise data, will steer the course of analytics in the year ahead. “Because commercial LLMs are only trained on publicly available information, they often struggle to provide the necessary quality that businesses need,” Joel Minnick, VP of marketing at Databricks, explained. “Instead, companies are combining LLMs with their enterprise data and governance to build data intelligence—an advanced form of AI with a unique understanding of specific domains. This concept of data intelligence will further the reach of data analytics and make it possible for everyone within an organization to get accurate insights from their companies’ unique data.”
Front-end BI and analytics tools “are evolving rapidly, with emphasis on intuitive, user-friendly interfaces and enhanced interactivity,” Bellur observed. “Traditional BI tools are being augmented by LLM-powered analytics that allows for real-time, natural language interaction with data. This shift makes BI accessible to users without technical expertise by allowing them to communicate with data systems conversationally and gain insights quickly. This interactive approach is particularly impactful in fields like customer service, where instant access to data insights can dramatically improve response times and personalization.”
In addition, “LLMs contribute to advancements in augmented analytics, where AI-driven insights assist users by suggesting trends, forecasting, and alerting based on detected patterns,” Bellur continued. “As predictive and prescriptive analytics become more integrated into BI platforms, these tools can offer recommendations that empower end users to make data-informed decisions autonomously. This new landscape suggests that BI tools will not only serve as data viewers but as proactive agents that guide decision making in real time.”
GenAI and LLMs are having “a transformative impact on BI tools, as now, anyone across an organization can use natural language to access data and gain real insights hidden underneath the old world of limited charts and reports,” said Minnick. “While self-serve access is important for BI tool adoption, the tools are only as good as the insights they provide. The intelligence needs to come from the bottom of the stack, not bolted on at the top. BI tools need to align with companies’ governance framework and adhere to any global policies set by administrators. Data teams need the ability to track the usage of their data assets BI tools, as traceability helps instill confidence in the analysis results.”
ANOTHER SURGE IN DATA DEMOCRATIZATION COMING
This enhanced accessibility to data will be the hallmark of analytics in the year ahead. “While I would largely consider data democratized already, it still needs to be made more accessible,” said Lou Flynn, senior manager for AI and analytics at SAS. “Focus on simplifying data access for users with varied technical backgrounds. Organizations that do it well are seeing massive benefits for their expanding analyst personas.”
The new era of BI “is about embedding insights directly into everyday workflows, making data-driven decisions accessible to everyone—not just analysts,” said Malakar. “Generative AI enables users to uncover and act on insights within their existing tools, transforming productivity and amplifying what’s possible. While search-based and augmented analytics allow users to ask questions without being data experts, generative AI opens doors to insights users might not have known to seek. But as BI becomes more accessible, the insights are only as reliable as the data behind them, making trusted, governed data foundational to creating a truly data-driven organization.”
“BI and analysis tools are not just making it easier to create reports and insights, they are making it easier to understand the insights provided,” said Williams. “The ability to ask questions of a BI report or provide an easy summary of key insights helps you gain more value from the tools. Additionally, low-code/no-code BI tools are putting more capabilities in the hands of business users to tackle complex problems with sophisticated solutions.”
Advances in GenAI are exponentially democratizing access to data analytics, industry experts concurred. “But a key challenge remains,” Minnick cautioned. “If AI doesn’t understand the underlying data, it doesn’t understand the uniqueness of a company’s domain. For example, AI may not understand the organization’s specific product lines or that its fiscal year happens to begin in February instead of January.”
As a result, “the LLM will hallucinate and make up inaccurate responses,” Minnick continued. Natural language processing and democratized data access may mitigate this risk. “With a data intelligence approach, teams can simply talk to their organization’s data and get a high-quality response.
Knowledge workers could converse with their data to ask questions and discover insights on their own. With a natural language interface, combined with AI that truly understands the user’s unique data, deep technical knowledge or SQL skills are no longer a requirement.”
Data analytics—fueled by AI—“now extends well beyond data scientists and specific business functions,” Flynn said. “Analytics are now an executive-level priority and essential for staying competitive and unlocking the benefits of enhanced operational efficiency, innovation, customer experience, and workforce productivity.”
However, a risk in democratizing data is that “too often it results in a flood of dirty, unformatted, and ultimately unusable data,” Flynn warned. “All too frequently, I see these datasets hastily uploaded into enterprise systems that don’t align with their users’ intended purpose. The outcome is a familiar one: the same data processed in different ways across the organization to meet varying needs, leading to inefficiencies, errors, and slower time-to-value.”
A well-crafted internal “data products” strategy is essential to “provide seamless access to data services, tailored to the specific needs and workloads of diverse users across the organization,” he added.
There is a significant shift toward democratizing data analytics that will only gain traction over the coming year. “Tools leveraging natural language search and data governance foster a data-driven culture where insights become a shared language across the organization,” said Malakar. “We recognize that data is a shared asset and that’s essential to every function within our organization,” said Williams. “Whether you’re in claims, finance, or technology, data is a cornerstone of our collective accountability. And thus, every team member becomes a responsible steward of the data we access.”
As end users continue to have more access to the power of data analytics, “it places greater importance on the need for active data literacy, stewardship, and governance programs,” said Williams. “To this end, we’ve instituted an executive education program, complemented by extensive training initiatives organization-wide, to deepen our understanding of data. By demystifying data and going beyond its abstract nature, we empower ourselves to harness it effectively. This commitment ensures that every team member becomes a responsible steward of the data we access.”