With the ever-proliferating numbers of data that every enterprise must contend with, adopting the latest and greatest cloud platforms and analytics tools seems a logical step in the modernization process. Yet, despite these tools, organizations far and wide struggle to truly unlock the value of all their data.
Experts in next-generation solutions and architectures joined DBTA’s webinar, The Race to Unified Analytics: Next-Gen Data Platforms and Architectures, to examine both the business potential of new technologies, as well as its common hurdles—such as data silos, complex data and analytic systems, remaining legacy technologies, integration, and governance and security—that prevent the fruition of a modern, data-accessible enterprise.
Paige Roberts, open source relations manager at OpenText Analytics and AI, aimed the discussion toward the importance of unifying business intelligence (BI) and data science through a next-generation data lakehouse.
Roberts explained that there are two common enterprise strategies for analytics; a data warehouse, for BI, and a data lake, for machine learning (ML) and AI. As separate entities, these two structures can be adopted simultaneously, yet this incurs expensive, complicated tech stacks, low data accessibility, lack of data trust, slow production on ML projects, and more.
The future’s remediation of this challenge, Roberts argued, is the data lakehouse, where a high performance, high concurrency analytics engine meets affordable, highly scalable data storage for any kind of data.
OpenText Vertica offers smarter analytics for any data, anywhere, with the integration of a data warehouse and a data lake. With isolated compute workloads, OpenText Vertica enables enterprises to do many things at once without affecting other analysts or workload speeds. This solution also offers:
- BI and ML processes in a single place
- Rapid ML project production
- ML functions using SQL or Python, as well as R for BI functions
- Powerful analytics application builds with BI and AI capabilities
- Efficient infrastructure usage
- Fast data access that’s always up-to-date
Stephen Darlington, principal consultant at GridGain, emphasized that technology trends and the demand for data has led to a variety of architectural issues, which include siloed data, latency issues, lack of scalability, and high-performance compute. While there are solutions that solve one or two of these challenges, rarely is there one that solves them all.
Fortunately, GridGain has seized the opportunity to innovate on this technological gap with the GridGain Unified Real-Time Data Platform, a platform designed to correct multi-dimensional data problems. The solution tackles numerous challenges facing the modern enterprise, offering:
- High-speed transaction processing with an in-memory database and data grid
- Execution of advanced analytics
- Continuous training and learning of AI and ML models
- Single view of data through the data hub
- System of record for managing and storing data
- Stream processing for real-time data
The use cases for GridGain’s platform are numerous, ranging from real-time risk management to smart decisioning, real-time transactional analytics, high-performance OLTP, and more. With unmatched speed and scale, transactional and analytical workload support, and a cloud native architecture, Darlington highlighted that GridGain’s data platform can be the key to efficiently—and successfully—modernizing.
According to Brett Hansen, chief growth officer at Semarchy, culture remains the key impediment to data transformation initiatives. Whether the cultural barrier exists as evolving business processes, outdated organization structures, or natural, human resistance to change, this phenomenon can manifest as the main inhibitor toward modern enterprise success.
Hansen also pointed to the common success factors associated with delivering value in data products, which include:
- A measurable business objective
- Single domain focus
- Great user experience
- Quick value delivery
- Ability to adapt and evolve to business requirements
Despite this clear outline of what proves to be a successful data product, 90% of businesses fail when first attempting to implement and maintain a Master Data Management (MDM) project, according to a 2021 Gartner report.
As a culminated response to both the cultural barriers and common success factors for data products, Hansen recommended the Semarchy Unified Data Platform, which orchestrates and masters data using a no-code, business-driven configuration approach to rapidly generate custom applications and deliver high-quality records across an organization.
Deployable on-prem, in cloud, or in hybrid cloud environments, Semarchy’s platform excels in its ease of use, seamless scalability, reduced TCO, and rapid time-to-value. Tackling both the challenges of cultural barriers and common MDM issues, Hansen emphasized that Semarchy’s platform can make solving business problems as simple as defining them.
To view a detailed examination of next-gen data solutions and challenges, an archived version of this webinar is available here.