The ongoing hubbub that surrounds the world of AI and ML promises a wealth of opportunities when it comes to modernizing data-driven enterprises. Though AI and ML do, in fact, present such exciting possibilities for modernization, the reality of its successful implementation requires an intelligent mix of the right platforms, tools, and practices.
Experts in AI and ML joined DBTA’s webinar, Enabling Broader AI and ML Analytics Adoption: Data Platforms, Tools, and Practices, to delve into the myriad of ways that enterprises seeking to implement these wildly popular concepts can do so with agility, governance, integration, automation, and best practices in mind.
Lucy Zhu, product marketing manager at Snowflake, kicked off the conversation by exploring the great rewards—and simultaneously, the great risks—associated with large language model (LLM) implementation.
According to Zhu, the focus of LLMs is centered on its training and its hosted environment. Currently, many are trained with internet data and hosted externally. While this is certainly a valid path to take, Zhu emphasized an area of untapped opportunity; self-hosted LLMs trained solely on an enterprise’s unique proprietary data.
Naturally, there are some risk rewards when it comes to both of these LLM strategies. While self-hosted, open source LLMs maintain privacy and security of its data and prompts, it introduces a heavy operational burden and a lack of model accessibility. On the other hand, externally-hosted LLMs risk the potential of data and prompt exposure, however, it introduces minimal infrastructure operations and enables access to some of the leading third-party LLM models.
Zhu pointed to a way for enterprises to secure a LLM platform without the tradeoffs. With Snowflake, enterprises can do the following, all within Snowflake’s security perimeter:
- Get smarter with your data via Document AI, a purpose-built multi-modal LLM for document intelligence that performs analysis and surfaces insights on structured data.
- Enhance user productivity with text to code—enabling users to search and discover answers with natural language—and semantic search—an AI assistant for the Snowflake Marketplace.
- Bring LLMs directly to your data via secure and customizable third-party LLM hosting and an LLM user interface with Streamlit, which allows businesses to build unique conversational experiences with little code.
Dan DeMers, CEO of Cinchy, opened with a succinct argument: Your AI initiatives will fail to scale if you don’t solve for data collaboration.
He further explained that data collaboration solves scenarios where data products and business owners benefit from “coproduction,” using federated collaboration. He emphasized that sharing is not collaboration; sending email attachments as a way of sharing and integration fails to offer the same agility as real-time collaboration, such as a solution like Google Drive/Docs.
Data integration is antiquated, according to DeMers; spreadsheets feeding into SaaS, legacy, and custom applications—which all have numerous pipelines interconnecting each other, further underpinned by several advanced analytics engines and governance and control platforms—is fundamentally inadequate and difficult to scale.
Data collaboration, conversely, simplifies data pipelines among SaaS, custom, and legacy apps through a data collaboration platform, which does the following:
- Liberates data, allowing app owners to make business data available for federation with real-time, 2-way synchronization
- Federates data, where data product owners can create and link data products with granular data access policies
- Introduces true collaboration, as project teams can use data products to access and change data to meet business requirements
Lalit Ahuja, chief customer and product officer at GridGain, explained the current barriers toward enterprise AI implementation, offering the following statistics:
- 15% of enterprises cited that it’s due to a lack of skills and technology understanding
- 38% of enterprises cited that it’s due to the inability to quantify the business value of AI
- 47% of enterprises cited that it’s due to challenges with data and technology, where accessibility, quality, processing, and security pose major issues
The AI lifecycle, Ahuja explained, is a circular process that spans data acquisition, modeling, AIOps, and execution, where each of these stages requires significant technological components to implement AI successfully. For example, data acquisition requires efficient data stores, pipelines, ingestion, and visualization, while execution requires in-transaction invocation, event-driven analytics, and data-driven decision-making.
Ahuja narrowed down some key success factors for AI implementation, which include:
- Timely data availability that supports different data types and formats
- Low-latency data processing that unites data at-rest, data in-motion, and contextualization
- Model training and execution against streaming or transactional data
- AIOps implementation that enables model lifecycle management, model retraining, and model deployment
Ultimately, Ahuja pointed to GridGain’s unified, real-time data platform as the answer to these various AI adoption barriers, where such a platform “simplifies and optimizes the data architecture for companies that require extreme speed, massive scale, and high availability from their ecosystem.”
GridGain’s platform enables enterprises to successfully expedite the AI lifecycle implementation processes, offering high-speed transaction processing, advanced analytics, ML/AIOps, stream processing, a single-view data hub, and a system of record from a unified pane of glass.
For an in-depth discussion of successful AI and ML adoption featuring live demos, you can view an archived version of the webinar here.