Newsletters




Kinetica Integrates with Native LLM for Accessible, Secure, and Optimized SQL Queries


Kinetica, the speed layer for generative AI and real-time analytics, is integrating with a native large language model (LLM) that enables consumers to use natural language to perform ad-hoc analysis on real-time, structured data, centered around security.

Public LLMs—such as Open AI’s GPT-3.5—introduce a variety of privacy and security concerns for enterprises dealing with highly sensitive data. Kinetica’s native LLM, dubbed SQL-GPT, is engineered to keep data within the customer’s environment and network perimeter—no external API call required.

In addition to its security advantage, SQL-GPT is innately woven with industry jargon—such as that associated with telecommunications, financial services, automotive, logistics, and others—to ensure that SQL generation is more reliable and more accurate for business users. This extends even further outside of SQL, where the LLM is equipped to handle tasks for decision-making relating to time-series, graph, and spatial expertise.

“Kinetica has led the market with next-level capabilities for analyzing sensor and machine data with our vectorized, real-time analytic database,” said Nima Negahban, co-founder and CEO of Kinetica. “With the integration of SQL-GPT, we extend this capability to an entirely new horizon, empowering organizations to unleash the true potential of their real-time, structured data like never before."

Kinetica utilizes a fine-tuned approach that optimizes SQL generation for consistency and accuracy. Unlike other approaches that value creativity, and in turn, unpredictability, Kinetica prioritizes ongoing functionality and reliability for SQL query outcomes, according to the company.

“At Kinetica, we believe in fostering openness and embracing the diversity of generative AI models,” said Amit Vij, co-founder and president of Kinetica. “We expect there will be different LLM platforms that emerge, and we want to provide our customers with choice. While currently supporting two models, our commitment lies in continuously expanding our offerings to accommodate client-driven preferences and seamlessly integrate with a wide array of future models. Towards that end, Kinetica will roll out integration with other LLM platforms like NVIDIA NeMo later this year for language to SQL as new state of the art models emerge."

To learn more about Kinetica and SQL-GPT, please visit https://www.kinetica.com/.


Sponsors