Newsletters




Braving the AI Era: Tech and Strategies for Surfacing Tangible AI Value


The AI era that an array of industries has suddenly found themselves in is overflowing with idyllic promises of greater data efficiency, agility, and innovation. Pushing those promises into a tangible reality, however, is the foremost challenge that enterprises—that are in various stages in their IT modernization projects—must address.

Experts in data and the AI era joined DBTA’s webinar, Modernizing Your Data Management Strategy for the AI Era, to examine the ways in which IT leaders and data management teams can effectively implement advanced tech in cloud computing, analytics, and AI in the face of the growing size and complexity of their data ecosystems.

Jon Osborn, field CTO at Ascend.io, explained that when it comes to AI, 80% of those projects fail to produce business value. With that value in mind, Osborn explained that data maturity is a necessary component for any organization attempting to integrate AI and LLM technologies into their existing data infrastructures.

While most people want to believe they are ready for AI, their data maturity is often not at the appropriate stage. If data maturity is low, then enterprises will be unable to determine how any AI/ML initiative will impact the business, which is crucial when embarking on an AI journey.

“The model itself is not the only outcome, and that’s not your actual business product,” said Osborn. “Your business product has to be assembled from other data that you’re ingesting and combining with the outcomes of your models to actually produce something valuable that your business partners can use.”

Operationalizing this business product at scale is where Ascend’s platform steps in, according to Osborn. Ascend enables organizations delivering pipelines with AI/ML models to do so with speed, efficiency, and optimization, ultimately accelerating the time it takes to achieve business value from these initiatives.

Dan DeMers, chief simplification officer at Cinchy, explained that Cinchy’s “unique philosophy is that as we introduce technology innovation—things like AI, generative AI, ChatGPT—it better remove more complexity than it adds—and that’s not how it used to work.”

“Our purpose is to change that by enabling simplifying technology as technology innovation comes to life,” said DeMers.

While AI has certainly changed the tech landscape, the barriers to its success—integration and governance complexity—remain prevalent. According to DeMers, these barriers highlight the need for a simpler, more scalable approach to enterprise data management.

Collaboration technology is the key toward making workflows, and the data they handle, simple. This further unlocks AI-enabled data collaboration at scale which federates and liberates data for the modern enterprise.

The Cinchy platform is purpose-built for data products that are collaborative—or co-produced between teams, legacy and new platforms, or organizations—and dynamic—or data models that change over time. The simplification of data collaboration, according to DeMers, is the foundation for delivering innovation in the era of AI.

According to John de Saint Phalle, senior product manager at Precisely, data is still fueling the entire business.

“In the age of AI, data is still the coin of the realm,” he explained. “Data is still the currency by which we all operate. Those who are ‘wealthy’—and have troves of data to work with—are going to be much more successful and have much more means when we talk about these specific projects.”

However, it can’t be just any data pulled from a mass of sources; trusted data, de Saint Phalle remarked, should be the number one priority when attempting to implement AI projects.

While these projects may be “simple” in their verbiage—i.e., wanting to ask questions of proprietary data in natural language—they represent a host of different processes and operations that must fall perfectly into place. Trusted data, ultimately, is the key toward achieving that endeavor.

As data journey complexity grows and is further compounded by the heavy operational burdens that innovation incurs, Precisely produces trusted data that is accurate, consistent, and contextual to aid enterprises in cultivating a proper foundation for any AI project.

Helping customers at every step along their data integrity journey—from integrating data to performing data quality, employing tight governance standards, and watching that data over time for potential dips in integrity—Precisely empowers organizations to embark on AI initiatives with a robust data foundation in place.

Yori Lavi, field CTO at SQream, echoed de Saint Phalle’s concerns, remarking how the quality of the foundations and infrastructures from which enterprises launch various AI initiatives is critical in those projects’ successes.

“At the end of the day, if the underlying infrastructure that processes all the data is not able to do this at scale, then everything stalls,” said Lavi.

When asking SQream’s customers what they needed, a large challenge that arose was actually becoming a data-driven organization. Despite these orgs’ investments into this arena, they find the ability to transform raw data into actionable decisions  a major obstacle.

SQream’s value proposition is leveraging GPUs to enact data processing faster than other—and as Lavi put it, more incumbent—methods. By using GPUs to process data and achieve a higher throughput while driving greater data democratization, data scientists experience more synergy in their workflows. It also improves yield, reduces runtime, costs, and data loading.

“This works in the development phase of machine learning,” said Lavi. “If you can get 5x more productivity out of the same data scientist because he’s not waiting for someone else...he can just do it [data processing] from his laptop, and it just works.”

For an in-depth review of strategies and technologies for effectively braving the AI era, you can view an archived version of the webinar here.


Sponsors