Newsletters




Embedding AI into the Modern Business at Data Summit 2023


Embedding AI into businesses is a highly popular subject that has many enterprises’ attention, though how to properly do so to enable agility and scalability while remaining cost-effective remains an issue.

Kaladhar Voruganti, senior technologist in office of the chief revenue officer at Equinix, and Rory Kelleher, director of global business development for healthcare at NVIDIA, led the Data Summit’s session, “Succeeding With AI in the Cloud,” to discuss strategies and methods to effectively utilize AI within business operations.

The annual Data Summit conference returned to Boston, May 10-11, 2023, with pre-conference workshops on May 9.

“Most companies, when they talk about AI, consist of many steps,” explained Voruganti, which include the following:

  • Data ingestion
  • Data processing curation
  • AI model inference and AI model training
  • Data warehouse and databases
  • Outcomes

Voruganti noted that AI model training and model inference are the largest drains on organization finances, posing a challenge for any enterprise in the current economic climate.

“Historically, AI has been done mostly in a central location. However, this is changing; we are seeing inference being pushed to the edge,” explained Voruganti. Enterprises are moving AI to the edge for use cases with large datasets, latency, and privacy concerns, as well as for cloud preferences.

Voruganti offered a statistic: 75% of enterprise generated data will be processed outside the cloud by 2025. He further emphasized that “if data is being generated outside of the cloud, process it outside of the cloud. It is too expensive to migrate data to the cloud if it's being generated elsewhere.”

Kelleher continued the conversation by focusing on AI/ML trends that may impact the way organizations implement these technologies for positive business outcomes. The main trends were boiled down to the following:

  • AI requires full stack computing, where systems are integrated and work in concert with one another
  • AI requires access to various data types/sources which emphasize security, compliance, and privacy
  • AI leverages large language models (LLMs) and transformer AI

He further explained that AI is moving from centralized to distributed hybrid cloud, where workloads are moving to the edge for latency and availability benefits, data uses multiple, external sources, and architectures use multiple platforms, including a mix of SaaS, PaaS, and data lakes.

Many Data Summit 2023 presentations are available for review at https://www.dbta.com/DataSummit/2023/Presentations.aspx


Sponsors