Domino Data Lab, the enterprise MLOps platform company, is announcing updates to its platform that will drive accessibility to open source tools and techniques—including Ray 2.0, MLflow, and Feast’s feature store for machine learning (ML)—allowing enterprises to see tangible value from their AI, sooner. The announcement is also accompanied by the launch of Domino Cloud, the fully-managed MLOps Platform-as-a-Service, and the general availability of Domino’s hybrid and multi-cloud Nexus capability.
Now supporting Ray 2.0, an open source framework, Domino’s platform features accelerated development and training of generative AI models at scale. The development process is further streamlined with Domino’s auto-scaling compute clusters, paired with data prep via Apache Spark, as well as ML and deep learning with PyTorch, TensorFlow, and XGBoost.
“The incorporation of on-demand, auto-scaling clusters and Ray 2.0 support in Domino’s spring release accelerates both development and data preparation for teams at any scale. Ray speeds up the process by providing a unified, distributed compute framework that makes it easy to scale AI and Python workloads—from reinforcement learning to deep learning to model tuning,” explained Chris Lauren, SVP of product at Domino Data Lab. “This single-platform integration enables data scientists to be more productive by streamlining data preparation and model training from end to end.”
The MLflow integration targets ML lifecycle management, enabling data scientists to more simply track, reproduce, and share experiments and artifacts directly within their Domino projects. Domino’s security protocols are maintained across artifacts, metrics, and logs.
“As data scientists iteratively explore which new breakthroughs in algorithms, fine-tuning foundational models, and tuning hyperparameters yield the best results, it’s important to track their progress in a consistently sharable way for model review and audits,” said Lauren. “Our customers can now leverage MLflow to automatically log key metrics and artifacts that help them manage experimentation at scale, streamline their work, and increase collaboration with other team members, team leads, or auditors.”
Native integration of Feast within Domino streamlines access to query and transform ML functions. This introduces a cost-saving reusable feature logic across data science projects, further tracing feature lineage while simultaneously ensuring data accuracy and security.
The launch of Domino Cloud focuses on accelerating time-to-value for AI projects with scalable resources and a secure, governed, enterprise-grade platform, according to the company. The platform needs no setup or management investment, ensuring that data science workflows can maintain focus on more critical tasks. This solution allows teams to do more with less, reducing operational burden; customers only pay for compute used and can access GPUs and distributed compute frameworks.
“In contrast to more limited fully managed data science platforms offered by cloud providers, Domino Cloud allows teams to integrate workflows and accelerate the full lifecycle from experiment to production. This means that teams can use end-to-end workflows with common patterns and practices, even if different users have different tool preferences,” said Lauren. “In this way, Domino Cloud is ideal for teams with an urgent need to scale AI while maintaining full access to the complete ecosystem of professional data science tools and scalable infrastructure they need to drive immediate business impact.”
Domino is additionally announcing the general availability of Domino Nexus, built for enterprises with complex accelerated computing associated with generative AI projects existing throughout hybrid and multi-cloud environments. Workload deployment ranges from data centers to the edge, empowering enterprises with seamless workload migration.
Due in part to Domino’s achievement of membership within the NVIDIA AI Accelerated program, enterprise customers are assured that Domino’s platform supports the latest GPU and DPU technologies.
Once again alleviating cost pains, Domino Nexus is accompanied by a Vultr validation, enabling Domino Nexus customers to burst to Vultr Cloud with virtualized fractional NVIDIA A100 Tensor Core GPUs.
“Our partnership with Vultr Cloud provides an exciting new opportunity for our customers to access the latest NVIDIA GPUs at a competitive price point. This further increases customers’ flexibility and choice in their hybrid and multi-cloud data science strategies,” said Lauren. “By offering more options from companies like Vultr, we're continuously evolving Nexus to support our customers' changing needs and helping them streamline their migration to the cloud or hybrid/multi-cloud strategies.”
This Vultr infrastructure, working in combination with the NVIDIA NGC catalog, and the NVIDIA AI Enterprise software suite helps enterprises reduce costs while driving innovative generative AI projects.
The Spring release of Domino’s platform (Domino 5.5), as well as Domino Cloud and Domino Nexus, are available today. Integrations with MLflow and the Feast feature store are available in preview. The Domino and Vultr solution will be released later this year.
“Domino’s Spring ‘23 release unveils powerful new capabilities which give every enterprise access to cutting-edge, open source tools and techniques to achieve real business value from AI in a fraction of the time,” concluded Lauren. “No company, regardless of scale, needs to be without the tools to unlock new insights and capabilities, and allows them to drive innovation for the creation of new AI infused products and services that were previously beyond reach.”
For more information about this news, visit www.dominodatalab.com.