GMI Cloud, a fast-rising provider of GPU-as-a-Service infrastructure purpose-built for AI, announced it is one of the first GPU cloud providers to contribute to NVIDIA DGX Cloud Lepton, a recently announced AI platform and marketplace designed to connect developers with global compute capacity.
As an NVIDIA Cloud Partner, GMI Cloud will bring high-performance GPU infrastructure, including NVIDIA Blackwell and other leading architectures, to the NVIDIA DGX Cloud Lepton.
According to the vendors, this integration gives developers access to GMI Cloud’s globally distributed infrastructure, supporting everything from low-latency real-time inference to long-term, sovereign AI workloads.
DGX Cloud Lepton addresses a critical challenge for developers: securing access to reliable, high-performance GPU resources at scale in a unified way, NVIDIA said.
DGX Cloud Lepton addresses this challenge with a unified platform that simplifies development, training, and deployment of AI.
The platform integrates directly with NVIDIA’s software stack, including NVIDIA NIM microservices, NVIDIA NeMo, NVIDIA Blueprints, and NVIDIA Cloud Functions, to make the journey from prototype to production faster and more efficient.
GMI Cloud is contributing to DGX Cloud Lepton by offering:
- Direct access to NVIDIA GPU clusters optimized for cost, scale, and performance
- Strategic regional availability to meet compliance and latency needs
- Full-stack infrastructure ownership that allows us to deliver unbeatable economics to our customers
- Fast deployment pipelines powered by a robust tool chain and NVIDIA’s integrated software stack
“DGX Cloud Lepton reflects everything we believe in at GMI Cloud: speed, sovereignty, and scale without compromise,” said Alex Yeh, CEO of GMI Cloud. “We built our infrastructure from the silicon up to help developers build AI without limits. This partnership accelerates that vision.”
For more information about this news, visit www.gmicloud.ai.