NVIDIA is releasing the NVIDIA GPU Cloud Container registry for AI developers, allowing them to begin deep learning development.
The cloud-based service is available immediately to users of the just-announced Amazon Elastic Compute Cloud (Amazon EC2) P3 instances featuring NVIDIA Tesla V100 GPUs. NVIDIA plans to expand support to other cloud platforms soon.
After signing up for an NGC account, developers can download a containerized software stack that integrates and optimizes a wide range of deep learning frameworks, NVIDIA libraries and CUDA runtime versions — which are kept up to date and run seamlessly in the cloud or on NVIDIA DGX systems.
“The NVIDIA GPU Cloud democratizes AI for a rapidly expanding global base of users,” said Jim McHugh, vice president and general manager of Enterprise Systems at NVIDIA. “NGC frees developers from the complexity of integration, allowing them to move quickly to create sophisticated neural networks that deliver the transformative powers of AI.”
Developers who want to get started with deep learning right away using the NGC container registry can follow a three-step process:
Key benefits of the NGC container registry include instant access to the most widely used GPU-accelerated frameworks, maximum performance, pre-integration, and up to date containers.
For more information about this news, visit www.nvidia.com.