Newsletters




ServiceNow, Hugging Face, and NVIDIA Collaborate to Create Open-Access LLMs for Building Enterprise Applications


ServiceNow, Hugging Face, and NVIDIA are releasing StarCoder2, a family of open-access large language models for code generation, aiming to set new standards for performance, transparency, and cost-effectiveness.

“Since every software ecosystem has a proprietary programming language, code LLMs can drive breakthroughs in efficiency and innovation in every industry,” said Jonathan Cohen, vice president of applied research at NVIDIA. “NVIDIA’s collaboration with ServiceNow and Hugging Face introduces secure, responsibly developed models and supports broader access to accountable generative AI that we believe will benefit the global community.”

StarCoder2 was developed in partnership with the BigCode Community, managed by ServiceNow and Hugging Face, where the machine learning community collaborates on models, datasets, and applications, according to the vendors.

Trained on 619 programming languages, StarCoder2 can be further trained and embedded in enterprise applications to perform specialized tasks such as application source code generation, workflow generation, text summarization, and more.

Developers can use its code completion, advanced code summarization, code snippets retrieval, and other capabilities to accelerate innovation and improve productivity, according to the companies.

StarCoder2 offers three model sizes: a 3-billion-parameter model trained by ServiceNow; a 7-billion-parameter model trained by Hugging Face; and a 15-billion-parameter model built by NVIDIA with NVIDIA NeMo and trained on NVIDIA accelerated infrastructure.

According to the vendors, the smaller variants provide powerful performance while saving on compute costs, as fewer parameters require less computing during inference. The new 3-billion-parameter model matches the performance of the original StarCoder 15-billion-parameter model.

“StarCoder2 stands as a testament to the combined power of open scientific collaboration and responsible AI practices with an ethical data supply chain,” said Harm de Vries, lead of ServiceNow’s StarCoder2 development team and co-lead of BigCode. “The state-of-the-art open-access model improves on prior generative AI performance to increase developer productivity and provides developers equal access to the benefits of code generation AI, which in turn enables organizations of any size to more easily meet their full business potential.”

StarCoder2 models share a state-of-the-art architecture and carefully curated data sources from BigCode that prioritize transparency and open governance to enable responsible innovation at scale. 

StarCoder2 advances the potential of future AI-driven coding applications, including text-to-code and text-to-workflow capabilities, according to the companies.

With broader, deeper programming training, it provides repository context, enabling accurate, context-aware predictions. These advancements serve seasoned software engineers and citizen developers alike, accelerating business value and digital transformation, according to the companies.

ServiceNow’s text-to-code Now LLM was purpose-built on a specialized version of the 15-billion-parameter StarCoder LLM, fine-tuned and trained for its workflow patterns, use cases, and processes. Hugging Face has also used the model to create its StarChat assistant.

All StarCoder2 models will also be available for download from Hugging Face, and the StarCoder2 15-billion-parameter model is available on NVIDIA AI Foundation models for developers to experiment with directly from their browser, or through an API endpoint.

For more information about this news, visit https://huggingface.co/bigcode.


Sponsors