CircleCI, a continuous integration and continuous delivery (CI/CD) platform, announced it has implemented a gen2 GPU resource class, leveraging Amazon Elastic Compute Cloud (Amazon EC2) G5 instances, to offer the latest generation of NVIDIA GPUs, and new images tailored for artificial intelligence/machine learning (AI/ML) workflows.
According to the company, these under-the-hood enhancements put cost-effective, powerful resources at developers' fingertips to accelerate AI innovation.
“Software teams are building the next wave of AI-powered applications that solve specific customer pain points,” said Rob Zuber, CTO at CircleCI. “While many teams find it difficult to get started, at the end of the day, we’re still building software. You already have 95% of the tools needed to do it. By supporting AI product builders with CircleCI’s comprehensive CI/CD tooling, engineering teams can confidently build upon years of key learnings while also addressing the novel changes AI introduces.”
Additionally, CircleCI launched key new features for teams building Large Language Model (LLM)-powered applications: inbound webhooks, including support for tools like Hugging Face, and an integration with evaluation platform LangSmith.
The company also released a CircleCI Orb for Amazon SageMaker to help software teams deploy and monitor ML models at scale.
With its new inbound webhooks and evaluation platform integrations, CircleCI is redefining what CI/CD tools do to manage the novel complexity that AI/ML introduces so that teams can confidently go from idea to innovation, according to the company.
CircleCI’s latest features are pivotal in helping teams manage the complexity of developing AI-powered applications by providing a structured, automated pipeline spanning from building and testing to training and monitoring them.
Its new inbound webhooks are entirely adaptable to all sources of change, making it the most change-agnostic CI/CD tool on the market. This also marks a necessary departure from a version control-centric approach to now allowing users to trigger pipelines through various sources.
This shift is crucial as AI-powered applications live outside the repository, with code, data, and the LLM all interacting to drive novel product experiences for end consumers. Therefore, engineering teams must rethink how they test, release, and retrain their applications, according to the company.
The custom Hugging Face integration is a new trigger to kick off pipelines. Developers can now run automated workflows any time a model on Hugging Face changes, so engineering teams can be confident that their application continues to behave as expected.
The LangSmith integration is the first of many evaluation platforms to be supported on CircleCI, enabling robust testing for non-deterministic outcomes. Testing AI-enabled applications is new territory for professional software teams, and these new capabilities from CircleCI will dramatically up-level developer confidence when building and validating LLM-powered software, according to the company.
CircleCI also introduced its new Orb for Amazon SageMaker to help teams using Amazon SageMaker ship their model to production. The Orb enables basic deployment to Amazon SageMaker, monitors deployments for problems, and quickly rolls back endpoints should something go awry. It also executes different deployment strategies, namely canary and blue/green style deployments.
For more information about this news, visit https://circleci.com.