Newsletters




Tecton Collaborates with Databricks to Quickly Deploy Machine Learning Applications to Production


Tecton, the enterprise feature store company, is partnering with Databricks, the Data and AI Company, to help organizations build and automate their machine learning (ML) feature pipelines from prototype to production.

Tecton is integrated with the Databricks Lakehouse Platform so data teams can use Tecton to build production-ready ML features on Databricks in minutes.

“We are thrilled to have Tecton available on the Databricks Lakehouse Platform,” said Adam Conway, SVP of products at Databricks. “Databricks customers now have the option to use Tecton to operationalize features for their ML projects and effectively drive business with production ML applications.”

Databricks and Tecton are collaborating to accelerate and automate the many steps involved in transforming raw data inputs into ML features and serving those features to fuel predictive applications at scale.

Built on an open lakehouse architecture, Databricks allows ML teams to prepare and process data, streamline cross-team collaboration, and standardize the full ML lifecycle from experimentation to production.

With Tecton, these same teams can now automate the full lifecycle of ML features and operationalize ML applications in minutes without having to leave the Databricks workspace, according to the vendors.

“Building on Databricks’ powerful and massively scalable foundation for data and AI, Tecton extends the underlying data infrastructure to support ML-specific requirements. This partnership with Databricks enables organizations to embed ML into live, customer-facing applications and business processes, quickly, reliably and at scale,” said Mike Del Balso, co-founder and CEO of Tecton.

Available on the Databricks Lakehouse Platform, Tecton acts as the central source of truth for ML features, and automatically orchestrates, manages, and maintains the data pipelines that generate features.

Allowing data teams to define features as code using Python and SQL, the integration further enables ML teams to track and share features with a version-control repository.

Tecton then automates and orchestrates production-grade ML data pipelines that materialize feature values in a centralized repository.

From there, users can instantly explore, share, and serve features for model training, batch and real-time predictions across use cases without worrying about typical roadblocks such as training-serving skew or point-in-time correctness.

As the interface between the Databricks Lakehouse Platform and their ML models, Tecton allows customers to process features using real-time and streaming data from a myriad of data sources.

By automatically building the complex feature engineering pipelines needed to process streaming and real-time data, Tecton eliminates the need for extensive engineering support and enables users to drastically improve model performance, accuracy and outcome, according to the vendors.

For more information about this collaboration, visit www.tecton.ai.


Sponsors