<< back Page 3 of 6 next >>

Big Data 50: Companies Driving Innovation in 2021

Built on a modern lakehouse architecture in the cloud, Databricks combines the best of data warehouses and data lakes to offer an open and unified platform for data and AI that is relied upon by more than 5,000 organizations worldwide.

Offering an enterprise DataOps Platform that enables organizations to implement and manage an end-to-end DataOps program using tools they already own, DataKitchen helps to simplify complex toolchains, environments, and teams so that data analytics organizations can innovate, collaborate, and deliver on-demand insight.

Helping organizations decrease total cost of ownership and accelerate their innovation speed, DataStax delivers DataStax Astra, an open, multi-cloud serverless database that provides Cassandra-as-a-service with pay-as-you-go data, simplified operations, and the freedom of multi-cloud and open source.

Big Data Trailblazer by Chet Kapoor, Chairman & CEO

DATASTAX HAS A MISSION to deliver products that developers love and change the trajectory of enterprises.

The world runs on Apache Cassandra and DataStax was created to make the world’s most scalable database easier to run and manage, deploy the future of modern cloud applications cost-effectively and free enterprises from cloud vendor lock-in. We do that by shattering the traditional methods of managing real-time data and solving pain points for developers, while delivering always-on business continuity and bringing the power of Cassandra to every developer and enterprise, for mission-critical workloads. With DataStax, any developer or enterprise can now deploy data at massive scale, with 100% uptime, for lower cost.

Through a unique open data stack for the future, DataStax empowers any enterprise to tap the power of data without limits, providing a solution that is:

  • Kubernetes-based for cloud-native agility
  • Developer-ready with APIs to reduce time to market for new apps
  • Cloud-delivered to simplify operations and reduce TCO

Uniquely positioned to deliver the modern database of the future, DataStax harnesses its power to solve real business problems by making the distribution of data easy to scale, accelerating the data-driven enterprise and streamlining developer operations.

Today, nearly 500 of the world’s most demanding enterprises and half of the Fortune 100 rely on DataStax to power modern data apps, including Netflix, The Home Depot, T-Mobile, Intuit and so many more.


A data virtualization leader providing agile, high-performance data integration, data abstraction, and real-time data services across a broad range of enterprise, cloud, big data, and unstructured data sources, Denodo helps customers achieve faster access to unified business information.

Big Data Trailblazer by Ravi Shankar, Senior VP & Chief Marketing Officer

The term “big data” has been around for many years, in many respects it has come to mean almost everything and nothing at the same time. This is not the fault of the data, but more so how people use it and interpret it.

Why is this the case? there are many factors, but essentially if you can’t see the wood for the trees, it is hard to gain value from the data. Big data systems fail to become the single repository for all enterprise data. Organizations stumble on the challenges of moving/storing data of different types especially in multi cloud/hybrid cloud enterprises. It is quite common for no one person in an organization to have that single view of the data, organizations spend more time collecting data than they do analyzing it. ETL processes are often the tool of choice when organizations look to integrate siloed data. ETL processes are scripted to move data in batches and fail to deliver real time insights. They also fail to accommodate new sources without extensive testing and coding, and of course even more challenging are more modern data formats such as streaming IoT or unstructured data (which actually are often the real key to success in big data projects).

Data virtualization on the other hand is a data integration technology that integrates data in real time, without the need for replication. Data virtualization allows organizations to establish flexible modern disparate logical data architectures such as a logical data fabric, allowing them to draw data seamlessly across silos of a big data implementation.

The award-winning Denodo Platform offers the most advanced data virtualization capabilities available for establishing a logical data fabric to maximize big data investments. Its built-in data catalog provides seamless access to data via a searchable, contextualized interface, and in-memory parallel processing accelerates data access to unparalleled speeds.

Denodo Technologies

Reimagining the data lake service, Dremio eliminates the need to copy and move data to proprietary data warehouses or create cubes, aggregation tables, and BI extracts, enabling flexibility and control for data architects and self-service for data consumers.

An early innovator in AI and leading supplier of graph database technology, Franz provides AllegroGraph, a graph-based platform that unifies all data and siloed knowledge into an entity-event knowledge graph solution that can support big data analytics.

Big Data Trailblazer by Jans Aasman, CEO


Industry analysts recognize the power of Knowledge Graphs in delivering a modern big data architecture that provides integrated, trusted, and real-time views of enterprise data. The accelerating adoption in the enterprise of this Knowledge Graph approach, which unifies business data with knowledge bases, industry terms, and domain knowledge, is clearly the future of AI and advanced analytics.

Franz’s AllegroGraph platform further extends this modern Knowledge Graph approach with a novel Entity-Event Model, natively integrated with domain ontologies and metadata, and dynamic ways of setting the analytics focus on all entities in the system (patient, person, devices, transactions, events, operations, etc.) as prime objects that can be the focus of an analytic (AI, ML, DL) process.

The Entity-Event Data Model utilized by AllegroGraph with FedShard puts core “entities” such as customers, patients, students, or people of interest at the center and then collects several layers of knowledge related to the entity as “events.” Events represent activities that transpire in a temporal context. The rich functional and contextual integration of multimodal predictive modeling and artificial intelligence is what distinguishes AllegroGraph as a modern, scalable, enterprise knowledge platform. AllegroGraph is the first big temporal Knowledge Graph technology that encapsulates a novel entity-event model to deliver a modern data architecture to the Enterprise.

Financial institutions, healthcare providers, contact centers, manufacturing firms, government agencies, and other data-driven enterprises that use AllegroGraph gain a holistic, future-proofed Knowledge Graph architecture for big data predictive analytics and machine learning across complex knowledge bases to discover deep connections, uncover new patterns, and attain explainable results.

CONTACT FRANZ INC. TODAY to build your Enterprise Scale Knowledge Graph solution.

Franz Inc.

An in-memory technology vendor that is driving enterprise digital transformation, GigaSpaces is relied upon by hundreds of tier-1 and Fortune-listed organizations and OEMs across financial services, retail, transportation, telecom, healthcare, and more.

Google Cloud
With distributed cloud solutions that provide consistency between public and private clouds, Google Cloud has a commitment to open source, multi-cloud, and hybrid cloud—allowing customers to use their data and run their apps in any environment.

GridGain Systems is a provider of enterprise-grade in-memory computing solutions powered by Apache Ignite, an open source in-memory computing platform that delivers speed, scalability, and real-time data access for both legacy and greenfield applications.

HPE (Hewlett Packard Enterprise)
A global edge-to-cloud company, HPE helps organizations accelerate outcomes by unlocking value from all of their data, so they can develop new business models, engage in new ways, and increase operational performance.

<< back Page 3 of 6 next >>


Subscribe to Big Data Quarterly E-Edition