Newsletters




Trend-Setting Products in Data and Information Management for 2017

<< back Page 4 of 10 next >>

Dell EMC
www.emc.com
VMAX 250F—the entry model in the VMAX All Flash family is designed to deliver flash performance, low latency, and high availability in a compact footprint for the modern data center

Delphix Corp.
www.delphix.com
Delphix Platform—helps organizations release applications faster by delivering secure, virtualized data across the application lifecycle for development, testing, and reporting environments, on premises or in the cloud Denodo Technologies

Denodo Technologies
www.denodo.com
Denodo Platform—provides data virtualization to enable organizations to query, browse, search, and manage corporate data without the need to move data, replicate data, or code

DenodoPaul Moxon,
Senior Director, Product Management & Solution Architecture

Denodo Technologies

The evolution of data processing technologies has left many data management professionals uncertain about adopting new architectures enabled by these technologies. Should they move to a data lake or a logical data warehouse architecture? Read on. 

Domo, Inc.
www.domo.com
Domo’s Business Cloud—offers a customizable platform that can be tailored to meet the unique needs of an individual business with a selection of more than 1,000 apps and the ability to build custom apps and use APIs to connect to proprietary data sources and systems in real time

Empolis Information Management GmbH
www.empolis.com
Empolis Smart Cloud—a solution for comprehensive creation, management, analysis, intelligent processing, and provisioning of all information with the database, connecting to third-party systems, sophisticated search processes, or specialized application logic outsourced to the Empolis Smart Cloud and run from Empolis’ data center

EmpolisDr. Stefan Wess,
CEO

Empolis 
SMART CLOUD

Maximizing Big Data and Artificial Intelligence Technologies from Empolis

Big Data and artificial intelligence technologies are more prevalent than ever and constantly appear in our daily lives. Companies are putting these topics at the top of their agendas. International companies with established structures and processes, as well as heterogeneous data sources and systems, find it progressively challenging to utilize the growing amount of data. More and more companies are transferring their business processes into the Cloud for uniform and “smart” data to manage their Big Data issues.Read on.

 

erwin, Inc.
http://erwin.com
erwin CloudCore—bundles erwin Data Modeler and the recently acquired Corso Agile EA tools to provide a cloudbased data modeling and enterprise architecture solution

erwinMartin Owen
VP, Product Management

erwin
Cloudcore

We are proud to introduce our latest solution, erwin CloudCore, the first integrated cloud-based data modeling and enterprise architecture tool. CloudCore brings together the powerful capabilities of erwin’s market- leading Data Modeling tool with Corso’s innovative Enterprise Architecture tool into a single web-based platform. Read on.

FairCom Corp.
www.faircom.com
FairCom c-treeACE—a multimodel NoSQL and SQL solution for enterprise data that empowers developers to work directly with a NoSQL key-value store and a SQL layer simultaneously

GridGain Systems, Inc.
www.gridgain.com
GridGain In-Memory Data Fabric—with support for both transactional and analytics applications, it provides ACID support for transactions and is ANSI SQL-99 compliant, built on Apache Ignite

GridGainAbe Kleinfeld

GridGain 
IN-MEMORY COMPUTING

Web scale applications and the Internet of Things (IoT) share two characteristics: high workloads and massive amounts of data. By moving data from disk to RAM, in-memory computing offers performance that is 1,000x that of disk-based systems. In-memory computing solutions can be massively scaled out by adding commodity nodes to the cluster, supporting petabyte-scale in-memory datasets. Using in-memory computing for web scale applications and the IoT allows users to transition to a Fast Data world. In this emerging world, great end user experiences and real-time big data insights can be driven by the same operational dataset with this new generation of hybrid transactional/analytical processing (HTAP) solutions. Read on.

Hitachi Data Systems
www.hds.com
Compute as a Cloud Service—a managed private cloud solution based on the enterprise-grade Hitachi Unified Compute Platform, it combines the resources and financial flexibility of a public cloud to an organization’s on- or offpremises data center

Hortonworks, Inc.
http://hortonworks.com
Hortonworks Data Platform—enterprise-ready open source Apache Hadoop distribution based on a centralized architecture (YARN) to address the requirements of data-at-rest,support real-time customer applications, and deliver analytics that enable rapid decision making

Hewlett Packard Enterprise (HPE)
www.hpe.com
HPE Helion—provides products, solutions, services, and expertise to help organizations create a flexible, open, and secure hybrid infrastructure with a mix of public cloud, private cloud, and traditional IT

HVR Software
www.hvr-software.com
HVR High Volume Replicator—delivers large volumes of data to an organization’s data store of choice using real-time data capture between data sources, including SQL databases, Hadoop, data warehousing, and business intelligence data stores and most commonly used file systems

HVRAnthony Brooks-Williams
CEO

HVR Software

In today’s world of ride sharing, mobile tra­ffic monitoring, instant social media updates and internet-connected appliances, it is not surprising that expectations are rising for current business information. Acceptability of batch-oriented daily data warehouse reports is rapidly diminishing. Businesses and government entities are increasingly relying on real-time analytics to optimize everything, including customer service, marketing, logistics, inventory, cash flow, human capital, public safety and more. Read on.

<< back Page 4 of 10 next >>

Sponsors