Big Data Quarterly Articles



1010data, Inc. is making enhancements to its Consumer Insights Platform (CIP) with its latest release. CIP 3.0 supports a wider range of business users by facilitating faster decision-making with a more user-friendly interface, additional reports, and enhanced interactivity built right into key features, according to the company.

Posted August 05, 2016

With a variety of big data comes complexity and enterprises are faced with the challenge of extracting and combining that data to form meaningful insights.

Posted July 21, 2016

Monte Zweben, CEO and co-founder of the company that was founded in 2012, talks with Big Data Quarterly about why Splice Machine has rolled out an open source Community Edition - and why it is doing so now

Posted July 18, 2016

GridGain Systems, provider of enterprise-grade In-Memory Data Fabric solutions based on Apache Ignite, is releasing a new edition of its signature platform.

Posted July 05, 2016

RedPoint Global, a provider of data management and customer engagement software, has announced integration with Microsoft Azure HDInsight to support enhanced data management capabilities via Hadoop deployments on Microsoft Azure. RedPoint is a member of the Microsoft Partner Network, and the new integration evolved from its participation in the Microsoft Enterprise Cloud Alliance Program.

Posted June 30, 2016

The next major release of MarkLogic's enterprise NoSQL database platform is expected to be generally available by the end of this year. Gary Bloom, president and CEO of the company, recently reflected on the changing database market and how new features in MarkLogic 9 address evolving requirements for data management in a big data world. "For the first time in years, the industry is going through a generational shift of database technology - and it is a pretty material shift," observed Bloom.

Posted June 30, 2016

Hortonworks, Inc. unveiled new innovations at Hadoop Summit that will improve the Hortonworks Data Platform (HDP), allowing enterprises to accumulate, analyze, and act on data.

Posted June 29, 2016

Qubole unveiled a new feature to its Qubole Data Service (QDS) called auto-caching, a next-generation disk cache for cloud storage systems that works across different data engines.

Posted June 28, 2016

Hortonworks, Inc.,is partnering with AtScale to resell AtScale's technology, providing users with the ability to query data without any data movement from any business intelligence tool.

Posted June 28, 2016

MapR Technologies is introducing a new initiative that will help support Hadoop deployments and increase user and administrator productivity.

Posted June 28, 2016

Pepperdata is unveiling a new tool that will evaluate and assess Hadoop clusters and provide visibility into current cluster conditions.

Posted June 27, 2016

At the 2016 Hadoop Summit in San Jose, Teradata announced the certification of multiple BI and visualization solutions on the Teradata Distribution of Presto.

Posted June 27, 2016

Talend has released the newest version of its Data Fabric, an integration platform for both developers and business users whether their applications are on-premises or in the cloud. The updated platform now features enterprise-grade support for data preparation, a solution that reduces the time required for collection and analysis of data.

Posted June 27, 2016

Big data with its increasing volume, velocity, and variety is causing organizations to see their data as a valuable commodity. But central to that value, says Amit Walia, chief product officer of Informatica, is the ability to bring all data together to get a single view. Here, Walia discusses the three key challenges organizations face and what he sees as the biggest disruptor on the horizon.

Posted June 23, 2016

Trifacta, a provider of data wrangling software, is deepening technical integration with the Hortonworks Data Platform (HDP) and the industry's first certification for Apache Atlas, a data governance and metadata framework for Hadoop.

Posted June 23, 2016

Hortonworks, Inc. is enhancing its Global Professional Services (GPS) program to support and enable Hortonworks Connected Data Platforms customers.

Posted June 16, 2016

When working with data governance practitioners, I often hear comments that indicate pockets of data governance excellence (the proverbial half-full glass) or silos of data governance (half-empty) as they work toward the common goal of enterprise data governance. This is often accompanied by an observation that "if we could just get everyone to follow the rules (the same rules), then we could truly and successfully govern at the enterprise level."

Posted June 09, 2016

The popularity of open source analytical software has sparked the debate about the added value of commercial tools. Commercial and open source software each have their merits which should be thoroughly evaluated before any analytical software investment decision is made.

Posted June 08, 2016

While the high-speed world of futuristic IoT applications sounds exciting, it is really the mass of connected "small data" sensors that is truly going to deliver on the social and economic promise of the IoT revolution. The problem is how to effectively connect this "small data" IoT world.

Posted June 08, 2016

The General Data Protection Regulation (GDPR) is a legal construct that emanates from the EU and has already resulted in far-ranging implications for all producers, providers, and consumers of services delivered or maintained in the cloud. Though it has yet to go into effect, this system of regulations is sure to impact every provider, producer, and consumer of cloud-based infrastructure, products, services, and, most importantly, data in the years ahead.

Posted June 08, 2016

Cloud computing is gaining ground in the enterprise since it allows businesses to concentrate on their core competencies rather than on IT. As cloud becomes more popular, organizations are focusing on hybrid strategies that combine on-premise and cloud capabilities, industry research shows. However, data integration and security remain concerns.

Posted June 07, 2016

Using data visualization to support visual data storytelling is a craft, and one that takes practice, expertise, and a good bit of drafting and rewriting. Strong visual narratives that make data easier to understand, according to The Economist, "meld the skills of computer science, statistics, artistic design, and storytelling."

Posted June 07, 2016

While the focus of the IoT data discussion has understandably been on the strength of near real-time analytics to power what will soon become automated decisions based on information such as sensor data, there is also the question of what happens with data after the real-time window has closed. The time value of data is an important thing for IoT-enabled businesses to consider. Applying cost versus benefit to data is the first step organizations must take when considering the benefit of storing data, which data to store, and how secure the data will be.

Posted June 06, 2016

In the wide world of Hadoop today, there are seven technology areas that have garnered a high level of interest. These key areas prove that Hadoop is not just a big data tool; it is a strong ecosystem in which new projects coming along are assured of exposure and interoperability because of the strength of the environment.

Posted June 03, 2016

In a new book, titled Next Generation Databases: NoSQL, NewSQL and Big Data, Guy Harrison explores and contrasts both new and established database technologies. Harrison, who leads the team at Dell that develops the Toad, Spotlight, and SharePlex product families, wrote the book to address the gap he sees in the conversation about the latest generation of databases.

Posted June 03, 2016

The honeymoon between business and big data is over. The end was conclusively noted when Gartner placed big data in its trough of disillusionment. We've reached a point where companies must figure out how to use big data analytics in profitable ways.

Posted June 03, 2016

Data Summit 2016, held in May in NYC, brought together IT managers, data architects, application developers, data analysts, project managers, and business managers to hear industry-leading professionals deliver educational presentations on industry trends and technologies, networks with their peers, and participate in hands-on workshops. Here are 10 key takeaways from Data Summit 2016:

Posted May 23, 2016

While further discussions and negotiations about the proposed replacement to Safe Harbor continue, the future is clear: U.S. companies have no choice but to shore up their data privacy and security measures in line with Europe's progressive stance. While a nec­essary evolution, history has proved time and time again that companies that take proactive steps today to address future needs will be better positioned than those who attempt to meet compliance-related requirements retroactively.

Posted May 20, 2016

ThoughtSpot, a provider of search-driven analytics platforms, is receiving a $50 million investment, led by investor General Catalyst Partners and Geodesic Capital, to propel the company's growth.

Posted May 20, 2016

Syncsort is adding new capabilities to its platform, including native integration with Apache Spark and Apache Kafka. DMX-h v9 allows organizations to access and integrate enterprise-wide data with streams from real-time sources.

Posted May 18, 2016

While the big data industry has lacked a consistent and well-understood definition of the data lake since its entry into the hype cycle, clear use cases and best practices are now emerging.

Posted May 18, 2016

Qlik is unveiling the next version of its Qlik Sense Enterprise platform, combing enterprise readiness and governance with visualization and data preparation capabilities, according to the company.

Posted May 03, 2016

Led by Pivotal, GE Ventures, and GTD Capital, SnappyData has secured $3.65 million in Series A funding that will help it grow and further its technologies.

Posted April 29, 2016

Enabled by a partnership with Pentaho, a Hitachi Group Company, and integration with Pentaho's Big Data Integration and Analytics platform, Melissa Data's data quality tools and services can now be scaled across the Hadoop cluster to cleanse and verify data center records.

Posted April 27, 2016

Cloudera, provider of a data management and analytics platform built on Apache Hadoop and open source technologies, has announced the general availability of Cloudera Enterprise 5.7. According to the vendor, the new release offers an average 3x improvement for data processing with added support of Hive-on-Spark, and an average 2x improvement for business intelligence analytics with updates to Apache Impala (incubating).

Posted April 26, 2016

The core reason for implementing in-memory technology is to improve performance. To help accelerate adoption of in-memory technologies and provide a universal standard for columnar in-memory processing and interchange, the lead developers of 13 major open source big data projects have joined forces to create Apache Arrow, a new top level project within the Apache Software Foundation (ASF).

Posted April 24, 2016

Percona, a provider of MySQL and MongoDB solutions and services, is releasing an updated version of Percona Server for MongoDB .

Posted April 21, 2016

Teradata, the big data analytics and marketing applications company, is making key investments in the Internet of Things (IoT) and the Analytics of Things (AoT), along with updating its signature platforms.

Posted April 18, 2016

First created as part of a research project at UC Berkeley AMPLab, Spark is an open source project in the big data space, built for sophisticated analytics, speed, and ease of use. It unifies critical data analytics capabilities such as SQL, advanced analytics, and streaming in a single framework. Databricks is a company that was founded by the team that created and continues to lead both the development and training around Apache Spark.

Posted April 14, 2016

Thanks to the digital business transformation, the world around us is changing—and quickly—to a very consumer- and data-centric economy, where companies must transform to remain competitive and survive. The upshot is that for many companies today, it is a full-on Darwinian experience of survival of the fittest.

Posted April 08, 2016

SnapLogic is releasing its hybrid execution framework Snaplex on the Microsoft Azure Marketplace as Azureplex, giving users the ability to gain business insights faster with self-service data integration from a plethora of sources.

Posted April 07, 2016

The emergence of big data, characterized in terms of its four V's—volume, variety, velocity, and veracity—has created both opportunities and challenges for credit scoring.

Posted April 04, 2016

As data visualization increasingly becomes top-of-mind for data-driven organizations, it's time to introduce the concept of data visualization competency. There is a need today to provide a framework to fingerprint data visualizations as unique digital assets in the business for maximum impact and consistent execution against strategic business practices and goals.

Posted April 01, 2016

Hershey's LLC recently deployed the Infosys Information Platform on AWS to analyze retail store data.

Posted March 31, 2016

Over the past 2 years, there have been big announcements from all of the major car manufacturers about their connected car initiatives, lots of M&A activity in the technology industry as they race to supply the revolution, and major global alliances of telecom providers being formed to provide the underlying connectivity and infrastructure. But, most of all, we are actually starting to see some of the promised transformational benefits of the Internet of Things becoming a reality.

Posted March 31, 2016

The pervasive corporate mindset to transition all levels of infrastructure to some cloud, somewhere, is accelerating the growth of the cloud industry with a rapidity so far unseen in the history of computing. This phenomenon has resulted in weighty pressure on CIOs to develop and deploy an effective and comprehensive cloud strategy or risk their organization falling behind this undeniable trend. The internet changed the information technology game, but now the cloud constitutes an entirely different league.

Posted March 31, 2016

There's a need to enable better decision making today with faster access to data. But many organizations are still weighed down by integration and management processes that are not keeping up with the increasing volume, variety, and velocity of data. A greater emphasis on cloud and self-service tools may provide an approach to remedy the situation.

Posted March 30, 2016

Trifacta, a provider of data wrangling technology, is introducing the Photon Compute Framework, providing users with an interactive platform for large in-memory datasets.

Posted March 29, 2016

NoSQL databases were born out of the need to scale transactional persistence stores more efficiently. In a world where the relational database management system (RDBMS) was king, this was easier said than done.

Posted March 29, 2016

The digital economy promises to redefine nearly every aspect of a company's operations—from raw material procurement through post-sale services. Yet, some of the most dramatic changes will be seen in how companies evolve their product portfolios and leverage digital capabilities.

Posted March 25, 2016

Pages
1
2
3
4
5
6
7
8
9

Newsletters

Subscribe to Big Data Quarterly E-Edition