Big Data Quarterly Articles



Big Data 50 - Companies Driving Innovation in 2017

Posted September 07, 2017

InfluxData provides an open source platform built for metrics, events, and other time-based data. Recently, Evan Kaplan, CEO of InfluxData, reflected on the future of databases and why time-series represents the next wave of databases for data—from humans, sensors, and machines.

Posted August 30, 2017

8 Rules of the Road for Fast Data Management and Analytics

Posted August 16, 2017

Barracuda Networks, a provider of cloud-enabled security and data protection solutions, has added the ability to replicate data from either an on-premises physical or virtual backup appliance to AWS. The new feature provides customers, resellers, and MSPs with greater flexibility and choice to protect their data from data loss and potential disasters, including security threats like ransomware. This adds an additional option for customers, in addition to the ability to replicate to the Barracuda Cloud.

Posted August 15, 2017

How to Solve Big Data Integration Challenges

Posted August 08, 2017

Dremio has announced its launch in the data analytics market with the availability of the Dremio Self-Service Data Platform. According to Dremio, its platform allows users to be independent and self-directed in their use of data, while accessing data from a variety of sources at scale.

Posted July 19, 2017

Pricchaa has released a free solution for detecting, encrypting, and monitoring sensitive data housed in the Amazon Web Services (AWS) Cloud.

Posted July 13, 2017

Tips and Tricks for Migrating to NoSQL

Posted June 15, 2017

The Growing Power of the Internet of Things

Posted June 15, 2017

The Apache Arrow project is a standard for representing data for in-memory processing.Hardware evolves rapidly. Because Apache Arrow was designed to benefit many different types of software in a wide range of hardware environments, the project team focused on making the work "future-proof," which meant anticipating changes to hardware over the next decade.

Posted June 14, 2017

Looker, provider of a cloud data platform, has announced Instant Insight, a new feature that allows users to analyze their data without waiting for help from an analyst.

Posted June 14, 2017

Attunity Ltd., a provider of data integration and big data management software solutions, is launching a new solution, Attunity Compose for Hive, which automates the process of creation and continuous loading of operational and historical data stores in a data lake.

Posted June 13, 2017

Addressing the rise of hybrid deployments, Hortonworks has introduced a new software support subscription to provide seamless support to organizations as they transition from on-premise to cloud. Separately, Hortonworks also announced the general availability of Hortonworks Dataflow (HDF) 3.0, a new release of its open source data-in-motion platform, which enables customers to collect, curate, analyze and act on all data in real-time, across the data center and cloud.

Posted June 12, 2017

How to Leverage the Power of Data Lakes

Posted June 09, 2017

The demand for speed and agility are among the key drivers of the growing DevOps movement, which seeks to better align software development and IT operations. Yet, challenges still exist.

Posted June 07, 2017

Data Governance and Security Tips from Data Summit 2017

Posted May 24, 2017

The Enterprise Impact of Streaming Data and IoT

Posted May 24, 2017

9 Key Takeaways About Cloud and Analytics from Data Summit 2017

Posted May 24, 2017

Tapping into the Best Strategies for Integrating to Hadoop with Josh Klahr at Data Summit 2017

Posted May 17, 2017

Tips for Database Management and Migration in the Cloud

Posted May 17, 2017

The Importance of Information Governance in our Current Data Analytics Landscape

Posted May 17, 2017

In October of 2008, Congress enacted the Emergency Economic Stabilization Act, more commonly known as the bailout of the financial system. The understanding was that catastrophic financial consequences would be the result of the failure of these entities and that those aggregate failures could devastate the U.S. The recent major outages in the public clouds services inevitably lead to the same issue being consider with regard to this new industry.

Posted May 15, 2017

With the furor over fake news, where the truth is massaged for commercial or political gain, the focus has gone off fake data—which can have a lot more perilous consequences.

Posted May 15, 2017

When people talk about the next generation of applications or infrastructure, what is often echoed throughout the industry is the cloud. On the application side, the concept of "serverless" is becoming less of a pipe dream and more of a reality. The infrastructure side has already proven that it is possible to deliver the ability to pay for compute on an hourly or more granular basis.

Posted May 15, 2017

You will often hear experienced practitioners and consultants suggest that there is both an art and a science to effective data governance. The art is in the details of fine-tuning a data governance program to fit your culture and address specific business needs. But the fundamental principles of data governance are best understood and executed through science.

Posted May 15, 2017

The tide is changing for analytics architectures. Traditional approaches, from the data warehouse to the data lake, implicitly assume that all relevant data can be stored in a single, centralized repository. But this approach is slow and expensive, and sometimes not even feasible, because some data sources are too big to be replicated, and data is often too distributed to make a "full centralization" strategy successful.

Posted May 15, 2017

Over the next 6 years, the Internet of Things (IoT) market is expected to reach $883.55 billion, as connected devices continue to pour into just about every aspect of our lives. For enterprises, the IoT is helping to transform products into connected services, capable of creating recurring revenue streams, reducing costs, and enhancing customers' experiences.

Posted May 15, 2017

Data—now universally understood to be the lifeblood of businesses—is at risk like never before in the form of both malicious attacks and innocent indiscretions. Recently, Steve Grobman, CTO for McAfee, discussed the range of threats to data security and what companies must do to defend themselves.

Posted May 15, 2017

Organizations are embracing data visualization as more than a tool to "see" trends and patterns in data but as a pathway to a dynamic culture of visual data discovery. As with any type of cultural shift, there are going to be a few bumps along the road as innovative ways to transform data into actionable insights through the power of data visualization are sought.However, with a few considerations kept top-of-mind in the early stages of data visualization adoption, common problems can be avoided.

Posted May 15, 2017

Big data and analytics are all around these days. Most companies already have their first analytical models in production and are thinking about further boosting their performance. However, far too often, these companies focus on the analytical techniques rather than on the key ingredient: data. The best way to boost the performance and ROI of an analytical model is by investing in new sources of data which can help to further unravel complex customer behavior and improve key analytical insights.

Posted May 15, 2017

Today's headlines are filled with news about artificial intelligence (AI), proclaiming variously that robots will take our jobs, cure cancer, or change industries in ways unseen since the industrial revolution. One thing is clear to those of us watching closely, however: It's not all hype. In 2016 alone, the quantity of AI startup acquisitions was remarkable, but most of these massive investments were made by an elite corps of companies, such as Amazon, Google, Apple, Facebook and a few others.

Posted May 15, 2017

What are the enabling technologies that make enterprise architecture what it is today? There are a range of new-generation technologies and approaches shaping today's data environments. The key is putting them all together to help enterprise architecture fit into the enterprise's vision of itself as a data-driven organization. Tools and technologies emerging within today's data-driven enterprise include cloud, data lakes, real-time analytics, microservices, containers, Spark, Hadoop, and open source trends.

Posted May 15, 2017

Progress, a provider of application development and deployment technologies, recently acquired DataRPM, a privately-held provider of cognitive predictive maintenance software for the industrial IoT (IIoT) market. Mark Troester, vice president of strategy at Progress, discussed how the addition of DataRPM's predictive analytics and meta learning capabilities help to round out the Progress platform, enabling the company to embrace a "cognitive first" strategy.

Posted April 24, 2017

To meet the new demands of managing infrastructure in the cloud in a proactive manner, the new role of the "cloud keeper" has emerged. The cloud keeper is part technologist, part accountant, and part administrator. The cloud keeper has financial responsibility for keeping control of infrastructure expenses to prevent financial chaos. The role is part technical, since it requires an understanding of how and where resources are deployed. The cloud keeper must know how a resource is paid for and have enough technical expertise to know which resources can be spun up or down or would be better suited for one cloud paradigm over another.

Posted April 07, 2017

While companies often view processes from their frame of reference, "cutting" processes up according to department, business objective, or other internal aspect, customers obviously do not act according to the same taxonomy and—from the perspective of the company—appear to jump from process to process, from department to department, and from channel to channel, making it difficult for businesses to truly follow a customer through his or her whole journey.

Posted April 07, 2017

Governing and managing big data are not easy tasks. In fact, let's be honest—data management and governance for data of any size is no walk in the park. But, big data makes it even tougher. From integrating the disparate data sources of seemingly unending variety to curating the chaos in the heaps of unstructured data, managing the craziness we lovingly refer to as big data is not for the faint of heart. Even for those tough as nails, the challenges of big data management can be more than frustrating. So, now that you know you are not alone in dealing with this insanity, here are a few ways to make the frustrations of big data a little less intense.

Posted April 07, 2017

By now we are all in agreement: The business of data is changing. Business users are more empowered to work with data; IT is becoming less about control and more about enablement. New data science job descriptions—such as the data scientist—are springing up as companies everywhere look for the right people with the right skill sets to squeeze more value from their data. Data itself is getting bigger, hardware more economical, and analytical software more "self-service." We've embraced the paradigm shift from traditional BI to iterative data discovery. It's a new era.

Posted April 07, 2017

As the Internet of Things (IoT) revolution works its way through marketing hype and seeks its place of valuable contribution within companies and industries, you might pause to wonder how IoT can create opportunities for your company. Yet that assessment is difficult in part because the buzz does not always align with reality. In short, it's no simple task to discern the true potential of IoT today, leaving one to wonder: What is realistic, what difference could IoT make in my company, and how mature are other companies in embracing IoT potential?

Posted April 07, 2017

The concept of data lakes is a great one, but if not done correctly, this treasure trove of information can quickly turn into a black abyss for data analysts and scientists, let alone business users.

Posted April 07, 2017

Make no mistake: Big data is promising, exciting, and effective—when done right. Once considered an overhyped buzzword, it's now a potential tool that leaders in every vertical want to harness. Unfortunately, the majority of new big data projects—about 55% of them, according to Gartner—are shuttered before they even get off the ground.

Posted April 07, 2017

There has been a sea of change in how enterprises are thinking about Apache Hadoop and big data. Today, a majority of enterprises are thinking about the cloud first, not on-premises, and are increasingly relying on ecosystem standards to drive their Apache Hadoop distribution selection.

Posted April 07, 2017

It is difficult to find someone not talking about or considering using containers to deploy and manage their enterprise applications. A container just looks like another process running on a system; a dedicated CPU and pre-allocated memory aren't required in order to run a container. The simplicity of building, deploying, and managing containers is among the reasons that containers are growing rapidly in popularity.

Posted April 07, 2017

Alation and Trifacta say they are extending their partnership to jointly deliver an integrated solution for self-service data discovery and preparation that enables users to access the data catalog and data wrangling features within a single interface.

Posted March 15, 2017

SAP has announced advancements in the SAP Vora solution to help customers accelerate project implementations and improve their enterprise business analytics.

Posted March 15, 2017

Dataguise, a provider of sensitive data governance, has announced that DgSecure now provides sensitive data monitoring and masking in Apache Hive.

Posted March 15, 2017

The rise of big data and the growing popularity of cloud is a combination that presents valuable new opportunities to leverage data with greater efficiency. But organizations also need to be aware of some key differences between on-premise and cloud deployments, says Charles Zedlewski, senior vice president, products, at Cloudera.

Posted March 15, 2017

Ash Munshi, Pepperdata CEO, recently discussed the need for DevOps for big data, and the role of the Dr. Elephant project, which was open sourced in 2016 by LinkedIn and is available under the Apache v2 License.

Posted March 07, 2017

Tableau Software is releasing an updated version of its namesake platform, bringing advanced mapping capabilities to the analytics solution. Tableau 10.2 will make complex geospatial analysis easier, simplify data prep with new ways to combine and clean data, and give enterprises more tools to deliver self-service analytics at scale, according to the company.

Posted March 03, 2017

Kong Yang, head geek at SolarWinds, believes the rise of the mobile workforce and the pressure to implement new technologies means that modern IT professionals must be able to quickly evolve beyond the confines of on-premises and shift into the realm of hybrid IT. Here, Yang reflects on some of the ways that IT professionals can begin that journey.

Posted February 24, 2017

MapR Technologies, Inc., which provides a converged data platform, has introduced persistent storage for containers with complete state access to files, database tables, and message streams from any location. The MapR Converged Data Platform for Docker includes the MapR Persistent Client Container (PACC) that enables stateful applications and microservices to access data for greater application agility and faster time-to-value.

Posted February 07, 2017

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13

Newsletters

Subscribe to Big Data Quarterly E-Edition