Big Data Quarterly Articles



No longer the stuff of science fiction, the business uses for cognitive computing, artificial intelligence, and machine learning today include fields as diverse as medicine, marketing, defense, energy, and agriculture. Enabling these applications is the vast amount of data that companies are collecting from machine sensors, instruments, and websites and the ability to support smarter solutions with faster data processing.

Posted November 13, 2017

AtScale is releasing a universal semantic platform for business intelligence (BI) with Microsoft Azure HDInsight, providing enterprises with faster time to insight. AtScale's new offer helps enhance Azure HDInsight customers' ability to quickly turn their Big Data lake into a highly available and performant analytical database.

Posted November 02, 2017

There's a surprising trick for greatly increasing the chances of real impact, true success with many types of machine learning systems, and that is "do the logistics correctly and efficiently."   That sounds like simple advice - it is - but the impact can be enormous. If the logistics are not handled well, machine learning projects generally fail to deliver practical value. In fact, they may fail to deliver at all. But carrying out this advice may not seem simple at all.

Posted October 26, 2017

After years of ambiguous expectations about data governance, organizations now have a better handle on how these programs can help them manage the exponential growth of the data they generate. More importantly, there is an understanding about how the integration, organization, and alignment of that data can help meet or exceed business and technology goals.

Posted October 18, 2017

As companies grow increasingly data-centric in their decision making, product and services development, and their overall understanding of the world they work in, speed and agility are becoming critical capabilities. A common theme in big data and analytics today is "Industry 4.0," representing a new wave of technology that enables the automation necessary for scaling. There's compelling justification for this as companies seek to unlock business value from big data with two broad approaches: the democratization of data with greater access by more users, and the enablement of automation everywhere possible.

Posted September 20, 2017

The movement toward the instrumentation of everything and the democratization of data and analytics is resulting in more data flowing to more users, and is creating new challenges in data management.

Posted September 20, 2017

Over the last few years, organizations have shifted from using virtual data centers to creating private or hybrid IaaS clouds that allow authorized users to perform self-service provisioning of virtual machines. These environments have reduced administrative workloads, improved the user experience, and discouraged shadow IT, but they have also brought their own challenges. As virtualized environments increase in scale, management techniques have often become far less effective, making it difficult to keep track of virtual machines, their owners, and why the virtual machines were created in the first place.

Posted September 20, 2017

Companies today are spreading their applications across multiple clouds in a hybrid fashion. According to a recent IDC CloudView study among 6,000 IT and line-of-business executives whose organizations have adopted cloud technologies, 73% are implementing a hybrid strategy, which most defined as utilizing more than one public cloud in addition to dedicated assets.

Posted September 20, 2017

Many people are unsure of the differences between deep learning, machine learning, and artificial intelligence. Generally speaking, and with minimal debate, it is reasonably well-accepted that artificial intelligence can most easily be categorized as that which we have not yet figured out how to solve, while machine learning is a practical application with the know-how to solve problems, such as with anomaly detectio

Posted September 20, 2017

When it comes to visualizing data, there is no shortage of charts and graphs to choose from. From traditional graphs to innovative hand-coded visualizations, there is a continuum of visualizations ready to translate data from numbers into meaning using shapes, colors, and other visual cues. However, each visualization type is intended to show different types of data in specific ways to best represent its insight. Let's look at five of the most common visualization types to help you choose the right chart for your da

Posted September 20, 2017

Businesses of all sizes across all industries are rapidly adopting digital transformation models that put data at the center of driving the business forward—as they should. However, putting data at the center of everything the business does can be risky without proper planning and rigorous management. Many companies have been wise to introduce data governance programs to protect corporate data assets and establish a framework for operational excellence when it comes to data management and use. Data governance emphasizes the enforcement of defined standards or policies and provides mechanisms for consistency and repeatable processes, but it is not enough to protect businesses in today's world of data.

Posted September 20, 2017

Nowadays, many firms are already using big data and analytics to manage and optimize their customer relationships. Both technologies can also prove beneficial to leverage a firm's other key assets: its employees! Various HR analytics (also called workforce analytics) examples can be thought of.

Posted September 20, 2017

While Vic Damone and Jane Powell wanted their eggs with a kiss in the 1950s musical Rich, Young and Pretty, in the near future, your kitchen might well know exactly how you want them thanks to the Internet of Things (IoT).

Posted September 20, 2017

Tic toc, tic toc—back and forth swings the privacy pendulum. While we in the U.S. continue to regress on issues of data privacy, the European Union (EU) is proceeding with bold steps to protect the privacy of its citizens. On May 25, 2018, the General Data Protection Regulation (GDPR) becomes the law of the land in the EU. It applies to any company that processes or holds data on EU residents, regardless of where it is located in the world. Popular applications such as Facebook, Twitter, and Airbnb are among the companies that will be directly impacted by this law. If you do business with EU residents, regardless of geographic locality, this law directly applies to you.

Posted September 20, 2017

Qubole Data Service provides a single platform for ETL, reporting, ad hoc analysis, stream processing and machine learning. It runs on AWS, Microsoft Azure and Oracle Bare Metal Cloud, taking advantage of the elasticity and scale of the cloud, and also supports leading open source engines, including Apache Spark, Hadoop, Presto, and Hive.

Posted September 12, 2017

The Apache Arrow project is a standard for representing columnar data for in-memory processing, which has a different set of trade-offs compared to on-disk storage. In-memory, access is much faster and processes optimize for CPU throughput by paying attention to cache locality, pipelining, and SIMD instructions.

Posted September 12, 2017

New multi-cloud capabilities scale discovery and dependency mapping of all assets to go beyond on-prem data centers to public and private cloud.

Posted September 12, 2017

As organizations increasingly move their data and applications from on-premise deployments to the cloud, the role of the DBA is also shifting. According to Penny Avril, vice president of product management, Oracle Database, the transition means that DBAs have the opportunity to move from being data custodians and keepers to taking on a more strategic role in their organizations. But, she says,the time to prepare for the new cloud reality is now.

Posted September 07, 2017

Evaluating new and disruptive technologies, as well as when and where they may prove useful, is a challenge. Against the rapidly evolving big data scene, this year, Big Data Quarterly presents the newest "Big Data 50," an annual list of forward-thinking companies that are working to expand what's possible in terms of collecting, storing, and deriving value from data.

Posted September 07, 2017

InfluxData provides an open source platform built for metrics, events, and other time-based data. Recently, Evan Kaplan, CEO of InfluxData, reflected on the future of databases and why time-series represents the next wave of databases for data—from humans, sensors, and machines.

Posted August 30, 2017

These days, end users—be they employees or consumers visiting a site—expect information delivered in seconds, if not nanoseconds. Applications tied into networks of connected devices and sensors are powering operations and making adjustments on a real-time basis.

Posted August 16, 2017

Barracuda Networks, a provider of cloud-enabled security and data protection solutions, has added the ability to replicate data from either an on-premises physical or virtual backup appliance to AWS. The new feature provides customers, resellers, and MSPs with greater flexibility and choice to protect their data from data loss and potential disasters, including security threats like ransomware. This adds an additional option for customers, in addition to the ability to replicate to the Barracuda Cloud.

Posted August 15, 2017

As data flows into businesses faster than ever before time-to-insight and time-to-action are critical competitive differentiators, and the demand for fast access to information is growing.

Posted August 08, 2017

Dremio has announced its launch in the data analytics market with the availability of the Dremio Self-Service Data Platform. According to Dremio, its platform allows users to be independent and self-directed in their use of data, while accessing data from a variety of sources at scale.

Posted July 19, 2017

Pricchaa has released a free solution for detecting, encrypting, and monitoring sensitive data housed in the Amazon Web Services (AWS) Cloud.

Posted July 13, 2017

As data continues to impact every facet of every business more and more Global 2000 companies are choosing NoSQL databases to power their Digital Economy applications.

Posted June 15, 2017

The Internet of Things continues to grow exponentially, continuing to disrupt markets, and causing enterprises to grapple with more data than ever before. Without the right data management strategy, investments in IoT can yield limited results. DBTA recently held a webinar featuring John O'Brien, principal advisor and CEO of Radiant Advisors, and Vijay Raja, solutions marketing lead, IoT, at Cloudera, who discussed key drivers and patterns for IoT adoption across industries.

Posted June 15, 2017

The Apache Arrow project is a standard for representing data for in-memory processing.Hardware evolves rapidly. Because Apache Arrow was designed to benefit many different types of software in a wide range of hardware environments, the project team focused on making the work "future-proof," which meant anticipating changes to hardware over the next decade.

Posted June 14, 2017

Looker, provider of a cloud data platform, has announced Instant Insight, a new feature that allows users to analyze their data without waiting for help from an analyst.

Posted June 14, 2017

Attunity Ltd., a provider of data integration and big data management software solutions, is launching a new solution, Attunity Compose for Hive, which automates the process of creation and continuous loading of operational and historical data stores in a data lake.

Posted June 13, 2017

Addressing the rise of hybrid deployments, Hortonworks has introduced a new software support subscription to provide seamless support to organizations as they transition from on-premise to cloud. Separately, Hortonworks also announced the general availability of Hortonworks Dataflow (HDF) 3.0, a new release of its open source data-in-motion platform, which enables customers to collect, curate, analyze and act on all data in real-time, across the data center and cloud.

Posted June 12, 2017

Hadoop adoption is growing and so is the commitment to data lake strategies. Data security, governance, integration, and access have all been identified as critical success factors for data lake deployments.

Posted June 09, 2017

The demand for speed and agility are among the key drivers of the growing DevOps movement, which seeks to better align software development and IT operations. Yet, challenges still exist.

Posted June 07, 2017

With the recently unleashed WannaCry ransomware attacks that targeted computer systems globally fresh in attendees' minds, a number of Data Summit 2017 sessions looked at the need for smarter approaches to data governance and data security.

Posted May 24, 2017

The rise of streaming data and IoT and its implications for the enterprise were considered during sessions at Data Summit 2017, an annual conference presented by Big Data Quarterly and Database Trends and Applications, in NYC.

Posted May 24, 2017

Data Summit 2017 was recently held in NYC. New big data technologies, cloud, and analytics were among the key areas scrutinized in educational presentations, keynotes, and hands-on workshops.

Posted May 24, 2017

Enterprises are always looking to improve their business intelligence strategies. Hadoop is one tool that can successfully support the onboarding of business intelligence workloads. Josh Klahr, vice president of AtScale, addressed the "Do's and Don'ts for Success with BI on Big Data" during his session at Data Summit 2017.

Posted May 17, 2017

Cloud represents a new way of approaching database management, potentially relieving enterprise data shops of many administrative burdens. But in reality, for the DBA, the cloud has also become the new bottleneck because developers can open up new cloud accounts so easily, which immediately become a new responsibility for the DBAs who are the gatekeepers. This trend will only increase as cloud continues as a trend in the industry, Kellyn Pot'Vin-Gorman, technical intelligence manager for the Office of CTO, Delphix, who presented a session at Data Summit 2017, titled, "Database Management & the Cloud."

Posted May 17, 2017

The second day of Data Summit 2017 began by focusing on the current state of big data analytics as it meets information governance.The keynote was presented by Linda G. Sharp, associate general counsel at ZL Technologies, and Bennett B. Borden, chief data scientist at Drinker Biddle & Reath.

Posted May 17, 2017

In October of 2008, Congress enacted the Emergency Economic Stabilization Act, more commonly known as the bailout of the financial system. The understanding was that catastrophic financial consequences would be the result of the failure of these entities and that those aggregate failures could devastate the U.S. The recent major outages in the public clouds services inevitably lead to the same issue being consider with regard to this new industry.

Posted May 15, 2017

With the furor over fake news, where the truth is massaged for commercial or political gain, the focus has gone off fake data—which can have a lot more perilous consequences.

Posted May 15, 2017

When people talk about the next generation of applications or infrastructure, what is often echoed throughout the industry is the cloud. On the application side, the concept of "serverless" is becoming less of a pipe dream and more of a reality. The infrastructure side has already proven that it is possible to deliver the ability to pay for compute on an hourly or more granular basis.

Posted May 15, 2017

You will often hear experienced practitioners and consultants suggest that there is both an art and a science to effective data governance. The art is in the details of fine-tuning a data governance program to fit your culture and address specific business needs. But the fundamental principles of data governance are best understood and executed through science.

Posted May 15, 2017

The tide is changing for analytics architectures. Traditional approaches, from the data warehouse to the data lake, implicitly assume that all relevant data can be stored in a single, centralized repository. But this approach is slow and expensive, and sometimes not even feasible, because some data sources are too big to be replicated, and data is often too distributed to make a "full centralization" strategy successful.

Posted May 15, 2017

Over the next 6 years, the Internet of Things (IoT) market is expected to reach $883.55 billion, as connected devices continue to pour into just about every aspect of our lives. For enterprises, the IoT is helping to transform products into connected services, capable of creating recurring revenue streams, reducing costs, and enhancing customers' experiences.

Posted May 15, 2017

Data—now universally understood to be the lifeblood of businesses—is at risk like never before in the form of both malicious attacks and innocent indiscretions. Recently, Steve Grobman, CTO for McAfee, discussed the range of threats to data security and what companies must do to defend themselves.

Posted May 15, 2017

Organizations are embracing data visualization as more than a tool to "see" trends and patterns in data but as a pathway to a dynamic culture of visual data discovery. As with any type of cultural shift, there are going to be a few bumps along the road as innovative ways to transform data into actionable insights through the power of data visualization are sought.However, with a few considerations kept top-of-mind in the early stages of data visualization adoption, common problems can be avoided.

Posted May 15, 2017

Big data and analytics are all around these days. Most companies already have their first analytical models in production and are thinking about further boosting their performance. However, far too often, these companies focus on the analytical techniques rather than on the key ingredient: data. The best way to boost the performance and ROI of an analytical model is by investing in new sources of data which can help to further unravel complex customer behavior and improve key analytical insights.

Posted May 15, 2017

Today's headlines are filled with news about artificial intelligence (AI), proclaiming variously that robots will take our jobs, cure cancer, or change industries in ways unseen since the industrial revolution. One thing is clear to those of us watching closely, however: It's not all hype. In 2016 alone, the quantity of AI startup acquisitions was remarkable, but most of these massive investments were made by an elite corps of companies, such as Amazon, Google, Apple, Facebook and a few others.

Posted May 15, 2017

What are the enabling technologies that make enterprise architecture what it is today? There are a range of new-generation technologies and approaches shaping today's data environments. The key is putting them all together to help enterprise architecture fit into the enterprise's vision of itself as a data-driven organization. Tools and technologies emerging within today's data-driven enterprise include cloud, data lakes, real-time analytics, microservices, containers, Spark, Hadoop, and open source trends.

Posted May 15, 2017

Pages
1
2
3
4
5
6

Newsletters

Subscribe to Big Data Quarterly E-Edition