Newsletters




Hadoop

The Apache Hadoop framework for the processing of data on commodity hardware is at the center of the Big Data picture today. Key solutions and technologies include the Hadoop Distributed File System (HDFS), YARN, MapReduce, Pig, Hive, Security, as well as a growing spectrum of solutions that support Business Intelligence (BI) and Analytics.



Hadoop Articles

StackIQ Inc., a provider of server automation solutions, has released a new version of software offering key new data center automation capabilities. "Boss 5 extends our full stack server automation capabilities to Docker containers as well as enhancing all of our Hadoop and OpenStack solutions," says Tim McIntire, CEO and co-founder of StackIQ Inc.

Posted February 17, 2015

RapidMiner is releasing self-service analytics for Hadoop. RapidMiner offers a solution that accelerates time to value for building and deploying advanced analytic models with numerous types of data. "Our value is around accelerating time to value for our customers. We do this through pre-built models which we call accelerators," stated Michele Chambers, president and COO at RapidMiner.

Posted February 17, 2015

To help simplify Hadoop deployments, Datameer is partnering with cloud providers Altiscale and Bigstep, to introduce Datameer Professional, a SaaS big data analytics platform targeted at department-specific deployments.

Posted February 17, 2015

Hadoop heavyweight Pivotal is open sourcing components of its Big Data Suite, including Pivotal HD, HAWQ, Greenplum Database, and GemFire; forming the Open Data Platform (ODP), a new industry foundation along with founding members GE, Hortonworks, IBM, Infosys, SAS, and other big data leaders; and forging a new business and technology partnership with Hortonworks.

Posted February 17, 2015

The IOUG (Independent Oracle Users Group) is joining DBTA as a conference partner at the Data Summit 2015 conference, a comprehensive conference focused on information management and big data for technology and database professionals. The IOUG track at the Data Summit will focus on big data in the cloud and the evolution of the data warehouse.

Posted February 12, 2015

Contending that big data is meaningless to an organization if it is not correct and complete, Trillium Software has unveiled Trillium Big Data, a data quality solution that extends the capabilities of the flagship Trillium Software System to the Hadoop big data environment. "Trillium Big Data is the second in a series of planned new product launches for Trillium Software in 2015 which will help organizations increase the value of their data," said Phil Galati, CEO at Trillium Software.

Posted February 12, 2015

Unisphere Research and Radiant Advisors have announced the publication of a brand new report on the emerging concepts and strategies surrounding the data lake. "We wrote The Definitive Guide to the Data Lake to provide guidance to those considering the data lake by sharing the findings of companies within our research and advisory network that are actively implementing data lake strategies today," said John O'Brien, CEO of Radiant Advisors.

Posted February 11, 2015

Someone new to big data and Hadoop might be forgiven for feeling a bit confused after reading some of the recent press coverage on Hadoop. On one hand, Hadoop has achieved very bullish coverage in mainstream media. However, counter to this positive coverage, there have been a number of claims that Hadoop is overhyped. What's a person to make of all these mixed messages?

Posted February 11, 2015

Attunity recently added new capabilities to its solution suite with the acquisition of BIReady's data warehouse automation technology, which eliminates the complex, manual tasks of preparing data for BI and big data analytics. Lawrence Schwartz, Attunity's vice president of marketing, spoke with DBTA about BIReady and other Attunity solutions for customers dealing with big data projects.

Posted February 11, 2015

The "Internet of Things" (IoT) is opening up a new world of data interchange between devices, sensors, and applications, enabling businesses to monitor, in real time, the health and performance of products long after they leave the production premises. At the same time, enterprises now have access to valuable data—again, in real time if desired—on how customers are adopting products and services.

Posted February 11, 2015

Teradata is introducing new big data applications that incorporate recently acquired capabilities and technologies from Revelytix and Thing Big Analytics. The new offerings include solutions targeted to specific verticals such as retail and healthcare, new Teradata Loom 2.4 capabilities to expand the depth and breadth of metadata in the data lake, and a new fixed price/fixed time-frame data lake optimization service offering. The new products and services are targeted at extending big data competency to more non-data scientist users, and helping companies gain additional value from their data lake projects, said Chris Twogood, vice president of product and services marketing at Teradata.

Posted February 11, 2015

MemSQL has introduced the MemSQL Spark Connector. According to the vendor, the combination of an in-memory database from MemSQL and Spark's memory optimized processing framework gives enterprises the benefit of fast access to transactions, ETL, and analytics. The MemSQL Spark Connector is also available as an open source offering, providing developers the ability to adapt it to their needs.

Posted February 10, 2015

Hadoop has continued its growth and become part of the consciousness of decision makers dealing with big data. However, Hadoop is a still too advanced for the typical business user to work with. To help make it easier, Oracle has created Big Data Discovery, a product that aims to help simplify Hadoop for the average business user.

Posted February 10, 2015

Splice Machine has announced that it has achieved Hortonworks Data Platform (HDP) certification by completing the required integration testing with Hortonworks Data Platform. As a result of the HDP Certification, Splice Machine customers can leverage the pre-built and validated integrations between enterprise technologies and the Hortonworks Data Platform, an open source Hadoop distribution, to simplify and accelerate their Splice Machine and Hadoop deployments.

Posted February 05, 2015

During a live event, Larry Ellison, Oracle's executive chairman of the board and CTO, outlined a new strategy for reducing customer costs and increasing value with the company's next generation of engineered systems. In the presentation today, Ellison emphasized two key points.

Posted February 04, 2015

VoltDB, which provides an in-memory, scale-out SQL database, is releasing version 5.0 of its software. "Developers are in need of better tools with which to develop fast data streaming applications with real-time analytics and decision making across industries," said Bruce Reading, president and CEO of VoltDB.

Posted February 03, 2015

BlueData Software, Inc., which provides the EPIC software platform that allows enterprises to create a public cloud-like experience from their on-premise environments, has announced a technology preview of the Tachyon in-memory distributed storage system as a new option. This new integration enables Hadoop, Hbase virtual clusters, and other applications provisioned in the BlueData platform, to take advantage of Tachyon's high performance in-memory data processing.

Posted February 03, 2015

In 2014, the big data drumbeat continued to pound, major DBMS vendors expanded their product offerings, Microsoft hired a new CEO, and a range of new technology offerings were introduced. In retrospect, what stands out?

Posted January 29, 2015

To help organizations leverage the full range of big data to drive better decision making, Novetta has launched data refinement, entity resolution and analysis software that it says will power large-scale analytics on all data in Hadoop. The solution is now certified to run on Cloudera CDH and Hortonworks HDP.

Posted January 23, 2015

It is no secret that we are in the data age. Data comes at us from all directions, in all shapes and sizes.Incumbent vendors and startups constantly add new features, build on top of emerging open source projects, and claim to solve the next wave of challenges. Within the Hadoop ecosystem alone, there are (at least) 11 Hadoop-related open source projects. Making sense of it can be a time-consuming headache. To bring clarity and peace of mind, here are the top 5 big data predictions for 2015 and beyond.

Posted January 21, 2015

Registration is now open for Data Summit 2015, providing the opportunity to connect with the best minds in the industry, learn what works, and chart your course forward in an increasingly data-driven world. The event is designed to offer a comprehensive educational experience designed to guide attendees through the key issues in data management and analysis today.

Posted January 21, 2015

In 2014, we continued to watch big data enable all things "big" about data and its business analytics capabilities. We also saw the emergence (and early acceptance) of Hadoop Version 2 as a data operating platform, with cornerstones of YARN (Yet Another Resource Negotiator) and HDFS (Hadoop Distributed File System). In 2015, the mainstream adoption with enterprise data strategies and acceptance of the data lake will continue as data management and governance practices provide further clarity. The cautionary tale of 2014 to ensure business outcomes drive big data adoption, rather than the hype of previous years, will likewise continue.

Posted January 21, 2015

Xplenty, which provides a big data processing platform powered by Hadoop, is partnering with Segment, which provides a customer data hub. Segment is a universal integration layer that supports customer data collection. Through a single platform, it collects, translates, and routes data to analytics and marketing tools.

Posted January 20, 2015

The manageability track at COLLABORATE 15 - IOUG Forum is designed to provide DBAs of every experience level and industry with the knowledge set to streamline IT management processes at their organization and accelerate their transformation to cloud.

Posted January 14, 2015

MapR, a provider of an enterprise grade distributed data platform including Hadoop, has announced a relationship with SAS, a provider of business analytics software and services. "If I were to summarize the journey the two companies are on, it is about getting data results bigger and faster," explained Jack Norris, CMO, MapR Technologies. The partnership of both organizations will allow for additional flexibility and control of Hadoop-based data.

Posted December 29, 2014

As analytics continues to play a larger role in the enterprise, the need to leverage and protect the data looms larger. According to the IDC, the big data and analytics market will reach $125 billion worldwide in 2015. Here are 10 predictions from industry experts about the data and analytics in 2015.

Posted December 19, 2014

In its fourth big data-related acquisition this year, Teradata announced it has acquired RainStor, a privately held company specializing in online big data archiving on Hadoop. RainStor's technology offers three key advantages, explains Chris Twogood, vice president of products and services at Teradata. It enables extreme data compression with the ability to compress data from 10x to 40x, Rainstor data is immutable which is important for compliance and security regulations, and it is all accessible by SQL.

Posted December 18, 2014

2015 is going to be a big year for big data in the enterprise, according to Oracle. Neil Mendelson, Oracle vice president of big data and advanced analytics, shared Oracle's "Top 7" big data predictions for 2015. "The technology is moving very quickly and it is gaining to the point where a broader set of people can get into it - not just because it is affordable - but because they no longer require specialized skills in order to take advantage of it," he said.

Posted December 17, 2014

Data is increasingly being recognized as a rich resource flowing through organizations from a continually growing range of sources. But to realize its full potential, this data must be accessed by an array of users to support both real-time decision making and historical analysis, integrated with other information, and still kept safe from hackers and others with malicious intent. Fortunately, leading vendors are developing products and services to help. Here, DBTA presents the list of Trend-Setting Products in Data and Information Management for 2015.

Posted December 17, 2014

BlueData, which provides EPIC Enterprise, software to enable enterprises to create a self-service cloud experience on premise, has announced the growth of its big data partner ecosystem. Thirteen new companies spanning infrastructure, big data distributions, ETL/BI applications and system integrators have joined the partner program to accelerate the adoption of big data private cloud on-premises.

Posted December 16, 2014

There is still time to submit a speaking proposal for DBTA's Data Summit 2015, which will take place at the New York Hilton Midtown, May 11-13, 2015.

Posted December 08, 2014

Real-time fraud analytics provider Argyle Data is partnering with Hortonworks, a contributor to and provider of enterprise Apache Hadoop. By joining the Hortonworks Technology Partner Program, Argyle says it can rely on Hadoop to help strengthen its ability to drive and lead efforts related to advanced analytics and emerging technologies, including petabyte-scale data storage, machine learning and deep-packet inspection (DPI).

Posted December 02, 2014

GoGrid, an infrastructure-as-a-service provider specializing in multi-cloud solutions, is partnering with Cloudera, a provider of Hadoop-based software and services. The partnership will allow companies to evaluate and run the platform for big data through Cloudera Live. Traditional methods of deploying Hadoop require on-premise work to be done to just test potential solutions. The GoGrid-Cloudera partnership allows customers to run Cloudera with GoGrid's cloud infrastructure. What makes GoGrid unique is its 1-Button Deploy orchestration process, according to John Keagy, founder and CEO of GoGrid.

Posted December 02, 2014

To help IT and business stakeholders take action to benefit from the emerging technologies and trends in information management, Database Trends and Applications has just published the second annual Big Data Sourcebook, a free resource.

Posted November 25, 2014

Oracle has announced an updated version of Oracle GoldenGate 12c. With GoldenGate 12c, customers can implement real-time data integration and transactional data replication between on-premises and cloud environments and across a broader set of heterogeneous platforms, achieving faster time to value and a greater return from their data assets. "Inherent in the concept of integration is that we can effectively cover both like and unlike platforms, and that we offer our customers the ability to effectively capture and move their data regardless of which systems, platforms, and vendors their data originates from," said Jeff Pollock, vice president of product management at Oracle.

Posted November 19, 2014

Splice Machine today announced the general availability of its Hadoop RDBMS, a platform to build real-time, scalable applications, that incorporates new features that emerged from charter customers using the the beta offering. With the additional new features and the validation from beta customers, Splice Machine 1.0 can support enterprises struggling with their existing databases and seeking to scale-out affordably, said Monte Zweben, co-founder and CEO, Splice Machine.

Posted November 19, 2014

The call for speakers for Data Summit 2015 at the New York Hilton Midtown, May 11-13, 2015, is now officially open. The deadline for submitting proposals is December 5, 2014.

Posted November 19, 2014

Concurrent has announced the latest version of Driven, a big data application performance-monitoring and management system. Driven is purpose-built to address the challenges of enterprise application development and deployment for business-critical data applications, delivering control and performance management for enterprises seeking to achieve operational excellence.

Posted November 18, 2014

Seagate Technology, a provider of storage solutions, has introduced ClusterStor Hadoop Workflow Accelerator. The solution is expected to be a boon to computationally intensive high performance data analytics environments, enabling them to achieve a significant reduction in data transfer time.

Posted November 17, 2014

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21

Sponsors