Newsletters




Hadoop

The Apache Hadoop framework for the processing of data on commodity hardware is at the center of the Big Data picture today. Key solutions and technologies include the Hadoop Distributed File System (HDFS), YARN, MapReduce, Pig, Hive, Security, as well as a growing spectrum of solutions that support Business Intelligence (BI) and Analytics.



Hadoop Articles

The future will flourish with machines. We've been told this in pop culture for decades, from the helpful robots of the Jetsons, to the infamous Skynet of the Terminator movies, to the omniscient "computer" of Star Trek. Smart, connected devices will be ubiquitous and it's up to us, the humans, to decide what's next. But the Internet of Things (IoT) is about more than devices and data.

Posted April 23, 2015

SUSE and Veristorm are partnering to provide certified high-performance Hadoop solutions that run directly on Linux on IBM z Systems, IBM Power Systems, and x86-64. Customers with IBM z Systems can team SUSE Linux Enterprise Server for System z with Veristorm zDoop, a commercial distribution of Hadoop supported on mainframes.

Posted April 23, 2015

Many DBAs are now tasked with managing multi-vendor environments, and handling a variety of data types. Increasingly, DBAs are turning to strategies such as database automation to be able to concentrate more on the big picture of moving their enterprises forward.

Posted April 23, 2015

While the new data stores and other software components are generally open source and incur little or no licensing costs, the architecture of the new stacks grows ever more complex, and this complexity is creating a barrier to adoption for more modestly sized organizations.

Posted April 22, 2015

To help organizations answer questions with data spread across disparate analytics systems and data repositories, Teradata has expanded its QueryGrid technologies. "With this announcement we have our foot on the gas pedal," Imad Birouty, director of product marketing, Teradata. "We have seven updates. We are announcing new connectors that are on their way, announcing that we have delivered on the connectors that we previously announced, and we are refreshing previously-released connector versions of the technologies."

Posted April 20, 2015

Unstructured data types and new database management systems are playing an increasing role in the modern data ecosystem, but structured data in relational database management systems (RDBMS) remains the foundation of the information infrastructure in most companies. In fact, structured data still makes up 75% of data under management for more than two-thirds of organizations, with nearly one-third of organizations not yet actively managing unstructured data at all, according to a new survey commissioned by Dell Software and conducted by Unisphere Research, a division of Information Today, Inc.

Posted April 15, 2015

Voting has opened for the 2015 DBTA Readers' Choice Awards. This year, there are more than 300 nominees across 29 categories. Unlike other awards programs which rely on our editorial and publishing staff's evaluations, the DBTA Readers' Choice Awards are unique in that the winning information management solutions are chosen by you—the people who actually use them.

Posted April 14, 2015

A host of questions surround the implementation of data virtualization and, as the concept becomes commonplace, more businesses need answers and assistance with adapting this method. To address these issues, Lindy Ryan, research director for Radiant Advisors' Data Discovery and Visualization Practice, will present Data Summit 2015 attendees with a toolkit for adopting data virtualization.

Posted April 14, 2015

AtScale, Inc. has introduced a platform that will enable interactive, multi-dimensional analysis on Hadoop, directly from standard business intelligence tools such as Microsoft Excel, Tableau Software or QlikView. Dubbed the "AtScale Intelligence Platform," the new offering provides a Hadoop-native analysis server that allows users to analyze big data at full scale and top speed, while leveraging the existing BI tools they already own.

Posted April 14, 2015

Think Big, a Teradata company, has introduced the Dashboard Engine for Hadoop, which enables organizations to access and report on big data in Hadoop-based data lakes to make agile business decisions. "There are endless streams of data from web browsers, set top boxes, and contact centers that often land in Hadoop, but sometimes don't make their way into downstream analytics," said Ron Bodkin, president, Think Big.

Posted April 13, 2015

Pivotal has proposed "Project Geode" for incubation by the Apache Software Foundation (ASF). A distributed in-memory database, Geode will be the open source core of Pivotal GemFire, and is now available for review at network.pivotal.io. Pivotal plans to contribute to, support, and help build the Project Geode community while simultaneously producing its commercial distribution of Pivotal GemFire.

Posted April 13, 2015

Hortonworks, a contributor to and provider of enterprise Apache Hadoop, has signed a definitive agreement to acquire SequenceIQ. "This acquisition complements our strategy of providing enterprise customers the broadest choice of consumption options for Hortonworks Data Platform, from on-premise deployments to cloud architectures,"said Rob Bearden, chief executive officer of Hortonworks.

Posted April 13, 2015

Oracle has unveiled Oracle Data Integrator for Big Data to help make big data integration more accessible and actionable for customers. The goal with the new data integration capabilities is to bring together disparate communities that have emerged within the Oracle client base and allow the mainstream DBAs and ETL developers as well as the big data development organization to be brought together on a single platform for collaboration, said Jeff Pollock, vice president of product management at Oracle.

Posted April 08, 2015

To fully take advantage of big data tools and architectures, businesses need to adapt a different mindset, according to Edd Dumbill, who contends that looking at the data value chain is the first step to understanding the value of data.

Posted April 08, 2015

Teradata made its fourth acquisition of 2014 in the big data space with the purchase of Rainstor, a privately held company specializing in online big data archiving on Hadoop. Here, Chris Twogood, vice president of products and services marketing at Teradata, explains why the newly added technologies and services are important to Teradata's big data portfolio.

Posted April 08, 2015

There is no one single path to the data lake within the data architecture of the organization. Likewise, each data lake is unique, with inputs and decisions from the organization contributing a variety of essential elements in organization, governance, and security.

Posted April 08, 2015

In order to truly appreciate Apache Drill, it is important to understand the history of the projects in this space, as well as the design principles and the goals of its implementation.

Posted April 08, 2015

How does an organization acknowledge that data is important? An organization does so by enabling and supporting efforts for gathering and persisting information about the organization's data resources.

Posted April 06, 2015

Using StretchDB, an enterprise can "stretch" an on-premises database into the cloud, such that "hot," heavily used data is stored in the on-premises instance of SQL Server, while "cold" and infrequently used data is transparently stored in Azure. A stretched database automatically and transparently manages synchronization and movement of aging data from on-premises to the cloud.

Posted April 06, 2015

With the advent of big data, the scope of business intelligence and analytics is expanding. To shed light on the changes taking place within BI as a result of big data, Ian Abramson, author, principal senior consultant at SWI, and a past president of the IOUG, will present a session at Data Summit 2015 titled "The New Analytic Paradigm: The Big Evolution of BI." Abramson will present his session at 10:45 am on Wednesday, May 13, as part of the IOUG Track.

Posted April 06, 2015

The Independent Oracle Users Groups (IOUG) has been serving Oracle technologists and professionals for more than 20 years, and we are very pleased with how much the community has grown as well as how much IOUG has accomplished. Having said this, we will not rest on our laurels. There are many great opportunities that lie ahead of us. While we set the bar pretty high in 2014 with the establishment of the content-rich blog #IOUGenius, an increased number of Master Classes offered across the nation and a truly inspirational COLLABORATE 14, you'll be very pleased with what IOUG has in store for 2015.

Posted April 01, 2015

Syncsort is partnering with Impetus Technologies to provide integrated solutions for building real-time, streaming analytics applications integrated with Apache Kafka, RabbitMQ, and other message brokers on Hadoop. While Syncsort has contributed to open source projects for several years, the partnership with Impetus is taking the next step in streamlining data analytics for real-time applications.

Posted April 01, 2015

Splice Machine has formed a strategic partnership with RedPoint Global, to provide a marketing solution for big data. According to the vendors, with database technology from Splice Machine and cross-channel marketing and data quality technology from RedPoint, the partnership provides a platform that can use big data to enable personalized, real-time interactions to engage customers across channels.

Posted March 31, 2015

Mtelligence Corporation (dba Mtell), and MapR Technologies have introduced a new big data platform called Mtell Reservoir that combines the MapR Distribution including Hadoop, Mtell Previse Software, and Open TSDB (time-series database) software technology. The solution is targeted at the oil and gas industry.

Posted March 31, 2015

Data science, famously described as the sexiest job of the 21st century in the Harvard Business Review, represents the ability to sift through massive amounts of data to discover hidden patterns and predict future trends and actions. The catch is that it requires an understanding of many elements of data analytics. To help provide information on how to harness the vast potential this data represents, Joe Caserta, president and founder of Caserta Concepts, will present an "Introduction to Data Science" workshop at the Data Summit 2015 conference in New York City and also participate in a panel discussion titled "The Data Lake: From Hype to Reality."

Posted March 31, 2015

The deadline for nominating products and services for the Database Trends and Applications Readers' Choice Awards is coming soon. This is a unique opportunity because the DBTA 2015 DBTA Readers' Choice Awards is a program in which the winners will be selected by the experts whose opinions count above all others - you.

Posted March 26, 2015

The age of big data is here and with it comes a unique set of problems. Anne Buff, business solutions manager for SAS best practices at the SAS Institute, hopes to help companies avoid those pitfalls by showing the importance of virtualizing data environments or taking advantage of cloud technologies.

Posted March 26, 2015

The ability to transform data into a competitive edge and financial benefit requires organizations to pay attention to the evolving trends in analytics across mobile and cloud applications, new data platforms and data discovery tools.

Posted March 26, 2015

COLLABORATE is the single event that delivers the full range of Oracle applications and technology from three independent Oracle users groups. This year, COLLABORATE 15: Technology and Applications Forum will provide the Oracle Community with more than 1,000 sessions and panels covering first-hand experiences, case studies, how-to content, news and information from Oracle executive management, opportunities for networking, SIG meetings, a Women in Technology Forum, and an exhibitor showcase highlighting products and solutions that can help solve real-world challenges.

Posted March 26, 2015

Talend has introduced a new solution called Talend Integration Cloud to provide instant, elastic and secure capacity so IT teams can more easily shift workloads between on-premise and cloud environments. Targeted at SMBs, large enterprises, and IT integration developers, and planned for availability in April, the hosted cloud integration platform will provide a single solution for bulk, batch and real-time data integration across hundreds of data sources including Hadoop, Amazon Redshift and NoSQL databases, said Ashley Stirrup, chief marketing officer for Talend.

Posted March 24, 2015

EMC has unveiled a new fully engineered solution incorporating storage and big data analytics technologies from EMC Information Infrastructure, Pivotal, and VMware. Dubbed "the Federation Business Data Lake (FBDL)," it is designed for speed, self-service, and scalability for the enterprise, enabling organizations to deploy Hadoop and real-time analytics capabilities in as little as 7 days.

Posted March 23, 2015

It looks like 2015 will be an important year for big data and many other technologies such as HTAP and in-memory computing. Many businesses have gone from investigation to experimentation to actual implementation. With installations coming online, and more to come in 2015 and beyond, big data will become more efficient and more customer-focused. Essentially, what many saw as hype will now turn into real implementations.

Posted March 12, 2015

Hadoop distribution provider MapR Technologies has announced the results of testing based on the recently released benchmark for big data technologies from TPC (Transaction Processing Performance Council). The recently released TPCx-HS benchmark for big data technologies is a series of tests that compare Hadoop architectures across several dimensions. Cisco is also now reselling the MapR Distribution with Cisco UCS, as part of an agreement that includes marketing, sales and training worldwide.

Posted March 05, 2015

"This acquisition is strategic, synergistic, and will strengthen our leadership in the big data and Hadoop market," said Shimon Alon, Attunity's chairman and CEO, during a conference call this morning discussing his company's purchase of Appfluent, a provider of data usage analytics for big data environments, including data warehousing and Hadoop. "We also expect it to accelerate our revenue growth and to be accretive to earnings." The total purchase price is approximately $18 million, payable in cash and stock, with additional earn-out consideration based on performance milestones.

Posted March 05, 2015

Whether an organization is currently considering Hadoop or already using it in production, Hadoop Day on May 12 will provide the opportunity to connect with experts and advance your knowledge base. The educational event will include wide range of presentations focused on topics such as the current state of Hadoop and how to get started, best practices for building a data warehouse on Hadoop that co-exists with other frameworks and non-Hadoop platforms, leveraging Hadoop in the cloud, the key components of the Hadoop ecosystem, as well as a spirited panel discussion on what to consider before diving into the data lake.

Posted March 03, 2015

Informatica, a data management company, is collaborating with two major big data players - Capgemini and Pivotal - on a data lake solution. As part of the Business Data Lake ecosystem developed by Capgemini and Pivotal, Informatica will deliver certified technologies for data integration, data quality and master data management (MDM).

Posted March 03, 2015

Pages
1
2
3
4
5
6

Sponsors