Newsletters




Hadoop

The Apache Hadoop framework for the processing of data on commodity hardware is at the center of the Big Data picture today. Key solutions and technologies include the Hadoop Distributed File System (HDFS), YARN, MapReduce, Pig, Hive, Security, as well as a growing spectrum of solutions that support Business Intelligence (BI) and Analytics.



Hadoop Articles

Pure Storage, provider of an all-flash storage platform, has introduced a data hub to modernize storage architecture for unstructured, data-intensive workloads. Organizational data silos are a universal problem across every industry, but businesses need to realize value from all data, which is impossible without insight into the full picture, explained Matt Burr, GM of FlashBlade, Pure Storage. With the data hub, he noted, Pure Storage has created a central storage system that addresses current and future application requirements with a modern platform.

Posted September 12, 2018

Alation Inc., the data catalog company, and First San Francisco Partners, a business advisory and information management consultancy, are entering a strategic partnership to meet the needs of Chief Data Officers (CDOs). The partnership will focus on delivering new, field-tested methodologies for agile and modern data governance, made possible by data catalog technology.

Posted September 06, 2018

There's a new generation of technologies reshaping data management as we know it. To explore some of the game-changing technologies or approaches that are having the most profound impact on today's enterprises, DBTA asked industry experts and leaders to cite what they see as having the most positive impact. The following are eight areas effecting the most change.

Posted September 05, 2018

Everyone wants to be part of a data-driven enterprise, and for good reason. Data analytics, when applied in a meaningful way, provides an enormous competitive advantage. There's a catch to this though that frequently gets overlooked amidst the glowing analyst projections and keynote speeches about a limitless future in which systems and machines do all the heavy lifting and thinking for businesses. Data—the right kind, in the right sequence, in the right context—doesn't just magically drop out of the cloud. It needs to be discovered, identified, transformed, and brought together for analysis, management, and eventual storage.

Posted September 04, 2018

Informatica, a cloud data management provider, is launching new product innovations that will enhance customer engagement with trusted, governed, and secure data management. The updates transform Informatica Master Data Management (MDM), Intelligent Cloud Services (IICS) and Data Privacy, and Protection, enabling enterprises with intelligent hybrid data management to transform customer experience.

Posted August 31, 2018

Morpheus Data is making major updates to its next generation cloud management platform to help IT operations teams orchestrate software defined networks, multi-cloud high availability, and more. By bringing together over 75 third-party tools under a single out-of-the-box automation engine, Morpheus lets IT quickly provide self-service provisioning to developers and accelerate application deployments by 150x or more.

Posted August 27, 2018

Alation Inc., the data catalog company, is launching the Alation Partner Program which will be dedicated to the successful enterprise-wide deployment of data catalogs. One focus of the partner program is fulfilling the needs of customers to achieve success with enterprise-wide metadata management.

Posted August 09, 2018

We are living in the age of polyglot persistence, which really just means that it makes sense to store data using the technology that best matches the way the data will be used by applications. The age of trying to force everything into a relational DBMS is over, and we now have NoSQL, NewSQL, in-memory, and Hadoop-based offerings that are being used to store data. But you really should also be looking at the algorithmic approach offered by Ancelus Database.

Posted August 08, 2018

This year is an expansive one for the database ecosystems that have evolved around the major platforms. Artificial intelligence (AI), machine learning, the Internet of Things (IoT), and cloud computing are now mainstream offerings seen within the constellations of database vendors, partners, and integrators.

Posted August 08, 2018

This may seem contradictory at first glance: Fresh data from the database user community finds that data lakes continue to increase within the enterprise space as big data flows get even bigger. Yet, at the same time, enterprises appear to have pulled back on Hadoop implementations.

Posted August 08, 2018

After more than 10 years, there is no technology more aligned with advent of big data than Hadoop. The Apache Hadoop framework allows for the distributed processing of large datasets across compute clusters, enabling scale up from single commodity servers to thousands of machines for local computing and storage. Designed to detect and handle failures at the application layer, the framework supports high availability.

Posted August 08, 2018

Database downtime can inflict a fatal wound on the life of a business and having a trusted backup solution that can ensure that databases can be back up and running quickly in the event of an outage is critical. With long downtimes simply unacceptable, organizations seek solutions with capabilities such as the ability to manage backups seamlessly, manage and monitor backups, ensure data integrity, scale efficiently restore quickly to any point in time, and provide security features to stay in compliance with local geographic and industry mandates.

Posted August 08, 2018

Companies are increasingly looking for the right database for the data storage need at hand. That might mean NoSQL, NewSQL, in-memory databases, and cloud databases—also known as database as a service—approaches. 

Posted August 08, 2018

Today's data visualization tools go beyond the standard charts and graphs used in Excel spreadsheets, displaying data in more sophisticated ways such as infographics, dials and gauges, geographic maps, sparklines, heat maps, and detailed bar, pie and fever charts. The images may include interactive capabilities, enabling users to manipulate them or drill into the data for querying and analysis.

Posted August 08, 2018

Sumo Logic, provider of a cloud-native, machine data analytics platform is unveiling 11 new Google Cloud Platform (GCP) applications as well as integration with Google Cloud's open source machine learning library, TensorFlow. The applications will enable real-time collection and analytics of machine data emitted by all major GCP services.

Posted July 23, 2018

Splice Machine is releasing a plugin for Apache Ranger to provide centralized security administration for its customers on Hortonworks Data Platform (HDP). The Apache Ranger plugin will provide Splice Machine customers with a framework for central administration of security policies, and the monitoring of user access and security-related administrative actions within all of the components of Splice Machine. It also provides enhanced support for various authorization methods, such as role-based access control, attribute-based access control and more.

Posted July 17, 2018

With so many layers that may be vulnerable to attack, securing an ERP system involves much more than just one piece of cybersecurity software.

Posted July 16, 2018

Big data has changed in the way it is collected, stored, processed, and analyzed—spanning Hadoop MapReduce, cloud—and increasingly multi-cloud—Spark, streaming analytics, the use of AI, the growing importance of edge, and the expanding use of containerization, said Anoop Dawar, senior vice president product management and marketing, MapR. MapR's goal, he said, is to provide a platform that survives and thrives amidst the transitions in technology and deployment going on now and in the future.

Posted June 28, 2018

Hewlett Packard Enterprise is planning to invest $4 billion in Intelligent Edge technologies and services over the next four years. Specifically, HPE will invest in research and development to advance and innovate new products, services and consumption models across a number of technology domains such as security, AI and machine learning, automation, and edge computing.

Posted June 20, 2018

Our data capture and retention requirements continue to grow at a very fast rate, which brings new entrants in the SQL and NoSQL market all the time. However, not all data is created equal. Companies recognize that disparate data can and should be treated differently. That means the way we persist that data can be extremely varied. Now, enter applications that need to access all that data across a very heterogeneous landscape, and we get to the point where we're reinventing the data access wheel every time someone needs to spin up another application or introduce another data source.

Posted June 01, 2018

Looker, a data platform provider, is releasing new tools and integrations to optimize data science workflows. Looker is improving its governed data workflow with an SDK for R and connections for Python, as well as streamed and merged results, Google TensorFlow integrations, and clean, visual recommendations for users.

Posted May 30, 2018

Chris Reuter, North America data warehousing sales leader, IBM, presented a keynote at Data Summit 2018 on the key trends in IT today and what organizations must do to advance their organizations.

Posted May 24, 2018

Percona is extending its open source database support expertise to PostgreSQL, enabling organizations to work with a single vendor to meet their support needs for MySQL, MongoDB, MariaDB, PostgreSQL, or any hybrid combination of these database technologies. By adding PostgreSQL to its portfolio of services, Percona has made it faster and easier for organizations to get the PostgreSQL support they need.

Posted May 14, 2018

Hortonworks has announced that Trimble Transportation Enterprise Solutions is leveraging its global data management solutions with machine learning models to alleviate pain points across the transportation and logistics industry. In conjunction with Trimble's new blockchain network, Hortonworks DataFlow (HDF) and Hortonworks Data Platform (HDP) are helping Trimble to increase its customers' efficiency by modernizing transportation industry systems.

Posted April 30, 2018

An astounding array of new technologies and approaches have emerged on the database scene over the past few years that promise to turn the next 12 months into a time of unprecedented transformation for the database landscape. There are new developments, along with reinforcement of tried-and-true technologies, some of which may help make the jobs of data managers just a bit easier.

Posted April 25, 2018

MicroStrategy, a provider of enterprise analytics and mobility software, is releasing an enhanced version of its flagship software, boosting mapping capabilities and more. Version 10.11 introduces new out-of-the-box visualizations, intelligent recommendations for content, prompts for dossiers, a native MicroStrategy Library app for smartphones, and more.

Posted April 10, 2018

Syncsort, a provider of "Big Iron to Big Data" software, is releasing new innovations in its Ironstream data integration software. The new updates include the ability to deliver mainframe log and application data in real-time directly to Elastic Logstash.

Posted March 21, 2018

BMC, a provider of IT solutions for the digital enterprise, is expanding its Control-M Managed File Transfer offering to include support for all file transfers from a single automation platform. With BMC's Control-M solution, companies have instant visibility into the status of file transfers and business application workloads.

Posted March 09, 2018

With data lakes being so new to organizations, an early failure can significantly set back the opportunity to fundamentally transform analytics. However, while the potential for big data innovation is significant, organizations are mired with slow, manual, and time-consuming processes for managing the tasks that turn raw big data into relevant business assets and insights. Without addressing these challenges in a systematic way, organizations find that data lake projects turn into labor-intensive, complex endeavors.

Posted March 08, 2018

MapR Technologies has extended advanced containers integration into the MapR Converged Data Platform. The company is enabling the deployment of stateful applications with its Data Fabric for Kubernetes providing persistent storage and full Kubernetes support with volume access. The persistent storage for stateful containers is a well-known problem in the container world, said Jack Norris, SVP of Data & Applications, MapR, noting that MapR is providing an "elegantly simple way" to solve that problem that is scalable, fast, and secure.

Posted March 06, 2018

TimeXtender, a provider of self-service BI and analytics, is now a certified technology partner with Tableau Software, enabling customers to make business decisions on the fly. TimeXtender was selected as a new partner as its Discovery Hub works seamlessly with Tableau, allowing customers to make business decisions in a timely manner based on data that is important to them.

Posted January 25, 2018

Quest Software, a global systems management and security software provider, is making three new updates to the Toad product family, including Toad Edge v1.2, Toad Data Point v4.3 and Toad Intelligence Central v4.3. The new release of Toad Edge simplifies the development and management of next-generation open source database platforms, with added support for MariaDB and MySQL instances running on Microsoft Azure.

Posted January 25, 2018

In a world where new technologies are often presented to the industry as rainbows and unicorns, there is always someone in a cubicle trying to figure out how to solve business problems and just make these great new technologies work together. The truth is that all of these technologies take time to learn, and it also takes time to identify the problems that can be solved by each of them.

Posted January 05, 2018

With a new year comes new ideas on how to disrupt the big data industry. From tried and true methods to the introduction of new solutions, several experts are predicting a surge of a combination of both old and new solutions, along with the rise of different roles that will power enterprises through 2018.

Posted January 04, 2018

Data lakes are often viewed as the ultimate silo breakers, integrating mountains of data frompoint solutions for ERP, CRM, enterprise data warehouse, cloud and on-premises applications.However, if the enterprise data lake is not leveraged appropriately, it often ends up being just adata dump or worse still a "data swamp."

Posted January 02, 2018

RedPoint Global, a provider of data management and customer engagement technology, is updating its RedPoint Data Management solution within the RedPoint Customer Data Platform. With RedPoint Data Management 8.0, organizations can harness massive amounts of data from an ever-growing number of touchpoints to create a truly unified customer profile - all in an expanded open garden environment.

Posted December 11, 2017

IBM is launching its next-generation Power Systems Servers incorporating its newly designed POWER9 processor, advancing performance improvements across popular AI frameworks. Built specifically for compute-intensive AI workloads, the new POWER9 systems are capable of improving the training times of deep learning frameworks by nearly which allows enterprises to build more accurate AI applications.

Posted December 05, 2017

There never has been a more interesting time to be involved in the data management field. Data not only has become "the new oil" but is also the catalyst that is powering organizations to new heights of success. The past year has seen the rise of powerful analytics and an embrace of new tools and platforms emerging to more effectively tap into the power that data offers. DBTA reached out to industry experts to document the most important trends shaping data management in 2018.

Posted December 01, 2017

Over the past few years, the scale, speed, and power of analytics have been dramatically transformed. The amount of data available from the internet, combined with advances in software to make use of it, has created a practice called "big data analytics." It can provide types of information that were not available in the recent past and it has the potential to do so in real-time.

Posted December 01, 2017

MapR Technologies, a pioneer in delivering one platform for all data, across every cloud, has announced the availability of the MapR Converged Data Platform 6.0, with new advancements to help organizations achieve greater value from their data through DataOps teams.  The major system update from MapR includes innovations that automate platform health and security, and a database for next-generation applications.

Posted November 21, 2017

No longer the stuff of science fiction, the business uses for cognitive computing, artificial intelligence, and machine learning today include fields as diverse as medicine, marketing, defense, energy, and agriculture. Enabling these applications is the vast amount of data that companies are collecting from machine sensors, instruments, and websites and the ability to support smarter solutions with faster data processing.

Posted November 13, 2017

Hortonworks, which recently announced DataPlane Service (DPS), a product designed to address the new paradigm of data management, has announced the first generally available extensible service that DPS will support—Data Lifecycle Manager (DLM). DLM 1.0 is a hybrid cloud-focused solution that offers disaster recovery and replication with auto-tiering and backup and restore.

Posted October 31, 2017

Syncsort and CA Technologies have formed a partnership. The new integration between Syncsort DMX-h and CA Datacom and CA IDMS bridges the "big iron to big data gap," enabling enterprises to tap into valuable mainframe data and make it accessible to emerging next-generation platforms.

Posted October 26, 2017

MapR Technologies has introduced the MapR Data Science Refinery, a new solution that allows data scientists to access and analyze all data in-place, to collaborate, build and deploy machine learning models on the MapR Converged Data Platform. 

Posted October 25, 2017

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18

Sponsors