Newsletters




Data Warehousing

Hardware and software that support the efficient consolidation of data from multiple sources in a Data Warehouse for Reporting and Analytics include ETL (Extract, Transform, Load), EAI (Enterprise Application Integration), CDC (Change Data Capture), Data Replication, Data Deduplication, Compression, Big Data technologies such as Hadoop and MapReduce, and Data Warehouse Appliances.



Data Warehousing Articles

Deep Information Sciences is unveiling a new solution that combines a MySQL-compliant database with cloud- and resource-awareness.

Posted February 04, 2016

SolarWinds, a provider of hybrid IT infrastructure management software, is adding improved support for Oracle Database 12c Enterprise Edition in the latest release of SolarWinds Database Performance Analyzer. With the additonal support for Oracle Database 12c, the SolarWinds tool now pinpoints efficiency issues and optimizes performance of Oracle pluggable databases in a multitenant environment through tuning, metric visibility and resource correlation to help ensure the availability and speed of business-critical applications.

Posted February 04, 2016

Oracle has introduced a new Big Data Preparation Cloud Service. Despite the increasing talk about the need for companies to become "data-driven," and the perception that people who work with business data spend most of their time on analytics, Oracle contends that in reality many organizations devote much more time and effort on importing, profiling, cleansing, repairing, standardizing, and enriching their data.

Posted February 04, 2016

The development of a functional and practical quantum computing system has been "pending" for some decades now, but there are some real signs that this technology may become decisive soon. The implications of cryptography are encouraging major government investment - both the U.S. and China, in particular, are heavily investing in quantum computing technology. The arms race to develop functional quantum computing has begun.

Posted February 03, 2016

With Hadoop marking its 10th anniversary this year, Sean Suchter, CEO of Pepperdata, recently reflected on his experience with the platform and speculated on what the next 10 years may bring.

Posted February 03, 2016

Hewlett Packard Enterprise (HPE) has announced the availability of HPE Investigative Analytics, a new software solution to enable financial institutions and other organizations in highly regulated industries to use big data technologies to detect patterns, relationships, behaviors, and anomalies across structured and unstructured data stores. The software is aimed at helping companies reduce risk by proactively preventing fraudulent actions.

Posted February 02, 2016

Attunity Ltd. is releasing an enhanced version of its Attunity Compose platform to eliminate pitfalls in data warehousing and accelerate big data analytics.

Posted February 02, 2016

The business world continues to shortchange a critical step between storing and analyzing the explosion of new data expected over the coming years. A business can quickly move from the old world of siloed, unusable data to a new one where stakeholders around the globe can find information in a few minutes from their local access points. The "findability" of data is particularly important for Global 1000 companies pursuing Industrial Internet-related innovation.

Posted February 02, 2016

Snowflake Computing, a cloud data warehousing company, has formed a technology and go-to-market partnership with Looker, which provides a data exploration and business intelligence platform.

Posted February 01, 2016

The Winter '15 release allows administrators to deploy Paxata in heterogeneous environments including the Hortonworks Data Platform on YARN and with multiple versions of Apache Spark. The latest release also improves the way business analysts find, access, and apply data by delivering additional ease of use capabilities supported by machine learning innovations, and provides enterprise-grade security and a multi-tenant governance model.

Posted January 27, 2016

Data Summit, a comprehensive educational experience designed to guide attendees through the key issues in data management and analysis, is coming to the New York Hilton Midtown. Data Summit also features two co-located events: Hadoop Day and Virtualization Day. In addition, the IOUG will participate in Data Summit again this year presenting a track focused on big data in the cloud and the evolution of the data warehouse.

Posted January 26, 2016

Embarcadero Technologies, provider of database lifecycle management solutions, is enhancing its ER/Studio platform, focusing on better collaboration and communication within enterprises.

Posted January 25, 2016

For decades, the enterprise data warehouse (EDW) has been the aspirational analytic system for just about every organization. It has taken many forms throughout the enterprise, but all share the same core concepts of integration/consolidation of data from disparate sources, governing that data to provide reliability and trust, and enabling reporting and analytics. The last few years, however, have been very disruptive to the data management landscape. The "big data" era has introduced new technologies and techniques that provide alternatives to the traditional EDW approach, and in many cases, exceeding its capabilities. Many claim we are now in a post-EDW era and the concept itself is legacy.

Posted January 19, 2016

Rackspace has added three key enhancements spanning data storage, security, and DR to the ObjectRocket offering for MongoDB to address requirements in mission-critical enterprise scenarios. The new capabilities were announced in a blog post by Kyle Hunter, product marketing manager for ObjectRocket solutions at Rackspace. The database market is growing strongly and the biggest growth area is open source databases, according to Chris Lalonde, CEO and co-founder of ObjectRocket.

Posted January 19, 2016

The year 2015 started out with people recognizing that the Hadoop ecosystem is here to stay, and ended as the year in which organizations achieved real success within the Hadoop ecosystem. Today, more projects are popping up within the Hadoop ecosystem that can run both with and without Hadoop. The great thing about this trend is that it lowers the barrier to entry for people to get started with these technologies. More importantly, all of these new technologies work best at large scale within the rest of the Hadoop ecosystem, while Hadoop MapReduce has begun its ride off into the sunset.

Posted January 19, 2016

Join IT practitioners and business stakeholders alike for the third annual Data Summit conference at the New York Hilton, May 9-11, 2016. Discounted pricing is available for a limited time.

Posted January 08, 2016

Data modelers must look at the big picture of an organization's data ecosystem to ensure additions and changes fit in properly. Simultaneously, each data modeler must be focused on the minute details, adhering to naming standards, domain rules, data type practices, still remaining ever vigilant for instilling consistency across everything they do. And while focused on all of the above, their efforts must culminate in a practical model that serves the individual project's requirements while also being implementable, maintainable, and extensible.

Posted January 07, 2016

If you are a working DBA, the actual work you do these days is probably significantly different than it was when you first began work as a DBA. So is the term DBA really accurate any longer? Or has the job grown into something more?

Posted January 07, 2016

Today, the success of many startups hinges upon the ability to gain insights from rapidly growing data. Yet startups and smaller businesses often don't have the resources to hire a full-scale data science team, especially considering the painful data scientist shortage that's making it difficult for even large enterprises to find qualified candidates. Here are three approaches companies can adopt to deal with their big and complex data analytics challenges in 2016.

Posted January 07, 2016

The continual evolution in technology has allowed for more data sources than previously thought possible. The growth of SaaS tools provides many benefits, but there is a downside as well. Bringing these cloud data sources into a coherent system for reporting is perpetually a challenge for IT and business intelligence teams. A recent DBTA roundtable webcast covered the issues of combining different SaaS applications into a cloud based enterprise data and leveraging the simple data pipe. Presenters included with Sarah Maston, solution architect with IBM Cloud Data Services, and Erin Franz, alliances data analyst with Looker.

Posted January 04, 2016

We can expect to see every year within the next 5 years be "The Year of IoT." IoT promises to unlock value and rapidly transform how organizations manage, operationalize, and monetize their assets. With IoT, physical assets become liquid, easily indexed and tracked, enabling identification of idle capacity or over utilization.

Posted December 22, 2015

In 2015, big data, mobility, IoT, expanding requirements for security and real-time analytics and the introduction of the Cognitive Era continued to place greater pressure on IT organizations. Linux and open source technologies are at the center of many of the most innovative software and hardware solutions that are addressing emerging enterprise requirements. Here's a look back at some of the most significant announcements in Linux and open source technology of 2015.

Posted December 16, 2015

Looking ahead to 2016 David Jonker, senior director of big data at SAP, offered six key business and technology trends that will take the market by storm, including a continued interest in in-memory processing, enablement of real-time data and analytics, and an increased focus on advanced analytics and machine learning.

Posted December 16, 2015

What's ahead for 2016 in terms of cloud, IoT, big data, analytics, and open source technologies? IT executives gaze into their crystal balls, and weigh in on the upcoming challenges and opportunities ahead in the next year - and beyond.

Posted December 16, 2015

The modern business landscape is a fast-moving, ever-changing, highly competitive environment. For companies to outpace the competition and build upon innovation, they must embrace a modern data architecture. It is necessary that this new architecture support today's new requirements such as mobile integration and advanced digital marketing.

Posted December 02, 2015

It's commonly asserted—and generally accepted—that the era of the "one-size-fits-all" database is over. We expect that enterprises will use a combination of database technologies to meet the distinct needs created by various application architectures.

Posted December 02, 2015

As 2015 draws to a close it's time to look back on the year's big changes in data management and reflect on some of the most insightful observations by leading data experts. Here, Big Data Quarterly presents a round-up of thought-provoking articles that explore some of the key advances of 2015, how organizations are harnessing what big data has to offer, and the challenges they face as they seek to compete on analytics.

Posted December 02, 2015

Hadoop distribution provider Cloudera has introduced Cloudera Enterprise 5.5, including Cloudera Navigator Optimizer, a new product targeted at helping organizations improve big data workload performance and efficiency. Cloudera Navigator Optimizer, now in beta, is expected to be generally available in 2016. The new release of Cloudera Enterprise has three main areas of focus, according to Anupam Singh, head of data management at Cloudera.

Posted November 19, 2015

SAP is the first company to announce IoT solutions based on the Intel's new IoT Platform. SAP plans to develop its IoT enterprise end-to-end solutions utilizing the Intel platform along with its SAP HANA Cloud Platform.

Posted November 18, 2015

The concept of the data lake has become a hot topic. The data lake retains data in its original format to allow the data to be more flexible for everyone involved. While this sounds fine in theory, it is a more complicated in practice due to the need for governance and security.

Posted November 09, 2015

There are many different ways to look at database administration. It can be done by task, by discipline, by DBMS, by server, and so on. But one useful way to look at database administration is in terms of the type of support being delivered to applications. You can paint a broad brush stroke across the duties of the DBA and divide them into two categories: those that support development work and those that support the production systems.

Posted November 09, 2015

To better manage the data explosion now and with scalable options for the future, existing data architecture is evolving beyond traditional databases, data stores, data warehouses, and the like into a more unfiltered repository known as the data lake.

Posted November 05, 2015

Organizations are getting squeezed, said Mark Hurd, Oracle CEO, in his Monday morning keynote at Oracle OpenWorld 2015. They have old infrastructure; there is the need for innovation but also great pressure to do things such as increase security and adhere to governance mandates which are not innovative; and also pressure to keep costs flat, without increasing IT costs. "This is why the cloud is such a big deal," said Hurd, sharing his list of top 5 predictions for 2025.

Posted November 04, 2015

Informatica has introduced a new data management platform designed to handle data at any speed across today's hybrid IT environments, both cloud and on-premise. The new release, Informatica 10, provides enhancements across three core components - Informatica PowerCenter 10, Informatica Data Quality 10 and Informatica Data Integration Hub 10, and offers specific certified optimizations for Oracle Exadata, Oracle SuperCluster, SAP HANA, and HP Vertica.

Posted October 28, 2015

Attunity Ltd., a provider of data management software solutions, has introduced the latest version of its data replication and loading solution. Designed to accelerate enterprise big data analytics initiatives, Attunity Replicate 5.0 automates big data movement to, from and between databases, data warehouses, Hadoop and the cloud, reducing the time and labor, and ultimately the cost of making big data analytics available in real time.

Posted October 26, 2015

Ever since Linux became a viable server operating system, organizations have been looking to all kinds of open source software (OSS) to save on license and maintenance costs and to enjoy the benefits of an open platform that invites innovation. If you're considering MySQL or another open source DBMS as either your primary database or to, perhaps, operate alongside your existing commercial systems, such as Oracle or Microsoft SQL Server, for one reason or another, here are seven things to keep in mind.

Posted October 21, 2015

The Agile methodology is great for getting turgid development teams to start working faster and more coherently. With Agile, which focuses on more rapid, incremental deliverables and cross-departmental collaboration, the bureaucratic plaque is flushed from the information technology groups' arteries. But there is a dark side to Agile approaches.

Posted October 21, 2015

In the 1989 movie, "Back to the Future Part II," actor Michael J. Fox's character traveled in time to October 21, 2015. And today, October 21, 2015, in his keynote at Dell World 2015 in Austin, Michael S. Dell, founder and CEO of Dell, expanded on what it means for organizations to be future-ready, and what Dell is doing to help. Referring to the recently announced plans for Dell to acquire EMC in a deal valued at $67 billion, Dell said, "I started this company 32 years ago, just a few blocks from here in my dorm room, building PCs. And as I speak to you here today with this agreement in place Dell is set to become an enterprise solutions powerhouse."

Posted October 21, 2015

MapR is including Apache Drill 1.2 in its Apache Hadoop distribution and is also now offering a new Data Exploration Quick Start Solution, leveraging Drill to help customers get started more rapidly with big data projects. Apache Drill is an open source, low-latency query engine for Hadoop that delivers secure, interactive SQL analytics at petabyte scale. With the two announcements, MapR says customers and partners will able to more quickly leverage Drill to get fast business insights from all their data in Hadoop and other sources. MapR also released a comprehensive SQL-based test framework to the open source community.

Posted October 21, 2015

Splice Machine has released version 1.5 of its Hadoop RDBMS, which adds multiple enterprise-ready features. The new release adds functionality and performance improvements to enable companies to increase the benefits of using Splice Machine to support real-time applications, run operational data lakes, and accelerate their ETL pipelines.

Posted October 20, 2015

As many companies begin to look to other methods of data storage other than traditional methods such as data warehouses, cloud storage has become a popular option that organizations are beginning to use. Cloud offers companies better cost and more flexibility than traditional storage methods and companies are beginning to make use of Cloud and its advantages. When considering cloud storage options, there are lots of different questions that a company must weigh. Sarah Maston, developer advocate with IBM Cloud Data Services, covered the move to the cloud in a recent DBTA webinar.

Posted October 13, 2015

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15

Sponsors