Newsletters




Data Warehousing

Hardware and software that support the efficient consolidation of data from multiple sources in a Data Warehouse for Reporting and Analytics include ETL (Extract, Transform, Load), EAI (Enterprise Application Integration), CDC (Change Data Capture), Data Replication, Data Deduplication, Compression, Big Data technologies such as Hadoop and MapReduce, and Data Warehouse Appliances.



Data Warehousing Articles

Rackspace has added three key enhancements spanning data storage, security, and DR to the ObjectRocket offering for MongoDB to address requirements in mission-critical enterprise scenarios. The new capabilities were announced in a blog post by Kyle Hunter, product marketing manager for ObjectRocket solutions at Rackspace. The database market is growing strongly and the biggest growth area is open source databases, according to Chris Lalonde, CEO and co-founder of ObjectRocket.

Posted January 19, 2016

The year 2015 started out with people recognizing that the Hadoop ecosystem is here to stay, and ended as the year in which organizations achieved real success within the Hadoop ecosystem. Today, more projects are popping up within the Hadoop ecosystem that can run both with and without Hadoop. The great thing about this trend is that it lowers the barrier to entry for people to get started with these technologies. More importantly, all of these new technologies work best at large scale within the rest of the Hadoop ecosystem, while Hadoop MapReduce has begun its ride off into the sunset.

Posted January 19, 2016

Join IT practitioners and business stakeholders alike for the third annual Data Summit conference at the New York Hilton, May 9-11, 2016. Discounted pricing is available for a limited time.

Posted January 08, 2016

Data modelers must look at the big picture of an organization's data ecosystem to ensure additions and changes fit in properly. Simultaneously, each data modeler must be focused on the minute details, adhering to naming standards, domain rules, data type practices, still remaining ever vigilant for instilling consistency across everything they do. And while focused on all of the above, their efforts must culminate in a practical model that serves the individual project's requirements while also being implementable, maintainable, and extensible.

Posted January 07, 2016

The development of a functional and practical quantum computing system has been "pending" for some decades now, but there are some real signs that this technology may become decisive soon. The implications of cryptography are encouraging major government investment - both the U.S. and China, in particular, are heavily investing in quantum computing technology. The arms race to develop functional quantum computing has begun.

Posted January 07, 2016

If you are a working DBA, the actual work you do these days is probably significantly different than it was when you first began work as a DBA. So is the term DBA really accurate any longer? Or has the job grown into something more?

Posted January 07, 2016

Today, the success of many startups hinges upon the ability to gain insights from rapidly growing data. Yet startups and smaller businesses often don't have the resources to hire a full-scale data science team, especially considering the painful data scientist shortage that's making it difficult for even large enterprises to find qualified candidates. Here are three approaches companies can adopt to deal with their big and complex data analytics challenges in 2016.

Posted January 07, 2016

The continual evolution in technology has allowed for more data sources than previously thought possible. The growth of SaaS tools provides many benefits, but there is a downside as well. Bringing these cloud data sources into a coherent system for reporting is perpetually a challenge for IT and business intelligence teams. A recent DBTA roundtable webcast covered the issues of combining different SaaS applications into a cloud based enterprise data and leveraging the simple data pipe. Presenters included with Sarah Maston, solution architect with IBM Cloud Data Services, and Erin Franz, alliances data analyst with Looker.

Posted January 04, 2016

We can expect to see every year within the next 5 years be "The Year of IoT." IoT promises to unlock value and rapidly transform how organizations manage, operationalize, and monetize their assets. With IoT, physical assets become liquid, easily indexed and tracked, enabling identification of idle capacity or over utilization.

Posted December 22, 2015

In 2015, big data, mobility, IoT, expanding requirements for security and real-time analytics and the introduction of the Cognitive Era continued to place greater pressure on IT organizations. Linux and open source technologies are at the center of many of the most innovative software and hardware solutions that are addressing emerging enterprise requirements. Here's a look back at some of the most significant announcements in Linux and open source technology of 2015.

Posted December 16, 2015

Looking ahead to 2016 David Jonker, senior director of big data at SAP, offered six key business and technology trends that will take the market by storm, including a continued interest in in-memory processing, enablement of real-time data and analytics, and an increased focus on advanced analytics and machine learning.

Posted December 16, 2015

What's ahead for 2016 in terms of cloud, IoT, big data, analytics, and open source technologies? IT executives gaze into their crystal balls, and weigh in on the upcoming challenges and opportunities ahead in the next year - and beyond.

Posted December 16, 2015

The modern business landscape is a fast-moving, ever-changing, highly competitive environment. For companies to outpace the competition and build upon innovation, they must embrace a modern data architecture. It is necessary that this new architecture support today's new requirements such as mobile integration and advanced digital marketing.

Posted December 02, 2015

It's commonly asserted—and generally accepted—that the era of the "one-size-fits-all" database is over. We expect that enterprises will use a combination of database technologies to meet the distinct needs created by various application architectures.

Posted December 02, 2015

As 2015 draws to a close it's time to look back on the year's big changes in data management and reflect on some of the most insightful observations by leading data experts. Here, Big Data Quarterly presents a round-up of thought-provoking articles that explore some of the key advances of 2015, how organizations are harnessing what big data has to offer, and the challenges they face as they seek to compete on analytics.

Posted December 02, 2015

Hadoop distribution provider Cloudera has introduced Cloudera Enterprise 5.5, including Cloudera Navigator Optimizer, a new product targeted at helping organizations improve big data workload performance and efficiency. Cloudera Navigator Optimizer, now in beta, is expected to be generally available in 2016. The new release of Cloudera Enterprise has three main areas of focus, according to Anupam Singh, head of data management at Cloudera.

Posted November 19, 2015

SAP is the first company to announce IoT solutions based on the Intel's new IoT Platform. SAP plans to develop its IoT enterprise end-to-end solutions utilizing the Intel platform along with its SAP HANA Cloud Platform.

Posted November 18, 2015

The concept of the data lake has become a hot topic. The data lake retains data in its original format to allow the data to be more flexible for everyone involved. While this sounds fine in theory, it is a more complicated in practice due to the need for governance and security.

Posted November 09, 2015

There are many different ways to look at database administration. It can be done by task, by discipline, by DBMS, by server, and so on. But one useful way to look at database administration is in terms of the type of support being delivered to applications. You can paint a broad brush stroke across the duties of the DBA and divide them into two categories: those that support development work and those that support the production systems.

Posted November 09, 2015

To better manage the data explosion now and with scalable options for the future, existing data architecture is evolving beyond traditional databases, data stores, data warehouses, and the like into a more unfiltered repository known as the data lake.

Posted November 05, 2015

Organizations are getting squeezed, said Mark Hurd, Oracle CEO, in his Monday morning keynote at Oracle OpenWorld 2015. They have old infrastructure; there is the need for innovation but also great pressure to do things such as increase security and adhere to governance mandates which are not innovative; and also pressure to keep costs flat, without increasing IT costs. "This is why the cloud is such a big deal," said Hurd, sharing his list of top 5 predictions for 2025.

Posted November 04, 2015

Informatica has introduced a new data management platform designed to handle data at any speed across today's hybrid IT environments, both cloud and on-premise. The new release, Informatica 10, provides enhancements across three core components - Informatica PowerCenter 10, Informatica Data Quality 10 and Informatica Data Integration Hub 10, and offers specific certified optimizations for Oracle Exadata, Oracle SuperCluster, SAP HANA, and HP Vertica.

Posted October 28, 2015

Attunity Ltd., a provider of data management software solutions, has introduced the latest version of its data replication and loading solution. Designed to accelerate enterprise big data analytics initiatives, Attunity Replicate 5.0 automates big data movement to, from and between databases, data warehouses, Hadoop and the cloud, reducing the time and labor, and ultimately the cost of making big data analytics available in real time.

Posted October 26, 2015

Ever since Linux became a viable server operating system, organizations have been looking to all kinds of open source software (OSS) to save on license and maintenance costs and to enjoy the benefits of an open platform that invites innovation. If you're considering MySQL or another open source DBMS as either your primary database or to, perhaps, operate alongside your existing commercial systems, such as Oracle or Microsoft SQL Server, for one reason or another, here are seven things to keep in mind.

Posted October 21, 2015

The Agile methodology is great for getting turgid development teams to start working faster and more coherently. With Agile, which focuses on more rapid, incremental deliverables and cross-departmental collaboration, the bureaucratic plaque is flushed from the information technology groups' arteries. But there is a dark side to Agile approaches.

Posted October 21, 2015

In the 1989 movie, "Back to the Future Part II," actor Michael J. Fox's character traveled in time to October 21, 2015. And today, October 21, 2015, in his keynote at Dell World 2015 in Austin, Michael S. Dell, founder and CEO of Dell, expanded on what it means for organizations to be future-ready, and what Dell is doing to help. Referring to the recently announced plans for Dell to acquire EMC in a deal valued at $67 billion, Dell said, "I started this company 32 years ago, just a few blocks from here in my dorm room, building PCs. And as I speak to you here today with this agreement in place Dell is set to become an enterprise solutions powerhouse."

Posted October 21, 2015

MapR is including Apache Drill 1.2 in its Apache Hadoop distribution and is also now offering a new Data Exploration Quick Start Solution, leveraging Drill to help customers get started more rapidly with big data projects. Apache Drill is an open source, low-latency query engine for Hadoop that delivers secure, interactive SQL analytics at petabyte scale. With the two announcements, MapR says customers and partners will able to more quickly leverage Drill to get fast business insights from all their data in Hadoop and other sources. MapR also released a comprehensive SQL-based test framework to the open source community.

Posted October 21, 2015

Splice Machine has released version 1.5 of its Hadoop RDBMS, which adds multiple enterprise-ready features. The new release adds functionality and performance improvements to enable companies to increase the benefits of using Splice Machine to support real-time applications, run operational data lakes, and accelerate their ETL pipelines.

Posted October 20, 2015

As many companies begin to look to other methods of data storage other than traditional methods such as data warehouses, cloud storage has become a popular option that organizations are beginning to use. Cloud offers companies better cost and more flexibility than traditional storage methods and companies are beginning to make use of Cloud and its advantages. When considering cloud storage options, there are lots of different questions that a company must weigh. Sarah Maston, developer advocate with IBM Cloud Data Services, covered the move to the cloud in a recent DBTA webinar.

Posted October 13, 2015

In what is being hailed as the biggest tech merger ever, Dell Inc. and EMC Corp. today formally announced they have signed a definitive agreement under which Dell will acquire EMC. The total transaction is valued at $67 billion. The deal is expected to close in the second or third quarter of Dell's fiscal year which ends February 3, 2017 (within the months of May to October 2016). The industry is going through a "tremendous transformation," with the old style of IT being "pretty quickly disrupted" yet this rapid change is also presenting "incredibly rich" opportunities, said Joe Tucci, chairman and chief executive officer of EMC, during a conference call with media and industry analysts.

Posted October 12, 2015

Too little emphasis overall is placed on the integrity and recoverability of the data—and too much is placed on performance. Yes, performance is probably the most visible aspect of database systems, at least from the perspective of the end user. But the underlying assumption of the end user is always that they want to access accurate and, usually, up-to-date data. But what good does it do to quickly access the wrong data? Anybody can provide rapid access to the wrong data!

Posted October 07, 2015

IT suppliers and data management managers are experiencing a major pain point with efficient data logging management. The availability of NoSQL open source software has enabled enterprises to collect large volumes of data from different sources, and software companies have implemented "call back home" features that allow their software to send information to data collection centers within various parameters, creating additional run time configurations and data traffic. And as the Internet of Things and a "connected everything" approach to businesses become increasingly popular, more and more data will flow in and out of data management systems, leaving IT managers muddled with millions of pieces of data they must properly manage and store.

Posted October 07, 2015

At Strata + Hadoop World 2015, SAP showcased its portfolio of big data solutions, including the HANA platform that offers real-time integration of big data and information held in Hadoop with business processes and operational systems, Lumira and SAP BI tools that enable data discovery on Hadoop along with data wrangling capabilities, SAP Data Services, and the newest SAP product for the Hadoop world, HANA Vora, which takes advantage of an in-memory query engine for Apache Spark and Hadoop to speed queries. SAP HANA Vora can be used as a stand-alone, or in concert with SAP HANA platform to extend enterprise-grade analytics to Hadoop clusters and provide enriched, interactive analytics on Hadoop and HANA data.

Posted October 01, 2015

MarkLogic, which bills itself as the only enterprise NoSQL database provider, completed a $102 million financing round earlier this year that it will use to accelerate the pace of growth in the $36 billion operational database market. Recently, Big Data Quarterly spoke with Joe Pasqua, executive vice president of products at MarkLogic, about the changing database management market, and what MarkLogic is doing to meet emerging enterprise customer requirements.

Posted September 24, 2015

There are various terms being bandied about that describe the new world data centers are entering—from the "third platform" to the "digital enterprise" to the "always-on" organization. Whatever the terminology, it's clear there is a monumental shift underway. Business and IT leaders alike are rethinking their approaches to technology, rethinking their roles in managing this technology, and, ultimately, rethinking their businesses. The underlying technologies supporting this movement—social, mobile, data analytics, and cloud—are also causing IT leaders to rethink the way in which database systems are being developed and deployed.

Posted September 24, 2015

Shortly after the explosion of non-relational databases, around 2009, it became apparent that rather than being part of the problem, SQL would instead continue to be part of the solution. If the new wave of database systems excluded the vast population of SQL-literate professionals, then their uptake in the business world would be impeded. Furthermore, a whole generation of business intelligence tools use SQL as the common way of translating user information requests into database queries. Nowhere was the drive toward SQL adoption more clear than in the case of Hadoop.

Posted September 23, 2015

MapR Technologies, Inc., a provider of a distribution for Apache Hadoop, has extended its support for SAS, a provider of business analytics software and services. According to the vendors, the collaboration between SAS and MapR provides advanced analytics with ease of data preparation and integration with legacy systems, assurance of SLAs, and security and data governance compliance. Additionally, joint customers can cost-effectively grow their big data storage infrastructure without relying on storage area network (SAN) or network-attached storage (NAS).

Posted September 22, 2015

Data has continued to grow at an exponential pace, and along with that trend, more businesses are beginning to take advantage of data. Businesses have begun to rely more and more on their IT departments to be able to leverage their data quicker than their competition. However, it is difficult to get the most out of data as fast as organizations would like to do so. Recently, DBTA held a special roundtable webcast to provide education on new data management technologies and techniques for meeting increasing requirements posed by modern applications for speed, scale and flexibility.

Posted September 15, 2015

Whenever I get into a discussion about database standards I invariably bring up one of my favorite quotes on the topic: "The best thing about standards is that there are so many to choose from." It shouldn't be true, but it is.

Posted September 15, 2015

Anyone who thought Hadoop was a fly-by-night technology was wrong. Hadoop has rapidly evolved—improving and gaining mainstream adoption as a technology and framework for enabling data applications previously out of reach for all but the savviest of companies. The open source Apache Hadoop developer community (and distribution vendors) continuously contributes advances to meet the demands of companies seeking more powerful—and useful—data applications, while also focusing on requirements for improved data management, security, metadata, and governance. Hadoop is not only stable but worthy of consideration for core IT strategies.

Posted September 14, 2015

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31

Sponsors