Newsletters




Data Warehousing

Hardware and software that support the efficient consolidation of data from multiple sources in a Data Warehouse for Reporting and Analytics include ETL (Extract, Transform, Load), EAI (Enterprise Application Integration), CDC (Change Data Capture), Data Replication, Data Deduplication, Compression, Big Data technologies such as Hadoop and MapReduce, and Data Warehouse Appliances.



Data Warehousing Articles

In 2015, big data, mobility, IoT, expanding requirements for security and real-time analytics and the introduction of the Cognitive Era continued to place greater pressure on IT organizations. Linux and open source technologies are at the center of many of the most innovative software and hardware solutions that are addressing emerging enterprise requirements. Here's a look back at some of the most significant announcements in Linux and open source technology of 2015.

Posted December 16, 2015

Looking ahead to 2016 David Jonker, senior director of big data at SAP, offered six key business and technology trends that will take the market by storm, including a continued interest in in-memory processing, enablement of real-time data and analytics, and an increased focus on advanced analytics and machine learning.

Posted December 16, 2015

What's ahead for 2016 in terms of cloud, IoT, big data, analytics, and open source technologies? IT executives gaze into their crystal balls, and weigh in on the upcoming challenges and opportunities ahead in the next year - and beyond.

Posted December 16, 2015

The modern business landscape is a fast-moving, ever-changing, highly competitive environment. For companies to outpace the competition and build upon innovation, they must embrace a modern data architecture. It is necessary that this new architecture support today's new requirements such as mobile integration and advanced digital marketing.

Posted December 02, 2015

It's commonly asserted—and generally accepted—that the era of the "one-size-fits-all" database is over. We expect that enterprises will use a combination of database technologies to meet the distinct needs created by various application architectures.

Posted December 02, 2015

As 2015 draws to a close it's time to look back on the year's big changes in data management and reflect on some of the most insightful observations by leading data experts. Here, Big Data Quarterly presents a round-up of thought-provoking articles that explore some of the key advances of 2015, how organizations are harnessing what big data has to offer, and the challenges they face as they seek to compete on analytics.

Posted December 02, 2015

Hadoop distribution provider Cloudera has introduced Cloudera Enterprise 5.5, including Cloudera Navigator Optimizer, a new product targeted at helping organizations improve big data workload performance and efficiency. Cloudera Navigator Optimizer, now in beta, is expected to be generally available in 2016. The new release of Cloudera Enterprise has three main areas of focus, according to Anupam Singh, head of data management at Cloudera.

Posted November 19, 2015

SAP is the first company to announce IoT solutions based on the Intel's new IoT Platform. SAP plans to develop its IoT enterprise end-to-end solutions utilizing the Intel platform along with its SAP HANA Cloud Platform.

Posted November 18, 2015

The concept of the data lake has become a hot topic. The data lake retains data in its original format to allow the data to be more flexible for everyone involved. While this sounds fine in theory, it is a more complicated in practice due to the need for governance and security.

Posted November 09, 2015

There are many different ways to look at database administration. It can be done by task, by discipline, by DBMS, by server, and so on. But one useful way to look at database administration is in terms of the type of support being delivered to applications. You can paint a broad brush stroke across the duties of the DBA and divide them into two categories: those that support development work and those that support the production systems.

Posted November 09, 2015

To better manage the data explosion now and with scalable options for the future, existing data architecture is evolving beyond traditional databases, data stores, data warehouses, and the like into a more unfiltered repository known as the data lake.

Posted November 05, 2015

Organizations are getting squeezed, said Mark Hurd, Oracle CEO, in his Monday morning keynote at Oracle OpenWorld 2015. They have old infrastructure; there is the need for innovation but also great pressure to do things such as increase security and adhere to governance mandates which are not innovative; and also pressure to keep costs flat, without increasing IT costs. "This is why the cloud is such a big deal," said Hurd, sharing his list of top 5 predictions for 2025.

Posted November 04, 2015

Informatica has introduced a new data management platform designed to handle data at any speed across today's hybrid IT environments, both cloud and on-premise. The new release, Informatica 10, provides enhancements across three core components - Informatica PowerCenter 10, Informatica Data Quality 10 and Informatica Data Integration Hub 10, and offers specific certified optimizations for Oracle Exadata, Oracle SuperCluster, SAP HANA, and HP Vertica.

Posted October 28, 2015

Attunity Ltd., a provider of data management software solutions, has introduced the latest version of its data replication and loading solution. Designed to accelerate enterprise big data analytics initiatives, Attunity Replicate 5.0 automates big data movement to, from and between databases, data warehouses, Hadoop and the cloud, reducing the time and labor, and ultimately the cost of making big data analytics available in real time.

Posted October 26, 2015

Ever since Linux became a viable server operating system, organizations have been looking to all kinds of open source software (OSS) to save on license and maintenance costs and to enjoy the benefits of an open platform that invites innovation. If you're considering MySQL or another open source DBMS as either your primary database or to, perhaps, operate alongside your existing commercial systems, such as Oracle or Microsoft SQL Server, for one reason or another, here are seven things to keep in mind.

Posted October 21, 2015

The Agile methodology is great for getting turgid development teams to start working faster and more coherently. With Agile, which focuses on more rapid, incremental deliverables and cross-departmental collaboration, the bureaucratic plaque is flushed from the information technology groups' arteries. But there is a dark side to Agile approaches.

Posted October 21, 2015

In the 1989 movie, "Back to the Future Part II," actor Michael J. Fox's character traveled in time to October 21, 2015. And today, October 21, 2015, in his keynote at Dell World 2015 in Austin, Michael S. Dell, founder and CEO of Dell, expanded on what it means for organizations to be future-ready, and what Dell is doing to help. Referring to the recently announced plans for Dell to acquire EMC in a deal valued at $67 billion, Dell said, "I started this company 32 years ago, just a few blocks from here in my dorm room, building PCs. And as I speak to you here today with this agreement in place Dell is set to become an enterprise solutions powerhouse."

Posted October 21, 2015

MapR is including Apache Drill 1.2 in its Apache Hadoop distribution and is also now offering a new Data Exploration Quick Start Solution, leveraging Drill to help customers get started more rapidly with big data projects. Apache Drill is an open source, low-latency query engine for Hadoop that delivers secure, interactive SQL analytics at petabyte scale. With the two announcements, MapR says customers and partners will able to more quickly leverage Drill to get fast business insights from all their data in Hadoop and other sources. MapR also released a comprehensive SQL-based test framework to the open source community.

Posted October 21, 2015

Splice Machine has released version 1.5 of its Hadoop RDBMS, which adds multiple enterprise-ready features. The new release adds functionality and performance improvements to enable companies to increase the benefits of using Splice Machine to support real-time applications, run operational data lakes, and accelerate their ETL pipelines.

Posted October 20, 2015

As many companies begin to look to other methods of data storage other than traditional methods such as data warehouses, cloud storage has become a popular option that organizations are beginning to use. Cloud offers companies better cost and more flexibility than traditional storage methods and companies are beginning to make use of Cloud and its advantages. When considering cloud storage options, there are lots of different questions that a company must weigh. Sarah Maston, developer advocate with IBM Cloud Data Services, covered the move to the cloud in a recent DBTA webinar.

Posted October 13, 2015

In what is being hailed as the biggest tech merger ever, Dell Inc. and EMC Corp. today formally announced they have signed a definitive agreement under which Dell will acquire EMC. The total transaction is valued at $67 billion. The deal is expected to close in the second or third quarter of Dell's fiscal year which ends February 3, 2017 (within the months of May to October 2016). The industry is going through a "tremendous transformation," with the old style of IT being "pretty quickly disrupted" yet this rapid change is also presenting "incredibly rich" opportunities, said Joe Tucci, chairman and chief executive officer of EMC, during a conference call with media and industry analysts.

Posted October 12, 2015

Too little emphasis overall is placed on the integrity and recoverability of the data—and too much is placed on performance. Yes, performance is probably the most visible aspect of database systems, at least from the perspective of the end user. But the underlying assumption of the end user is always that they want to access accurate and, usually, up-to-date data. But what good does it do to quickly access the wrong data? Anybody can provide rapid access to the wrong data!

Posted October 07, 2015

IT suppliers and data management managers are experiencing a major pain point with efficient data logging management. The availability of NoSQL open source software has enabled enterprises to collect large volumes of data from different sources, and software companies have implemented "call back home" features that allow their software to send information to data collection centers within various parameters, creating additional run time configurations and data traffic. And as the Internet of Things and a "connected everything" approach to businesses become increasingly popular, more and more data will flow in and out of data management systems, leaving IT managers muddled with millions of pieces of data they must properly manage and store.

Posted October 07, 2015

At Strata + Hadoop World 2015, SAP showcased its portfolio of big data solutions, including the HANA platform that offers real-time integration of big data and information held in Hadoop with business processes and operational systems, Lumira and SAP BI tools that enable data discovery on Hadoop along with data wrangling capabilities, SAP Data Services, and the newest SAP product for the Hadoop world, HANA Vora, which takes advantage of an in-memory query engine for Apache Spark and Hadoop to speed queries. SAP HANA Vora can be used as a stand-alone, or in concert with SAP HANA platform to extend enterprise-grade analytics to Hadoop clusters and provide enriched, interactive analytics on Hadoop and HANA data.

Posted October 01, 2015

MarkLogic, which bills itself as the only enterprise NoSQL database provider, completed a $102 million financing round earlier this year that it will use to accelerate the pace of growth in the $36 billion operational database market. Recently, Big Data Quarterly spoke with Joe Pasqua, executive vice president of products at MarkLogic, about the changing database management market, and what MarkLogic is doing to meet emerging enterprise customer requirements.

Posted September 24, 2015

There are various terms being bandied about that describe the new world data centers are entering—from the "third platform" to the "digital enterprise" to the "always-on" organization. Whatever the terminology, it's clear there is a monumental shift underway. Business and IT leaders alike are rethinking their approaches to technology, rethinking their roles in managing this technology, and, ultimately, rethinking their businesses. The underlying technologies supporting this movement—social, mobile, data analytics, and cloud—are also causing IT leaders to rethink the way in which database systems are being developed and deployed.

Posted September 24, 2015

Shortly after the explosion of non-relational databases, around 2009, it became apparent that rather than being part of the problem, SQL would instead continue to be part of the solution. If the new wave of database systems excluded the vast population of SQL-literate professionals, then their uptake in the business world would be impeded. Furthermore, a whole generation of business intelligence tools use SQL as the common way of translating user information requests into database queries. Nowhere was the drive toward SQL adoption more clear than in the case of Hadoop.

Posted September 23, 2015

MapR Technologies, Inc., a provider of a distribution for Apache Hadoop, has extended its support for SAS, a provider of business analytics software and services. According to the vendors, the collaboration between SAS and MapR provides advanced analytics with ease of data preparation and integration with legacy systems, assurance of SLAs, and security and data governance compliance. Additionally, joint customers can cost-effectively grow their big data storage infrastructure without relying on storage area network (SAN) or network-attached storage (NAS).

Posted September 22, 2015

Data has continued to grow at an exponential pace, and along with that trend, more businesses are beginning to take advantage of data. Businesses have begun to rely more and more on their IT departments to be able to leverage their data quicker than their competition. However, it is difficult to get the most out of data as fast as organizations would like to do so. Recently, DBTA held a special roundtable webcast to provide education on new data management technologies and techniques for meeting increasing requirements posed by modern applications for speed, scale and flexibility.

Posted September 15, 2015

Whenever I get into a discussion about database standards I invariably bring up one of my favorite quotes on the topic: "The best thing about standards is that there are so many to choose from." It shouldn't be true, but it is.

Posted September 15, 2015

Anyone who thought Hadoop was a fly-by-night technology was wrong. Hadoop has rapidly evolved—improving and gaining mainstream adoption as a technology and framework for enabling data applications previously out of reach for all but the savviest of companies. The open source Apache Hadoop developer community (and distribution vendors) continuously contributes advances to meet the demands of companies seeking more powerful—and useful—data applications, while also focusing on requirements for improved data management, security, metadata, and governance. Hadoop is not only stable but worthy of consideration for core IT strategies.

Posted September 14, 2015

There are a lot of moving parts that data managers and professionals need to attend to in today's enterprises. Here are the eight things that matter the most in today's market.

Posted September 09, 2015

Data modelers face a choice when encountering multiple variations of a data item. Designers must focus on the longer term appropriateness of their decisions when choosing how their designs will play out; and going vertical or horizontal does have an impact over time.

Posted September 09, 2015

It's no secret that the world of information technology is changing fast. Data is being created in ways not possible a few years ago. It is now feasible to collect and analyze data from a wide range of sources, including mobile devices, machines, social media, documents, and emails. To help organizations navigate the rapidly changing big data landscape, Big Data Quarterly presents the "Big Data 50," a list of companies driving innovation.

Posted September 09, 2015

Birst, a provider of cloud BI and analytics software, has introduced a new technology it calls "Networked BI" to enable global governance with local execution. According to Birst, the new approach furthers its vision of trusted and agile collaboration between both centralized and decentralized teams.

Posted September 08, 2015

With the release of Oracle Cloud Platform 2015 and new Oracle Cloud Platform Services, things are clouding up across the Oracle landscape—but in a positive way. Larry Ellison, chairman and chief technology officer for Oracle, has made it clear in pronouncements that Oracle is in the cloud to stay.

Posted September 02, 2015

Magnitude Software, a provider of enterprise information management solutions, is introducing new additions to its technology suite designed to assist users in making informed business decisions faster and easier. The release of its Kalido Information Engine 9.1 SP2 makes a variety of advancements in data warehouse automation capabilities including increasing automation that accelerates the migration process and reduces overhead tasks when moving from development to production.

Posted September 01, 2015

Unless you've been living under a rock, you've heard Microsoft has introduced the initial release of Windows 10. Windows 10 has been a widely anticipated release on the part of consumers and business alike, with Microsoft aiming to deploy Windows 10 on more than one billion devices within the next few years. Here are 10 technical and market reasons why Windows 10 will become Microsoft's biggest release ever.

Posted August 26, 2015

The Independent Oracle Users Group is headed to San Francisco this October to join the Oracle technology community at Oracle OpenWorld 2015. Representing the voice of Oracle technology professionals, IOUG members will present more than 35 sessions on critical Oracle technology topics, including big data, cloud, Oracle Enterprise Manager 12c, and more. There is a lot to discuss with the Oracle database and technology community, and I am looking forward to connecting with Oracle professionals with similar interests to address challenges and solutions face-to-face.

Posted August 19, 2015

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30

Sponsors