Newsletters




Data Warehousing

Hardware and software that support the efficient consolidation of data from multiple sources in a Data Warehouse for Reporting and Analytics include ETL (Extract, Transform, Load), EAI (Enterprise Application Integration), CDC (Change Data Capture), Data Replication, Data Deduplication, Compression, Big Data technologies such as Hadoop and MapReduce, and Data Warehouse Appliances.



Data Warehousing Articles

If you look at what is really going on in the big data space it's all about inexpensive open source solutions that are facilitating the modernization of data centers and data warehouses, and at the center of this universe is Hadoop. In the evolution of the big data market, open source is playing a seminal role as the "disruptive technology" challenging the status quo. Additionally, organizations large and small are leveraging these solutions often based on inexpensive hardware and memory platforms in the cloud or on premise.

Posted September 11, 2013

Attunity Ltd., a provider of information availability software solutions, has formed a new partnership with HP and announced the availability of its enhanced Attunity Click-2-Load solution for HP Vertica. The solution provides automation and optimized technologies that accelerate data loading to HP Vertica from disparate sources, and then maintains the changed data continuously and efficiently.

Posted August 13, 2013

A new Oracle In-Memory Application - Oracle In-Memory Logistics Command Center - has been launched that enables customers to improve scenario management in order to increase supply chain resiliency and agility, decrease costs and enhance service levels. With supply chains and their associated logistics networks becoming increasingly complex, strategic and operational, Oracle says scenario management is now central to creating an effective logistics network. Oracle In-Memory Logistics Command Center leverages the performance capabilities of Oracle Engineered Systems, including Oracle Exadata Database Machine and Oracle Exalogic Elastic Cloud, which can manage the large and complex data sets central to in-memory applications.

Posted August 07, 2013

Noetix Corp., a provider of business intelligence (BI) software and services for enterprise applications, has introduced Noetix Analytics 5.3, with improvements including new data marts, performance enhancements, and a streamlined upgrade process. Noetix Analytics is a packaged data warehouse solution designed to provide business users with strategic reporting for trending and analysis based on information from multiple data sources.

Posted August 07, 2013

The Oracle database provides intriguing possibilities for the storing, manipulating and streaming of multimedia data in enterprise class environments. However, knowledge of why and how the Oracle database can be used for multimedia applications is essential if one is to justify and maximize the ROI.

Posted July 17, 2013

These are heady times for data products vendors and their enterprise customers. When business leaders talk about success these days, they often are alluding to a new-found appreciation for their data environments. It can even be said that the tech vendors that are making the biggest difference in today's business world are no longer software companies at all; rather, they are "data" companies, with all that implies. Enterprises are reaching out to vendors for help in navigating through the fast-moving, and often unforgiving, digital realm. The data vendors that are leading their respective markets are those that know how to provide the tools, techniques, and hand-holding needed to manage and sift through gigabytes', terabytes', and petabytes' worth of data to extract tiny but valuable nuggets of information to guide business leaders as to what they should do next.

Posted June 19, 2013

There is an emerging field of companies looking to take on the challenges presented by the roiling tide of big data. While their visions vary, each has identified a market need that it believes its technology uniquely addresses. Here, DBTA highlights the approaches of 10 companies we think are worth watching.

Posted June 13, 2013

Embarcadero Technologies gives 97% of the world's top 2000 companies the tools needed to address the biggest challenges in data management. Facing significant growth in complexity, diversity and volume of enterprise data, companies worldwide are increasingly turning to data governance as a strategic solution. Helping our customers manage this complexity, and close the "governance gap" has been a major driver of innovation in our products.

Posted June 03, 2013

SAP AG has announced the SAP HANA Enterprise Cloud service. With the new offering, running mission-critical SAP ERP, SAP CRM, SAP NetWeaver Business Warehouse and new applications powered by the SAP HANA in-memory platform, will be possible as a managed cloud service with elastic petabyte-scale.

Posted May 29, 2013

It seems that juggling is the most useful of all skills when embarking on a data warehousing project. During the discovery and analysis phase, the workload grows insanely large, like some mutant science fiction monster. Pressures to deliver can encourage rampant corner-cutting to move quickly, while the need to provide value urges caution in order not to throw out the proverbial baby with the bath water as the project speeds along. Change data capture is one area that is a glaring example of the necessary juggling and balancing.

Posted May 22, 2013

Key findings from a new study, "Big Data Opportunities," will be presented at Big Data Boot Camp at the Hilton New York. Big Data Boot Camp will kick off at 9 am on Tuesday, May 21, with a keynote from John O'Brien, founder and principal of Radiant Advisors, on the dynamics and current issues being faced in today's big data analytic implementations. Directly after the opening address, David Jonker, senior director of Big Data Marketing, SAP, will showcase the results of the new big data survey, which revealed a variety of practical approaches that organizations are adopting to manage and capitalize on big data. The study was conducted by Unisphere Research, a division of Information Today, Inc., and sponsored by SAP.

Posted May 16, 2013

The conference agenda as well as the list of speakers is now available for DBTA's Big Data Boot Camp, a deep dive designed to bring together thought leaders and practitioners who will provide insight on how to collect, manage, and act on big data. The conference will be held May 21-22 at the Hilton New York. SAP is the diamond sponsor, and Objectivity and MarkLogic are platinum sponsors of the two-day event.

Posted April 25, 2013

Data keeps growing, systems and servers keep sprawling, and users keep clamoring for more real-time access. The result of all this frenzy of activity is pressure for faster, more effective data integration that can deliver more expansive views of information, while still maintaining quality and integrity. Enterprise data and IT managers are responding in a variety of ways, looking to initiatives such as enterprise mashups, automation, virtualization, and cloud to pursue new paths to data integration. In the process, they are moving beyond the traditional means of integration they have relied on for years to pull data together.

Posted April 03, 2013

Attunity Ltd., a provider of information availability software solutions, released Attunity Replicate 2.1, a high-performance data delivery solution that adds improvements for data warehousing. Attunity Replicate's new performance enhancements support many data warehouses, including Amazon Redshift, EMC Greenplum and Teradata.

Posted March 13, 2013

SAP AG has introduced a new version of its Sybase IQ disk-based column store analytics server. The overriding theme of this new release, which will be generally available later in the first quarter, "is positioning IQ 16 to go from terabytes to petabytes," Dan Lahl, senior director of product marketing at SAP, tells 5 Minute Briefing. To accomplish this, IQ 16 provides enhancements in three critical areas.

Posted February 27, 2013

Hortonworks, a leading contributor to Apache Hadoop, has released Hortonworks Sandbox, a learning environment and on-ramp for anyone interested in learning, evaluating or using Apache Hadoop in the enterprise. This tool seeks to bridge the gap between people who want to learn Hadoop, and the complexity of setting up a cluster with an integrated environment that provides demos, videos, tutorials.

Posted February 27, 2013

MarkLogic said it plans to deliver an enterprise-grade application and analytics software solution based on the new Intel Distribution for Apache Hadoop software. The Intel Distribution will be combined with the MarkLogic Enterprise NoSQL database to support real-time transactional and analytic applications.

Posted February 26, 2013

Attunity is developing a solution for fast data loading into Amazon Redshift, AWS's new data warehouse in the cloud. The Attunity solution is expected to be available for customer preview in March 2013.

Posted February 21, 2013

Hortonworks, a contributor to Apache Hadoop, has submitted two new incubation projects to the Apache Software Foundation and also announced the launch of the new "Stinger Initiative." These three projects seek to address key enterprise requirements regarding Hadoop application security and performance.

Posted February 21, 2013

IBM reports a surge in mainframe sales in the most recent quarter, surpassing all previous quarters. This announcement was part of the company's release of quarterly and annual results. Overall, total quarterly revenue was down 1% from last year, and down 2% for the year.

Posted February 04, 2013

Actian Corp. and Pervasive Software Inc. have entered into a definitive merger agreement through which Actian will acquire all of Pervasive's outstanding shares for $9.20 per share. Actian products include Action Apps, Vectorwise, the analytical database, Ingres, an independent mission-critical OLTP database, in addition to the Versant Object Database, which Actian added to its portfolio through another recent merger in which Actian acquired all the outstanding shares of Versant Corporation. According to the company, the deal values Pervasive at $161.9 million and will accelerate Actian's ability to deliver its vision of providing organizations with the capability to take action in real time as their business environment changes.

Posted January 31, 2013

Today's data warehouse environments are not keeping up with the explosive growth of data volume (or "big data") and the demand for real-time analytics. Fewer than one out of 10 respondents to a new survey say their data warehouse sites can deliver analysis in what they would consider a real-time timeframe. Nearly 75% of respondents believe that in-memory technology is important to enabling their organization to remain competitive in the future. Yet, almost as many also indicate they lack the in-memory skills to deliver even current business requirements. These are among the findings of a new survey of 323 data managers and professionals who are part of the Independent Oracle Users Group (IOUG). The survey was underwritten by SAP Corporation and conducted by Unisphere Research, a division of Information Today, Inc.

Posted January 29, 2013

Databases are hampered by a reliance on disk-based storage, a technology that has been in place for more than two decades. Even with the addition of memory caches and solid state drives, the model of relying on repeated access to the permanent information storage devices is still a bottleneck in capitalizing on today's "big data," according to a new survey of 323 data managers and professionals who are part of the IOUG. Nearly 75% of respondents believe that in-memory technology is important to enabling their organization to remain competitive in the future. Yet, almost as many also indicate they lack the in-memory skills to deliver even current business requirements. The research results are detailed in a new report, titled "Accelerating Enterprise Insights: 2013 IOUG In-Memory Strategies Survey."

Posted January 24, 2013

Despite the rise of big data, data warehousing is far from dead. While traditional, static data warehouses may have indeed seen their day, an agile data warehouse — one that can map to the needs of the business and change as the business changes — is quickly on the rise. Many of the conversations today around big data revolve around volume and while that is certainly valid, the issue is also about understanding data in context to make valuable business decisions. Do you really understand why a consumer takes action to buy? How do their purchases relate? When will they do it again? Big data is limited when it comes to answering these questions. An agile approach — one that gives even big data a life beyond its initial purpose — is the value data warehousing can bring to bear and is critical to long-term business success.

Posted December 19, 2012

The University of Minnesota, a top research institution comprised of five campuses, 65,000 students and 25,000 employees, has made systematic changes and improved database administration efficiency with Oracle Exadata Database Machine. By hosting its IT environment on two Oracle Exadata Database Machine half racks, the university consolidated more than 200 Oracle database instances into fewer than 20, enabling it to reduce data center floor space and total cost of ownership.

Posted December 12, 2012

At OpenWorld, Oracle's annual conference for customers and partners, John Matelski, president of the IOUG, and CIO for Dekalb County, Georgia, gave his perspective on the key takeaways from this year's event. Matelski also described the user group's efforts to help the community understand the value of Oracle's engineered systems and deal with the broad implications of big data, and how the IOUG is supporting Oracle DBAs in their evolving roles.

Posted December 12, 2012

Tervela Turbo is now certified on CDH4 (Cloudera's Distribution Including Apache Hadoop Version 4). Introduced in October, Tervela Turbo, a high-performance data movement engine, helps Cloudera customers implement mission-critical Hadoop systems with reliable data capture, high-speed data loading into HDFS, disaster recovery for Hadoop, and ETLT data warehousing. Tervela has also joined the Cloudera Connect Partner Program.

Posted December 12, 2012

Amazon Web Services Inc. has announced the limited preview of Amazon Redshift, a managed, petabyte-scale data warehouse service in the cloud, which aims to enable customers to increase the speed of query performance when analyzing data sets using the same SQL-based BI tools they use today. "Over the past 2 years, one of the most frequent requests we've heard from customers is for AWS to build a data warehouse service," says Raju Gulabani, vice president of Database Services, AWS.

Posted November 28, 2012

Jeff West, president of Quest International Users Group, joined by Jonathan Vaughn, Quest's executive director, talked with DBTA at Oracle OpenWorld about what's ahead for 2013. The group has launched smaller, product-concentrated events to support JD Edwards and PeopleSoft users' specific areas of interest, and expanded its range of online offerings for users who may not be able take advantage of in-person conferences. Plans are underway to help members learn about PeopleSoft 9.2 coming in March and to prepare for the looming end of support for JD Edwards World. As always, says West, Quest continues to help get information to members from Oracle and their peers. "It is always about return on investment and aligning IT with the business. That is always on the top of people's minds."

Posted November 27, 2012

MapR Technologies, Inc., provider of the MapR Distribution for Hadoop, has formed a partnership with Hadapt, which offers a data analytics platform for natively integrating SQL with Apache Hadoop. The partnership enables customers to leverage MapR's Hadoop distribution in conjunction with Hadapt's Interactive Query capabilities to analyze all types of data, structured, semi-structured and unstructured, in a single, enterprise platform. Partnerships such as the one with Hadapt enable a broad community of users to have access to Hadoop data while also leveraging the existing skill sets of those users, Jack Norris, vice president of MapR, tells 5 Minute Briefing.

Posted November 15, 2012

Cloudera, provider of Apache Hadoop-based software and services, announced the first big data management solution that allows batch and real-time operations on any type of data within one scalable system. Cloudera Enterprise Real-Time Query (RTQ), powered by Cloudera Impala, improves the economics and performance of large scale enterprise data management, allowing organizations to process data at petabyte scale and interact with that data in real time all on the same system.

Posted November 06, 2012

Open source software vendor Talend announced that it has added big data profiling for Apache Hadoop and support for NoSQL databases in the upcoming release of its integration platform, Talend v5.2. Data profiling, the process of evaluating the character and condition of data stored across the enterprise, is a critical step toward gaining control over organizational data, and is emerging as a big data best practice. "Profiling allows you to understand what you have in your Hadoop cluster and how this data can be used for your big data integration and management project," Yves de Montcheuil, Talend's vice president of marketing, tells 5 Minute Briefing.

Posted November 06, 2012

Attunity Ltd., a provider of information availability software solutions, is partnering with Teradata to offer Attunity Replicate for Teradata, a big data replication solution designed to enable loading of heterogeneous data to Teradata with high performance, efficiency and ease-of-use.

Posted October 25, 2012

Kognitio is allowing companies to download a fully functional copy of its software at no charge, and with no time restrictions. The company, which made the announcement at O'Reilly Strata and Hadoop World conference, said it is offering a full-featured, perpetual use license of up to 128 gigabytes without an expiration period or other limited functions normally found in "trialware." This capability gives companies the ability to do in-memory analytics on, for example, more than 500 million customer records at once.

Posted October 25, 2012

At SAP TechEd 2012 in Las Vegas, SAP unveiled its plans for SAP HANA Cloud, a next-generation cloud platform based on in-memory technology. As part of SAP HANA Cloud, the company also announced the general availability of SAP NetWeaverCloud, an open standards-based application service, and SAP HANA One, a deployment of SAP HANA certified for production use on the Amazon Web Services (AWS) Cloud, as the first offerings based on SAP HANA Cloud.

Posted October 24, 2012

The opportunities and challenges presented by big data are addressed in a new report summarizing the results of a survey of data managers and professionals who are part of the Independent Oracle Users Group. The survey was underwritten by Oracle Corporation and conducted by Unisphere Research, a division of Information Today, Inc. Key highlights from the survey include the finding that more than one out of 10 data managers now have in excess of a petabyte of data within their organizations, and a majority of respondents report their levels of unstructured data are growing.

Posted October 24, 2012

Survey respondents to the IOUG Big Data survey were entered into a drawing to win an iPad by providing their email addresses. The winner of the iPad in the recent IOUG Big Data study sweepstakes drawing was Thomas F. Lewandowski, an independent Oracle DBA.

Posted October 24, 2012

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49

Sponsors