Newsletters




Data Warehousing

Hardware and software that support the efficient consolidation of data from multiple sources in a Data Warehouse for Reporting and Analytics include ETL (Extract, Transform, Load), EAI (Enterprise Application Integration), CDC (Change Data Capture), Data Replication, Data Deduplication, Compression, Big Data technologies such as Hadoop and MapReduce, and Data Warehouse Appliances.



Data Warehousing Articles

Attunity Ltd., a provider of information availability software solutions, has released a new version of its data replication software intended to address requirements for big data analytics, business intelligence, business continuity and disaster recovery initiatives. Addressing expanding use cases for the solution, Attunity Replicate 3.0, is engineered to provide secure data transfer over long distances such as wide area networks (WANs), the cloud and satellite connections, said Lawrence Schwartz, vice president of marketing at Attunity, in an interview.

Posted September 25, 2013

Oracle CEO Larry Ellison made three key announcements in his opening keynote at Oracle OpenWorld, the company's annual conference for customers and partners in San Francisco. Ellison unveiled the Oracle Database In-Memory Option to Oracle Database 12c which he said speeds up query processing by "orders of magnitude," the M6 Big Memory Machine, and the new Oracle Database Backup Logging Recovery Appliance. Explaining Oracle's goals with the new in-memory option, Ellison noted that in the past there have been row-format databases, and column-format databases that are intended to speed up query processing. "We had a better idea. What if we store data in both formats simultaneously?"

Posted September 24, 2013

On Thursday, September 19, at 11 am PT / 2 pm ET, DBTA will present a special webcast to provide a deeper understanding of the key technologies that are changing the database world - from NoSQL and NewSQL databases, to in-memory processing and virtualization. Sponsored by Progress Software and TransLattice, the webcast titled, "New Technologies Revolutionizing the Database World," will provide in-depth information about game-changing database technologies.

Posted September 18, 2013

RainStor, a provider of an enterprise database for managing and analyzing all historical data, has introduced RainStor FastForward, a new product that enables customers to re-instate data from Teradata tape archives (also known as BAR for Backup, Archive and Restore) and move it to RainStor for query. The new RainStor FastForward product resolves a pressing challenge for Teradata customers that need to archive their Teradata warehouse data to offline tape, which can make it difficult to access and query that data when business and regulatory users require it, Deirdre Mahon, vice president of marketing, RainStor, explained in an interview.

Posted September 17, 2013

If you look at what is really going on in the big data space it's all about inexpensive open source solutions that are facilitating the modernization of data centers and data warehouses, and at the center of this universe is Hadoop. In the evolution of the big data market, open source is playing a seminal role as the "disruptive technology" challenging the status quo. Additionally, organizations large and small are leveraging these solutions often based on inexpensive hardware and memory platforms in the cloud or on premise.

Posted September 11, 2013

Data analytics, long the obscure pursuit of analysts and quants toiling in the depths of enterprises, has emerged as the must-have strategy of organizations across the globe. Competitive edge not only comes from deciphering the whims of customers and markets but also being able to predict shifts before they happen. Fueling the move of data analytics out of back offices and into the forefront of corporate strategy sessions is big data, now made enterprise-ready through technology platforms such as Hadoop and MapReduce. The Hadoop framework is seen as the most efficient file system and solution set to store and package big datasets for consumption by the enterprise, and MapReduce is the construct used to perform analysis over Hadoop files.

Posted August 21, 2013

Attunity Ltd., a provider of information availability software solutions, has formed a new partnership with HP and announced the availability of its enhanced Attunity Click-2-Load solution for HP Vertica. The solution provides automation and optimized technologies that accelerate data loading to HP Vertica from disparate sources, and then maintains the changed data continuously and efficiently.

Posted August 13, 2013

Database Trends and Applications has launched a special "Who to See at Oracle OpenWorld" section online where you can find information on what to expect at this year's conference and premium vendors that offer products and services to serve your needs as an Oracle technology professional.

Posted August 09, 2013

A new Oracle In-Memory Application - Oracle In-Memory Logistics Command Center - has been launched that enables customers to improve scenario management in order to increase supply chain resiliency and agility, decrease costs and enhance service levels. With supply chains and their associated logistics networks becoming increasingly complex, strategic and operational, Oracle says scenario management is now central to creating an effective logistics network. Oracle In-Memory Logistics Command Center leverages the performance capabilities of Oracle Engineered Systems, including Oracle Exadata Database Machine and Oracle Exalogic Elastic Cloud, which can manage the large and complex data sets central to in-memory applications.

Posted August 07, 2013

Noetix Corp., a provider of business intelligence (BI) software and services for enterprise applications, has introduced Noetix Analytics 5.3, with improvements including new data marts, performance enhancements, and a streamlined upgrade process. Noetix Analytics is a packaged data warehouse solution designed to provide business users with strategic reporting for trending and analysis based on information from multiple data sources.

Posted August 07, 2013

In many ways, Hadoop is the most concrete technology underlying today's big data revolution, but it certainly does not satisfy those who want quick answers from their big data. Hadoop - at least Hadoop 1.0 - is a batch-oriented framework that allows for the economical execution of massively parallel workloads, but provides no capabilities for interactive or real-time execution.

Posted August 07, 2013

The Oracle database provides intriguing possibilities for the storing, manipulating and streaming of multimedia data in enterprise class environments. However, knowledge of why and how the Oracle database can be used for multimedia applications is essential if one is to justify and maximize the ROI.

Posted July 17, 2013

Database Trends and Applications introduces the inaugural "DBTA 100," a list of the companies that matter most in data. The past several years have transformed enterprise information management, creating challenges and opportunities for companies seeking to extract value from a sea of data assets. In response to this, established IT vendors as well as legions of newer solution providers have rushed to create the tools to do just that.

Posted June 27, 2013

Database Trends and Applications (DBTA) magazine has announced the inaugural "DBTA 100: The Companies That Matter Most in Data," a list saluting this year's companies in data and enterprise information management—from long-standing industry veterans to fast-growing startups tackling big data. "Beyond the explosion of interest surrounding big data, the past several years have transformed enterprise information management, creating both challenges and opportunities for companies seeking to protect, optimize, integrate, and extract actionable insight from a sea of data assets," remarked Thomas Hogan, group publisher of Database Trends and Applications.

Posted June 26, 2013

These are heady times for data products vendors and their enterprise customers. When business leaders talk about success these days, they often are alluding to a new-found appreciation for their data environments. It can even be said that the tech vendors that are making the biggest difference in today's business world are no longer software companies at all; rather, they are "data" companies, with all that implies. Enterprises are reaching out to vendors for help in navigating through the fast-moving, and often unforgiving, digital realm. The data vendors that are leading their respective markets are those that know how to provide the tools, techniques, and hand-holding needed to manage and sift through gigabytes', terabytes', and petabytes' worth of data to extract tiny but valuable nuggets of information to guide business leaders as to what they should do next.

Posted June 19, 2013

There is an emerging field of companies looking to take on the challenges presented by the roiling tide of big data. While their visions vary, each has identified a market need that it believes its technology uniquely addresses. Here, DBTA highlights the approaches of 10 companies we think are worth watching.

Posted June 13, 2013

Embarcadero Technologies gives 97% of the world's top 2000 companies the tools needed to address the biggest challenges in data management. Facing significant growth in complexity, diversity and volume of enterprise data, companies worldwide are increasingly turning to data governance as a strategic solution. Helping our customers manage this complexity, and close the "governance gap" has been a major driver of innovation in our products.

Posted June 03, 2013

David Jonker, senior director, Big Data Marketing, SAP, recently highlighted the results of a new big data survey, the "2013 Big Data Opportunities Survey." According to the research, contrary to the perception that big data only provides value when it is crunched on the type of large-scale clustering technologies used by web companies, many big data issues are being successfully addressed now by conventional technologies such as relational databases. But while many respondents believe their current technology is capable of helping them manage and capitalize on big data, they also are concerned about gaining faster access to their large datasets.

Posted May 29, 2013

SAP AG has announced the SAP HANA Enterprise Cloud service. With the new offering, running mission-critical SAP ERP, SAP CRM, SAP NetWeaver Business Warehouse and new applications powered by the SAP HANA in-memory platform, will be possible as a managed cloud service with elastic petabyte-scale.

Posted May 29, 2013

Three things people need to think about in a big data implementation are persistence, context and access, John O'Brien, founder and principal, Radiant Advisors, told attendees during his keynote, "The Big Data Paradigm," at DBTA's Big Data Boot Camp. O'Brien's talk provided an overview of the technologies and issues that attendees would learn about during the conference which took place this week in New York City. Following the opening address, David Jonker, senior director, Big Data Marketing, SAP, highlighted the results of a new big data survey, the "2013 Big Data Opportunities Survey," which revealed a variety of practical approaches that organizations are adopting to manage and capitalize on big data.

Posted May 23, 2013

It seems that juggling is the most useful of all skills when embarking on a data warehousing project. During the discovery and analysis phase, the workload grows insanely large, like some mutant science fiction monster. Pressures to deliver can encourage rampant corner-cutting to move quickly, while the need to provide value urges caution in order not to throw out the proverbial baby with the bath water as the project speeds along. Change data capture is one area that is a glaring example of the necessary juggling and balancing.

Posted May 22, 2013

Key findings from a new study, "Big Data Opportunities," will be presented at Big Data Boot Camp at the Hilton New York. Big Data Boot Camp will kick off at 9 am on Tuesday, May 21, with a keynote from John O'Brien, founder and principal of Radiant Advisors, on the dynamics and current issues being faced in today's big data analytic implementations. Directly after the opening address, David Jonker, senior director of Big Data Marketing, SAP, will showcase the results of the new big data survey, which revealed a variety of practical approaches that organizations are adopting to manage and capitalize on big data. The study was conducted by Unisphere Research, a division of Information Today, Inc., and sponsored by SAP.

Posted May 16, 2013

The conference agenda as well as the list of speakers is now available for DBTA's Big Data Boot Camp, a deep dive designed to bring together thought leaders and practitioners who will provide insight on how to collect, manage, and act on big data. The conference will be held May 21-22 at the Hilton New York. SAP is the diamond sponsor, and Objectivity and MarkLogic are platinum sponsors of the two-day event.

Posted April 25, 2013

Data keeps growing, systems and servers keep sprawling, and users keep clamoring for more real-time access. The result of all this frenzy of activity is pressure for faster, more effective data integration that can deliver more expansive views of information, while still maintaining quality and integrity. Enterprise data and IT managers are responding in a variety of ways, looking to initiatives such as enterprise mashups, automation, virtualization, and cloud to pursue new paths to data integration. In the process, they are moving beyond the traditional means of integration they have relied on for years to pull data together.

Posted April 03, 2013

Attunity Ltd., a provider of information availability software solutions, released Attunity Replicate 2.1, a high-performance data delivery solution that adds improvements for data warehousing. Attunity Replicate's new performance enhancements support many data warehouses, including Amazon Redshift, EMC Greenplum and Teradata.

Posted March 13, 2013

SAP AG has introduced a new version of its Sybase IQ disk-based column store analytics server. The overriding theme of this new release, which will be generally available later in the first quarter, "is positioning IQ 16 to go from terabytes to petabytes," Dan Lahl, senior director of product marketing at SAP, tells 5 Minute Briefing. To accomplish this, IQ 16 provides enhancements in three critical areas.

Posted February 27, 2013

Hortonworks, a leading contributor to Apache Hadoop, has released Hortonworks Sandbox, a learning environment and on-ramp for anyone interested in learning, evaluating or using Apache Hadoop in the enterprise. This tool seeks to bridge the gap between people who want to learn Hadoop, and the complexity of setting up a cluster with an integrated environment that provides demos, videos, tutorials.

Posted February 27, 2013

MarkLogic said it plans to deliver an enterprise-grade application and analytics software solution based on the new Intel Distribution for Apache Hadoop software. The Intel Distribution will be combined with the MarkLogic Enterprise NoSQL database to support real-time transactional and analytic applications.

Posted February 26, 2013

Hortonworks has announced that the Hortonworks Data Platform is now available for Windows in addition to Linux, enabling organizations to run Hadoop-based solutions natively on Windows. According to Hortonworks, making the Hortonworks Data Platform available for Windows is a necessary step in its strategy to broaden the reach of Apache Hadoop across the enterprise.

Posted February 25, 2013

Attunity is developing a solution for fast data loading into Amazon Redshift, AWS's new data warehouse in the cloud. The Attunity solution is expected to be available for customer preview in March 2013.

Posted February 21, 2013

Hortonworks, a contributor to Apache Hadoop, has submitted two new incubation projects to the Apache Software Foundation and also announced the launch of the new "Stinger Initiative." These three projects seek to address key enterprise requirements regarding Hadoop application security and performance.

Posted February 21, 2013

IBM reports a surge in mainframe sales in the most recent quarter, surpassing all previous quarters. This announcement was part of the company's release of quarterly and annual results. Overall, total quarterly revenue was down 1% from last year, and down 2% for the year.

Posted February 04, 2013

Actian Corp. and Pervasive Software Inc. have entered into a definitive merger agreement through which Actian will acquire all of Pervasive's outstanding shares for $9.20 per share. Actian products include Action Apps, Vectorwise, the analytical database, Ingres, an independent mission-critical OLTP database, in addition to the Versant Object Database, which Actian added to its portfolio through another recent merger in which Actian acquired all the outstanding shares of Versant Corporation. According to the company, the deal values Pervasive at $161.9 million and will accelerate Actian's ability to deliver its vision of providing organizations with the capability to take action in real time as their business environment changes.

Posted January 31, 2013

Today's data warehouse environments are not keeping up with the explosive growth of data volume (or "big data") and the demand for real-time analytics. Fewer than one out of 10 respondents to a new survey say their data warehouse sites can deliver analysis in what they would consider a real-time timeframe. Nearly 75% of respondents believe that in-memory technology is important to enabling their organization to remain competitive in the future. Yet, almost as many also indicate they lack the in-memory skills to deliver even current business requirements. These are among the findings of a new survey of 323 data managers and professionals who are part of the Independent Oracle Users Group (IOUG). The survey was underwritten by SAP Corporation and conducted by Unisphere Research, a division of Information Today, Inc.

Posted January 29, 2013

Databases are hampered by a reliance on disk-based storage, a technology that has been in place for more than two decades. Even with the addition of memory caches and solid state drives, the model of relying on repeated access to the permanent information storage devices is still a bottleneck in capitalizing on today's "big data," according to a new survey of 323 data managers and professionals who are part of the IOUG. Nearly 75% of respondents believe that in-memory technology is important to enabling their organization to remain competitive in the future. Yet, almost as many also indicate they lack the in-memory skills to deliver even current business requirements. The research results are detailed in a new report, titled "Accelerating Enterprise Insights: 2013 IOUG In-Memory Strategies Survey."

Posted January 24, 2013

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30

Sponsors