Newsletters




Data Warehousing

Hardware and software that support the efficient consolidation of data from multiple sources in a Data Warehouse for Reporting and Analytics include ETL (Extract, Transform, Load), EAI (Enterprise Application Integration), CDC (Change Data Capture), Data Replication, Data Deduplication, Compression, Big Data technologies such as Hadoop and MapReduce, and Data Warehouse Appliances.



Data Warehousing Articles

Oracle announced that fiscal 2014 Q3 total revenues were up 4% to $9.3 billion. In constant currency, Oracle's Cloud Software subscriptions revenues grew 25% and its Engineered Systems revenue grew more than 30% in the quarter, said Oracle president and CFO Safra Catz in a statement released by the company.

Posted March 18, 2014

Cloudera has closed on a new round of funding for $160 million which will be used to further drive the enterprise adoption of and innovation in Hadoop and promote the enterprise data hub (EDH) market; support geographic expansion into Europe and Asia; expand its services and support capabilities; and scale the field and engineering organizations. The funding round was led by T. Rowe Price, and included an investment by Google Ventures and an affiliate of MSD Capital, L.P., the private investment firm for Michael S. Dell and his family.

Posted March 18, 2014

You have until April 11 to take advantage of special pricing for DBTA's Data Summit, which will take place at the New York Hilton Midtown, from May 12 to May 14. Providing an intensive 2-day immersion into critical technologies for becoming a data-driven enterprise, IT practitioners and business stakeholders alike will benefit from Data Summit.

Posted March 18, 2014

Pivotal has introduced Pivotal HD 2.0 and Pivotal GemFire XD, which along with the HAWQ query engine, form the foundation for the Business Data Lake architecture, a big data application framework for enterprise

Posted March 17, 2014

Today, businesses are ending up with more and more critical dependency on their data infrastructure. If underlying database systems are not available, manufacturing floors cannot operate, stock exchanges cannot trade, retail stores cannot sell, banks cannot serve customers, mobile phone users cannot place calls, stadiums cannot host sports games, gyms cannot verify their subscribers' identity. Here is a look at some of the trends and how they are going to impact data management professionals.

Posted March 17, 2014

Along with big data there come some fundamental challenges. The biggest challenge is that big data is not able to be analyzed using standard analytical software. There is new technology called "textual disambiguation" which allows raw unstructured text to have its context specifically determined.

Posted March 14, 2014

As the deployment of big data analytics becomes part of a business' "must-do" list, cloud platforms offer the scalability, flexibility, and on-demand ability to handle workload spikes that are hard to come by with on-premises systems. But, as is the case with every system and data environment, it takes careful planning to achieve the desired business value—which is to enable unprecedented opportunities to better understand and engage with key customer segments and markets.

Posted March 12, 2014

Data Summit will take place at the New York Hilton Midtown, from May 12 to May 14. The advance program is now available and registration is open with a special early bird registration rate when you register before April 11, 2014.

Posted March 12, 2014

To recognize the best information management solutions in the marketplace, Database Trends and Applications has launched the DBTA Readers' Choice Awards, a program in which the winners will be selected by the experts whose opinions count above all others - you. The nominations period will conclude on March 28, 2014, and voting will begin on April 11.

Posted March 12, 2014

To simplify and strengthen enterprise application integration (EAI) capabilities, the latest release of Kourier Integrator, Kore Technologies' flagship product for data management providing both extract, transform and load (ETL) and enterprise application integration (EAI) capabilities, offers simpler inbound integration of third-party systems and databases to U2 applications.

Posted March 12, 2014

SAP underscored the company's strategy which focuses squarely on HANA and the cloud in a webcast presented by Jonathan Becher, chief marketing officer of SAP, and Vishal Sikka, member of the Executive Board of SAP AG, Products & Innovation. The company rolled out new and enhanced offerings for the SAP HANA Cloud Platform, the SAP HANA Marketplace, HANA's new pricing, innovations on top of HANA, and also announced that HANA had broken the Guinness World Record for the largest data warehouse ever built - 12.1PB.

Posted March 05, 2014

Enterprise data warehouses aren't going away anytime soon. Despite claims that Hadoop will usurp the role of data warehousing, Hadoop needs data warehouses, just as data warehouses need Hadoop. However, making the leap from established data warehouse environments—the kind most companies still have, based on extract, transform and load (ETL) inputs with a relational data store and query and analysis tools—to the big data realm isn't a quick hop.

Posted February 26, 2014

The latest release of Embarcadero's portfolio of database tools adds first-class support for Teradata in addition to updating support for the latest releases of the major RDBMSs. Overall, a key theme for the XE5 releases is an emphasis on scale, as big data, with big models and big applications, requires close collaboration across big teams, said Henry Olson, Embarcadero director of product management.

Posted February 26, 2014

In order to be effective, big data analytics must present a clear and consistent picture of what's happening in and around the enterprise. Does a new generation of databases and platforms offer the scalability and velocity required for cloud-based, big data-based applications—or will more traditional relational databases come roaring back for all levels of big data challenges?

Posted February 26, 2014

In the last several years, there has been an explosion in the array of choices for not only managing relational and unstructured data but also protecting it and extracting value from it. To shine a spotlight on the best data management offerings, DBTA will soon open nominations for our first-ever Readers' Choice Awards.

Posted February 26, 2014

To say that big data is the sum of its volume, variety, and velocity is a lot like saying that nuclear power is simply and irreducibly a function of fission, decay, and fusion. It's to ignore the societal and economic factors that—for good or ill—ultimately determine how big data gets used. In other words, if we want to understand how big data has changed data integration, we need to consider the ways in which we're using—or in which we want to use—big data.

Posted February 21, 2014

To say that big data is the sum of its volume, variety, and velocity is a lot like saying that nuclear power is simply and irreducibly a function of fission, decay, and fusion. It's to ignore the societal and economic factors that—for good or ill—ultimately determine how big data gets used. In other words, if we want to understand how big data has changed data integration, we need to consider the ways in which we're using—or in which we want to use—big data.

Posted January 20, 2014

DBTA is seeking speakers who possess unique insight into leading technologies, and experience with successful IT and business strategies for the Data Summit conference in New York City, May 12-14, 2014. The deadline to submit your proposal is January 31, 2014.

Posted January 20, 2014

Changes and enhancement to solutions are hard, even under the best of circumstances. It is not usual that, as operational changes roll out into production, the business intelligence area is left uninformed, suggesting that data warehouses and business intelligence be categorized according to the view of the old comedian Rodney Dangerfield because they both "get no respect."

Posted January 07, 2014

The data-driven demands on organizations have never been greater. Two of the most pressing concerns that organizations face today are the need to provide analytic access to newer data types such as machine-generated data, documents and graphics, and the need to control the cost of information management for growing data stores. DBTA's new list of Trend-Setting Products in Data for 2014 highlights the products, platforms, and services that seek to provide organizations with the tools necessary to address rapidly changing market requirements.

Posted December 20, 2013

A new rapid-deployment solution from SAP aims to address the issue of big data storage access and analysis which companies are grappling with as they attempt to balance what information needs to be accessible in real time and what can be stored for historical analysis. The SAP NetWeaver Business Warehouse (BW) Near-Line Storage rapid-deployment solution facilitates seamless data transfer between the business warehouse and the near-line storage that holds historical data. This, the company says, helps limit a business' burden around housing volumes of big data while also creating an online and accelerated retrieval system with the near-line storage.

Posted December 18, 2013

As unstructured data overtakes structured data within enterprises, the coming year will see the start of a reassessment of how data is architected, stored, and queried in enterprises. To meet this challenge, new technologies and solutions have already begun to transform data management within enterprises.

Posted December 17, 2013

Cloudera and Informatica have partnered to create a new Data Warehouse Optimization (DWO) reference architecture specifically for Enterprise Data Hub deployments with the goal of helping reduce data warehouse costs and increasing productivity. Cloudera also announced the public beta offering of Cloudera Enterprise 5, which delivers the new Enterprise Data Hub and enhancements to related products.

Posted October 31, 2013

If you look at what is really going on in the big data space it's all about inexpensive open source solutions that are facilitating the modernization of data centers and data warehouses, and at the center of this universe is Hadoop. In the evolution of the big data market, open source is playing a seminal role as the "disruptive technology" challenging the status quo. Additionally, organizations large and small are leveraging these solutions often based on inexpensive hardware and memory platforms in the cloud or on premise.

Posted October 24, 2013

At its Partners User Group Conference in Dallas, Teradata made a range of product and partner announcements, including the introduction of the Teradata Data Warehouse Appliance 2750, the availability of the Teradata Cloud, and new support for Java Script Object Notation (JSON) data.

Posted October 22, 2013

Oracle CEO Larry Ellison made three key announcements in his opening keynote at Oracle OpenWorld, the company's annual conference for customers and partners in San Francisco. Ellison unveiled the Oracle Database In-Memory Option to Oracle Database 12c which he said speeds up query processing by "orders of magnitude," the M6 Big Memory Machine, and the new Oracle Database Backup Logging Recovery Appliance. Explaining Oracle's goals with the new in-memory option, Ellison noted that in the past there have been row-format databases, and column-format databases that are intended to speed up query processing. "We had a better idea. What if we store data in both formats simultaneously?"

Posted October 02, 2013

RainStor, a provider of an enterprise database for managing and analyzing all historical data, has introduced RainStor FastForward, a new product that enables customers to re-instate data from Teradata tape archives (also known as BAR for Backup, Archive and Restore) and move it to RainStor for query. The new RainStor FastForward product resolves a pressing challenge for Teradata customers that need to archive their Teradata warehouse data to offline tape, which can make it difficult to access and query that data when business and regulatory users require it, Deirdre Mahon, vice president of marketing, RainStor, explained in an interview.

Posted September 26, 2013

Attunity Ltd., a provider of information availability software solutions, has released a new version of its data replication software intended to address requirements for big data analytics, business intelligence, business continuity and disaster recovery initiatives. Addressing expanding use cases for the solution, Attunity Replicate 3.0, is engineered to provide secure data transfer over long distances such as wide area networks (WANs), the cloud and satellite connections, said Lawrence Schwartz, vice president of marketing at Attunity, in an interview.

Posted September 25, 2013

On Thursday, September 19, at 11 am PT / 2 pm ET, DBTA will present a special webcast to provide a deeper understanding of the key technologies that are changing the database world - from NoSQL and NewSQL databases, to in-memory processing and virtualization. Sponsored by Progress Software and TransLattice, the webcast titled, "New Technologies Revolutionizing the Database World," will provide in-depth information about game-changing database technologies.

Posted September 18, 2013

Data analytics, long the obscure pursuit of analysts and quants toiling in the depths of enterprises, has emerged as the must-have strategy of organizations across the globe. Competitive edge not only comes from deciphering the whims of customers and markets but also being able to predict shifts before they happen. Fueling the move of data analytics out of back offices and into the forefront of corporate strategy sessions is big data, now made enterprise-ready through technology platforms such as Hadoop and MapReduce. The Hadoop framework is seen as the most efficient file system and solution set to store and package big datasets for consumption by the enterprise, and MapReduce is the construct used to perform analysis over Hadoop files.

Posted August 21, 2013

Attunity Ltd., a provider of information availability software solutions, has formed a new partnership with HP and announced the availability of its enhanced Attunity Click-2-Load solution for HP Vertica. The solution provides automation and optimized technologies that accelerate data loading to HP Vertica from disparate sources, and then maintains the changed data continuously and efficiently.

Posted August 13, 2013

Database Trends and Applications has launched a special "Who to See at Oracle OpenWorld" section online where you can find information on what to expect at this year's conference and premium vendors that offer products and services to serve your needs as an Oracle technology professional.

Posted August 09, 2013

A new Oracle In-Memory Application - Oracle In-Memory Logistics Command Center - has been launched that enables customers to improve scenario management in order to increase supply chain resiliency and agility, decrease costs and enhance service levels. With supply chains and their associated logistics networks becoming increasingly complex, strategic and operational, Oracle says scenario management is now central to creating an effective logistics network. Oracle In-Memory Logistics Command Center leverages the performance capabilities of Oracle Engineered Systems, including Oracle Exadata Database Machine and Oracle Exalogic Elastic Cloud, which can manage the large and complex data sets central to in-memory applications.

Posted August 07, 2013

Noetix Corp., a provider of business intelligence (BI) software and services for enterprise applications, has introduced Noetix Analytics 5.3, with improvements including new data marts, performance enhancements, and a streamlined upgrade process. Noetix Analytics is a packaged data warehouse solution designed to provide business users with strategic reporting for trending and analysis based on information from multiple data sources.

Posted August 07, 2013

In many ways, Hadoop is the most concrete technology underlying today's big data revolution, but it certainly does not satisfy those who want quick answers from their big data. Hadoop - at least Hadoop 1.0 - is a batch-oriented framework that allows for the economical execution of massively parallel workloads, but provides no capabilities for interactive or real-time execution.

Posted August 07, 2013

The Oracle database provides intriguing possibilities for the storing, manipulating and streaming of multimedia data in enterprise class environments. However, knowledge of why and how the Oracle database can be used for multimedia applications is essential if one is to justify and maximize the ROI.

Posted July 17, 2013

Database Trends and Applications introduces the inaugural "DBTA 100," a list of the companies that matter most in data. The past several years have transformed enterprise information management, creating challenges and opportunities for companies seeking to extract value from a sea of data assets. In response to this, established IT vendors as well as legions of newer solution providers have rushed to create the tools to do just that.

Posted June 27, 2013

Database Trends and Applications (DBTA) magazine has announced the inaugural "DBTA 100: The Companies That Matter Most in Data," a list saluting this year's companies in data and enterprise information management—from long-standing industry veterans to fast-growing startups tackling big data. "Beyond the explosion of interest surrounding big data, the past several years have transformed enterprise information management, creating both challenges and opportunities for companies seeking to protect, optimize, integrate, and extract actionable insight from a sea of data assets," remarked Thomas Hogan, group publisher of Database Trends and Applications.

Posted June 26, 2013

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30

Sponsors