Newsletters




Data Warehousing

Hardware and software that support the efficient consolidation of data from multiple sources in a Data Warehouse for Reporting and Analytics include ETL (Extract, Transform, Load), EAI (Enterprise Application Integration), CDC (Change Data Capture), Data Replication, Data Deduplication, Compression, Big Data technologies such as Hadoop and MapReduce, and Data Warehouse Appliances.



Data Warehousing Articles

As analytics continues to play a larger role in the enterprise, the need to leverage and protect the data looms larger. According to the IDC, the big data and analytics market will reach $125 billion worldwide in 2015. Here are 10 predictions from industry experts about the data and analytics in 2015.

Posted December 19, 2014

In its fourth big data-related acquisition this year, Teradata announced it has acquired RainStor, a privately held company specializing in online big data archiving on Hadoop. RainStor's technology offers three key advantages, explains Chris Twogood, vice president of products and services at Teradata. It enables extreme data compression with the ability to compress data from 10s to 40x, Rainstor data is immutable which is important for compliance and security regulations, and it is all accessible by SQL.

Posted December 18, 2014

SAP Business One, version for SAP HANA, running on AWS combines the flexibility of the cloud with powerful in-memory technology from SAP and helps companies of all sizes streamline their overall business execution through accessible, easy-to-use, real-time data that provide clear insights. According to SAP, this AWS pay-as-you go, subscription-based model will help reduce total cost of ownership for the solution.

Posted December 17, 2014

2015 is going to be a big year for big data in the enterprise, according to Oracle. Neil Mendelson, Oracle vice president of big data and advanced analytics, shared Oracle's "Top 7" big data predictions for 2015. "The technology is moving very quickly and it is gaining to the point where a broader set of people can get into it - not just because it is affordable - but because they no longer require specialized skills in order to take advantage of it," he said.

Posted December 17, 2014

Data is increasingly being recognized as a rich resource flowing through organizations from a continually growing range of sources. But to realize its full potential, this data must be accessed by an array of users to support both real-time decision making and historical analysis, integrated with other information, and still kept safe from hackers and others with malicious intent. Fortunately, leading vendors are developing products and services to help. Here, DBTA presents the list of Trend-Setting Products in Data and Information Management for 2015.

Posted December 17, 2014

With the latest update to the Oracle Database Appliance, Oracle has built in the notion of a "dev/test appliance," according to Sohan DeMel, vice president, product strategy and business development, Oracle. According to DeMel, Oracle has found that customers use five, six, or as many as seven times as many systems for their test and development functions because there are so many different aspects to testing - such as integration testing, user acceptance testing, and stress testing. While the dev/testing capability is not new to the market, DeMel points out that what is different about the Database Appliance approach to dev/test is that Oracle has essentially commoditized the capability and is providing it as a core platform feature and not something that it is up-selling and charging customers more for.

Posted December 10, 2014

There is still time to submit a speaking proposal for DBTA's Data Summit 2015, which will take place at the New York Hilton Midtown, May 11-13, 2015.

Posted December 08, 2014

Attunity Ltd. has acquired BIReady's data warehouse automation technology, a move that Attunity says will enable it to support enterprises that need an automated, end-to-end data warehousing solution to support mission-critical data initiatives. BIReady's technology was acquired for approximately $1 million in a combination of cash and Attunity shares with an earn-out potential of up to $375,000 in cash over the next 2 years.

Posted December 02, 2014

To help IT and business stakeholders take action to benefit from the emerging technologies and trends in information management, Database Trends and Applications has just published the second annual Big Data Sourcebook, a free resource.

Posted November 25, 2014

The call for speakers for Data Summit 2015 at the New York Hilton Midtown, May 11-13, 2015, is now officially open. The deadline for submitting proposals is December 5, 2014.

Posted November 19, 2014

SAP and BI provider Birst have formed a partnership to provide analytics in the cloud on the SAP HANA Cloud Platform. This collaboration intends to bring together the next-generation cloud platform from SAP with Birst's two-tier data architecture to provide instant access to an organization's data and help eliminate BI wait time.

Posted October 22, 2014

Today, many companies still have most of their transactional data in relational database management systems which support various business-critical applications, from order entry to financials. But in order to maintain processing performance, most companies limit the amount of data stored there, making it less useful for in-depth analysis. One alternative, according to a recent DBTA webcast presented by Bill Brunt, product manager, SharePlex, at Dell, and Unisphere Research analyst Elliot King, is moving the data to Hadoop to allow it to be inexpensively stored and analyzed for new business insight.

Posted October 22, 2014

Generally available today, EMC and Pivotal have announced the Data Lake Hadoop Bundle 2.0 that includes EMC's Data Computing Appliance (DCA), a high-performance big data computing appliance for deployment and scaling of Hadoop and advanced analytics, Isilon scale-out NAS (network attached storage), as well as the Pivotal HD Hadoop distribution and the Pivotal HAWQ parallel SQL query engine. The idea is to provide a turn-key offering that combines compute, analytics and storage for customers building scale-out data lakes for enterprise predictive analytics.

Posted October 14, 2014

Teradata is introducing Teradata Loom 2.3, a platform that provides integrated metadata management, data lineage, and ata wrangling for enterprise Hadoop. Teradata has also launched Teradata Cloud for Hadoop, a turnkey, full service cloud environment, and a broad technology and marketing partnership with Cloudera. "Increasingly, customers want a one-stop shop for their data analytics needs," said Chris Twogood, vice president of products and services at Teradata.

Posted October 09, 2014

It is fair to say that relational theory is the only solid framework for establishing a rational expression of data that falls anywhere inside the boundaries of formal logic. As people continue to laud the "death of relational" by coming up with one or other "new" physical implementations of coding or data engines, from object-oriented, XML, columnar, or anything else one might name, the primary short-coming is that these are physical implementations that avoid having any formalized logic underpinning them.

Posted October 08, 2014

The managers and professionals who dedicate their careers to fundamental database administration are a shrinking pool. Many are retiring, while others have career aspirations outside of the IT department. The data profession itself is splintering into an array of new specialties and tasks—away from database administration and programming and toward higher-level data science and business consulting tasks.

Posted October 08, 2014

Cisco brought the fifth annual Data Virtualization Day to the Waldorf Astoria in New York City to share details about advancements coming in Cisco Information Server 7.0, the advantages of data virtualization, and the importance of the network. A key component of 7.0 release of Cisco Information Server which will be shipped next month, is the Business Directory, which will support greater access to data among more users on a self-service basis.

Posted October 01, 2014

Ron Bodkin founded Think Big Analytics to help organizations gain value from big data. Before that, he was vice president of engineering at Quantcast where he led the data science and engineering teams deploying Hadoop and NoSQL for batch and real-time decision making. In this interview, Bodkin who is CEO of Think Big, now a Teradata company, discusses the challenges organizations face in getting big data projects off the ground and what they need to consider when they embark on projects to leverage data from the Internet of Things and social media.

Posted September 25, 2014

Many in the industry have begun to look to data lakes and Hadoop as the future for data storage. To help shed light on the data lake approach, the pros and cons of this data repository were considered in a recent Unisphere webcast presented by Peter Evans, BI and analytics product evangelist and product technologist consultant, Dell Software; and Elliot King, Unisphere Research analyst.

Posted September 25, 2014

Just after Oracle announced that Larry Ellison would take on the role of executive chairman and CTO, and be succeeded as CEO by Safra Catz and Mark Hurd, the three Oracle executives hosted the company's fiscal 2015 Q1 earnings call. Ellison noted that at Oracle OpenWorld, which starts September 27, 2014, the company will be rolling out its new database cloud service with its new multitenant DBaaS offerings. "Our customers and ISVs can move any of their existing applications and databases to the Oracle Cloud with the push of a button," said Ellison.

Posted September 24, 2014

One of the major issues for companies trying to leverage big data is the length of time it takes for data to be analyzed. While being able to gather and store the data is essential, big data is useless if it cannot be analyzed. As data continues to grow, the processes for moving and analyzing it only become slower and more tedious.

Posted September 23, 2014

Broadening the range of tools at its disposal to help customers who are grappling with emerging technologies for leveraging unstructured data, Teradata this week announced its acquisition of Think Big Analytics, a consulting and solutions company that is focused exclusively on Hadoop and open source big data solutions.

Posted September 04, 2014

Why do tens of thousands of Oracle customers, partners, and consultants descend on Moscone Center in San Francisco for the annual Oracle OpenWorld conference? Despite the proliferation of conferences and online events competing for attention, OpenWorld remains the single, central place for all things Oracle. Here, DBTA presents the annual Who to See @ Oracle OpenWorld special section.

Posted August 27, 2014

With the opportunities and obstacles presented by developments such as big data, cloud, and mobility, the challenges of managing and extracting value from data have never been greater. At the same time, the array of technology options for storing, protecting, integrating, enhancing and analyzing data has exploded. To help add perspective and acknowledge the products and services that have been deemed to provide unique value to customers, DBTA created this new competition. Here are the winners of the 2014 DBTA Readers' Choice Awards.

Posted August 27, 2014

The Internet of Things (IoT) has the potential to transform how and when decisions are made throughout business and our daily lives, but only if that data can be processed and analyzed effectively. Join DBTA for a special webcast featuring Robert Geiger, vice president, Products, at TransLattice; Michael Hummel, CTO of ParStream; and Hannah Smalltree, director of Treasure Data, to learn about the key technologies and practices shaping the future. This webcast takes place August 26 at 2 pm ET/ 11 am PT.

Posted August 21, 2014

Violin Memory, a provider of all-flash storage arrays and appliances delivering application solutions for the enterprise, has introduced data deduplication and compression capabilities in its Concerto 2200 solution. According to Violin Memory, the new Concerto 2200 array update with inline deduplication and compression gives customers maximum storage efficiency, with deduplication rates commonly between 6:1 and 10:1.

Posted August 19, 2014

2014 is turning out to be a banner year for Hadoop. The big data giant is continuing to move forward and expand and evolve as big data technology and big data analytics become more mainstream. Here are six key points that demonstrate the advances being made.

Posted August 19, 2014

A healthy data warehouse environment can self-correct to accomplish the right things. And that self-correction is crucial, for no process is likely to be perfect from the start.

Posted August 05, 2014

The pioneers of big data, such as Google, Amazon, and eBay, generated a "data exhaust" from their core operations that was more than sufficient to allow them to create data-driven process automation. But, for smaller enterprises, data might be the scarcest commodity. Hence, the emergence of data marketplaces.

Posted August 05, 2014

Programs that read database data can access numerous rows and are therefore susceptible to concurrency problems. To get around this issue, most major RDBMS products support read-through locks, also known as "dirty read" or "uncommitted read," to help overcome concurrency problems. When might you want to consider using dirty reads in your applications?

Posted August 05, 2014

As demand for IT services and data volumes grow, so do the challenges with managing databases. For Oracle's vast and growing network of independent software vendors, value-added resellers, systems integrators, and consultants, these are fast-changing times. With Oracle OpenWorld rapidly approaching, DBTA examines Oracle's current product line-up and evolving ecosystem.

Posted August 05, 2014

Quest International Users Group (Quest) supports JD Edwards, PeopleSoft, Fusion, and many other Oracle edge application customers, and is comprised of 55,000 members in 95 countries. Throughout Oracle OpenWorld 2014, you'll find members of the Quest team speaking at various breakout sessions, hosting SIG meetings and networking with members, Oracle personnel, and other customers. We will be answering questions and sharing information on how our members use and work with their Oracle products.

Posted August 04, 2014

Members of the Independent Oracle Users Group (IOUG), which represents the independent voice of Oracle technology and database professionals, will be out in force at OpenWorld 2014—presenting more than 40 sessions on the topics you want to learn about most.

Posted August 04, 2014

The converging forces of open source, with its rapid crowd-sourced innovation, cloud, with its unlimited capacity and on-demand deployment options, and NoSQL database technologies, with their ability to handle unstructured data, are helping companies address the new challenges and opportunities presented by big data. Here are the winners of the 2014 DBTA Readers' Choice Awards for Best Big Data Solution.

Posted August 04, 2014

Data replication is used as part of many different enterprise scenarios. Whether companies need to share information as part of their BI and reporting processes, support high availability and disaster recovery, or arrange a no-downtime migration, data replication can help them achieve their goals. Here are the winners of the 2014 DBTA Readers' Choice Awards for Best Data Replication Solution.

Posted August 04, 2014

Data integration is critical to many organizational initiatives such as business intelligence, sales and marketing, customer service, R&D, and engineering, to name a few. Whether seeking to achieve a comprehensive view of data across disparate relational database management systems, or across unstructured data repositories and the cloud, the goal for many companies is the same—to help reduce costs and drive better decision making to fuel business success. Here are the winners of the 2014 DBTA Readers Choice Awards for Best Data Integration Solution (Overall).

Posted August 04, 2014

The staggering variety of data—with much of it unstructured, including business documents, presentations, emails, log files and social media data—means that this data does not fit neatly into the rows and columns of relational database management systems. But Hadoop, which is an open source technology for storing and processing data that runs on industry-standard hardware, embraces this mixed bag of data types and enables companies to store and analyze data sets with no limits in size. Here are the winners of the 2014 DBTA Readers' Choice Awards for Best Hadoop Solution.

Posted August 04, 2014

Pages
1
2
3
4
5
6
7
8
9
10

Sponsors