Newsletters




Data Warehousing

Hardware and software that support the efficient consolidation of data from multiple sources in a Data Warehouse for Reporting and Analytics include ETL (Extract, Transform, Load), EAI (Enterprise Application Integration), CDC (Change Data Capture), Data Replication, Data Deduplication, Compression, Big Data technologies such as Hadoop and MapReduce, and Data Warehouse Appliances.



Data Warehousing Articles

InetSoft Technology, a provider of data mashup driven dashboard and reporting solutions, and Management Systems International (MSI), experts in financial planning and reporting, have announced a joint solution for interactive dashboard reporting on top of a financial consolidation and information management platform. The joint solution is intended to enable multinational firms using diverse ERP and financial systems to be able to visually understand and explore their financial data in order to manage their financial operations more efficiently.

Posted August 09, 2011

Oracle's S. Ramakrishnan, group vice president and general manager for Oracle Financial Services Analytical Applications, was in New York last week to provide an update on how financial services institutions are leveraging tailored technology from Oracle - including the recently announced Oracle Financial Services Data Warehouse - to manage the complex information needed to compete profitably and effectively address stringent regulatory requirements.

Posted August 04, 2011

Expanding its existing product portfolio, Informatica Corporation now offers Universal Data Replication, giving customers more options to meet their business continuity, big data and operational data integration needs. A part of the Informatica Platform, Informatica's new data replication technology includes Informatica Fast Clone which automates the cloning of application data and Informatica Data Replication which manages the capture, routing and delivery of high-volume transaction data across diverse systems in real time with minimal source system impact.

Posted August 02, 2011

Oracle has introduced the Oracle Exadata Storage Expansion Rack to offer customers a cost-effective way to add storage to an Oracle Exadata Database Machine. "There are customers, earlier Exadata Database Machine customers, that have now started to fill up the disks that they have on the Database Machine and they are starting to look for ways to expand their storage capacity, and so this is going to be really welcome for them," says Tim Shetler, vice president of Product Management, Oracle.

Posted July 27, 2011

The Oracle Applications Users Group (OAUG), the world's largest user knowledgebase for Oracle Applications users, is launching the OAUG Educational Series 2011, a virtual learning series offered to OAUG members from Aug. 8-19, featuring the most popular presentations from the COLLABORATE 11 - OAUG Forum.

Posted July 25, 2011

Dataguise, a provider of enterprise security intelligence solutions, has announced a high performing database cloning privacy solution to support test, development and analytic uses in data warehousing environments. According to Dataguise, its sensitive data discovery and masking solutions complement NetApp's solution for rapid cloning of large Oracle data sets to enable efficient and secure distribution when running on the Cisco Unified Computing System (UCS) platform.

Posted July 19, 2011

Kognitio today launched a new family of data warehouse appliances designed to let companies choose the model best suited to their specific data analysis speed and volume needs. "We have always offered a software-only database prepackaged on industry-standard hardware as an appliance for a turnkey solution. What we are doing today is basically giving customers more choice," Sean Jackson, vice president of marketing, Kognitio, tells 5 Minute Briefing. Kognitio has named the three new appliance varieties Rapids, Rivers and Lakes - which the company says are metaphors for the variety of performance and capacity issues that customers must consider.

Posted June 29, 2011

Vertica, an HP company, has announced the availability of Vertica 5.0, the latest version of the MPP columnar Vertica Analytics Platform. Vertica 5.0 offers a new software development kit (SDK) that provides the ability to customize and insert customer- and use case-specific query logic into the Vertica database for fully parallel execution. "The three tenets of Vertica have been its speed, scalability and simplicity and we have continued to further the industry-leading aspects of all three of those tenets of the system," Scott Howser, vice president of product marketing, HP Vertica, tells 5 Minute Briefing.

Posted June 29, 2011

Representing a continued expansion of its big data analytics portfolio, IBM has introduced a new addition to the Netezza product family of analytics appliances that is designed to help organizations uncover patterns and trends from extremely large data sets. The appliance is the first to be delivered by IBM since it acquired Netezza in November 2010. According to IBM, using the new appliance, businesses can now more easily sift through petabytes of data, including banking and mobile phone transactions, insurance claims, electronic medical records, and sales information, and they can also analyze this information to reveal new trends on consumer sentiment, product safety, and sales and marketing effectiveness. "This new appliance takes the scalability to a completely new dimension," says Razi Raziuddin, senior director of product management at IBM Netezza.

Posted June 24, 2011

EMC Corporation, a provider of storage and infrastructure solutions, announced it will be shipping a data warehouse appliance that leverages the Apache Hadoop open-source software used for data-intensive distributed applications. The company's high-performance, data co-processing Hadoop appliance - the Greenplum HD Data Computing Appliance - integrates Hadoop with the EMC Greenplum Database, allowing the co-processing of both structured and unstructured data within a single solution. EMC also says the solution will run either Hadoop-based EMC Greenplum HD Community Edition or EMC Greenplum HD Enterprise Edition software.

Posted June 24, 2011

Composite Software has introduced Composite 6, a new version of its flagship data virtualization software that provides "big data" integration support for the Cloudera Distribution including Apache Hadoop (CDH), IBM Netezza and HP Vertica data sources. In addition, Composite 6, which is now completing beta test and will be commercially available in July, includes performance optimizations, cache enhancements, new data governance capabilities and ease-of-use features. "Data virtualization is emerging as an ideal solution for managing today's complex data integration challenges," says Jim Green, CEO for Composite Software.

Posted June 24, 2011

Is the day of reckoning for big data upon us? To many observers, the growth in data is nothing short of incomprehensible. Data is streaming into, out of, and through enterprises from a dizzying array of sources-transactions, remote devices, partner sites, websites, and nonstop user-generated content. Not only are the data stores resulting from this information driving databases to scale into the terabyte and petabyte range, but they occur in an unfathomable range of formats as well, from traditional structured, relational data to message documents, graphics, videos, and audio files.

Posted June 22, 2011

Dell says it will resell RainStor's specialized data retention database to support solutions such as application retirement and retention of machine-generated data. RainStor, a data storage infrastructure software company, designs and sells technology that enables data to be de-duplicated and compressed, while still accessible online through standard SQL language and BI tools. "The RainStor-Dell solution combines the object storage capabilities of the Dell DX with RainStor's online data retention (OLDR) repository," Ramon Chen, vice president of product management at RainStor, tells 5 Minute Briefing.

Posted June 13, 2011

The Oracle Exadata Database Machine X2-2 and X2-8 with the Solaris option began shipping just this month. Now in its third generation, the Database Machine combines all the components to create what the company describes as the best platform for running the Oracle Database. Here, Tim Shetler, vice president of Product Management, Oracle, talks about the performance innovations that differentiate Oracle's offering, how customers are using the system today for business advantage, and also - what's ahead. "Performance is one of the really big advantages of Exadata. It is single-purpose around running the Oracle Database with the best performance and the best availability possible," says Shetler.

Posted May 31, 2011

The Oracle Exadata Database Machine X2-2 and X2-8 with the Solaris option began shipping just this month. Now in its third generation, the Database Machine combines all the components to create what the company describes as the best platform for running the Oracle Database. Here, Tim Shetler, vice president of Product Management, Oracle, talks about the performance innovations that differentiate Oracle's offering, how customers are using the system today for business advantage, and also — what's ahead.

Posted May 26, 2011

Data quality, MDM, and data governance software vendor Ataccama Corporation announced that it has entered into a cooperative partnership with Teradata Corporation, a leader in data warehousing and enterprise analytics. The partnership is aimed at enabling joint customers to improve data quality within their data warehouses. Ataccama is a global software company with headquarters in Prague, and offices in Toronto, Stamford, London, and Munich, and the new partnership with Teradata represents a worldwide geographical relationship, according to Michal Klaus, CEO, Ataccama Corp.

Posted May 24, 2011

Alpine Data Labs, developer of Alpine Miner, a solution for big data predictive analytics, has received $7.5 million in Series A funding. In addition, after 15 months of product development, the company also announced its 10th production customer and its formal launch in the U.S. market. According to Anderson Wong, Alpine Labs CEO and co-founder, organizations cannot extract all the possible value from their data because it is growing faster than they can analyze it, they don't have enough resources with analytics expertise, and the tools they're using are too complex to get to the answers they need quickly.

Posted May 13, 2011

Big data provides new opportunities to improve customer care, unearth business insights, control operational costs, and in some cases, enable entirely new business models. By having access to larger and broader data sets, you can improve forecasts and projections for the business. A healthcare organization can conduct longitudinal analysis against years of data for patients treated with coronary attacks in order to improve care and speed time to recovery. A retailer can conduct deeper analysis on buying behavior during recessionary times if they have access to large data sets collected during the last economic downturn. Additionally, organizations across many sectors, such as communications, financial services and utilities, face significant regulatory and legal requirements for retaining and providing fast access to historical data for inquiries, audits and reporting.

Posted May 12, 2011

Full 360 Inc., a New York-based systems integrator, has introduced a new release of its elasticBI platform-as-a-service (PaaS). elasticBI pairs Jaspersoft4, the open source BI application, with the Vertica Analytic Database in a PaaS offering that is tightly integrated via the Opscode Chef framework. The platform is offered by Full 360 on Amazon Web Services, the Amazon cloud. According to Full 360, elasticBI provides a complete BI-data warehouse platform that is accessible from a cost and technical perspective for small and mid-market companies, as well as enterprise departments.

Posted April 26, 2011

HP has updated its Information Management portfolio to enable organizations to reduce risk, increase efficiency and simplify the way they manage their business information. By bringing a holistic approach to information management, HP says the solutions and services will help executives harness the power of their information to make better decisions, manage for cost and compliance, and deliver the right information to the right users at the right time.

Posted April 12, 2011

EMC has announced three new additions to its EMC Greenplum Data Computing Appliance (DCA) line of products - the High Capacity DCA, the High Performance DCA, and the Data Integration Accelerator - as well as version 4.1 of Greenplum Database.

Posted April 12, 2011

With the annual Oracle users conference COLLABORATE about to begin, Andy Flower, president of the IOUG, spoke with 5 Minute Briefing about the IOUG's strong areas of focus in terms of overall conference content, and how the addition of the MySQL user base into the Oracle community is evolving. Citing a MySQL keynote, 75 sessions at COLLABORATE focused on MySQL, and a new MySQL Council headed by Sarah Novotny, Flower says the IOUG is making strides in giving voice to the MySQL community within the IOUG and setting a stage for positive interaction with Oracle.

Posted April 06, 2011

Revolution Analytics, a commercial provider of software and services based on the open source R project for statistical computing, and IBM Netezza announced they are teaming up to integrate Revolution R Enterprise and the IBM Netezza TwinFin Data Warehouse Appliance. According to the vendors, this will enable customers to directly leverage the capabilities of the open source R statistics language as they run high-performance predictive analytics from within data warehouse platforms.

Posted March 21, 2011

Revolution Analytics, a commercial provider of software and services based on the open source R project for statistical computing, and IBM Netezza have announced a partnership to integrate Revolution R Enterprise and the IBM Netezza TwinFin Data Warehouse Appliance. According to the vendors, this will enable customers seeking to run high performance and full-scale predictive analytics from within a data warehouse platform to directly leverage the capabilities of the open source R statistics language. Under the terms of the agreement, the companies will work together to create a version of Revolution's software that takes advantage of IBM Netezza's i-class technology so that Revolution R Enterprise can run in-database in an optimal fashion. General availability of this integrated solution is currently planned for later this year.

Posted March 15, 2011

A member of the Oracle Applications Users Group (OAUG) since 1992, Mark C. Clark recently took over as president of the organization. Recently, 5 Minute Briefing chatted with Clark about what's in store for members at the annual Oracle users conference COLLABORATE as well as for the year ahead. Helping members prepare for an upgrade to Oracle Applications Release 12, providing additional smaller, more targeted regional events, and a continued emphasis on a return to the basics with networking and education are at the top of his to-do list for 2011.

Posted March 08, 2011

Teradata Corporation, which already had an 11% ownership interest in Aster Data Systems, Inc, has signed a definitive agreement to pay $263 million for the remaining ownership interest in the company. Aster Data is a provider of software for advanced analytics and the management of a variety of diverse data that is not structured. The combination of Teradata and Aster Data technologies will enable businesses to unlock the new intelligence within the burgeoning volumes of big data. "Teradata is recognized as an innovator and leader in data warehousing, and the addition of Aster Data will enable us to leapfrog into a leadership position in the emerging big data market. This is a great opportunity and something our customers have been asking for," said Scott Gnau, chief development officer, Teradata Corporation.

Posted March 08, 2011

Oracle has announced that performance testing with Oracle Financial Services Profitability Management running on the Oracle Exadata Database Machine has delivered outstanding results for calculating profitability measures at extreme speed. Oracle Financial Services Profitability Management running on the Oracle Exadata Database Machine computed profitability for 172 allocation rules across 250 million accounts and more than 1 billion transactions in only 4 hours and 45 minutes.

Posted March 02, 2011

The market for data warehouse appliances - solutions consisting of integrated software and hardware - is heating up, with new twists emerging from both established and new appliance vendors. Netezza, an early proponent of the appliance approach, was acquired in November 2010 by IBM. Here, Phil Francisco, vice president, product management and product marketing for IBM Netezza, shares his views on what's changing and what's ahead for appliances. Going forward, he anticipates that there will be very specific, vertically-oriented solutions that are built on appliances, which will take into account the kinds of data models and the kind of functionality that is required for industries such as telco, retail, and financial services.

Posted February 23, 2011

HP announced it has signed a definitive agreement to acquire Vertica, a privately held, real-time analytics platform company based in Billerica, Mass. According to HP, the acquisition will enhance its capabilities for information optimization, adding sophisticated, real-time business analytics for large and complex sets of data in physical, virtual and cloud environments. Vertica's platform aims to help customers analyze massive amounts of data quickly, resulting in "just-in-time" business intelligence.

Posted February 15, 2011

Over the past 3 years, the IOUG ResearchWire studies conducted by Unisphere Research have focused on Oracle technology as well as trends affecting data professionals, allowing IT professionals to benchmark where their organizations stand within their own technology environment. Executive Summaries of all IOUG ResearchWire reports are publicly available for free download and full study reports are also available to IOUG members at no charge when they sign in with their user name and password.

Posted February 02, 2011

EMC has introduced a free Community Edition of the EMC Greenplum Database, the high-performance massively parallel processing (MPP) database product, along with free analytic algorithms and data mining tools. The new offering is intended to remove the cost barrier to entry for big data power tools for developers, data scientists, and other data professionals, according to EMC. This free set of tools will enable the community to better understand their data, gain deeper insights and better visualize insights, as well as contribute and participate in the development of new tools and solutions. With the Community Edition stack, developers can build complex applications to collect, analyze and operationalize big data leveraging big data tools including the Greenplum Database with its in-database analytic processing capabilities.

Posted February 01, 2011

The market for data warehouse appliances - solutions consisting of integrated software and hardware - is heating up, with new twists emerging from both established and new appliance vendors. Netezza, an early proponent of the appliance approach, was acquired in November 2010 by IBM. Here, Phil Francisco, vice president, product management and marketing for IBM Netezza, shares his views on what's changing and what's ahead. Going forward, Francisco sees very specific, vertically-oriented solutions that are built on appliances, whether it be for Telco, retailers, or financial services. "These appliances will take into account the kinds of data models that are required, but also the kind of functionality that is required for those industries and relieve even more of the day-to-day administration requirements," he notes.

Posted February 01, 2011

HP and Microsoft have announced four new converged application appliances that combine applications, infrastructure and productivity tools into a single system. In addition, there is the previously-announced HP Enterprise Data Warehouse Appliance, an enterprise-level mission-critical system. Now available with a starting price of $2 million, the HP Enterprise Data Warehouse Appliance is said to deliver up to 200 times faster queries and 10 times the scalability of traditional SQL Server deployments."Our appliances have been engineered from the beginning with time to solution in mind," Fausto Ibarra, senior director of Business Intelligence at Microsoft, tells 5 Minute Briefing. "We are really offering a unique value proposition."

Posted January 25, 2011

The Data Warehousing Institute (TDWI) will present the first event in its 2011 conference series February 13-18 at Caesars Palace in Las Vegas, with the theme "Building An Enterprise Data Strategy: A Framework For Consistent Information Across The Enterprise." The aim of the 6-day event is to provide attendees with the education they need about how to inventory, govern, integrate, and exploit their data assets. Organizations have "massive amounts of data" as well as a variety of different databases and duplicates of databases, and, as a result, people are not aware of all the data they have and how they can utilize it, Paul Kautza, director of education of TDWI, tells 5 Minute Briefing. The conference will provide classes on how to gain critical executive sponsorship, connect point projects into a unified data management approach, and define and implement a true corporate data strategy that will deliver consistent information across an enterprise.

Posted January 18, 2011

The exponentially increasing amounts of data being generated each year make getting useful information from that data more and more critical. The information frequently is stored in a data warehouse, a repository of data gathered from various sources, including corporate databases, summarized information from internal systems, and data from external sources. Analysis of the data includes simple query and reporting, statistical analysis, more complex multidimensional analysis, and data mining.

Posted January 07, 2011

Database Trends and Applications recently hosted an educational webcast to explain how organizations can extract business intelligence and business value from large and complex data, with Apache Hadoop. Moderated by Tom Wilson, president, DBTA and Unisphere Research, an on-demand replay of this live web event featuring Joe Nicholson, vice president of product marketing, Pentaho Corporation, and Omer Trajman, vice president, customer solutions, Cloudera, is now available.

Posted December 14, 2010

At the recent Supercomputing 2010 conference, IBM unveiled details of a new storage architecture design created by IBM scientists that will convert terabytes of pure information into actionable insights twice as fast as previously possible. Ideally suited for cloud computing applications and data-intensive workloads such as digital media, data mining and financial analytics, this new architecture will improve the speed of complex computations without requiring heavy infrastructure investment. IBM won the Storage Challenge competition for presenting the most innovative and effective design in high performance computing with the best measurements of performance, scalability and storage subsystem utilization.

Posted December 08, 2010

expressor software has announced an update to its enterprise data integration platform, which features enhancements in performance and reusable data mappings. expressor 3.0 is a design, development and deployment platform for supporting various data integration applications, from tactical data migrations to large enterprise data warehouses and strategic, predictive analytics.

Posted December 01, 2010

The IOUG has completed a number of ground-breaking studies in 2010 through the IOUG ResearchWire program. Conducted among IOUG members by Unisphere Research, 2010 IOUG ResearchWire Executive Summaries are available to all on the IOUG website.

Posted December 01, 2010

Panorama Software, a provider of proactive business intelligence (BI) solutions, has announced support for the Microsoft SQL Server 2008 R2 Parallel Data Warehouse through its flagship business intelligence solution, Panorama NovaView 6.2.

Posted November 30, 2010

The year 2010 brought many new challenges and opportunities to data managers' jobs everywhere. Companies, still recovering from a savage recession, increasingly turned to the power of analytics to turn data stores into actionable insights, and hopefully gain an edge over less data-savvy competitors. At the same time, data managers and administrators alike found themselves tasked with managing and maintaining the integrity of rapidly multiplying volumes of data, often presented in a dizzying array of formats and structures. New tools and approaches were sought; and the market churning with promising new offerings embracing virtualization, consolidation and information lifecycle management. Where will this lead in the year ahead? Can we expect an acceleration of these initiatives and more? DBTA looked at new industry research, and spoke with leading experts in the data management space, to identify the top trends for 2011.

Posted November 30, 2010

Did you miss a webcast covering important information about technologies you need to learn more about? IBM offers replays on-demand of webcasts presented by IBM experts and partners.

Posted November 23, 2010

ParAccel, Inc., provider of a high-speed analytic database, announced a new release of its analytical database, designed to enable organizations to more rapidly generate custom analytics, accelerate time to analysis, and use existing infrastructure standards. The new database release, ParAccel Analytic Database (PADB) 3.0, is intended to help organizations leverage in-database analytics to eliminate errors and delays from unnecessary data movement.

Posted November 23, 2010

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29

Sponsors