Newsletters




Data Warehousing

Hardware and software that support the efficient consolidation of data from multiple sources in a Data Warehouse for Reporting and Analytics include ETL (Extract, Transform, Load), EAI (Enterprise Application Integration), CDC (Change Data Capture), Data Replication, Data Deduplication, Compression, Big Data technologies such as Hadoop and MapReduce, and Data Warehouse Appliances.



Data Warehousing Articles

The winner of the Apple iPad 2 drawn from the pool of 421 data managers and professionals who responded to the latest IOUG Data Warehousing survey has been announced. The winner is Steven Pierce, principal of Think Huddle in Annandale, Virginia (www.thinkhuddle.com).

Posted February 01, 2012

SAP AG has announced that since introducing SAP HANA a year ago customer and partner demand for the technology has surged. According to Dr. Vishal Sikka, member of the SAP executive board, Technology & Innovation, leading independent software vendors are adopting the open SAP HANA platform for their existing products and also building completely new applications as well. The company also announced at the recent SAP Influencer Summit 2011 in Boston that SAP HANA is at the core of its platform roadmap, powering both renewed applications without disruption as well as new ones.

Posted December 21, 2011

The first calendar year following SAP's acquisition of Sybase is coming to a close. David Jonker, director, product marketing - Data Management & Analytics, Sybase, discusses key product integrations, IT trends that loom large in Sybase's data management strategies, and the emergence of what Sybase describes as DW 2.0. 2011 has been "a foundational year," with effort focused on making Sybase technologies work with SAP and setting the stage for 2012, says Jonker. "We believe 2012 is going to be a big year for us on the database side."

Posted December 21, 2011

In this, our last E-Edition of Database Trends and Applications for 2011, we're taking a look back at some of the most widely read articles of the past year. These articles cover a range of topics. Some provide an examination of just-emerging or quickly evolving technologies, others highlight best practices in a specific discipline, while others comment on trends observed by industry experts. Click on the "December 2011 E-Edition UPDATE" headline above to access the articles. If you missed one earlier in the year, here's your second chance. All DBTA E-Editions are archived by month on the DBTA website.

Posted December 16, 2011

EMC Corporation has introduced the EMC Greenplum Unified Analytics Platform (UAP), a platform to support big data analytics, that combines the co-processing of structured and unstructured data with a productivity engine that enables collaboration among data scientists. The new EMC Greenplum UAP brings together the EMC Greenplum database for structured data, the enterprise Hadoop offering EMC Greenplum HD for the analysis and processing of unstructured data, and EMC Greenplum Chorus, its new productivity engine for data science teams. Greenplum UAP will be available in the first quarter of calendar 2012.

Posted December 08, 2011

Appfluent Technology Inc., a provider of business activity and data usage software for big data and analytics, announced the launch of Visibility 90X, a program intended to offer IT departments a cost-effective solution for managing exploding data volumes smarter with existing resources.

Posted December 06, 2011

Attivio announced the launch of an advanced unified information access platform that condenses information from existing BI and big data technologies into a single environment accessible to business end users. Attivio's Active Intelligence Engine (AIE) 3.0 is designed to support information access methods used across the enterprise and provide search queries and role-based dashboards for mainstream business users.

Posted December 06, 2011

A new blog on the SHARE website places the focus on the mainframe as a big data workhorse and the reigning alternative to internal (or external) cloud provision. Pedro Pereira, authoring the blog in SHARE's "President's Corner," makes several astute observations including identifying security and availability as unknowns in a cloud environment.

Posted December 06, 2011

A new survey of 421 data managers and professionals affiliated with the Independent Oracle Users Group (IOUG) members finds that while most companies have well-established data warehouse systems, adoption is still limited within their organizations. Many respondents report a significant surge of data within their data warehouses in recent times, fueled not only by growing volumes of transaction data but unstructured data as well. Now, the challenge is to find ways to extend data analysis capabilities to additional business areas.

Posted December 01, 2011

The age of big data is upon us - and it is here to stay, but few organizations today are fully capable of accessing the full scope of big data. This webinar will cover a number of bulk load use case scenarios including data warehousing; data migration; data replication; disaster recovery; and cloud data publication. Once enterprise organizations effectively satisfy the bulk data access requirements for these use cases, they can simplify the data access architecture; save important resources for other tasks; and improve operational performance. Discover how to deal effectively with big data in this webinar sponsored by Progress DataDirect on Wednesday, November 30, at 11 am PT/ 2 pm ET.

Posted November 22, 2011

Kove, a high performance storage vendor, and ParAccel, provider of a leading analytic platform, have announced a new, joint solution available on Dell's PowerEdge servers and Kove XPD2 Storage intended to enable near-instantaneous analytic database duplication. The Kove-ParAccel database duplication solution enables the rapid delivery of an analytic sandbox clone that can be used by business analysts, or a quick provisioning clone for development and test in traditional on-premise environments.

Posted November 15, 2011

Few things in the world are changing as dramatically as data. Data has tapped out a powerful rhythm to keep time with technology's bleeding edge, leaving many technologies struggling to keep up. It should come as no surprise, then, that many of the data strategies that IT departments developed, and still widely rely upon, are no longer sufficient for today's needs. You can put data marts near the top of that list.

Posted November 10, 2011

DBTA had the opportunity to host a webcast this week featuring the Tableau application for big data running on Aster Data, delivered in part by Marc Parrish, vice president for Retention and Loyalty Marketing at Barnes & Noble. This was the most heavily attended webcast in the DBTA Webcast Series and includes one of the most cogent descriptions of the big data phenomenon you are likely to hear courtesy of Stephanie McReynolds, director of Product Marketing at Aster Data, during the question-and-answer session at the end of the event.

Posted November 04, 2011

Teradata, a data analytic data solutions provider, announced the latest update to its flagship data warehouse product, as well as new features in its data warehouse appliance. Teradata Database 14 is designed as the analytical engine for powering all of the vendor's "purpose-built" platform family members - from enterprise data warehouses to appliances. "We're including big data application support," says Scott Gnau, president of Teradata Labs at a briefing at the vendor's recent user group event.

Posted October 26, 2011

Research results of a new study among Independent Oracle Users Group (IOUG) members, shows that while most companies have well-established data warehouse systems, adoption is still limited within their organizations. While companies are conservative in their plans for data warehouse upgrades, there is movement toward implementation of ready-to-run data solutions, and respondents expect to see benefits from these implementation efforts, including better performance and greater insights for decision makers. The findings of the survey, which was underwritten by Oracle, and conducted by Unisphere Research, are detailed in a new report, "A New Dimension to Data Warehousing: 2011 IOUG Data Warehousing Survey."

Posted September 27, 2011

expressor software, a provider of data integration software, says it is shipping a new version of its flagship platform that expands its data warehousing, as well as a new licensing offer for business end users who wish to build their own queries. The latest version of expressor 3.4, features three major areas of enhancement for the company's data integration platform, including a new rules editor, enhanced data warehousing and ETL (extract, transform and load) capabilities, and end user functionality.

Posted September 20, 2011

For many organizations, data is not only crossing into the hundreds of terabytes, but into the near-petabyte (PB) and multi-petabyte range. In a new survey sponsored by Oracle and conducted by Unisphere Research among members of the Independent Oracle Users Group (IOUG), "The Petabyte Challenge: 2011 IOUG Database Growth Survey," close to one out of 10 respondents report that the total amount of online (disk-resident) data they manage today - taking into account all clones, snapshots, replicas and backups - tops a petabyte.

Posted September 07, 2011

Rampant data growth has been the stimuli to over-spending on data storage. Technology advances have enabled us to gather more data faster than any time in our history. This has been beneficial in many ways and has provided businesses more data that can enable them to optimize their sales, marketing, customer relations and product offerings. Unfortunately, in order to keep pace with data growth, businesses have had to provision more and more storage capacity, costing them millions of dollars.

Posted August 29, 2011

Endeca Technologies, Inc., an information management software company, has unveiled native integration of Endeca Latitude with Apache Hadoop. "This native integration between Endeca Latitude and Hadoop brings together big data processing from Hadoop and interactive search, exploration and analysis from Latitude," says Paul Sonderegger, chief strategist of Endeca Technologies.

Posted August 29, 2011

HP and Autonomy Corporation plc have announced plans for HP (through an indirect wholly-owned subsidiary, HP SPV) to acquire all of the outstanding shares of Autonomy for £25.50 ($42.11) per share in cash. "Together with Autonomy, we plan to reinvent how both unstructured and structured data is processed, analyzed, optimized, automated and protected. Autonomy has an attractive business model, including a strong cloud-based solution set, which is aligned with HP's efforts to improve our portfolio mix," said Léo Apotheker, HP president and chief executive officer, in a statement issued by the company.

Posted August 23, 2011

Oracle Solaris 11 Express is available on Oracle Exadata Database Machines X2-2 and X2-8, enabling customers to take advantage of the reliability, scalability, and security of Oracle Solaris to run their online transaction processing (OLTP), data warehousing and consolidated workloads on the x86-based Oracle Exadata systems, Oracle announced.

Posted August 18, 2011

InetSoft Technology, a provider of data mashup driven dashboard and reporting solutions, and Management Systems International (MSI), experts in financial planning and reporting, have announced a joint solution for interactive dashboard reporting on top of a financial consolidation and information management platform. The joint solution is intended to enable multinational firms using diverse ERP and financial systems to be able to visually understand and explore their financial data in order to manage their financial operations more efficiently.

Posted August 09, 2011

Oracle's S. Ramakrishnan, group vice president and general manager for Oracle Financial Services Analytical Applications, was in New York last week to provide an update on how financial services institutions are leveraging tailored technology from Oracle - including the recently announced Oracle Financial Services Data Warehouse - to manage the complex information needed to compete profitably and effectively address stringent regulatory requirements.

Posted August 04, 2011

Expanding its existing product portfolio, Informatica Corporation now offers Universal Data Replication, giving customers more options to meet their business continuity, big data and operational data integration needs. A part of the Informatica Platform, Informatica's new data replication technology includes Informatica Fast Clone which automates the cloning of application data and Informatica Data Replication which manages the capture, routing and delivery of high-volume transaction data across diverse systems in real time with minimal source system impact.

Posted August 02, 2011

Oracle has introduced the Oracle Exadata Storage Expansion Rack to offer customers a cost-effective way to add storage to an Oracle Exadata Database Machine. "There are customers, earlier Exadata Database Machine customers, that have now started to fill up the disks that they have on the Database Machine and they are starting to look for ways to expand their storage capacity, and so this is going to be really welcome for them," says Tim Shetler, vice president of Product Management, Oracle.

Posted July 27, 2011

The Oracle Applications Users Group (OAUG), the world's largest user knowledgebase for Oracle Applications users, is launching the OAUG Educational Series 2011, a virtual learning series offered to OAUG members from Aug. 8-19, featuring the most popular presentations from the COLLABORATE 11 - OAUG Forum.

Posted July 25, 2011

Dataguise, a provider of enterprise security intelligence solutions, has announced a high performing database cloning privacy solution to support test, development and analytic uses in data warehousing environments. According to Dataguise, its sensitive data discovery and masking solutions complement NetApp's solution for rapid cloning of large Oracle data sets to enable efficient and secure distribution when running on the Cisco Unified Computing System (UCS) platform.

Posted July 19, 2011

Kognitio today launched a new family of data warehouse appliances designed to let companies choose the model best suited to their specific data analysis speed and volume needs. "We have always offered a software-only database prepackaged on industry-standard hardware as an appliance for a turnkey solution. What we are doing today is basically giving customers more choice," Sean Jackson, vice president of marketing, Kognitio, tells 5 Minute Briefing. Kognitio has named the three new appliance varieties Rapids, Rivers and Lakes - which the company says are metaphors for the variety of performance and capacity issues that customers must consider.

Posted June 29, 2011

Vertica, an HP company, has announced the availability of Vertica 5.0, the latest version of the MPP columnar Vertica Analytics Platform. Vertica 5.0 offers a new software development kit (SDK) that provides the ability to customize and insert customer- and use case-specific query logic into the Vertica database for fully parallel execution. "The three tenets of Vertica have been its speed, scalability and simplicity and we have continued to further the industry-leading aspects of all three of those tenets of the system," Scott Howser, vice president of product marketing, HP Vertica, tells 5 Minute Briefing.

Posted June 29, 2011

Representing a continued expansion of its big data analytics portfolio, IBM has introduced a new addition to the Netezza product family of analytics appliances that is designed to help organizations uncover patterns and trends from extremely large data sets. The appliance is the first to be delivered by IBM since it acquired Netezza in November 2010. According to IBM, using the new appliance, businesses can now more easily sift through petabytes of data, including banking and mobile phone transactions, insurance claims, electronic medical records, and sales information, and they can also analyze this information to reveal new trends on consumer sentiment, product safety, and sales and marketing effectiveness. "This new appliance takes the scalability to a completely new dimension," says Razi Raziuddin, senior director of product management at IBM Netezza.

Posted June 24, 2011

EMC Corporation, a provider of storage and infrastructure solutions, announced it will be shipping a data warehouse appliance that leverages the Apache Hadoop open-source software used for data-intensive distributed applications. The company's high-performance, data co-processing Hadoop appliance - the Greenplum HD Data Computing Appliance - integrates Hadoop with the EMC Greenplum Database, allowing the co-processing of both structured and unstructured data within a single solution. EMC also says the solution will run either Hadoop-based EMC Greenplum HD Community Edition or EMC Greenplum HD Enterprise Edition software.

Posted June 24, 2011

Composite Software has introduced Composite 6, a new version of its flagship data virtualization software that provides "big data" integration support for the Cloudera Distribution including Apache Hadoop (CDH), IBM Netezza and HP Vertica data sources. In addition, Composite 6, which is now completing beta test and will be commercially available in July, includes performance optimizations, cache enhancements, new data governance capabilities and ease-of-use features. "Data virtualization is emerging as an ideal solution for managing today's complex data integration challenges," says Jim Green, CEO for Composite Software.

Posted June 24, 2011

Is the day of reckoning for big data upon us? To many observers, the growth in data is nothing short of incomprehensible. Data is streaming into, out of, and through enterprises from a dizzying array of sources-transactions, remote devices, partner sites, websites, and nonstop user-generated content. Not only are the data stores resulting from this information driving databases to scale into the terabyte and petabyte range, but they occur in an unfathomable range of formats as well, from traditional structured, relational data to message documents, graphics, videos, and audio files.

Posted June 22, 2011

Dell says it will resell RainStor's specialized data retention database to support solutions such as application retirement and retention of machine-generated data. RainStor, a data storage infrastructure software company, designs and sells technology that enables data to be de-duplicated and compressed, while still accessible online through standard SQL language and BI tools. "The RainStor-Dell solution combines the object storage capabilities of the Dell DX with RainStor's online data retention (OLDR) repository," Ramon Chen, vice president of product management at RainStor, tells 5 Minute Briefing.

Posted June 13, 2011

The Oracle Exadata Database Machine X2-2 and X2-8 with the Solaris option began shipping just this month. Now in its third generation, the Database Machine combines all the components to create what the company describes as the best platform for running the Oracle Database. Here, Tim Shetler, vice president of Product Management, Oracle, talks about the performance innovations that differentiate Oracle's offering, how customers are using the system today for business advantage, and also - what's ahead. "Performance is one of the really big advantages of Exadata. It is single-purpose around running the Oracle Database with the best performance and the best availability possible," says Shetler.

Posted May 31, 2011

The Oracle Exadata Database Machine X2-2 and X2-8 with the Solaris option began shipping just this month. Now in its third generation, the Database Machine combines all the components to create what the company describes as the best platform for running the Oracle Database. Here, Tim Shetler, vice president of Product Management, Oracle, talks about the performance innovations that differentiate Oracle's offering, how customers are using the system today for business advantage, and also — what's ahead.

Posted May 26, 2011

Data quality, MDM, and data governance software vendor Ataccama Corporation announced that it has entered into a cooperative partnership with Teradata Corporation, a leader in data warehousing and enterprise analytics. The partnership is aimed at enabling joint customers to improve data quality within their data warehouses. Ataccama is a global software company with headquarters in Prague, and offices in Toronto, Stamford, London, and Munich, and the new partnership with Teradata represents a worldwide geographical relationship, according to Michal Klaus, CEO, Ataccama Corp.

Posted May 24, 2011

Alpine Data Labs, developer of Alpine Miner, a solution for big data predictive analytics, has received $7.5 million in Series A funding. In addition, after 15 months of product development, the company also announced its 10th production customer and its formal launch in the U.S. market. According to Anderson Wong, Alpine Labs CEO and co-founder, organizations cannot extract all the possible value from their data because it is growing faster than they can analyze it, they don't have enough resources with analytics expertise, and the tools they're using are too complex to get to the answers they need quickly.

Posted May 13, 2011

Big data provides new opportunities to improve customer care, unearth business insights, control operational costs, and in some cases, enable entirely new business models. By having access to larger and broader data sets, you can improve forecasts and projections for the business. A healthcare organization can conduct longitudinal analysis against years of data for patients treated with coronary attacks in order to improve care and speed time to recovery. A retailer can conduct deeper analysis on buying behavior during recessionary times if they have access to large data sets collected during the last economic downturn. Additionally, organizations across many sectors, such as communications, financial services and utilities, face significant regulatory and legal requirements for retaining and providing fast access to historical data for inquiries, audits and reporting.

Posted May 12, 2011

Full 360 Inc., a New York-based systems integrator, has introduced a new release of its elasticBI platform-as-a-service (PaaS). elasticBI pairs Jaspersoft4, the open source BI application, with the Vertica Analytic Database in a PaaS offering that is tightly integrated via the Opscode Chef framework. The platform is offered by Full 360 on Amazon Web Services, the Amazon cloud. According to Full 360, elasticBI provides a complete BI-data warehouse platform that is accessible from a cost and technical perspective for small and mid-market companies, as well as enterprise departments.

Posted April 26, 2011

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30

Sponsors