Data Warehousing Articles
Organizations are struggling with big data, which they define as any large-size data store that becomes unmanageable by standard technologies or methods, according to a new survey of 264 data managers and professionals who are subscribers to Database Trends and Applications. The survey was conducted by Unisphere Research, a division of Information Today, Inc., in partnership with MarkLogic in January 2012. Among the key findings uncovered by the survey is the fact that unstructured data is on the rise, and ready to engulf current data management systems. Added to that concern, say respondents, is their belief that management does not understand the challenge that is looming, and is failing to recognize the significance of unstructured data assets to the business.
Posted May 01, 2012
Oracle addressed the need to make IT infrastructure and business analytics technologies simpler and more efficient in a presentation to OpenWorld Tokyo 2012 attendees that was also made available via live webcast. In addition to presenting its strategy and plans for business analytics, the company also unveiled new additions to its product portfolio. In his keynote address, Oracle president Mark Hurd explained how the business users of tomorrow will require faster and more comprehensive information access. "The true question with analytics is how to get the right information to the right person at the right time to make the right decision," he said.
Posted April 26, 2012
IBM has introduced DB2 10 and InfoSphere Warehouse 10 software that integrates with big data systems, automatically compresses data into tighter spaces to prevent storage sprawl, and slices information from the past, present, and future to eliminate expensive application code. Over the past 4 years, more than 100 clients, 200 business partners, and hundreds of experts from IBM Research and Software Development Labs around the world collaborated to develop the new software.
Posted April 26, 2012
MapR Technologies, Inc., provider of the MapR distribution for Apache Hadoop, has introduced new data connection options for Hadoop to enable a range of data ingress and egress alternatives for customers. These include direct file-based access using standard tools and file-based applications, direct database connectivity, Hadoop specific connectors via Sqoop, Flume and Hive; as well as direct access to popular data warehouses and applications using custom connectors. Additionally, technology providers Pentaho and Talend are partnering with MapR to provide direct integration with MapR's distribution, and MapR has also entered into a partnership with data warehouse and business intelligence platform vendor Tableau Software.
Posted April 26, 2012
Attivio and TIBCO Software Inc. have announced that, as part of a new partnership agreement, the TIBCO Spotfire analytics platform has achieved the highest level of information access available within Attivio's Active Intelligence Engine (AIE). According to Attivio, its AIE ingests all types of structured data and unstructured content, and unlike a traditional data warehouse, does not require relationships between any data or content to be defined in advance prior to ingestion. In achieving platinum-level certification, Attivio says TIBCO Spotfire has been authenticated by Attivio to be able to present, analyze, access and leverage AIE's AI-SQL capabilities to provide data visualization, enhanced analytics of unstructured content and intuitive search and discovery, all in the same dashboard.
Posted April 10, 2012
1010data, Inc., provider of an internet-based big data warehouse, has announced the launch of a new software tool that enables 1010data's customers to automatically segment and analyze huge consumer transaction databases and produce statistical models with specificity, even to the level of social groups, families and individuals. For the first stage of the launch, 1010data is making the tool available in an invitational beta release for retail, consumer goods, and mobile telecom companies. "In all consumer-driven industries, customers are demanding to be treated as individuals, not boomers, tweeners, or dinks - dual income, no kids," said Tim Negris, vice president of marketing at 1010data.
Posted March 22, 2012
For enterprises grappling with the onslaught of big data, a new platform has emerged from the open source world that promises to provide a cost-effective way to store and process petabytes and petabytes worth of information. Hadoop, an Apache project, is already being eagerly embraced by data managers and technologists as a way to manage and analyze mountains of data streaming in from websites and devices. Running data such as weblogs through traditional platforms such as data warehouses or standard analytical toolsets often cannot be cost-justified, as these solutions tend to have high overhead costs. However, organizations are beginning to recognize that such information ultimately can be of tremendous value to the business. Hadoop packages up such data and makes it digestible.
Posted March 19, 2012
Oracle has unveiled Oracle Airline Data Model, a standards-based, pre-built database schema to help airlines optimize the collection, storage, and analysis of passenger data from reservations, sales, operations, loyalty, customer service and finance in their data warehouse. Available as an option for Oracle Database 11g Enterprise Edition, the new Oracle Airline Data Model delivers a comprehensive database schema for passenger data, sophisticated analytics, trending and data mining capabilities.
Posted March 14, 2012
Quest Software said it has entered into definitive agreements with affiliates of Insight Venture Partners to become a private company. In the deal, stockholders would receive $23 per share in cash, valuing the company at approximately $2.0 billion. Upon closing, Quest will be a privately held company and will continue to be led by chairman and CEO Vinny Smith and the existing senior management team. "As a private company, we will have increased flexibility to drive innovation across our product lines and execute our long-term strategy," said Smith in a statement released by the company.
Posted March 09, 2012
EMC Corporation has announced version 4.2 of EMC Greenplum Database, which includes a high-performance gNet for Hadoop; simpler, scalable backup with EMC Data Domain Boost; an extension framework and turnkey in-database analytics; language and compatibility enhancements for faster migrations to Greenplum; and targeted performance optimization.
Posted March 01, 2012
The Teradata Data Warehouse Appliance 2690 is now generally available. The new release is designed to deliver double the performance with up to triple the data capacity of its predecessor. "The Teradata Data Warehouse Appliance 2690 is our fifth generation appliance that provides a faster, easier, and greener analytic engine for a wide variety of demanding business intelligence tasks, which has contributed to its rapid customer adoption," says Ed White, general manager, Teradata Appliances, Teradata Corporation.
Posted February 28, 2012
Tableau Software, a provider of business intelligence software, and Cloudera Inc., a provider of Apache Hadoop-based data management software and services, have announced integration between the two companies that provides enterprises with capabilities to more easily extract business insights from their big data without needing the specific technical skills typically required to operate Hadoop. Tableau has developed a Certified Cloudera Connector that is licensed to work with Cloudera's Distribution Including Apache Hadoop (CDH). The new connector is part of the Tableau 7.0 release.
Posted February 24, 2012
Oracle has announced the availability of Oracle Advanced Analytics, a new option for Oracle Database 11g that combines Oracle R Enterprise with Oracle Data Mining. According to Oracle, Oracle R Enterprise delivers enterprise class performance for users of the R statistical programming language, increasing the scale of data that can be analyzed by orders of magnitude using Oracle Database 11g.
Posted February 23, 2012
Kalido, a provider of agile information management software, unveiled the latest release of the Kalido Information Engine, which helps organizations decrease the time for data mart migrations and consolidations. With this new release, customers will be able to import existing logical and physical models and taxonomies to build a more agile data warehouse. Enabling customers to take advantage of existing assets and investments "is going to dramatically reduce the time and the cost that it takes to bring together data marts into more of a data warehouse scenario," says John Evans, director of product marketing at Kalido.
Posted February 23, 2012
SAP AG has announced two new offerings that will offer the benefits of the SAP HANA platform to small businesses and midsize enterprises (SMEs). With analytics powered by SAP HANA for the SAP Business One application and SAP HANA, Edge edition, SMEs will be able to leverage in-memory technology from SAP. The new offerings from SAP aim to provide SMEs with real-time access to information tailored to their individual requirements.
Posted February 22, 2012
The winner of the Apple iPad 2 drawn from the pool of 421 data managers and professionals who responded to the latest IOUG Data Warehousing survey has been announced. The winner is Steven Pierce, principal of Think Huddle in Annandale, Virginia (www.thinkhuddle.com).
Posted February 01, 2012
SAP AG has announced that since introducing SAP HANA a year ago customer and partner demand for the technology has surged. According to Dr. Vishal Sikka, member of the SAP executive board, Technology & Innovation, leading independent software vendors are adopting the open SAP HANA platform for their existing products and also building completely new applications as well. The company also announced at the recent SAP Influencer Summit 2011 in Boston that SAP HANA is at the core of its platform roadmap, powering both renewed applications without disruption as well as new ones.
Posted December 21, 2011
The first calendar year following SAP's acquisition of Sybase is coming to a close. David Jonker, director, product marketing - Data Management & Analytics, Sybase, discusses key product integrations, IT trends that loom large in Sybase's data management strategies, and the emergence of what Sybase describes as DW 2.0. 2011 has been "a foundational year," with effort focused on making Sybase technologies work with SAP and setting the stage for 2012, says Jonker. "We believe 2012 is going to be a big year for us on the database side."
Posted December 21, 2011
EMC Corporation has introduced the EMC Greenplum Unified Analytics Platform (UAP), a platform to support big data analytics, that combines the co-processing of structured and unstructured data with a productivity engine that enables collaboration among data scientists. The new EMC Greenplum UAP brings together the EMC Greenplum database for structured data, the enterprise Hadoop offering EMC Greenplum HD for the analysis and processing of unstructured data, and EMC Greenplum Chorus, its new productivity engine for data science teams. Greenplum UAP will be available in the first quarter of calendar 2012.
Posted December 08, 2011
Appfluent Technology Inc., a provider of business activity and data usage software for big data and analytics, announced the launch of Visibility 90X, a program intended to offer IT departments a cost-effective solution for managing exploding data volumes smarter with existing resources.
Posted December 06, 2011
Attivio announced the launch of an advanced unified information access platform that condenses information from existing BI and big data technologies into a single environment accessible to business end users. Attivio's Active Intelligence Engine (AIE) 3.0 is designed to support information access methods used across the enterprise and provide search queries and role-based dashboards for mainstream business users.
Posted December 06, 2011
A new blog on the SHARE website places the focus on the mainframe as a big data workhorse and the reigning alternative to internal (or external) cloud provision. Pedro Pereira, authoring the blog in SHARE's "President's Corner," makes several astute observations including identifying security and availability as unknowns in a cloud environment.
Posted December 06, 2011
A new survey of 421 data managers and professionals affiliated with the Independent Oracle Users Group (IOUG) members finds that while most companies have well-established data warehouse systems, adoption is still limited within their organizations. Many respondents report a significant surge of data within their data warehouses in recent times, fueled not only by growing volumes of transaction data but unstructured data as well. Now, the challenge is to find ways to extend data analysis capabilities to additional business areas.
Posted December 01, 2011
Kove, a high performance storage vendor, and ParAccel, provider of a leading analytic platform, have announced a new, joint solution available on Dell's PowerEdge servers and Kove XPD2 Storage intended to enable near-instantaneous analytic database duplication. The Kove-ParAccel database duplication solution enables the rapid delivery of an analytic sandbox clone that can be used by business analysts, or a quick provisioning clone for development and test in traditional on-premise environments.
Posted November 15, 2011
Few things in the world are changing as dramatically as data. Data has tapped out a powerful rhythm to keep time with technology's bleeding edge, leaving many technologies struggling to keep up. It should come as no surprise, then, that many of the data strategies that IT departments developed, and still widely rely upon, are no longer sufficient for today's needs. You can put data marts near the top of that list.
Posted November 10, 2011
Teradata, a data analytic data solutions provider, announced the latest update to its flagship data warehouse product, as well as new features in its data warehouse appliance. Teradata Database 14 is designed as the analytical engine for powering all of the vendor's "purpose-built" platform family members - from enterprise data warehouses to appliances. "We're including big data application support," says Scott Gnau, president of Teradata Labs at a briefing at the vendor's recent user group event.
Posted October 26, 2011
Research results of a new study among Independent Oracle Users Group (IOUG) members, shows that while most companies have well-established data warehouse systems, adoption is still limited within their organizations. While companies are conservative in their plans for data warehouse upgrades, there is movement toward implementation of ready-to-run data solutions, and respondents expect to see benefits from these implementation efforts, including better performance and greater insights for decision makers. The findings of the survey, which was underwritten by Oracle, and conducted by Unisphere Research, are detailed in a new report, "A New Dimension to Data Warehousing: 2011 IOUG Data Warehousing Survey."
Posted September 27, 2011
expressor software, a provider of data integration software, says it is shipping a new version of its flagship platform that expands its data warehousing, as well as a new licensing offer for business end users who wish to build their own queries. The latest version of expressor 3.4, features three major areas of enhancement for the company's data integration platform, including a new rules editor, enhanced data warehousing and ETL (extract, transform and load) capabilities, and end user functionality.
Posted September 20, 2011
For many organizations, data is not only crossing into the hundreds of terabytes, but into the near-petabyte (PB) and multi-petabyte range. In a new survey sponsored by Oracle and conducted by Unisphere Research among members of the Independent Oracle Users Group (IOUG), "The Petabyte Challenge: 2011 IOUG Database Growth Survey," close to one out of 10 respondents report that the total amount of online (disk-resident) data they manage today - taking into account all clones, snapshots, replicas and backups - tops a petabyte.
Posted September 07, 2011
Endeca Technologies, Inc., an information management software company, has unveiled native integration of Endeca Latitude with Apache Hadoop. "This native integration between Endeca Latitude and Hadoop brings together big data processing from Hadoop and interactive search, exploration and analysis from Latitude," says Paul Sonderegger, chief strategist of Endeca Technologies.
Posted August 29, 2011
HP and Autonomy Corporation plc have announced plans for HP (through an indirect wholly-owned subsidiary, HP SPV) to acquire all of the outstanding shares of Autonomy for £25.50 ($42.11) per share in cash. "Together with Autonomy, we plan to reinvent how both unstructured and structured data is processed, analyzed, optimized, automated and protected. Autonomy has an attractive business model, including a strong cloud-based solution set, which is aligned with HP's efforts to improve our portfolio mix," said Léo Apotheker, HP president and chief executive officer, in a statement issued by the company.
Posted August 23, 2011
Oracle Solaris 11 Express is available on Oracle Exadata Database Machines X2-2 and X2-8, enabling customers to take advantage of the reliability, scalability, and security of Oracle Solaris to run their online transaction processing (OLTP), data warehousing and consolidated workloads on the x86-based Oracle Exadata systems, Oracle announced.
Posted August 18, 2011
InetSoft Technology, a provider of data mashup driven dashboard and reporting solutions, and Management Systems International (MSI), experts in financial planning and reporting, have announced a joint solution for interactive dashboard reporting on top of a financial consolidation and information management platform. The joint solution is intended to enable multinational firms using diverse ERP and financial systems to be able to visually understand and explore their financial data in order to manage their financial operations more efficiently.
Posted August 09, 2011
Oracle's S. Ramakrishnan, group vice president and general manager for Oracle Financial Services Analytical Applications, was in New York last week to provide an update on how financial services institutions are leveraging tailored technology from Oracle - including the recently announced Oracle Financial Services Data Warehouse - to manage the complex information needed to compete profitably and effectively address stringent regulatory requirements.
Posted August 04, 2011
Expanding its existing product portfolio, Informatica Corporation now offers Universal Data Replication, giving customers more options to meet their business continuity, big data and operational data integration needs. A part of the Informatica Platform, Informatica's new data replication technology includes Informatica Fast Clone which automates the cloning of application data and Informatica Data Replication which manages the capture, routing and delivery of high-volume transaction data across diverse systems in real time with minimal source system impact.
Posted August 02, 2011
Oracle has introduced the Oracle Exadata Storage Expansion Rack to offer customers a cost-effective way to add storage to an Oracle Exadata Database Machine. "There are customers, earlier Exadata Database Machine customers, that have now started to fill up the disks that they have on the Database Machine and they are starting to look for ways to expand their storage capacity, and so this is going to be really welcome for them," says Tim Shetler, vice president of Product Management, Oracle.
Posted July 27, 2011
The Oracle Applications Users Group (OAUG), the world's largest user knowledgebase for Oracle Applications users, is launching the OAUG Educational Series 2011, a virtual learning series offered to OAUG members from Aug. 8-19, featuring the most popular presentations from the COLLABORATE 11 - OAUG Forum.
Posted July 25, 2011
Dataguise, a provider of enterprise security intelligence solutions, has announced a high performing database cloning privacy solution to support test, development and analytic uses in data warehousing environments. According to Dataguise, its sensitive data discovery and masking solutions complement NetApp's solution for rapid cloning of large Oracle data sets to enable efficient and secure distribution when running on the Cisco Unified Computing System (UCS) platform.
Posted July 19, 2011
Kognitio today launched a new family of data warehouse appliances designed to let companies choose the model best suited to their specific data analysis speed and volume needs. "We have always offered a software-only database prepackaged on industry-standard hardware as an appliance for a turnkey solution. What we are doing today is basically giving customers more choice," Sean Jackson, vice president of marketing, Kognitio, tells 5 Minute Briefing. Kognitio has named the three new appliance varieties Rapids, Rivers and Lakes - which the company says are metaphors for the variety of performance and capacity issues that customers must consider.
Posted June 29, 2011