Newsletters




Data Integration

Traditional approaches to the process of combining disparate data types into a cohesive, unified, organized view involve manual coding and scripting. The need for Real-Time Business Intelligence and the ability to leverage a wider variety of data sources is driving companies to embrace new ways to achieve Data Integration, including Data Virtualization, Master Data Management, and Integration Automation.



Data Integration Articles

At the most fundamental level, consider that at the end of the day NoSQL and SQL are essentially performing the same core task — storing data to a storage medium and providing a safe and efficient way to later retrieve said data. Sounds pretty simple — right? Well, it really is with a little planning and research. Here's a simple checklist of 5 steps to consider as you embark into the world of NoSQL databases.

Posted October 22, 2014

We are in the midst of a business performance revolution, one where companies and customers alike expect instant access to the tools of commerce from anywhere at any time. Mobility is integral to this revolution, as the enterprise mobility phenomenon is quickly becoming a key driver of business innovation. Central to mobile success and improved business performance, however, is the business/IT collaboration that originates in the data center from easy access to information within the enterprise mainframe.

Posted October 22, 2014

Apache Hadoop has been a great technology for storing large amounts of unstructured data, but to do analysis, users still need to reference data from existing RDBMS based systems. This topic was addressed in "From Oracle to Hadoop: Unlocking Hadoop for Your RDBMS with Apache Sqoop and Other Tools," a session at the Strata + Hadoop World conference, presented by Guy Harrison, executive director of Research and Development at Dell Software, David Robson, principal technologist at Dell Software, and Kathleen Ting, a technical account manager at Cloudera and a co-author of O'Reilly's Apache Sqoop Cookbook.

Posted October 22, 2014

In his presentation at the Strata + Hadoop World conference, titled "Unseating the Giants: How Big Data is Causing Big Problems for Traditional RDBMSs," Monte Zweben, CEO and co-founder of Splice Machine, addressed the topic of scale-up architectures as exemplified by traditional RDBMS technologies versus scale-out architectures, exemplified by SQL on Hadoop, NoSQL and NewSQL solutions.

Posted October 22, 2014

Today, many companies still have most of their transactional data in relational database management systems which support various business-critical applications, from order entry to financials. But in order to maintain processing performance, most companies limit the amount of data stored there, making it less useful for in-depth analysis. One alternative, according to a recent DBTA webcast presented by Bill Brunt, product manager, SharePlex, at Dell, and Unisphere Research analyst Elliot King, is moving the data to Hadoop to allow it to be inexpensively stored and analyzed for new business insight.

Posted October 22, 2014

Kourier Integrator Release 4.2, the latest version of its enterprise integration and data management suite, includes improvements to the enterprise application integration (EAI) and extract-transform-load (ETL) capabilities of the product, as well as new support for SQL Server 2014, the latest version of Microsoft's RDBMS.

Posted October 22, 2014

Microsoft SQL Server 2014 finally went RTM (Released to Manufacturing) at the beginning of this month. Here's a look at the key new features within three major areas of enhancement: Mission-Critical Performance, Business Intelligence, and Hybrid Cloud.

Posted October 22, 2014

To help simplify the process for the user with self-service BI tools, Logi Analytics has announced the latest version of its business intelligence platform Logi Info. "Self-service has been around for a while, but it never seems to deliver on its promise. Largely, that is because we are mismatching people and their capabilities with the tool sets and information they need," explained Brian Brinkmann, VP of Product for Logi Analytics.

Posted October 21, 2014

MapR Technologies, one of the top ranked distributors for Hadoop, has announced that MapR-DB is now available for unlimited production use in the freely-downloadable MapR Community Edition. "From a developer standpoint, they can combine the best of Hadoop, which is deep predictive analytics across the data, as well as a NoSQL database for real-time operations," explained Jack Norris, chief marketing officer for MapR Technologies.

Posted October 21, 2014

Datameer has introduced Datameer 5.0 with Smart Execution, a technology that examines dataset characteristics, analytics tasks and available system resources to determine the most appropriate execution framework for each workload.

Posted October 21, 2014

At Strata + Hadoop World in New York, Microsoft announced an update to Microsoft Azure HDInsight, its cloud-based distribution of Hadoop. Customers can now process millions of Hadoop events in near real time, with Microsoft's preview of support for Apache Storm clusters in Azure HDInsight. In addition, as part of its integration with the Azure platform, Hortonworks announced that the Hortonworks Data Platform (HDP) has achieved Azure Certification.

Posted October 20, 2014

Rocket Software's DBMS and Application Servers division is now being led by Gary Gregory, vice president and general manager. Looking ahead, the two key words for Rocket MultiValue are "modernization" and "acceleration," said Gregory. "That is what we want to do - and continuous quality improvement is something we must have to enable those two objectives."

Posted October 20, 2014

Oracle has expanded its data integration portfolio with the addition of Oracle Enterprise Metadata Management, a platform to help organizations govern data across the enterprise including structured and unstructured data, and across Oracle and third-party data integration, database, and business analytics platforms. "This is the first time that we have made a comprehensive offering in the area of metadata management," said Jeff Pollock, vice president of product management for Oracle Data Integration.

Posted October 20, 2014

Revolution Analytics, a commercial provider of open source R software, has released Revolution R Open and Revolution R Plus.

Posted October 15, 2014

With Cloudera 5.2 the focus is on building products to deliver on the promise of the enterprise data hub that Cloudera introduced last year, said Clarke Patterson, senior director of product marketing at Cloudera. In particular, new capabilities make the technology more accessible to users who are not data scientists and also increase the level of security, two hurdles which can stand in the way of Hadoop adoption.

Posted October 15, 2014

Informatica PowerCenter v. 9.6.1 and Data Quality v. 9.6.1 have achieved Oracle Exadata Optimized and Oracle SuperCluster Optimized status through the Oracle PartnerNetwork (OPN). Customers can utilize Informatica PowerCenter and Data Quality to ingest, cleanse and transform various types of data into Oracle Exadata and Oracle SuperCluster to maximize the value of their engineered systems investment.

Posted October 15, 2014

The newest release of Oracle Exalytics In-Memory Machine, an engineered system for business analytics, includes Intel Xeon processors customized for Oracle business analytics workloads, supporting 50% speed, 50% more processing cores and 50% more memory compared to the previous generation. The Oracle Database In-Memory has also been certified with Oracle Exalytics In-Memory Machine, expanding the scope of in-memory analytics to include the full capabilities of the Oracle Database.

Posted October 15, 2014

SAP SE has announced the SAP Cloud for Planning solution, an enterprise performance management (EPM) solution designed around user experience and built for the cloud. The SAP Cloud for Planning solution will be built natively on SAP HANA Cloud Platform, the in-memory platform-as-a-service (PaaS) from SAP.

Posted October 14, 2014

Generally available today, EMC and Pivotal have announced the Data Lake Hadoop Bundle 2.0 that includes EMC's Data Computing Appliance (DCA), a high-performance big data computing appliance for deployment and scaling of Hadoop and advanced analytics, Isilon scale-out NAS (network attached storage), as well as the Pivotal HD Hadoop distribution and the Pivotal HAWQ parallel SQL query engine. The idea is to provide a turn-key offering that combines compute, analytics and storage for customers building scale-out data lakes for enterprise predictive analytics.

Posted October 14, 2014

ParStream has introduced an analytics platform purpose-built for the speed and scale of the Internet of Things (IoT). The ParStream Analytics Platform is designed to scale to handle the massive volumes and high velocity of IoT data and is expected to help companies generate actionable insights by enabling analysis with greater flexibility and closer to the source.

Posted October 14, 2014

Building on its data lake approach, Pivotal today announced the next step in this vision with the implementation of an architecture that builds upon disk-based storage with memory-centric processing frameworks.

Posted October 14, 2014

In its first server announcement since completing the IBM System x server acquisition, Lenovo has announced plans to collaborate with VMware. This alliance extends the 16-year development relationship between System x and VMware and broadens the partnership to include the full range of Lenovo's expanded server business.

Posted October 14, 2014

MongoDB has introduced enhancements to MongoDB Management Service (MMS), a cloud service to simplify operations for MongoDB deployments and reduce operational overhead.

Posted October 14, 2014

Attunity has introduced Replicate 4.0 which provides high-performance data loading and extraction for Apache Hadoop. The solution has been certified with the Hortonworks and Cloudera Hadoop distributions.

Posted October 14, 2014

Two former Facebook engineers, Bobby Johnson and Lior Abraham, and former Intel engineer, Ann Johnson, have formed Interana to address what they say is an analytics void in event data. Espousing the philosophy that event data holds the key business metrics that companies care about most, Interana's solution is a database that has been specifically designed for event time data. Many methods in the past involved using general-purpose systems which were not designed to answer the types of questions posed by event data and it also took days to process, according to the company.

Posted October 13, 2014

Splunk, which provides software for machine-generated big data analysis, has announced Splunk Enterprise 6.2, Splunk Mint, and Splunk Hunk 6.2. "What we are doing with this release is fundamentally broadening the number of users that can do advanced analytics," stated Shay Mowlem, VP, product marketing at Splunk.

Posted October 13, 2014

IBM is adding new analytics capabilities to the mainframe platform, helping enable better data security and providing clients with the ability to integrate Hadoop big data. By applying analytic tools to business transactions as they are occurring, mainframe systems can enable clients to have true real-time insights. With the analytics on the System z platform, clients can also incorporate social media into their real-time analytic.

Posted October 13, 2014

GT Software has added enhancements to its flagship Ivory Service Suite line, incorporating greater support for big data elements and messaging formats.

Posted October 13, 2014

IBM, which has made a billion-dollar investment to broaden the use of cognitive computing, is announcing the launch of Watson World HQ today at 51 Astor Place. IBM said it chose NYC's Silicon Alley for Watson World to tap into the ecosystem of talent and capital centered around New York University, Columbia University, CUNY and Cooper Union, as well as venture capital firms and an expanding tech startup and developer community. Starting now, Watson's cognitive services and tools will be available to all users of Bluemix, IBM's open, cloud-based platform for mobile and web app development.

Posted October 13, 2014

Teradata is introducing Teradata Loom 2.3, a platform that provides integrated metadata management, data lineage, and ata wrangling for enterprise Hadoop. Teradata has also launched Teradata Cloud for Hadoop, a turnkey, full service cloud environment, and a broad technology and marketing partnership with Cloudera. "Increasingly, customers want a one-stop shop for their data analytics needs," said Chris Twogood, vice president of products and services at Teradata.

Posted October 09, 2014

SAP and BI provider Birst have formed a partnership to provide analytics in the cloud on the SAP HANA Cloud Platform. This collaboration intends to bring together the next-generation cloud platform from SAP with Birst's two-tier data architecture to provide instant access to an organization's data and help eliminate BI wait time.

Posted October 09, 2014

One feature of the big data revolution is the acknowledgement that a single database management system architecture cannot meet all needs. However, the Lambda Architecture provides a useful pattern for combining multiple big data technologies to achieve multiple enterprise objectives. First proposed by Nathan Marz, it attempts to provide a combination of technologies that together can provide the characteristics of a web-scale system that can satisfy requirements for availability, maintainability, and fault-tolerance.

Posted October 08, 2014

The PASS Summit, put on each fall by the Professional Association for SQL Server is the biggest SQL Server specific event in the world and it brings in many thousands of people from around the world. Its secret ingredient is the remarkable sense of comradery and overall friendliness of this professional association.

Posted October 08, 2014

Today, as never before, public sector agencies have had to become more proactive and operate more like private sector businesses and are required to maintain an immense amount of data in order to make "just in time" decisions as well as to forecast for the long term. This is where business intelligence (BI) tools come in handy—not only to provide systems that facilitate the collection of data, but more importantly, providing a means to sift through the vast amounts of information for which the public sector is custodian.

Posted October 08, 2014

Organizations have been collecting data for years, but never before has there been such urgency to have it immediately available. The business need is pressing—decision makers need up-to-the-minute situational awareness in a volatile global economy.

Posted October 08, 2014

In his keynote address, Thomas Kurian, executive vice president, Product Development, Oracle, showcased a brand new product, Oracle Big Data Discovery - a visual face to Hadoop - "that allows you to go to a browser, profile the data, explore the data, analyze the data, and do prediction and correlation," he said.

Posted October 08, 2014

A goal at many organizations is to make the data that matters more broadly accessible to more users in a timely fashion. However, at the same time, enterprise data environments are becoming more complex and costly, running on many platforms, supporting many applications, and handling growing volumes of data. IT is expected to respond quickly to new initiatives that can support the business but as IT environments become more unwieldy, agility is an elusive concept.

Posted October 08, 2014

Oracle CTO Larry Ellison took the stage at the Moscone Center to kick off the Oracle OpenWorld conference on Sunday evening. As he does each year, Ellison outlined major announcements for the week and explained the company's technology vision to set the tone for the conference. Confirming analysts' predictions that 2014 would be the year of the cloud for Oracle, Ellison said 2014 is an inflection point for the company. Ellison's key announcement was Oracle's upgraded platform-as-a-service capability: a fourth generation upward-compatible database.

Posted October 08, 2014

Hadoop RDBMS provider Splice Machine is partnering with LucidWorks to enable Splice Machine customers to access and analyze their unstructured data via LucidWorks Search.

Posted October 08, 2014

Join DBTA for a webcast on Thursday, October 9, to learn about the key use cases, data replication strategies and methods for exploring data more efficiently through Hadoop.

Posted October 06, 2014

Approaching the fourth year of its planned five-year turnaround, HP today announced its intention to separate into two new publicly traded companies. One will include HP's enterprise technology infrastructure, software and services businesses, which will do business as Hewlett-Packard Enterprise, and the other will include HP's personal systems and printing businesses, which will operate as HP Inc. and retain the current logo.

Posted October 06, 2014

Aerospike Inc., provider of an in-memory NoSQL database, has announced a new startup special and trade-in program. The new program gives free access to the enterprise Edition of Aerospike with no limits on nodes, TPS and volume of data managed. To qualify, startups must have revenue of under $2 million and funding of under $20 million.

Posted October 03, 2014

Denodo Technologies has introduced Denodo Express, a free version of its data virtualization solution that provides the same technical features of as the enterprise version of the Denodo platform. The only difference is there are certain data restrictions when using the Express version.

Posted October 02, 2014

Mainframes represent some of the most important log data available, since they host the most mission-critical applications. However, according to the companies, the terabytes of data in more than 200 different log types produced by a typical mainframe system were previously inaccessible to Splunk software without significant work. Delivering on the technology alliance recently announced by Splunk and Syncsort to pull mainframe data into Splunk Enterprise and Splunk Cloud, Syncsort has introduced Ironstream.

Posted October 02, 2014

Cisco brought the fifth annual Data Virtualization Day to the Waldorf Astoria in New York City to share details about advancements coming in Cisco Information Server 7.0, the advantages of data virtualization, and the importance of the network. A key component of 7.0 release of Cisco Information Server which will be shipped next month, is the Business Directory, which will support greater access to data among more users on a self-service basis.

Posted October 01, 2014

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41

Sponsors