Newsletters




Data Integration

Traditional approaches to the process of combining disparate data types into a cohesive, unified, organized view involve manual coding and scripting. The need for Real-Time Business Intelligence and the ability to leverage a wider variety of data sources is driving companies to embrace new ways to achieve Data Integration, including Data Virtualization, Master Data Management, and Integration Automation.



Data Integration Articles

RapidMiner is releasing self-service analytics for Hadoop. RapidMiner offers a solution that accelerates time to value for building and deploying advanced analytic models with numerous types of data. "Our value is around accelerating time to value for our customers. We do this through pre-built models which we call accelerators," stated Michele Chambers, president and COO at RapidMiner.

Posted February 17, 2015

Tamr, Inc., has introduced a new version of its scalable data unification platform that lets enterprises continuously deliver better data to decision-makers. In a related announcement, Tamr also unveiled two data-unification solutions for specific, high-value data variety challenges: Tamr for CDISC and Tamr for Procurement. Founded in 2013 by big data entrepreneurs Andy Palmer and Michael Stonebraker, Tamr is based in Cambridge, Mass.

Posted February 17, 2015

N1QL is a distributed query engine with powerful secondary index options and incremental map/reduce views. Join DBTA for a special webcast sponsored by NoSQL vendor Couchbase to look at the query engine details and indexing capabilities. The webcast will take place, Thursday, February 19, 2015 at 11 am PT/2 pm ET and one attendee will win a $100 American Express gift card during the live event!

Posted February 17, 2015

Hadoop heavyweight Pivotal is open sourcing components of its Big Data Suite, including Pivotal HD, HAWQ, Greenplum Database, and GemFire; forming the Open Data Platform (ODP), a new industry foundation along with founding members GE, Hortonworks, IBM, Infosys, SAS, and other big data leaders; and forging a new business and technology partnership with Hortonworks.

Posted February 17, 2015

The IOUG (Independent Oracle Users Group) is joining DBTA as a conference partner at the Data Summit 2015 conference, a comprehensive conference focused on information management and big data for technology and database professionals. The IOUG track at the Data Summit will focus on big data in the cloud and the evolution of the data warehouse.

Posted February 12, 2015

Contending that big data is meaningless to an organization if it is not correct and complete, Trillium Software has unveiled Trillium Big Data, a data quality solution that extends the capabilities of the flagship Trillium Software System to the Hadoop big data environment. "Trillium Big Data is the second in a series of planned new product launches for Trillium Software in 2015 which will help organizations increase the value of their data," said Phil Galati, CEO at Trillium Software.

Posted February 12, 2015

Unisphere Research and Radiant Advisors have announced the publication of a brand new report on the emerging concepts and strategies surrounding the data lake. "We wrote The Definitive Guide to the Data Lake to provide guidance to those considering the data lake by sharing the findings of companies within our research and advisory network that are actively implementing data lake strategies today," said John O'Brien, CEO of Radiant Advisors.

Posted February 11, 2015

In the early days of data warehousing, lines were very simple. The data warehouse reflected the business. In providing this reflection, data was summarized, data was cleansed, data was standardized, and data was even drastically reformatted for legibility and reporting usage. But, the big rule of thumb, the never-to-be-crossed line, was that the data warehouse did not create new data.

Posted February 11, 2015

It's already a great time to be a DBA or to consider becoming one. U.S. News & World Report ranks the DBA profession as the No. 5 best IT job and the #12 overall best professional jobs for 2014. The job outlook is very strong at 15% growth year-over-year, far greater than the U.S. economy in general and the IT industry in particular. For added context, consider that the business world is now enamored with data, analytics, and data visualization.

Posted February 11, 2015

I woke up early to get to the airport to ensure a timely check-in. When I got to the desk, the airline representative asked for my name and destination, keyed it in and we waited for the system to respond. The next words out of her mouth were, "Sorry, the system is slow today, you know how it is."

Posted February 11, 2015

There are always new buzzwords coming along. But whether you call it "SMAC" or "CAMS," there is no doubt that today the confluence of trends—analytics, cloud, social, and mobile—is proving to be a disruptive force that is causing many to reassess their approaches to data management. Over the years, MultiValue technologies have evolved and adapted, pushing boundaries in order to integrate with new data sources and targets, address new analytics needs, and keep pace with emerging requirements. This has enabled customers to continue to rely on their trusted, and often highly specialized, MultiValue applications and data management systems.

Posted February 11, 2015

Attunity recently added new capabilities to its solution suite with the acquisition of BIReady's data warehouse automation technology, which eliminates the complex, manual tasks of preparing data for BI and big data analytics. Lawrence Schwartz, Attunity's vice president of marketing, spoke with DBTA about BIReady and other Attunity solutions for customers dealing with big data projects.

Posted February 11, 2015

The "Internet of Things" (IoT) is opening up a new world of data interchange between devices, sensors, and applications, enabling businesses to monitor, in real time, the health and performance of products long after they leave the production premises. At the same time, enterprises now have access to valuable data—again, in real time if desired—on how customers are adopting products and services.

Posted February 11, 2015

There has been no shortage of disturbing accounts of data breaches and system hacks across many of the world's organizations. A recent survey of 353 data managers and professionals, members of the Independent Oracle Users Group, finds that enterprises are well aware of the risks they may encounter and bracing for the next potential onslaught of data security threats. More than one-third say their organizations are vulnerable, and likely to be hit by an incident, up from 20% in a similar survey conducted in 2008.

Posted February 11, 2015

Oracle unveiled the sixth-generation Oracle Exadata Database Machine during a launch event led by Oracle executive chairman of the board and CTO Larry Ellison. In a recent interview, Tim Shetler, vice president of Exadata Product Management at Oracle, and Juan R. Loaiza, senior vice president, Systems Technologies, discussed the significance of key innovations, including elastic configurations, the new Non-Volatile Memory Express flash protocol, and new capacity on demand licensing, that make it one of the most important releases in the history of Exadata.

Posted February 11, 2015

Zumasys, a provider of cloud computing solutions for business-critical software applications and ERP systems, has acquired the jBASE database technology from Temenos, a Geneva, Switzerland-based provider of banking software systems. "jBASE was the industry's first database-independent solution. Its more contemporary architecture allows Pick-based applications to natively interact with the underlying Windows or UNIX operating system, and store data in SQL Server, Oracle, and the cloud, which fits perfectly with our vision for the future," noted Paul Giobbi, president of Zumasys.

Posted February 11, 2015

Teradata is introducing new big data applications that incorporate recently acquired capabilities and technologies from Revelytix and Thing Big Analytics. The new offerings include solutions targeted to specific verticals such as retail and healthcare, new Teradata Loom 2.4 capabilities to expand the depth and breadth of metadata in the data lake, and a new fixed price/fixed time-frame data lake optimization service offering. The new products and services are targeted at extending big data competency to more non-data scientist users, and helping companies gain additional value from their data lake projects, said Chris Twogood, vice president of product and services marketing at Teradata.

Posted February 11, 2015

Hitachi Data Systems Corporation (HDS), a subsidiary of Hitachi, Ltd., has announced plans to acquire Pentaho Corporation, a big data integration and business analytics company with an open source-based platform for diverse big data deployments. According to Hitachi, the acquisition helps to fulfill its strategy of delivering business innovations that integrate machine data, information technology, and analytics to distill value from big data and the Internet of Things. The acquisition of Pentaho is expected to be complete by June 2015.

Posted February 10, 2015

Attivio has introduced Attivio 4.3, a single platform for search and discovery initiatives. The result, the company says, is that organizations with structured and unstructured data in disparate silos will be able to gain faster access to all information or practical business uses.

Posted February 10, 2015

Lavastorm Analytics, a data management and analytics software company, has added new functionality within the Lavastorm Analytics Engine platform that enables business analysts who have a limited knowledge of complex data science to deliver insights using predictive analytics.

Posted February 10, 2015

MemSQL has introduced the MemSQL Spark Connector. According to the vendor, the combination of an in-memory database from MemSQL and Spark's memory optimized processing framework gives enterprises the benefit of fast access to transactions, ETL, and analytics. The MemSQL Spark Connector is also available as an open source offering, providing developers the ability to adapt it to their needs.

Posted February 10, 2015

Hadoop has continued its growth and become part of the consciousness of decision makers dealing with big data. However, Hadoop is a still too advanced for the typical business user to work with. To help make it easier, Oracle has created Big Data Discovery, a product that aims to help simplify Hadoop for the average business user.

Posted February 10, 2015

Cloud computing has continued to become more and more prevalent among businesses. Addressing the increased demand for the cloud approach, Trillium Software, a Harte Hanks company and global provider of enterprise data quality solutions, has announced Trillium Cloud, a new service platform that provides organizations with an enterprise data quality solution, consumable via a managed public cloud environment. "We have made our entire Trillium stack software system available via the cloud," stated Will Schanz, senior vice president of Cloud Solutions at Trillium Software.

Posted February 10, 2015

In a recent DBTA webcast, the topic of moving from relational to NoSQL was explored with Shane Johnson, senior product marketing manager at Couchbase. Change can be hard because people typically like familiarity, but change usually becomes necessary because of need, and in this case, it is the need for flexibility that NoSQL provides.

Posted February 10, 2015

Splice Machine has announced that it has achieved Hortonworks Data Platform (HDP) certification by completing the required integration testing with Hortonworks Data Platform. As a result of the HDP Certification, Splice Machine customers can leverage the pre-built and validated integrations between enterprise technologies and the Hortonworks Data Platform, an open source Hadoop distribution, to simplify and accelerate their Splice Machine and Hadoop deployments.

Posted February 05, 2015

ERP change management company Panaya has announced a cloud-based electronic software update (ESU) automation service for Oracle's JD Edwards. The new service is intended to reduce the risk, cost, and time required in implementing JD Edwards ESUs by identifying what will be impacted on existing environments, and what needs to be tested.

Posted February 04, 2015

In the second quarter of fiscal year 2015, quarterly sales for Oracle Enterprise Resource Planning Cloud (Oracle ERP Cloud) and Oracle Enterprise Performance Management Cloud (Oracle EPM Cloud) increased by 80%, according to Oracle.

Posted February 04, 2015

For decades, data management was part of a clear and well-defined mission in organizations. Data was generated from transaction systems, then managed, stored, and secured within relational database management systems, with reports built and delivered to business decision makers' specs. In 2015, we will see the acceleration of 7 dramatic shifts in data management.

Posted February 04, 2015

During a live event, Larry Ellison, Oracle's executive chairman of the board and CTO, outlined a new strategy for reducing customer costs and increasing value with the company's next generation of engineered systems. In the presentation today, Ellison emphasized two key points.

Posted February 04, 2015

DataStax, provider of an enterprise distribution of Apache Cassandra, has acquired Aurelius LLC, provider of the open source graph database Titan.

Posted February 03, 2015

VoltDB, which provides an in-memory, scale-out SQL database, is releasing version 5.0 of its software. "Developers are in need of better tools with which to develop fast data streaming applications with real-time analytics and decision making across industries," said Bruce Reading, president and CEO of VoltDB.

Posted February 03, 2015

Wizeline, a provider of data-driven product intelligence solutions that help companies build successful products, has announced the availability of Wizeline Starter, a free version of its software platform. The new release eliminates the need for multiple tools for a product's development. This allows product managers to streamline the entire development cycle of a product on one piece of software, cutting the time of product development.

Posted February 03, 2015

Rackspace has announced support for Microsoft SQL Server 2014 In-Memory Online Transaction Processing (OLTP) and AlwaysOn Availability. With Rackspace DBA Services, customers can now use Rackspace database administrators (DBA) to migrate or upgrade existing workloads to SQL Server 2014 In-Memory OLTP on the Rackspace Managed Cloud

Posted February 03, 2015

SnapLogic, which provides iPaaS (integration platform as a service), has released a new service to speed the process by which time-sensitive data is delivered to cloud and big data applications. "Ultra Pipelines are a breakthrough for enterprise companies with real-time data-driven insights needs such as a sales and marketing programs, data science for predictive analysis, and customer data distributed across multiple systems," says Niraj Nagrani, vice president of engineering at SnapLogic.

Posted February 03, 2015

SAP has announced SAP Business Suite 4 SAP HANA (SAP S/4HANA). The new product is built on the in-memory platform SAP HANA and is designed leveraging the SAP Fiori user experience (UX) for mobile devices. The announcement was made at a launch event at the New York Stock Exchange.

Posted February 03, 2015

MongoDB has introduced MongoDB 3.0. According to Kelly Stirman, director of products at MongoDB, the new NoSQL database release introduces two key innovations that make the system well suited for the demanding requirements of large enterprises.

Posted February 03, 2015

Quantum Corp., a provider of storage solutions, announced three new solutions that integrate the cloud into multi-tier, hybrid storage architectures for demanding data workloads. "Effective data management is about putting data in the right place at the right time with the right technology, guided by a customer's workflow or application," said Geoff Stedman, senior vice president of StorNext Solutions for Quantum.

Posted February 02, 2015

Cisco Systems, Inc. is launching a new software licensing strategy aimed at consolidating enterprise cloud initiatives into a single solution set. The new licensing strategy, Cisco ONE Enterprise Cloud Suite, is an engineered software solution intended to deliver a hybrid-ready private cloud software solution.

Posted February 02, 2015

IBM reported its fourth-quarter and full-year 2014 income, and the results reflect a tough environment for the computing giant, buffeted by uncertain global markets and shifting technology paradigms. IBM reported its revenues were $92.8 billion for the year, down 6% from the previous year. Net income for 2014 was $15.8 billion, compared with $16.9 billion in the year-ago period, a decrease of 7%.

Posted February 02, 2015

ISUG-TECH has announced that the content and schedule for the 2015 ISUG-TECH Conference in Atlanta is now available online. The conference will be held March 29-April 2 at the Hilton Atlanta in Atlanta, Georgia.

Posted January 29, 2015

Cloud Foundry Foundation, a platform-as-a-service (PaaS) open source project, has been launched as an independent non-profit foundation. The Cloud Foundry Foundation will be managed as a Linux Foundation Collaborative Project and operate under a system of open governance created by a team of open source experts from founding Platinum Members EMC, HP, IBM, Intel, Pivotal, SAP, and VMware. "SAP is committed to contributing to open-source PaaS technologies, including integration of technologies such as the SAP HANA® platform and Internet of Things as part of SAP HANA Cloud Platform," said Björn Goerke, executive vice president and corporate officer, Product and Innovation Technology, SAP.

Posted January 29, 2015

SAP SE has collaborated with Raab Associates, Inc., a marketing technology consultant, to offer a free online tool that provides marketers with personalized recommendations on focus areas of investment for their organization. The marketing gap analysis tool asks a series of questions about the organization's profile, marketing programs, current systems and processes. The platform then evaluates the information to provide an analysis of the organization's current marketing technology foundation

Posted January 29, 2015

CA Technologies announced that its customer experience monitoring technology has been added as part of the "SAP Extended Diagnostics by CA Technologies" application. The SAP Extended Diagnostics application is intended to enable users to monitor the performance of applications from the business process down to the transaction-component level in real time.

Posted January 29, 2015

In 2014, the big data drumbeat continued to pound, major DBMS vendors expanded their product offerings, Microsoft hired a new CEO, and a range of new technology offerings were introduced. In retrospect, what stands out?

Posted January 29, 2015

Espresso Logic is adding data virtualization for mobile and web app back-end developers to its reactive programming-based back-end-as-a-service (BaaS). "We have a very unique technology that we have developed called reactive programming. As a user, whenever the spreadsheet is changed it will always recalculate itself and give you a result. Up until now, no one has really used this method for back-end development," explained R. Paul Singh, CEO of Espresso Logic.

Posted January 29, 2015

The data integration status quo is predicated on a model of data-at-rest. The designated final destination for data-at-rest is (and, at least for the foreseeable future, will remain) the data warehouse (DW). Traditionally, data of a certain type was vectored to the DW from more or less predictable directions—viz., OLTP systems, or flat files— and at the more or less predictable velocities circumscribed by the limitations of the batch model. Thanks to big data, this is no longer the case.

Posted January 28, 2015

Rackspace has announced new support for Microsoft SQL Server 2014 In-Memory Online Transaction Processing (OLTP) and AlwaysOn Availability. Rackspace now powers more than 10,000 compute instances running Microsoft SQL Server on its private and public clouds. By combining In-Memory OLTP with Managed High Availability, Rackspace says it can help customers speed up OLTP workloads to up to 30x while enabling greater uptime.

Posted January 28, 2015

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88

Sponsors