Newsletters




Data Integration

Traditional approaches to the process of combining disparate data types into a cohesive, unified, organized view involve manual coding and scripting. The need for Real-Time Business Intelligence and the ability to leverage a wider variety of data sources is driving companies to embrace new ways to achieve Data Integration, including Data Virtualization, Master Data Management, and Integration Automation.



Data Integration Articles

For those whose brackets took a blow after just the first several games of the 2016 NCAA men's basketball championships, this year's tournament may provide a case in point on the value of analytics over sentiment.

Posted March 18, 2016

Oracle has released a free and open API and developer kit for its Data Analytics Accelerator (DAX) in SPARC processors through its Software in Silicon Developer Program. "Through our Software in Silicon Developer Program, developers can now apply our DAX technology to a broad spectrum of previously unsolvable challenges in the analytics space because we have integrated data analytics acceleration into processors, enabling unprecedented data scan rates of up to 170 billion rows per second," said John Fowler, executive vice president of Systems, Oracle.

Posted March 16, 2016

As more businesses leverage applications that are hosted in the cloud, the lines between corporate networks and the internet become blurred. Accordingly, enterprises need to develop an effective strategy for ensuring security. The problem is, many of today's most common approaches simply don't work in this new cloud-based environment.

Posted March 16, 2016

Tesora has upgraded its DBaas platform. The new release, Tesora DBaaS Platform Enterprise Edition 1.7, gives IT and database managers the ability to update OpenStack database guest images in real time, automates patch management, and provides more control over database replica placement.

Posted March 16, 2016

Melissa Data, a provider of contact data quality and integration solutions, has been awarded patent #9,262,475 by the U.S. Patent and Trademark Office (USPTO) for proximity matching technology used in its global data quality tools and services. The patent features an algorithm enabling distance criteria to be used in matching customer records, capitalizing on latitude, longitude, and proximity thresholds to help data managers eliminate duplicate customer contact data.

Posted March 16, 2016

Available now, Talend says its Integration Cloud Spring '16 release adds enhancements to help IT organizations execute big data and data integration projects running on AWS Redshift or AWS Elastic MapReduce (EMR) with greater ease - using fewer resources, and at a reduced cost.

Posted March 16, 2016

Percona, a provider of MySQL and MongoDB support, consulting, remote DBA, training, and software development, will host its annual Live Data Performance Conference, formerly known as the Percona Live MySQL Conference & Expo, in Santa Clara, California April 18-21.

Posted March 15, 2016

Tableau Software has acquired HyPer, a high performance database system that started as a research project in 2010 at the Technical University of Munich (TUM). As part of the technology acquisition, Tableau says it will add key technical personnel and also plans to establish a research and development center in Munich and expand its research into high performance computing.

Posted March 15, 2016

Attivio is receiving $31 million in investment financing that will help expand the company as it accelerates its offerings into the big data market.

Posted March 09, 2016

In a new book titled "Next Generation Databases," Guy Harrison, an executive director of R&D at Dell, shares what every data professional needs to know about the future of databases in a world of NoSQL and big data.

Posted March 08, 2016

MapR Technologies has added new features in the MapR Converged Data Platform, including the ability to run stateful, containerized applications. There is a big issue today in that many organizations are limiting Docker to applications that don't require state and don't requirement storage access, said Jack Norris, SVP data and applications, MapR.

Posted March 08, 2016

Splunk Inc. is enhancing its security analytics portfolio by combining machine learning, anomaly detection, context-enhanced correlation, and rapid investigation capabilities in new versions of Splunk User Behavior Analytics (UBA) and Splunk Enterprise Security(ES).

Posted March 08, 2016

Syncsort is introducing new capabilities to its data integration software, DMX-h, that allow organizations to work with mainframe data in Hadoop or Spark in its native format, which is necessary for preserving data lineage and maintaining compliance.

Posted March 07, 2016

Oracle has introduced the StorageTek Virtual Storage Manager (VSM) 7 System, which provides mainframe and heterogeneous storage combined with the additional capability for automated tiering to the public cloud.

Posted March 07, 2016

The world of IT operations has always had a big data problem. Instrumentation of end users, servers, application components, logs, clickstreams, generated events and incidents, executed notifications and runbooks, CMDBs, Gantt charts—you name it, people in the IT operations area have had to cope with mountains of data. And yet the scope of the problem has been enlarged once again, thanks to industry-wide trends such as bring-your-own-device, the Internet of Things, microservices, cloud-native applications, and social/mobile interaction.

Posted March 03, 2016

Microsoft first truly disrupted the ETL marketplace with the introduction of SQL Server Integration Services (SSIS) back with the release of SQL Server 2005. Microsoft has upped the ante yet again by bringing to market powerful ETL features to the cloud via the Azure Data Factory, which enables IT shops to integrate a multitude of data sources, both on-premises and in the cloud, via a workflow (called a "pipeline) that utilizes Hive, Pig, and customized C# programs.

Posted March 03, 2016

Few of us working in the software industry would dispute that agile methodologies represent a superior approach to older waterfall-style development methods. However, many software developers would agree that older enterprise-level processes often interact poorly with the agile methodology, and long for agility at the enterprise level. The Scaled Agile Framework (SAFe) provides a recipe for adopting agile principles at the enterprise level.

Posted March 03, 2016

Attunity Ltd. is releasing an enhanced version of its Attunity Compose platform to eliminate pitfalls in data warehousing and accelerate big data analytics.

Posted March 02, 2016

Our friends at Dell recently shared new research findings that shed light on how healthcare organizations perceive, plan for, and utilize key technologies across big data, cloud, mobility and security. According to the research, nearly half of healthcare organizations believe big data is relevant, but don't know how to approach it.

Posted March 01, 2016

Infobright, the columnar database analytics platform, has unveiled its new Infobright Approximate Query (IAQ) solution for large-scale data environments, allowing users to gain insights faster and efficiently. "This technology is being delivered on the basis of rethinking the business problem and using technology in a very meaningful way to solve problems that would otherwise be unsolvable using a traditional approach," said Don DeLoach, CEO.

Posted February 26, 2016

Cloud-borne data is becoming commonplace—at least at the edges of the enterprise. Organizations are relying, both formally and informally, on cloud-based services for supplemental storage, file sharing, and content management. The challenge now is to bring core enterprise data into the cloud, to render data ranging from financials to sales to performance analytics as services.

Posted February 24, 2016

There is no denying the fact that cloud-based software and computing services are now accepted as the norm. This change has profound implications on how software applications are architected, delivered, and consumed. This change has ushered in a new generation of technology and an entirely new category in data integration. Today, there are 10 new requirements for an enterprise data integration technology:

Posted February 24, 2016

SnapLogic is unveiling its Winter 2016 release of its flagship platform, further connecting users to flexible Spark capabilities and reliable big data integration solutions.

Posted February 23, 2016

The promise of the data lake is an enduring repository of raw data that can be accessed now and in the future for different purposes. To help companies on their journey to the data lake, Information Builders has unveiled the iWay Hadoop Data Manager, a new solution that provides an interface to generate portable, reusable code for data integration tasks in Hadoop.

Posted February 23, 2016

Information Builders, a provider of business intelligence (BI) and analytics, information integrity, and data integration solutions, is releasing three editions of its platform, allowing users to gain reporting and analytics insight from a range of users and environments. "The reason we are bringing out these three tiers of product is to make it easier for people to buy the right level of software to accomplish the right analytics and BI missions that they have," said Jake Freivald, vice president of product marketing for Information Builders.

Posted February 22, 2016

It is hard to think of a technology that is more identified with the rise of big data than Hadoop. Since its creation, the framework for distributed processing of massive datasets on commodity hardware has had a transformative effect on the way data is collected, managed, and analyzed - and also grown well beyond its initial scope through a related ecosystem of open source projects. With 2016 recognized as the 10-year anniversary for Hadoop, Big Data Quarterly chose this time to ask technologists, consultants, and researchers to reflect on what has been achieved in the last decade, and what's ahead on the horizon.

Posted February 18, 2016

TIBCO Software, a provider of solutions for integration, analytics and event processing, has added new features to its TIBCO Jaspersoft embedded analytics and reporting software. Jaspersoft 6.2 provides new collaborative functionality that allows casual users to build and customize their own reports. If desired, the same reports can also be routed to IT for refinement, where greater complexity can be edited at the API-level.

Posted February 18, 2016

GridGain Systems, a provider of an in-memory data fabric based on Apache Ignite, has raised $15 million in Series B financing.

Posted February 18, 2016

Every company is undoubtedly concerned about keeping outside attackers away from its sensitive data, but understanding who has access to that data from within the organization can be an equally challenging task. The goal of every attacker is to gain privileged access. An excessively privileged user account can be used as a weapon of destruction in the enterprise, and if a powerful user account is compromised by a malicious attacker, all bets are off.

Posted February 17, 2016

Currently, the IT industry is the midst of a major transition as it moves from the last generation - the internet generation - to the new generation of cloud and big data, said Andy Mendelsohn, Oracle's EVP of Database Server Technologies, who recently talked with DBTA about database products that Oracle is bringing to market to support customers' cloud initiatives. "Oracle has been around a long time. This is not the first big transition we have gone through," said Mendelsohn.

Posted February 17, 2016

Oracle has expanded its cloud offerings in the UK with the introduction of new PaaS and IaaS technologies to be hosted in its Slough data center. The new services include Oracle Database Cloud Service, Oracle Dedicated Compute Cloud Service, Oracle Big Data Cloud Service and Oracle Exadata Cloud Service. The facility currently serves more than 500 UK and global customers with SaaS and IaaS offerings tailored for private corporations and public sector customers.

Posted February 17, 2016

Databricks, the company behind Apache Spark, has announced the beta release of Databricks Community Edition, a free version of the cloud-based big data platform. First available as an invite-only beta rollout, accessibility to the Databricks Community Edition by the broader community will be made possible over the coming months, with general availability planned for late Q2 2016.

Posted February 17, 2016

Hewlett Packard Enterprise has added AppPulse Trace, a new module in its application performance monitoring software (APM) suite, to help developers identify and fix issues at their source, down to the exact line of code and server.

Posted February 17, 2016

Hewlett Packard Enterprise (HPE) has selected RedPoint Data Management platform as the underlying platform for a new HPE Risk Data Aggregation and Reporting (RDAR) integrated solution to support financial institutions' compliance with BDBS 239.

Posted February 16, 2016

Addressing the shift toward business-user-oriented visual interactive data preparation, Trillium Software has launched a new solution that integrates self-service data preparation with data quality capabilities to improve big data analytics.

Posted February 16, 2016

Glassbeam, a machine data analytics company, has introduced two product enhancements for the IoT analytics market. The new capabilities are aimed at automating the transformation of unstructured machine data into business insights and also providing a lighter footprint for Glassbeam at the edge.

Posted February 16, 2016

ClearStory Data has announced an update to its data inference and data harmonization capabilities called Infinite Data Overlap Detection (IDOD). With this R&D innovation, ClearStory's Spark-based analytics solution now detects and infers data patterns and customer-specific data types for all values for data that a user connects to as part of their analysis.

Posted February 12, 2016

The Internet of Things holds great promise for everything from better healthcare to decreased traffic accidents and more efficient manufacturing processes. Michael Morton is currently CTO at Dell Boomi, which he joined in 2013 after a career with IBM where he became an IBM Master Inventor and worked directly with Fortune 100 companies. Recently, Morton talked with BDQ about some of the opportunities and challenges and the role that Boomi plays in the emerging IoT market.

Posted February 11, 2016

Say what you will about Oracle, it certainly can't be accused of failing to move with the times. Typically, Oracle comes late to a technology party but arrives dressed to kill.

Posted February 10, 2016

The ability to stand up and voice your opinion about a solution or technology that will not solve a business problem is critical. At times, choices are made for budgetary reasons, and other times, it may be organizational pressure. But regardless of the reason, as IT professionals, we need to be courageous and do the right thing, whatever that is.

Posted February 10, 2016

Oracle has a new PartnerNetwork (OPN) Cloud Program that is aimed at helping organizations accelerate the growth of their Oracle Cloud business. Providing technical and go-to-market support, Oracle says the new four-tier program will provide Oracle partners with the tools they need to help Oracle customers navigate their transition to the cloud.

Posted February 10, 2016

SolarWinds, a provider of hybrid IT infrastructure management software, is adding improved support for Oracle Database 12c Enterprise Edition in the latest release of SolarWinds Database Performance Analyzer. With the additional support for Oracle Database 12c, the SolarWinds tool now pinpoints efficiency issues and optimizes performance of Oracle pluggable databases in a multitenant environment through tuning, metric visibility and resource correlation to help ensure the availability and speed of business-critical applications.

Posted February 10, 2016

Oracle has introduced a new Big Data Preparation Cloud Service. Despite the increasing talk about the need for companies to become "data-driven," and the perception that people who work with business data spend most of their time on analytics, Oracle contends that in reality many organizations devote much more time and effort on importing, profiling, cleansing, repairing, standardizing, and enriching their data.

Posted February 10, 2016

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171

Sponsors