Newsletters




Data Integration

Traditional approaches to the process of combining disparate data types into a cohesive, unified, organized view involve manual coding and scripting. The need for Real-Time Business Intelligence and the ability to leverage a wider variety of data sources is driving companies to embrace new ways to achieve Data Integration, including Data Virtualization, Master Data Management, and Integration Automation.



Data Integration Articles

Datawatch Corporation is partnering with IBM to deliver enhanced data access and self-service data preparation to IBM Watson Analytics and IBM Cognos Analytics users.

Posted March 21, 2016

For those whose brackets took a blow after just the first several games of the 2016 NCAA men's basketball championships, this year's tournament may provide a case in point on the value of analytics over sentiment.

Posted March 18, 2016

With well over a hundred open source projects now part of the Hadoop ecosystem, it can be hard to know which technologies are best for which requirements. To help users get started with Hadoop and understand their technology choices, James Casaletto will present "Harnessing the Hadoop Ecosystem" at Data Summit 2016 in NYC. Casaletto is a solutions architect for MapR, where he develops and deploys big data solutions with Apache Hadoop.

Posted March 17, 2016

Oracle has released a free and open API and developer kit for its Data Analytics Accelerator (DAX) in SPARC processors through its Software in Silicon Developer Program. "Through our Software in Silicon Developer Program, developers can now apply our DAX technology to a broad spectrum of previously unsolvable challenges in the analytics space because we have integrated data analytics acceleration into processors, enabling unprecedented data scan rates of up to 170 billion rows per second," said John Fowler, executive vice president of Systems, Oracle.

Posted March 16, 2016

As more businesses leverage applications that are hosted in the cloud, the lines between corporate networks and the internet become blurred. Accordingly, enterprises need to develop an effective strategy for ensuring security. The problem is, many of today's most common approaches simply don't work in this new cloud-based environment.

Posted March 16, 2016

Tesora has upgraded its DBaas platform. The new release, Tesora DBaaS Platform Enterprise Edition 1.7, gives IT and database managers the ability to update OpenStack database guest images in real time, automates patch management, and provides more control over database replica placement.

Posted March 16, 2016

Melissa Data, a provider of contact data quality and integration solutions, has been awarded patent #9,262,475 by the U.S. Patent and Trademark Office (USPTO) for proximity matching technology used in its global data quality tools and services. The patent features an algorithm enabling distance criteria to be used in matching customer records, capitalizing on latitude, longitude, and proximity thresholds to help data managers eliminate duplicate customer contact data.

Posted March 16, 2016

Available now, Talend says its Integration Cloud Spring '16 release adds enhancements to help IT organizations execute big data and data integration projects running on AWS Redshift or AWS Elastic MapReduce (EMR) with greater ease - using fewer resources, and at a reduced cost.

Posted March 16, 2016

Percona, a provider of MySQL and MongoDB support, consulting, remote DBA, training, and software development, will host its annual Live Data Performance Conference, formerly known as the Percona Live MySQL Conference & Expo, in Santa Clara, California April 18-21.

Posted March 15, 2016

Tableau Software has acquired HyPer, a high performance database system that started as a research project in 2010 at the Technical University of Munich (TUM). As part of the technology acquisition, Tableau says it will add key technical personnel and also plans to establish a research and development center in Munich and expand its research into high performance computing.

Posted March 15, 2016

Analytics and the platforms that support big data are constantly evolving, being shaped by the need to deliver data faster to users and gain effective insights throughout the organization. Hadoop, Spark, Kafka, and the cloud are some of the technologies that can handle the demand the future will bring, according to Kevin Petrie, senior director at Attunity.

Posted March 14, 2016

Attivio is receiving $31 million in investment financing that will help expand the company as it accelerates its offerings into the big data market.

Posted March 09, 2016

In a new book titled "Next Generation Databases," Guy Harrison, an executive director of R&D at Dell, shares what every data professional needs to know about the future of databases in a world of NoSQL and big data.

Posted March 08, 2016

Data discovery is changing as result of new data usage patterns and requirements, and is increasingly breaking out on its own as a separate and distinct discipline, according to John O'Brien, principal advisor and CEO, Radiant Advisors, who will present a talk at Data Summit 2016 titled "Enabling Governed Data Discovery in Modern Data Architectures."

Posted March 08, 2016

MapR Technologies has added new features in the MapR Converged Data Platform, including the ability to run stateful, containerized applications. There is a big issue today in that many organizations are limiting Docker to applications that don't require state and don't requirement storage access, said Jack Norris, SVP data and applications, MapR.

Posted March 08, 2016

Splunk Inc. is enhancing its security analytics portfolio by combining machine learning, anomaly detection, context-enhanced correlation, and rapid investigation capabilities in new versions of Splunk User Behavior Analytics (UBA) and Splunk Enterprise Security(ES).

Posted March 08, 2016

As more and more data comes into the enterprise, companies are looking to build real-time big data architectures to keep up with an increased amount of information.

Posted March 07, 2016

Syncsort is introducing new capabilities to its data integration software, DMX-h, that allow organizations to work with mainframe data in Hadoop or Spark in its native format, which is necessary for preserving data lineage and maintaining compliance.

Posted March 07, 2016

Oracle has introduced the StorageTek Virtual Storage Manager (VSM) 7 System, which provides mainframe and heterogeneous storage combined with the additional capability for automated tiering to the public cloud.

Posted March 07, 2016

The world of IT operations has always had a big data problem. Instrumentation of end users, servers, application components, logs, clickstreams, generated events and incidents, executed notifications and runbooks, CMDBs, Gantt charts—you name it, people in the IT operations area have had to cope with mountains of data. And yet the scope of the problem has been enlarged once again, thanks to industry-wide trends such as bring-your-own-device, the Internet of Things, microservices, cloud-native applications, and social/mobile interaction.

Posted March 03, 2016

Microsoft first truly disrupted the ETL marketplace with the introduction of SQL Server Integration Services (SSIS) back with the release of SQL Server 2005. Microsoft has upped the ante yet again by bringing to market powerful ETL features to the cloud via the Azure Data Factory, which enables IT shops to integrate a multitude of data sources, both on-premises and in the cloud, via a workflow (called a "pipeline) that utilizes Hive, Pig, and customized C# programs.

Posted March 03, 2016

Few of us working in the software industry would dispute that agile methodologies represent a superior approach to older waterfall-style development methods. However, many software developers would agree that older enterprise-level processes often interact poorly with the agile methodology, and long for agility at the enterprise level. The Scaled Agile Framework (SAFe) provides a recipe for adopting agile principles at the enterprise level.

Posted March 03, 2016

To spread the word on the best information management solutions in the marketplace, Database Trends and Applications has launched the 2016 DBTA Readers' Choice Awards, a program in which the winners will be selected by the experts whose opinions count above all others - you. Time is running out so make your nomination now.

Posted March 02, 2016

Attunity Ltd. is releasing an enhanced version of its Attunity Compose platform to eliminate pitfalls in data warehousing and accelerate big data analytics.

Posted March 02, 2016

Our friends at Dell recently shared new research findings that shed light on how healthcare organizations perceive, plan for, and utilize key technologies across big data, cloud, mobility and security. According to the research, nearly half of healthcare organizations believe big data is relevant, but don't know how to approach it.

Posted March 01, 2016

Infobright, the columnar database analytics platform, has unveiled its new Infobright Approximate Query (IAQ) solution for large-scale data environments, allowing users to gain insights faster and efficiently. "This technology is being delivered on the basis of rethinking the business problem and using technology in a very meaningful way to solve problems that would otherwise be unsolvable using a traditional approach," said Don DeLoach, CEO.

Posted February 26, 2016

Cloud-borne data is becoming commonplace—at least at the edges of the enterprise. Organizations are relying, both formally and informally, on cloud-based services for supplemental storage, file sharing, and content management. The challenge now is to bring core enterprise data into the cloud, to render data ranging from financials to sales to performance analytics as services.

Posted February 24, 2016

The days of the one-size-fits-all, all-purpose database are over, and today there is a growing realization that different data management systems offer different benefits and that some are better suited for certain requirements than others. In a special report, DBTA asks MultiValue vendors: What are the current pressures your customers are facing, and how are you helping to extend and leverage their critical MultiValue systems to meet those new requirements?

Posted February 24, 2016

There is no denying the fact that cloud-based software and computing services are now accepted as the norm. This change has profound implications on how software applications are architected, delivered, and consumed. This change has ushered in a new generation of technology and an entirely new category in data integration. Today, there are 10 new requirements for an enterprise data integration technology:

Posted February 24, 2016

SnapLogic is unveiling its Winter 2016 release of its flagship platform, further connecting users to flexible Spark capabilities and reliable big data integration solutions.

Posted February 23, 2016

The promise of the data lake is an enduring repository of raw data that can be accessed now and in the future for different purposes. To help companies on their journey to the data lake, Information Builders has unveiled the iWay Hadoop Data Manager, a new solution that provides an interface to generate portable, reusable code for data integration tasks in Hadoop.

Posted February 23, 2016

Information Builders, a provider of business intelligence (BI) and analytics, information integrity, and data integration solutions, is releasing three editions of its platform, allowing users to gain reporting and analytics insight from a range of users and environments. "The reason we are bringing out these three tiers of product is to make it easier for people to buy the right level of software to accomplish the right analytics and BI missions that they have," said Jake Freivald, vice president of product marketing for Information Builders.

Posted February 22, 2016

It is hard to think of a technology that is more identified with the rise of big data than Hadoop. Since its creation, the framework for distributed processing of massive datasets on commodity hardware has had a transformative effect on the way data is collected, managed, and analyzed - and also grown well beyond its initial scope through a related ecosystem of open source projects. With 2016 recognized as the 10-year anniversary for Hadoop, Big Data Quarterly chose this time to ask technologists, consultants, and researchers to reflect on what has been achieved in the last decade, and what's ahead on the horizon.

Posted February 18, 2016

TIBCO Software, a provider of solutions for integration, analytics and event processing, has added new features to its TIBCO Jaspersoft embedded analytics and reporting software. Jaspersoft 6.2 provides new collaborative functionality that allows casual users to build and customize their own reports. If desired, the same reports can also be routed to IT for refinement, where greater complexity can be edited at the API-level.

Posted February 18, 2016

GridGain Systems, a provider of an in-memory data fabric based on Apache Ignite, has raised $15 million in Series B financing.

Posted February 18, 2016

Every company is undoubtedly concerned about keeping outside attackers away from its sensitive data, but understanding who has access to that data from within the organization can be an equally challenging task. The goal of every attacker is to gain privileged access. An excessively privileged user account can be used as a weapon of destruction in the enterprise, and if a powerful user account is compromised by a malicious attacker, all bets are off.

Posted February 17, 2016

Currently, the IT industry is the midst of a major transition as it moves from the last generation - the internet generation - to the new generation of cloud and big data, said Andy Mendelsohn, Oracle's EVP of Database Server Technologies, who recently talked with DBTA about database products that Oracle is bringing to market to support customers' cloud initiatives. "Oracle has been around a long time. This is not the first big transition we have gone through," said Mendelsohn.

Posted February 17, 2016

Oracle has expanded its cloud offerings in the UK with the introduction of new PaaS and IaaS technologies to be hosted in its Slough data center. The new services include Oracle Database Cloud Service, Oracle Dedicated Compute Cloud Service, Oracle Big Data Cloud Service and Oracle Exadata Cloud Service. The facility currently serves more than 500 UK and global customers with SaaS and IaaS offerings tailored for private corporations and public sector customers.

Posted February 17, 2016

Databricks, the company behind Apache Spark, has announced the beta release of Databricks Community Edition, a free version of the cloud-based big data platform. First available as an invite-only beta rollout, accessibility to the Databricks Community Edition by the broader community will be made possible over the coming months, with general availability planned for late Q2 2016.

Posted February 17, 2016

Hewlett Packard Enterprise has added AppPulse Trace, a new module in its application performance monitoring software (APM) suite, to help developers identify and fix issues at their source, down to the exact line of code and server.

Posted February 17, 2016

Over the last half decade, we've watched SQL purists butt heads with NoSQL upstarts, Hadoop triumphalists clash with Hadump pessimists, database geeks war with application developers, and so on. In the midst of all this warring, we've tried to fit—and, in many cases, to cram—the new into the old, the old into the new, with the result that at one time or another, we've asked the impossible of all of the components in our ever-expanding technology portfolios.

Posted February 16, 2016

Enterprises can't seem to pack enough big data and analytics solutions into their data centers, executive suites, and everywhere else across their organizations. Just about every venture-capital-cash-fueled startup from Silicon Valley to Boston has an analytics component to it. As these firms rapidly gain traction, they are being scarfed up by larger vendors looking to solidify their leadership of the analytics space.

Posted February 16, 2016

Hewlett Packard Enterprise (HPE) has selected RedPoint Data Management platform as the underlying platform for a new HPE Risk Data Aggregation and Reporting (RDAR) integrated solution to support financial institutions' compliance with BDBS 239.

Posted February 16, 2016

Addressing the shift toward business-user-oriented visual interactive data preparation, Trillium Software has launched a new solution that integrates self-service data preparation with data quality capabilities to improve big data analytics.

Posted February 16, 2016

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103

Sponsors