Newsletters




Data Integration

Traditional approaches to the process of combining disparate data types into a cohesive, unified, organized view involve manual coding and scripting. The need for Real-Time Business Intelligence and the ability to leverage a wider variety of data sources is driving companies to embrace new ways to achieve Data Integration, including Data Virtualization, Master Data Management, and Integration Automation.



Data Integration Articles

Seagate Technology, a provider of storage solutions, has introduced ClusterStor Hadoop Workflow Accelerator. The solution is expected to be a boon to computationally intensive high performance data analytics environments, enabling them to achieve a significant reduction in data transfer time.

Posted November 17, 2014

The U.S. Department of Energy has awarded IBM contracts valued at $325 million to develop and deliver advanced "data-centric" supercomputing systems at Lawrence Livermore and Oak Ridge National Laboratories. "This architecture is really part of a paradigm that addresses the big data challenge, one we hear about here at IBM all time - which we call data-centric computing. We believe the value of a supercomputer is not only tied to petaflops but also to the speed of insights. We solve this particular challenge working with the labs through an open ecosystem leveraging technologies with our partners at the OpenPower Foundation, NVIDIA and Mellanox," said Tom Rosamilia, senior vice president, IBM Systems and Technology Group, during a webcast to announce the new supercomputing systems.

Posted November 14, 2014

Big data tool vendors try to downplay the notion that data warehouses and data marts still need to exist, even in a big data world. Relational DBMSs are painted as "old-fashioned," "yesterday," and "inadequate." They beckon potential customers to take a dip in the refreshing data lake. The fact that big data, in all of its glory, is only part of a larger business intelligence solution is getting lost in the dialog.

Posted November 12, 2014

We are now seeing a seismic shift and increase in the significance of social network data for marketing and brand analysis. The next wave of social network exploitation promises to allow companies to narrowly target consumers and leads to predict market trends, and to more actively influence consumer behavior.

Posted November 12, 2014

Rocket Software's DBMS and Application Servers division is now being led by Gary Gregory, vice president and general manager. Looking ahead, the two key words for Rocket MultiValue are "modernization" and "acceleration," said Gregory. "That is what we want to do - and continuous quality improvement is something we must have to enable those two objectives."

Posted November 12, 2014

Kourier Integrator Release 4.2, the latest version of its enterprise integration and data management suite, includes improvements to the enterprise application integration (EAI) and extract-transform-load (ETL) capabilities of the product, as well as new support for SQL Server 2014, the latest version of Microsoft's RDBMS.

Posted November 12, 2014

Melissa Data, a provider of contact data quality and data integration solutions, is partnering with Semarchy, a developer of master data management software and solutions. Through the partnership, Semarchy offers its users a fast and easy way to perform worldwide address enrichment, standardization, geocoding and verification using Melissa Data's data quality tools and services, while Melissa Data enables its customers to upgrade data quality projects, moving beyond tactical processes to engage in a more comprehensive strategy.

Posted November 12, 2014

Talend has introduced a new release of its integration platform. The 5.6 release sets new benchmarks for big data productivity and profiling, innovates in MDM with efficiency controls, and broadens Internet of Things (IoT) device connectivity.

Posted November 12, 2014

The increasing size and complexity of database environments is straining IT resources at many organizations. As data continues to proliferate within organizations, many companies are looking for more efficient methods to enable storage and analysis. Virtualization and use of cloud technology, particularly, private clouds are two approaches coming to the fore.

Posted November 12, 2014

DBmaestro, a provider of DevOps for database solutions, has introduced TeamWork 4.5, which adds impact analysis for database deployment with SQL Server, in addition to its existing support for Oracle.

Posted November 11, 2014

Services provider Infosys has formed a strategic partnership with BI and analytics expert Tableau. Infosys will integrateTableau's software into the solutions it deploys to help clients gain more value from big data. Infosys says it will use its global training facilities to increase the number of Tableau analytics experts in the company, ensuring that the benefits of business analytics are included within a wide range of client solutions across multiple industries.

Posted November 11, 2014

Attunity Ltd. has expanded its Attunity CloudBeam solution to support Amazon Web Services customers in moving data from Amazon Relational Database Service (Amazon RDS) to Amazon Redshift. Attunity has also been awarded Big Data Competency status in the AWS Partner Network (APN) Competency Program.

Posted November 11, 2014

Attachmate Corp. has announced the release of a mobile version of its terminal emulation software. "More and more corporate workers are shifting business productivity to their personal devices, as evidenced by the BYOD/BYOA trend. Reflection for UNIX enables IT administrators to leverage modern devices to accomplish real work," said Kris Lall, product manager for Reflection for UNIX at Attachmate.

Posted November 10, 2014

Big data and analytics workloads are placing greater demands on the enterprise, and creating the need for software defined infrastructure, says Bernie Spang, VP, Strategy, software-defined environments in IBM Systems & Technology Group. With the volume of data organizations are dealing with today, there is a need to optimize the compute and storage resources. "We can't do it with the traditional, manual, rigid IT environment of the past," said Spang.

Posted November 10, 2014

CA Technologies today announced an expanded alliance with Microsoft that will help broaden enterprise cloud utility across a multi-platform IT environment. The first solution which provides dynamic storage for mainframe data on Microsoft Azure is being previewed at CA World '14

Posted November 10, 2014

Rocket Software has announced Rocket Data Virtualization version 2.1, a mainframe data virtualization solution for universal access to data, regardless of location, interface or format.

Posted November 10, 2014

At the most fundamental level, consider that at the end of the day NoSQL and SQL are essentially performing the same core task — storing data to a storage medium and providing a safe and efficient way to later retrieve said data. Sounds pretty simple — right? Well, it really is with a little planning and research. Here's a simple checklist of 5 steps to consider as you embark into the world of NoSQL databases.

Posted November 05, 2014

Attunity recently introduced the next generation of Attunity Maestro, its global-scale information flow management platform, to enable management and monitoring of its updated Attunity Replicate for Oracle.

Posted November 05, 2014

The call for speakers for Data Summit 2015 at the New York Hilton Midtown, May 11-13, 2015, is now officially open. The deadline for submitting proposals is December 5, 2014.

Posted November 05, 2014

Dell Software is collaborating with Microsoft to provide predictive analytics in a hybrid cloud setting and also upgrading its Statistica (formerly StatSoft) advanced analytics platform with enhanced big data capabilities through integration with Kitenga.

Posted November 05, 2014

Hadoop is one of the best-known technologies within the big data realm. However, deploying a Hadoop environment is not a simple task. To help address the challenges for prospective Hadoop customers, Cloudera, which offers analytic data management based on Apache Hadoop, and CenturyLink, which provides managing services in the cloud, have formed a partnership.

Posted November 04, 2014

NuoDB has introduced Swifts Release 2.1 which includes the first phase of its HTAP (Hybrid Transaction/Analytical Processing) capabilities. "HTAP" aims to provide real-time operational intelligence with the goal of allowing businesses to acquire immediate insights that they can then use to optimize their business processes

Posted November 04, 2014

The Oracle Applications Users Group (OAUG), in partnership with the OAUG Hyperion SIG, is hosting its Connection Point - Hyperion Online educational online series Nov. 17-19. The OAUG Connection Point - Hyperion Online features 18 training sessions on topics including budgeting and forecasting; financial management; data management; Oracle Hyperion Tax Provision; product improvements in Oracle Hyperion 11.1.2.3 and 11.1.2.3.500; as well as cloud, SaaS, on-premises and other deployment options.

Posted November 03, 2014

Azul Systems, a provider of Java runtime solutions, and DataStax, which provides DataStax Enterprise (DSE) built on Apache Cassandra, have formed a partnership to allow DSE customers to leverage Azul Zing. According to the companies, Zing is the best JVM for real-time Cassandra deployments, allowing Cassandra to operate more consistently by eliminating JVM-caused response time delays.

Posted November 03, 2014

The GridGain In-Memory Data Fabric has been accepted into the Apache Incubator program under the name "Apache Ignite." GridGain will continue to be a contributor to the Ignite code base while also adding enterprise-grade features to its commercial product. The platform's core code will be managed by the non-profit ASF.

Posted November 03, 2014

Melissa Data, a provider of global contact data quality and data enrichment solutions, has added global email, global phone verification, and U.S. property data enrichment services to its flagship Data Quality Components for SQL Server Integration Services (SSIS), a suite of custom data cleansing transformation components for Microsoft SSIS.

Posted October 30, 2014

Twitter and IBM have formed a new partnership to help improve organizations' understanding of their customers, markets and trends. The alliance brings together Twitter data with IBM's cloud-based analytics, customer engagement platforms, and consulting services. IBM says the collaboration will focus on 3 key areas.

Posted October 29, 2014

VMware has acquired the assets of Continuent. The Continuent team is joining VMware's Hybrid Cloud Business Unit. The acquisition offers "concrete benefits" to Continuent customers, said Robert Hodges, CEO of Continuent.

Posted October 29, 2014

While data warehouses have been the main data storage repository for companies since the 1970s, companies have begun to look on the horizon for what is next. To provide information about the key technologies, features, best practices, and pitfalls to consider when evaluating a data lake approach, Database Trends and Applications recently hosted a special roundtable webcast presented by Rich Reimer, VP of marketing and product management, Splice Machine; Rodan Zadeh, director of product marketing, Attunity; and George Corugedo, CTO and co-founder, RedPoint Global Inc.

Posted October 28, 2014

Within many companies' marketing departments there is a greater emphasis than ever before on using big data to make their products more appealing to customers. A major use for the data is to not only provide the best possible experience for the consumer, but to be able to provide it efficiently. Teradata's enhancements to the Teradata Integrated Marketing Cloud. are aimed at improving digital asset management and performance, real-time interaction management, and use of data in real time.

Posted October 28, 2014

Platfora, which provides a big data analytics platform built natively on Hadoop and Spark, has introduced Platfora 4.0 with advanced visualizations, geo-analytics capabilities, and collaboration features to enable users with a range of skill levels to work iteratively with data at scale.

Posted October 28, 2014

Protegrity, a provider of data security solutions, has announced an expanded partnership with Hadoop platform provider Hortonworks. Protegrity Avatar for Hortonworks extends the capabilities of HDP native security with Protegrity Vaultless Tokenization (PVT) for Apache Hadoop, Extended HDFS Encryption, and the Protegrity Enterprise Security Administrator, for advanced data protection policy, key management and auditing.

Posted October 28, 2014

At SAP TechEd & d-code, SAP announced new innovations for the latest release of SAP HANA, the fall update of SAP HANA Cloud Platform, and a new SAP API Management technology.

Posted October 22, 2014

SAP SE has announced the SAP Cloud for Planning solution, an enterprise performance management (EPM) solution designed around user experience and built for the cloud. The SAP Cloud for Planning solution will be built natively on SAP HANA Cloud Platform, the in-memory platform-as-a-service (PaaS) from SAP.

Posted October 22, 2014

Attunity has introduced Replicate 4.0 which provides high-performance data loading and extraction for Apache Hadoop. The solution has been certified with the Hortonworks and Cloudera Hadoop distributions.

Posted October 22, 2014

SAP and BI provider Birst have formed a partnership to provide analytics in the cloud on the SAP HANA Cloud Platform. This collaboration intends to bring together the next-generation cloud platform from SAP with Birst's two-tier data architecture to provide instant access to an organization's data and help eliminate BI wait time.

Posted October 22, 2014

Oracle has expanded its data integration portfolio with the addition of Oracle Enterprise Metadata Management, a platform to help organizations govern data across the enterprise including structured and unstructured data, and across Oracle and third-party data integration, database, and business analytics platforms. "This is the first time that we have made a comprehensive offering in the area of metadata management," said Jeff Pollock, vice president of product management for Oracle Data Integration.

Posted October 22, 2014

We are in the midst of a business performance revolution, one where companies and customers alike expect instant access to the tools of commerce from anywhere at any time. Mobility is integral to this revolution, as the enterprise mobility phenomenon is quickly becoming a key driver of business innovation.

Posted October 22, 2014

Apache Hadoop has been a great technology for storing large amounts of unstructured data, but to do analysis, users still need to reference data from existing RDBMS based systems. This topic was addressed in "From Oracle to Hadoop: Unlocking Hadoop for Your RDBMS with Apache Sqoop and Other Tools," a session at the Strata + Hadoop World conference, presented by Guy Harrison, executive director of Research and Development at Dell Software, David Robson, principal technologist at Dell Software, and Kathleen Ting, a technical account manager at Cloudera and a co-author of O'Reilly's Apache Sqoop Cookbook.

Posted October 22, 2014

In his presentation at the Strata + Hadoop World conference, titled "Unseating the Giants: How Big Data is Causing Big Problems for Traditional RDBMSs," Monte Zweben, CEO and co-founder of Splice Machine, addressed the topic of scale-up architectures as exemplified by traditional RDBMS technologies versus scale-out architectures, exemplified by SQL on Hadoop, NoSQL and NewSQL solutions.

Posted October 22, 2014

Today, many companies still have most of their transactional data in relational database management systems which support various business-critical applications, from order entry to financials. But in order to maintain processing performance, most companies limit the amount of data stored there, making it less useful for in-depth analysis. One alternative, according to a recent DBTA webcast presented by Bill Brunt, product manager, SharePlex, at Dell, and Unisphere Research analyst Elliot King, is moving the data to Hadoop to allow it to be inexpensively stored and analyzed for new business insight.

Posted October 22, 2014

To help simplify the process for the user with self-service BI tools, Logi Analytics has announced the latest version of its business intelligence platform Logi Info. "Self-service has been around for a while, but it never seems to deliver on its promise. Largely, that is because we are mismatching people and their capabilities with the tool sets and information they need," explained Brian Brinkmann, VP of Product for Logi Analytics.

Posted October 21, 2014

MapR Technologies, one of the top ranked distributors for Hadoop, has announced that MapR-DB is now available for unlimited production use in the freely-downloadable MapR Community Edition. "From a developer standpoint, they can combine the best of Hadoop, which is deep predictive analytics across the data, as well as a NoSQL database for real-time operations," explained Jack Norris, chief marketing officer for MapR Technologies.

Posted October 21, 2014

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90

Sponsors