Newsletters




Data Integration

Traditional approaches to the process of combining disparate data types into a cohesive, unified, organized view involve manual coding and scripting. The need for Real-Time Business Intelligence and the ability to leverage a wider variety of data sources is driving companies to embrace new ways to achieve Data Integration, including Data Virtualization, Master Data Management, and Integration Automation.



Data Integration Articles

Melissa, a provider of global data quality and identity verification solutions, has launched Contact Zone as a comprehensive customer data management platform. Optimized for organizations that want to make trusted data available across the enterprise through the combination of Pentaho Data Integration (PDI) and Melissa global data verification and enrichment tools, Contact Zone offers extraction, transformation, and loading (ETL) capabilities with data cleansing and enrichment functionality to help data stewards achieve a single customer view, break down information silos, improve data quality, and develop CRM and marketing strategies that boost revenue.

Posted April 11, 2017

IBM is bringing the cognitive capabilities of Watson to the service desk for a more intelligent workplace, which will work to anticipate, predict and act to meet current and future tech support requirements. IBM's Workplace Support Service with Watson will add analytics and Watson's cognitive capabilities, learning from user behavior and improving over time. It can support individuals on any device, any time and at any location, and will offer faster resolution of IT issues, handling the majority of support tickets when integrated with other helpdesk automation functions.

Posted April 10, 2017

Software AG has updated its Zementis Predictive Analytics product to support IBM z Systems and Adabas and Natural applications and databases. Zementis supports artificial intelligence (AI) and machine learning models in batch or real-time transactions, which in turn delivers operational AI for fast-moving, big data applications.

Posted April 10, 2017

IBM has announced it is the first provider to make the NVIDIA Tesla P100 GPU accelerator available on the cloud. The combination of NVIDIA's acceleration technology with IBM's Cloud platform is intended to help organizations more efficiently run compute-heavy workloads, such as artificial intelligence, deep learning and high performance data analytics.

Posted April 10, 2017

OVH is acquiring VMware's vCloud Air business. Financial details of the transaction were not disclosed. The transaction is expected to close in calendar Q2 2017.

Posted April 10, 2017

MapR Technologies has released an updated version of the MapR Ecosystem Pack (MEP) program, a set of open source ecosystem projects that support applications running on the MapR Converged Data Platform with inter-project compatibility.

Posted April 10, 2017

To meet the new demands of managing infrastructure in the cloud in a proactive manner, the new role of the "cloud keeper" has emerged. The cloud keeper is part technologist, part accountant, and part administrator. The cloud keeper has financial responsibility for keeping control of infrastructure expenses to prevent financial chaos. The role is part technical, since it requires an understanding of how and where resources are deployed. The cloud keeper must know how a resource is paid for and have enough technical expertise to know which resources can be spun up or down or would be better suited for one cloud paradigm over another.

Posted April 07, 2017

While companies often view processes from their frame of reference, "cutting" processes up according to department, business objective, or other internal aspect, customers obviously do not act according to the same taxonomy and—from the perspective of the company—appear to jump from process to process, from department to department, and from channel to channel, making it difficult for businesses to truly follow a customer through his or her whole journey.

Posted April 07, 2017

Make no mistake: Big data is promising, exciting, and effective—when done right. Once considered an overhyped buzzword, it's now a potential tool that leaders in every vertical want to harness. Unfortunately, the majority of new big data projects—about 55% of them, according to Gartner—are shuttered before they even get off the ground.

Posted April 07, 2017

There has been a sea of change in how enterprises are thinking about Apache Hadoop and big data. Today, a majority of enterprises are thinking about the cloud first, not on-premises, and are increasingly relying on ecosystem standards to drive their Apache Hadoop distribution selection.

Posted April 07, 2017

Elastic and Google have formed a partnership to bring managed support of Elastic's open source search and analytics platform to Google Cloud Platform (GCP). The partnership will provide customers a managed open source search and analytics solution that leverages GCP's global network and scale.

Posted April 07, 2017

Voting has opened for the 2017 Database Trends and Applications Readers' Choice Awards. Unlike other awards programs that rely on our editorial staff's evaluations, the DBTA Readers' Choice Awards are unique in that the winning information management solutions are chosen by you - the people who actually use them.

Posted April 07, 2017

I always look forward to new research from Unisphere Research, a division of Information Today, Inc., publishers of this magazine and other great products for data professionals. The latest report which you should read is "SQL Server Transformation: Toward Agility & Resiliency 2017; PASS Database Management Survey."

Posted April 07, 2017

It often seems that working around things is a full-time task in every area of information technology. When workarounds are conceived and deployed, people are not always in agreement.

Posted April 07, 2017

There has been plenty of dialogue within the IT community about when to migrate to the cloud, how to migrate to the cloud, which provider offers customers the best cloud environments, and the due diligence or governance that is necessary before taking that big step. There are larger waves of change nipping at our heels, yet we seem content to continue discussing a technology that is a means to an end.

Posted April 07, 2017

Every good DBA understands that backing up their database data is a non-optional part of assuring data availability and integrity. As a DBA, you need to know the difference between a full image copy backup and an incremental image copy backup and implement the proper image copy backup strategy based on application needs and database activity.

Posted April 07, 2017

There's a wide and growing acceptance that containers are replacing operating systems as the deployment target for application components. While application modules were previously designed to be installed upon a specific version of an operating system on a particular hardware platform, they are now increasingly being designed to run within a virtualized representation of an operating system—most frequently within a Docker container.

Posted April 07, 2017

While not the most media-hyped technology, databases are certainly one of the most crucial when it comes to our always-online, always-connected society. Databases power not just the applications and websites we use every day, but the businesses that generate revenue and fuel the economy. The internet relies on functioning and well-performing databases to operate.

Posted April 07, 2017

"Temporality" is a term that database managers know well, but it may be a new one for business managers. That has to change, as the temporality your database supports­—or, how it handles time—could be the difference between whether or not the business will increase revenue, pay a fine, or identify new opportunities. Especially important in this regard is "bitemporality," which is the ability to examine data across different points in time.

Posted April 07, 2017

It is difficult to find someone not talking about or considering using containers to deploy and manage their enterprise applications. A container just looks like another process running on a system; a dedicated CPU and pre-allocated memory aren't required in order to run a container. The simplicity of building, deploying, and managing containers is among the reasons that containers are growing rapidly in popularity.

Posted April 07, 2017

Hadoop continues to gain meaningful traction and organizations are now anticipating onboarding their analytics and business intelligence to the platform. At Data Summit 2017, Josh Klahr, vice president of products at AtScale, will discuss what enterprises users need to know on being successful with business intelligence on big data.

Posted April 06, 2017

Progress announced the latest release of Progress OpenEdge, an application development platform that helps simplify the delivery of mission-critical business applications. The 11.7 release includes capabilities to increase the ability to always be on, fortify applications through enhanced security, and keep accurate data flowing through the organization.

Posted April 06, 2017

Fujitsu and Oracle jointly announced the launch of Fujitsu SPARC M12, a new lineup of enterprise servers available worldwide. Featuring Fujitsu's new SPARC64 XII processor, the companies say, the Fujitsu SPARC M12 servers achieve the world's highest per CPU core performance in arithmetic processing, offering improvements for a range of database workloads, from mission-critical systems on premises to big data processing in the cloud.

Posted April 04, 2017

Cohesity, a provider of hyperconverged secondary storage, is receiving its largest funding to date, raising over $90 million in a Series C round co-led by investors GV (formerly Google Ventures) and Sequoia Capital, to expand sales and marketing to meet explosive customer demand. The investment will accelerate Cohesity's research and development of additional secondary storage use cases beyond data protection, with a special focus on analytics, test/dev, file services and object services.

Posted April 04, 2017

Teradata is launching an innovative database license model across hybrid cloud deployments, giving users more portability for deployment flexibility, subscription-based licenses, and simplified tiers with bundled features. With portable database licenses, Teradata customers can now have the flexibility to choose, shift, expand, and restructure their hybrid cloud environment by moving licenses between deployment options as their business needs change.

Posted April 04, 2017

Today's products and services connect to the world around its users. In an effort to help people support more devices and understand the Internet of Things space, Verizon launched ThingSpace, an IoT development platform that allows its enterprise customers to build and connect their own IoT applications.

Posted April 03, 2017

EnterpriseDB (EDB) has released the EDB Postgres Ark Database-as-a-Service (DBaaS) framework to the Amazon Web Services Marketplace (AWS). EDB Postgres Ark enables customizable deployments of Postgres clusters to private and public clouds, such as AWS as well as Red Hat OpenStack.

Posted April 03, 2017

The Oracle Public Cloud Offering has achieved a series of compliance certifications and attestations for for a number of core services.

Posted April 03, 2017

The top reasons for implementing Internet of Things projects include increasing new business revenue sources, increasing customer and product knowledge, and reducing operating expenses. DBTA recently held a roundtable webinar on how best to derive business value from IoT featuring Kevin Petrie, senior director and technology evangelist at Attunity; Jamie Morgan senior solutions architect at HPE Security - Data Security; and Becky Hanenkrat consulting sales specialist at North America database and data warehousing at IBM.

Posted March 31, 2017

GridGain Systems, provider of enterprise-grade in-memory computing platform solutions based on Apache Ignite, has obtained certifications from Hortonworks and Tableau and joined their technology partnership programs. GridGain says these relationships will make it easier for its customers to launch high performance big data systems built on Hortonworks that leverage in-memory computing and to visualize in-memory data held in GridGain using Tableau.

Posted March 31, 2017

In an effort to alleviate an impending critical shortage of developers, Cloud Foundry Foundation, an open source project whose stated purpose is to make Cloud Foundry the leading application platform for cloud computing, is launching a cloud-native developer certification initiative. The "Cloud Foundry Certified Developer" program will be delivered in partnership with The Linux Foundation. Training will be offered by companies, including SAP, IBM, and Pivotal, to help meet end user demand.

Posted March 30, 2017

The International Association of Cloud & Managed Service Providers (MSPAlliance) has formed a partnership with Ingram Micro Cloud to launch the MSPAlliance's MSP/Cloud Verify Program, an initiative to promote best practices and improve service delivery among the managed service provider (MSP) and cloud computing community.

Posted March 30, 2017

Thales, a provider of cybersecurity and data security solutions, is integrating its nShield hardware security module (HSM) with Chain, a provider of enterprise-grade blockchain infrastructure. The shared ledger model will offer the financial industry a new model for transacting between organizations that is more efficient and secure than legacy systems, according to Thales. Cost savings, faster transactions, and improved data quality add to the many benefits of the technology.

Posted March 30, 2017

Data analytics platform provider Looker closed an $81.5 million Series D funding round led by CapitalG, Alphabet's growth equity investment fund.

Posted March 30, 2017

Oracle has announced that Hearst has selected Oracle Cloud to provide its businesses with a common platform to accelerate business growth and global expansion.

Posted March 29, 2017

Qubole, which provides big data-as-a-service, has successfully completed the Service Organization Control (SOC) 2 Type II examination, confirming that Qubole meets stringent guidelines for data security.

Posted March 29, 2017

The Independent Oracle Users Group (IOUG) has represented the voice of data technologists and professionals for more than 20 years, and we are excited about how our community continues to grow and focus on peer-to-peer education and know-how. With that focus we are excited for our premier yearly event: COLLABORATE 17 - IOUG Forum.

Posted March 29, 2017

There are many points in life where you may ask yourself whether it is better to build or buy. Think of a new house, a business, or an application. Regardless of the object of discussion, answering certain upfront questions can act as a guide to help you along the path to the right solution. Given the increasing importance, complexity, and breadth of database systems, the question of whether to build or to buy database monitoring is an important one to consider.

Posted March 29, 2017

Oracle has unveiled its Cloud Converged Storage, which, it says, represents the first time a public cloud provider at scale has integrated its cloud services with its on-premises, high performance NAS storage systems.

Posted March 29, 2017

Talend, a provider of cloud and big data integration software, has announced that the newest version of its Talend Data Fabric integration solution has been certified on the MapR Converged Data Platform, which includes MapR-FS, MapR-DB and MapR Streams.

Posted March 29, 2017

Kinetica, provider of in-memory analytics database accelerated by GPUs, is partnering with Safe Software and created FME connectors that read and write data from Kinetica into and out of FME workspaces.

Posted March 29, 2017

Hazelcast, provider of an open source in-memory data grid (IMDG), has joined the Confluent Partner Program as a Technology Partner. Confluent provides a streaming platform based on Apache Kafka, and designed its partner program to foster ecosystem around Apache Kafka and Confluent.

Posted March 29, 2017

TIBCO Software Inc., a provider of software for integration, API management, and analytics, has announced availability of a new API management offering, TIBCO Mashery Professional, which is targeted at helping to speed up digital transformation activities.

Posted March 29, 2017

Jethro, provider of an index-based SQL enterprise platform, is launching Jethro 3.0, combining the power of indexing architecture with "auto-cubes" to accelerate all possible business intelligence use cases using big data.

Posted March 28, 2017

Splunk and New Relic are forming a strategic alliance and introducing a new integration to help enterprises improve customer experiences. The Splunk App for New Relic gives developers and IT operations teams a comprehensive view into both application performance and infrastructure health by sharing data across both Splunk and New Relic platforms.

Posted March 27, 2017

MicroStrategy is releasing an enhanced version of its signature platform, delivering a new set of APIs that will allow users to connect to almost any data source. MicroStrategy 10.7 also adds integrations with Natural Language Generation (NLG) providers Automated Insights and Narrative Science, letting users add Intelligent Narratives to their dashboards alongside their reports, graphs, and visualizations.

Posted March 27, 2017

Continuing the strong showing this year for tech IPOs, Alteryx, a provider of self-service data analytics software, rang the opening bell on Friday, March 24, in advance of its initial public offering on the NYSE under "AYX."

Posted March 24, 2017

Huawei, a global information and communications technology solution provider, has formed a new partnership with Software AG, a data center middleware vendor, to support the increasing demand of companies for Internet of Things (IoT) solutions. Huawei's Open IoT Platform and Network Infrastructure capabilities are being combined with Software AG's streaming analytics, hybrid enterprise integration and predictive analytics.

Posted March 23, 2017

It all used to be so simple. There were relational databases and they were all on-premises. But today it is an increasingly hybrid world, with virtualization and Hadoop arguably the source of the most dramatic technology shifts of the past 10 years. At Data Summit 2017, Unisphere Research lead analyst and DBTA contributor Joe McKendrick will moderate tracks on these two critical technology trends during Hadoop Day on Tuesday, May 16, and Virtualization Day on Wednesday, May 17.

Posted March 23, 2017

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96

Sponsors