▼ Scroll to Site ▼

Newsletters




Data Integration

Traditional approaches to the process of combining disparate data types into a cohesive, unified, organized view involve manual coding and scripting. The need for Real-Time Business Intelligence and the ability to leverage a wider variety of data sources is driving companies to embrace new ways to achieve Data Integration, including Data Virtualization, Master Data Management, and Integration Automation.



Data Integration Articles

Pivotal Software, Inc., a cloud-native platform provider, is releasing Pivotal Spring Runtime, a comprehensive support package for Java environments. Pivotal Spring Runtime supports Java workloads running on Linux and Windows server environments including server-side apps running on bare metal, VMs, containers, or Kubernetes. Pivotal Application Service includes Pivotal Spring Runtime, so Java workloads running on the app platform are already covered.

Posted May 28, 2019

JetStream Software Inc., a provider of cloud data protection, is receiving Series A funding of $7.7 million to advance the company's unique technologies. The company's platform enables managed service providers (MSPs) and cloud service providers (CSPs) to deliver disaster recovery as a service (DRaaS) and continuous data protection to enterprise customers, government agencies, and research and educational organizations.

Posted May 28, 2019

Broadcom has introduced value-based software licensing model designed to provide increased clarity regarding software consumption on z/OS mainframe systems. The new Mainframe Consumption Licensing (MCL) model is intended to give mainframe customers better visibility and predictability into their software spending while enabling them to maximize the value created for their end customers.

Posted May 28, 2019

Following up on IBM's announcement of Tailored Fit Pricing for IBM Z, John McKenny, VP of Strategy for ZSolutions Optimization at BMC, commented on the significance of the new offerings in a BMC blog post. BMC supports the new Tailored Fit Pricing model because it will help an increasing number of enterprises continue to grow and build new services on top of their mainframes as well as re-ignite interest in using the platform as an integral part of today's hybrid cloud strategies, McKenny noted.

Posted May 28, 2019

IBM has unveiled new services and capabilities for IBM Z, which, the company says, will further position it as a center point of a secured hybrid cloud strategy. According to Ross Mauri, general manager for IBM Z, who made the announcement in an IBM Infrastructure blog post, the new offerings include three major elements. IBM sees "secured hybrid and multicloud as the future of enterprise IT, and IBM Z is at the center," said Mauri.

Posted May 28, 2019

GigaSpaces, the provider of InsightEdge, has achieved Red Hat OpenShift Operator Certification for the Insight Edge In-Memory Computing Platform. Red Hat OpenShift's cloud-agnostic support combined with GigaSpaces multi-region and cloud replication module can improve the efficiency of enterprises' cloud and multi-cloud migration initiatives by helping to optimize bandwidth and reduce data transfer costs between regions and cloud providers. 

Posted May 28, 2019

NetApp, a provider of hybrid cloud data services, is releasing NetApp ONTAP 9.6, the new midrange, end-to-end NVMe AFF A320 storage system and an expanded portfolio of services.

Posted May 28, 2019

Dynatrace, a software intelligence company, is providing support for Red Hat OpenShift 4, the next generation of Red Hat's enterprise Kubernetes platform. Dynatrace's Software Intelligence platform automatically monitors and analyzes containers and the microservices running inside of them across the entire Red Hat OpenShift 4 Kubernetes environment and underlying multi-cloud infrastructure with no blind spots. Dynatrace

Posted May 28, 2019

Red Hat is contributing to Microsoft KEDA, a new open source project aimed at providing an event-driven scale capability for any container workload. Using KEDA, Red Hat puts Azure Functions on top of its OpenShift Container Platform in Developer Preview. It is designed to behave the same way it does when running on Azure as a managed service, but now running anywhere OpenShift runs, which means on the hybrid cloud and on-premises.

Posted May 28, 2019

Due to exponentially growing data stores, organizations today are facing slowdowns and bottlenecks at peak processing times, with queries taking hours or days. Some complex queries simply cannot be executed. Data often requires tedious and time-consuming preparation before queries can be run. David Leichner, CMO, SQream, demonstrated how the power of GPUs can help conquer these challenges with his presentation, "Accelerating Analytics in a New Era of Data," during Data Summit 2019.

Posted May 24, 2019

DataOps is still a relatively new concept, it combines people, processes, and technology to build and enhance data analytics. At Data Summit 2019 Kevin Petrie, Senior Director of Marketing, Attunity, Inc., explored the the five key steps necessary to be successful with DataOps, including the process and cultural shift required during his session, "Five Key Requirements for DataOps Success."

Posted May 24, 2019

As Data Summit 2019 comes to a close, John O'Brien, principal advisor and CEO, Radiant Advisors, looked at lessons learned from this year's conference during his closing keynote. Companies are not lacking in technology options; in most cases, more advanced technologies exist than can be absorbed into the organization all at once.

Posted May 23, 2019

Syncsort, the provider of Big Iron to Big Data software, is introducing Connect CDC, a new real-time change data capture and data replication product that enables organizations to stream application data. By making all mission-critical application data accessible in real time, organizations can empower business users across the enterprise to make decisions based on information when it's most relevant and valuable.

Posted May 23, 2019

Accenture and SAP are co-developing and co-innovating to accelerate the development of the SAP C/4HANA platform. The initiative, called Project Elevate, includes the formation of industry consortia with key market leaders in automotive original equipment manufacturers (OEM), business-to-business component manufacturing and utilities to help define and design these industry-specific experiences.

Posted May 22, 2019

SAP is introducing new SAP HANA Cloud services, aiming to bring the power and performance of the SAP HANA database to the cloud. SAP HANA Cloud Services and SAP Cloud Platform aim to provide customers access to all SAP and third-party application data, reduce data duplication and offer a single point for security and governance.

Posted May 22, 2019

OpenText, a provider of Enterprise Information Management (EIM) solutions, is offering new OpenText content services that will be delivered through SAP Cloud Platform and other SAP solutions. Delivered through the SAP Cloud Platform, OpenText content services are planned to become available for applications such as SAP S/4HANA Cloud, SAP C/4HANA Cloud, and SAP SuccessFactors.

Posted May 22, 2019

Mindtree, a global technology services and digital transformation company, is launching QuikDeploy, an IP-driven approach to helping customers maximize their use of the SAP Solution Manager. This flexible industry accelerator is tailor-made to rapidly deploy SAP S/4HANA into Microsoft Azure cloud.

Posted May 22, 2019

As we move into a new computing era, technology has outstripped society in its ability to up-end labor markets and industries, as well as personal lives. In a panel discussion at Data Summit 2019, Sue Feldman, president, Synthexis, David Bayer, Executive Director, Cognitive Computing Consortium, Tom Wilde, CEO, Indico, and Steven Cohen, COO, co-founder, Basis Technology, explored the ethical and legal issues that computing advances have raised.

Posted May 22, 2019

AI is already having a significant impact for the U.S. government, including defense and intelligence community use cases, and is also providing game-changing capabilities for global enterprises in a range of industries, including financial services, life sciences, and technology. In a presentation at Data Summit 2019,  titled "Solving Business Problems in Government, Financial Services," Amy Guarino, COO, Kyndi, offered real-world examples of how AI is driving measurable benefits in a range of industry sectors, and discussed the importance of explainable AI to regulated industries like financial services and healthcare, where being able to justify the reasoning behind algorithmic decisions is essential.

Posted May 22, 2019

There are new technologies that contribute to speed and scale of a modern data platform. But as data size and complexity increase with Big Data, data quality and data integration issues must still be addressed. At Data Summit 2019 Prakriteswar Santikary, VP & Global Chief Data Officer, ERT discussed how to create the architecture of a modern, cloud-based, real-time data integration and analytics platform that ingests any type of clinical data (structured, unstructured, binary, lab values, etc.) at scale from any data sources, during his presentation, "Designing a Fast, Scalable Data Platform."

Posted May 22, 2019

Container usage is now being adopted by organizations of all sizes, from small startups to companies with huge, established microservices platforms. At Data Summit 2019 Jeff Fried, director of product management, Intersystems and BA Insight, MIT, and Joe Carroll, product specialist, InterSystems, presented their session, "Understanding Database Containerization," focusing on helping practitioners navigate the minefield of database containerization and avoid some of the major pitfalls that can occur.

Posted May 22, 2019

At Data Summit 2019, Jay Benedetti, global solutions director of CloverDx, explained the importance of achieving a 360-degree view of the customer and how that begins with a  successful integration strategy. Benedetti shared two different case studies, one in B2B and another in B2C that CloverDX worked on, helping organizations to achieve a panoramic view of the customer  to achieve their business goals and enabling them to bring in more data as they grow in the future.

Posted May 22, 2019

DataOps is an emerging set of practices, processes, and technologies for building and enhancing data and analytics pipelines to better meet the needs of the business. The list of failed big data projects is long. They leave end users, data analysts, and data scientists frustrated with long lead times for changes.

Posted May 22, 2019

In a keynote at Data Summit 2019 titled "Digital Transformation Is Business Transformation: How to Incorporate AI Technology Into a 130-Year-Old Company" Helena Deus, technology research director, Elsevier, showcased how the company combines content and data with analytics and technology to help researchers to make new discoveries and have more impact on society, and clinicians to treat patients better and save more lives.

Posted May 22, 2019

The second day of Data Summit 2019 opened up with "Data and Donuts," a presentation featuring Paul Wolmering, VP worldwide sales engineering, Actian Corporation. His discussion focused on next-generation cloud data warehousing and what it takes to deliver insights from real-time data economically and at scale with hybrid data regardless of location, in the cloud, on-premises or both.

Posted May 22, 2019

Enterprise agility isn't a single initiative but rather a collection of activities and technologies that lead toward that goal. This includes adoption of microservices, containers, and Kubernetes to increase the flexibility of systems, applications, and data by releasing them from underlying hardware. In addition, practices such as DevOps are helping to increase the level of collaboration possible for fast-moving enterprises.

Posted May 22, 2019

At Data Summit 2019, Matthew Deyette, chief customer officer, presented a keynote titled The Evolution of Big Data Analytics, Data Summit 2019. Data availability for AI, regardless of what the goal is, is a critical problem, said Deyette.

Posted May 21, 2019

In 2017, The Economist declared that the world's most valuable resource is no longer oil but is instead data, said Lee Levitt, business strategist at Oracle, during a keynote at Data Summit 2019. Information is driving huge advantage, said Levitt, noting that extensive use of customer analytics has a large impact on corporate performance, and successful companies outperform their competitors across the full customer lifecycle.

Posted May 21, 2019

A.M. Turing Award Laureate and database technology pioneer Michael Stonebraker delivered the welcome keynote at Data Summit 2019, titled "Big Data, Technological Disruption, and the 800-Pound Gorilla in the Corner."

Posted May 21, 2019

Data is flowing into organizations from a previously unimaginable array of sources and at unprecedented speed and volume. This means that the challenges of cleaning, deduplicating, and integrating data are increasing.

Posted May 21, 2019

With edge computing becoming the next big thing, AI on-the-edge is quickly following suit. It unlocks a whole new world of possibilities, including predicting customer needs before they even know them. To tap into this opportunity, organizations don't need to choose a risky "all in" approach; a small iterative approach reduces the risk while ensuring edge AI projects aligns the overall business strategy.

Posted May 21, 2019

The customer experience (CX) is prime territory for employing AI technologies. Built upon mobile use and location data, bot services help customers with a variety of services, ranging from in-room amenities and dinner reservations to event tickets.

Posted May 21, 2019

The world of data management and administration is rapidly changing as organizations digitally transform. In a presentation at Data Summit 2019, Craig S. Mullins, Craig S. Mullins, president & principal consultant, Mullins Consulting, Inc., looked at how database management systems are changing and adapting to modern IT needs.

Posted May 21, 2019

Depending on who you talk to, AI will either enable massive productivity gains from your employees or replace them entirely. All the debating aside, AI is coming, and companies need to understand how to harness it.

Posted May 21, 2019

Traditional architecture and technologies and newer big data approaches each offer advantages. In a session at Data Summit 2019, titled "Designing a Data Architecture for Modern Business Intelligence & Analytics," Richard Sherman, managing partner, Athena IT Solutions, looked at the current state of analytics and what needs to change.

Posted May 20, 2019

It was hard enough to manage IT infrastructures when everything was on-premise only. But today, with combined on-premise deployments, say Michael Corey, co-founder, LicenseFortress, and Don Sullivan, system engineer database specialist, VMWare. In their Data Summit 2019 presentation, "Straight Talk on the Cloud License Landscape," Corey and Sullivan walked attendees through the steps to take to help stay compliance with software licensing rules, how to avoid and audit, and what to do if it happens.

Posted May 20, 2019

At Data Summit 2019, Susan E. Feldman, president, Synthexis, and David Bayer, executive director, Cognitive Computing Consortium, offered an overview of key cognitive computing considerations in a workshop titled "Cognitive Computing 101."

Posted May 20, 2019

Data science, the ability to sift through massive amounts of data to discover hidden patterns and predict future trends, may be in demand, but it requires an understanding of many different elements of data analysis. Extracting actionable knowledge from all your data to make decisions and predictions requires a number of skills, from statistics and programming to data visualization and business domain expertise.

Posted May 20, 2019

DataOps is a modern engineering practice that can improve the speed and accuracy of analytics. In a pre-conference workshop at Data Summit 2019, Mark Marinelli, head of product at Tamr, identified the processes, technologies, and roles involved in DataOps to improve the speed and availability of data for analytics—as well as the common pitfalls to avoid. 

Posted May 20, 2019

Machine learning (ML) is on the rise at businesses hungry for greater automation and intelligence with use cases spreading across industries. At the same time, most projects are still in the early phases. From selecting data sets and data platforms to architecting and optimizing data pipelines, there are many success factors to keep in mind.

Posted May 20, 2019

Apache Airflow is turning heads these days. It integrates with many different systems and it is quickly becoming as full-featured as anything that has been around for workflow management over the last 30 years. This is predominantly attributable to the hundreds of operators for tasks such as executing Bash scripts, executing Hadoop jobs, and querying data sources with SQL.

Posted May 16, 2019

It is still early for the use of cognitive technologies and AI, but many organizations are exploring the potential it holds across a range of industries. Retail, banking, healthcare, and manufacturing are industries leading the charge to leverage AI today. At the same time, concerns persist.

Posted May 16, 2019

Data architectures are becoming more complex and changing more frequently, requiring that a new operational mindset be applied to data management. In particular, automating the building and maintenance of data pipelines is needed, as is instrumenting and continuously monitoring pipeline performance to ensure reliability and quality for data consumers. We call this practice "DataOps."

Posted May 16, 2019

SnapLogic is releasing the May 2019 version of the SnapLogic Intelligent Integration Platform, extending its leadership in AI-enabled integration. This latest release adds several new core platform, API management, and data science capabilities, driving greater productivity and insights for users while accelerating enterprise-wide innovation and automation initiatives.

Posted May 15, 2019

Datrium is releasing the Automatrix platform, a secure multicloud data platform for the resilient enterprise designed to deliver best-in-class compute, primary storage, backup, disaster recovery, encryption, and data mobility capabilities. Automatrix will offer a suite of autonomous data management applications, built on its new SaaS application framework, which leverages machine learning to simplify and automate complicated IT tasks.

Posted May 15, 2019

Toad for Oracle SDP enables DBAs to detect and receive notification of PII data residing in databases, select encryption and redaction options immediately after detection, and perform ongoing monitoring to proactively manage databases. With greater visibility and insight into where sensitive data resides, Quest says, DBAs can maintain stronger security and have greater confidence in meeting compliance regulations and standards

Posted May 15, 2019

AI on-the-edge is an emerging development, unlocking a whole new world of possibilities, including predicting customer needs before they even know them. But edge AI seems like it's only a game for the most cutting-edge companies like Apple, Amazon, or Tesla, to name a few. Traditional enterprises aren't really embracing it out of fear it may cost too much or due to uncertainty about the potential ROI.

Posted May 15, 2019

Oracle has launched a new Oracle Cloud Solution Hub in Kuala Lumpur, and has been recruiting to fill close to 150 new positions for the new facility with experienced and highly trained cloud consultants.

Posted May 15, 2019

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114

Sponsors