Newsletters




Data Integration

Traditional approaches to the process of combining disparate data types into a cohesive, unified, organized view involve manual coding and scripting. The need for Real-Time Business Intelligence and the ability to leverage a wider variety of data sources is driving companies to embrace new ways to achieve Data Integration, including Data Virtualization, Master Data Management, and Integration Automation.



Data Integration Articles

Aruba, a Hewlett Packard Enterprise company, is unveiling Aruba 360 Secure Fabric, a new security framework that provides 360 degrees of analytics-driven attack detection and response. Aruba is also innovating in User and Entity Behavioral Analytics (UEBA) by expanding the Aruba IntroSpect product family, enabling businesses to easily and rapidly scale machine-learned behavior detection from small projects to full enterprise deployments.

Posted September 22, 2017

Alation has added new capabilities for cataloging data lakes deployed both on-premises and in the cloud, including data lakes built with Amazon Simple Storage Service (S3) and the Hadoop Distributed File System (HDFS).

Posted September 21, 2017

WaveMaker, Inc., a provider of Rapid Application Development (RAD) Platform software, is upgrading its platform, completely revamping its user interface. WaveMaker 9 focuses on providing a simplified and friendly first-time user experience, now offering video tutorials and tool run-throughs that can onboard fellow/partner developers to give them a clearer picture about the functions of all the components of the platform.

Posted September 21, 2017

Looker plans to release a new version of its flagship data platform that simplifies daily workflows with new integrations, provides built in applications, improves the user experience with new visualizations, and adds valuable public data to any analysis.

Posted September 21, 2017

Regardless of industry, the ability to collect, manage, and intelligently leverage data will clearly be a differentiator for the foreseeable future. Executives in healthcare are acutely aware of the disruption being driven by this new paradigm and understand that this trend is impacting every sector, from banking to farming to manufacturing.

Posted September 20, 2017

Linear scaling with legacy storage appliances is no longer an option. The burden that traditional architecture feels from today's tsunami of data is surpassed only by the cost required to meet current and future demands. Aside from the huge expense, this method of increasing storage capacity would take too long. Even adding multiple servers could not accommodate storage demands. Vertical storage architecture contains bottlenecks that slow performance to an unacceptable level.

Posted September 20, 2017

It's still months away, but it is never too early to start thinking about the holiday shopping season, especially since most Americans are already anticipating outages and system failures from their favorite online retailers. According to a survey, 52% of shoppers expect to experience an outage on days like Black Friday and Cyber Monday. To prepare for the busiest online shopping season, companies need to ensure their systems are ready for extreme scalability and continuous IT operations, year-round.

Posted September 20, 2017

As companies grow increasingly data-centric in their decision making, product and services development, and their overall understanding of the world they work in, speed and agility are becoming critical capabilities. A common theme in big data and analytics today is "Industry 4.0," representing a new wave of technology that enables the automation necessary for scaling. There's compelling justification for this as companies seek to unlock business value from big data with two broad approaches: the democratization of data with greater access by more users, and the enablement of automation everywhere possible.

Posted September 20, 2017

The movement toward the instrumentation of everything and the democratization of data and analytics is resulting in more data flowing to more users, and is creating new challenges in data management.

Posted September 20, 2017

Over the last few years, organizations have shifted from using virtual data centers to creating private or hybrid IaaS clouds that allow authorized users to perform self-service provisioning of virtual machines. These environments have reduced administrative workloads, improved the user experience, and discouraged shadow IT, but they have also brought their own challenges. As virtualized environments increase in scale, management techniques have often become far less effective, making it difficult to keep track of virtual machines, their owners, and why the virtual machines were created in the first place.

Posted September 20, 2017

Companies today are spreading their applications across multiple clouds in a hybrid fashion. According to a recent IDC CloudView study among 6,000 IT and line-of-business executives whose organizations have adopted cloud technologies, 73% are implementing a hybrid strategy, which most defined as utilizing more than one public cloud in addition to dedicated assets.

Posted September 20, 2017

Many people are unsure of the differences between deep learning, machine learning, and artificial intelligence. Generally speaking, and with minimal debate, it is reasonably well-accepted that artificial intelligence can most easily be categorized as that which we have not yet figured out how to solve, while machine learning is a practical application with the know-how to solve problems, such as with anomaly detectio

Posted September 20, 2017

When it comes to visualizing data, there is no shortage of charts and graphs to choose from. From traditional graphs to innovative hand-coded visualizations, there is a continuum of visualizations ready to translate data from numbers into meaning using shapes, colors, and other visual cues. However, each visualization type is intended to show different types of data in specific ways to best represent its insight. Let's look at five of the most common visualization types to help you choose the right chart for your da

Posted September 20, 2017

Nowadays, many firms are already using big data and analytics to manage and optimize their customer relationships. Both technologies can also prove beneficial to leverage a firm's other key assets: its employees! Various HR analytics (also called workforce analytics) examples can be thought of.

Posted September 20, 2017

TigerGraph is emerging from stealth, securing of $31M in Series A funding and launching TigerGraph - a  native parallel graph database platform for enterprise applications along with the availability of both its Cloud Service and GraphStudio, TigerGraph's visual software development kit (SDK).

Posted September 20, 2017

The "world's first autonomous self-driving database" was announced by Oracle executive chairman of the board and CTO Larry Ellison during a live event at the company's Redwood City headquarters. In addition to the database, which he said he plans to showcase in an Oracle OpenWorld presentation in October, Ellison also discussed Oracle's emphasis on automation in platform services, and the decision to be much more aggressive in simplifying and lowering the pricing of automation around those platform services with two programs—Bring Your Own License to PaaS and Universal Credits.

Posted September 20, 2017

SolarWinds, a provider of IT management software, has announced beta availability of AppOptics, a next-generation application performance management (APM) solution designed to provide comprehensive monitoring for distributed application performance and critical infrastructure metrics.

Posted September 19, 2017

Syncsort is releasing a new platform that will deliver an agile, efficient, and powerful solution that will improve data stored and processed in data lakes. Trillium Quality for Big Data integrates Trillium data quality capabilities with the Intelligent Execution (IX) technology from its DMX-h Big Data integration solution.

Posted September 19, 2017

Software AG is continuing its Internet of Things innovation drive with the launch of an extended Cumulocity IoT technology portfolio for cloud-based IoT platform services enablement.

Posted September 19, 2017

Actian, a provider of software for data management, analytics and integration, has announced support for Apache Spark in the latest release of Actian Vector in Hadoop (VectorH). Actian Vector technology exploits vectorized processing and multi-level in-memory  acceleration to improve performance on Hadoop data stores. It supports single node, clustered, and hybrid computing environments that span on-premise and the cloud.

Posted September 19, 2017

Veriflow has introduced updates to its Continuous Network Verification platform that help enterprises, government agencies, and service providers to reduce downtime and improve protection by eliminating outages and vulnerabilities in the network. Emerging from stealth  in April 2016, and with $8.2 million in series A funding raised in June 2016, Veriflow has pioneered the application of "mathematical network verification."

Posted September 14, 2017

BDNA, a provider of comprehensive technology asset information, is entering an agreement with Amazon Web Services (AWS) to develop BDNA Cloud Asset Insights, a new solution that provides discovery and normalization for commercial software assets deployed on Amazon Elastic Compute Cloud (Amazon EC2).

Posted September 13, 2017

Syncsort and ASG Technologies are combining their solutions and expertise in data quality and data governance to help companies with both data governance and regulatory compliance.

Posted September 13, 2017

As organizations increasingly move their data and applications from on-premise deployments to the cloud, the role of the DBA is also shifting. According to Penny Avril, vice president of product management, Oracle Database, the transition means that DBAs have the opportunity to move from being data custodians and keepers to taking on a more strategic role in their organizations. But, she says, the time to prepare for the new cloud reality is now.

Posted September 13, 2017

Oracle has made enhancements to the Oracle Internet of Things (IoT) Cloud. The offering now features built-in artificial intelligence (AI) and machine learning that powers Digital Twin and Digital Thread capabilities.

Posted September 13, 2017

Incorta, provider of a real-time analytics platform, is partnering with global cloud advisory and implementation services company Alcor to provide advanced analytics solutions to enterprises using ServiceNow and other leading software solutions and platforms.

Posted September 13, 2017

Qubole Data Service provides a single platform for ETL, reporting, ad hoc analysis, stream processing and machine learning. It runs on AWS, Microsoft Azure and Oracle Bare Metal Cloud, taking advantage of the elasticity and scale of the cloud, and also supports leading open source engines, including Apache Spark, Hadoop, Presto, and Hive.

Posted September 12, 2017

The Apache Arrow project is a standard for representing columnar data for in-memory processing, which has a different set of trade-offs compared to on-disk storage. In-memory, access is much faster and processes optimize for CPU throughput by paying attention to cache locality, pipelining, and SIMD instructions.

Posted September 12, 2017

New multi-cloud capabilities scale discovery and dependency mapping of all assets to go beyond on-prem data centers to public and private cloud.

Posted September 12, 2017

Information Builders, a provider of business intelligence (BI) and analytics, data integrity, and integration solutions, has announced that its iWay Universal Adapter Suite has achieved SAP-certified integration with the SAP NetWeaver technology platform. The certification is a reflection of the iWay Adapter Suite's interoperability and ability to integrate data residing within SAP solutions.

Posted September 11, 2017

IBM plans to make a 10-year, $240 million investment to create the MIT-IBM Watson AI Lab in partnership with the Massachusetts Institute of Technology. The lab will carry out fundamental artificial intelligence research and seek to propel scientific breakthroughs that unlock the potential of AI. The collaboration aims to advance AI hardware, software and algorithms related to deep learning and other areas, increase AI's impact on industries, such as health care and cybersecurity, and explore the economic and ethical implications of AI on society. IBM's investment in the lab will support research by IBM and MIT scientists.

Posted September 11, 2017

In what is described as the largest acquisition in its history, Rackspace is buying Datapipe, a provider of managed services across public and private clouds, managed hosting, and colocation.

Posted September 11, 2017

data Artisans, founded by the creators of Apache Flink, has introduced dA Platform 2, a new release of its enterprise stream processing platform. Featuring the Application Manager, dA Platform 2 aims to productionize stream processing and enable companies to provide real-time data applications as a centralized enterprise service.

Posted September 11, 2017

Cloudera has acquired Fast Forward Labs, an applied research and advisory services company, that specializes in machine learning and applied AI.

Posted September 07, 2017

Cybersecurity is a top of mind concern for CIOs, as the number of companies who fall victim to cyber attacks and data breaches increases every year. Attacks are becoming more common, sophisticated, and costly. Executives and IT professionals alike know the importance of enterprise security, yet companies' security policies frequently fail to address one key area of vulnerability to organizations—employee mobile devices.

Posted September 07, 2017

Rocket Software is updating its UniData MultiValue Database, part of the MultiValue Application Platform, with the addition of Python programming support and audit logging capabilities. These enhancements expand the potential user base for UniData and improve the ability to recruit new development talent, allowing users to easily establish configurable histories of interactions, events, and activities.

Posted September 07, 2017

Each year, tens of thousands of data professionals from well over 100 countries gather at Oracle OpenWorld in San Francisco. Leaders of two major Oracle users' groups—David Start, president of the Independent Oracle Users Group, and Alyssa Johnson, president of the Oracle Applications Users Group—share what they have planned for their members at Oracle OpenWorld 2017, taking place Oct. 1-5.

Posted September 07, 2017

Evaluating new and disruptive technologies, as well as when and where they may prove useful, is a challenge. Against the rapidly evolving big data scene, this year, Big Data Quarterly presents the newest "Big Data 50," an annual list of forward-thinking companies that are working to expand what's possible in terms of collecting, storing, and deriving value from data.

Posted September 07, 2017

Dell EMC has announced data protection capabilities designed to help enterprises streamline IT management and data governance for VMware environments. The new release adds support for Oracle and SQL Server databases. According to Dell EMC, application and database owners can now have self-service data protection capabilities for applications and databases from within their native GUI tools while remaining fully compliant with governance, service-level objectives (SLOs), and oversight of the IT data protection regime for Oracle and SQL Server databases. The support for Oracle will be available in Q4 2017.

Posted September 06, 2017

On Sunday at Oracle OpenWorld, several Oracle user groups, including the IOUG, will bring the experiences of our users and experts to San Francisco and share with thousands of our peers. If you're coming to OpenWorld, I can't say enough about how important it is to participate in the Sunday Program.

Posted September 06, 2017

The emergence of cryptocurrencies and blockchain technology may prove to be almost as significant an innovation as the internet itself. Blockchain offers a mechanism for the mediation of any transactions that previously would have required trusted third parties, while cryptocurrencies such as Bitcoin may eventually become a significant alternative to traditional "fiat" (e.g., government-backed) currencies. These technologies could eventually revolutionize the global banking infrastructure which has underpinned global commerce for centuries.

Posted September 06, 2017

Dataiku Inc., a maker of Dataiku Data Science Studio (DSS) analytics and collaborative data science tools has announced a $28 million series B funding round led by Battery Ventures and supported by FirstMark, Serena Capital and Alven. The company plans to double its team and accelerate marketing efforts globally while strengthening its platform with new integrations and technologies.

Posted September 06, 2017

NuoDB, which provides a scale-out SQL database for cloud- and container-based environments, has announced a new release which provides customers with flexible hybrid cloud deployment options.

Posted September 01, 2017

Micro Focus has completed its merger with Hewlett Packard Enterprise's (HPE) software business, creating what it says is the seventh largest pure-play enterprise software company in the world. Chris Hsu, formerly COO of HPE and executive vice president and general manager of HPE Software, was appointed CEO of Micro Focus.

Posted September 01, 2017

In an expansion of its Enterprise Performance Management (EPM) Cloud, Oracle has introduced new offerings for tax reporting, and profitability and cost management as well as new strategic modeling and disclosure management capabilities.

Posted August 31, 2017

There are plenty of pronouncements about artificial intelligence—both in terms of the miracles it can produce and the threat it poses to humanity. But according to Ali Ghodsi, co-founder and CEO of Databricks, there is actually a "1% problem" in that there are a handful of companies such as Google, Amazon, and a few others that are actually accomplishing their goals with it. AI has vast potential but some of the claims, as well as the fears, are overstated and a little premature right now, he contends.

Posted August 31, 2017

The future value of hybrid cloud computing is to empower customers to embrace a cloud strategy of their own, versus being dictated by a vendor. A hybrid cloud environment is defined by the customer—a hybrid cloud solution should not dictate where or which cloud the customer must use with their on-premise installation. Although this may seem obvious, large vendors often ignore this critical point, as they dictate choices based on their (lack of) capabilities.

Posted August 31, 2017

Arcadia Data, provider of native visual analytics software for big data, will support KSQL, a new technology for continuous, interactive queries on Kafka topics via SQL. The integration with Confluent's KSQL gives all users advanced visualizations for streaming data use cases, specifically around alerts and time-based data exploration (and drill downs).

Posted August 31, 2017

Expanding on the application modernization initiative they launched in December, MongoDB and Infosys are introducing a new solution targeted at helping companies move workloads off the mainframe and onto the MongoDB database platform.

Posted August 31, 2017

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84

Sponsors