Newsletters




Data Integration

Traditional approaches to the process of combining disparate data types into a cohesive, unified, organized view involve manual coding and scripting. The need for Real-Time Business Intelligence and the ability to leverage a wider variety of data sources is driving companies to embrace new ways to achieve Data Integration, including Data Virtualization, Master Data Management, and Integration Automation.



Data Integration Articles

As more and more applications, websites, and basic smart home devices ask users permissions for their data, people are starting to wonder where their information is going. The question of privacy, the Internet of Things, and what companies do with an individual's personal information was the subject of two different sessions at Strata + Hadoop World in NYC.

Posted October 13, 2015

As many companies begin to look to other methods of data storage other than traditional methods such as data warehouses, cloud storage has become a popular option that organizations are beginning to use. Cloud offers companies better cost and more flexibility than traditional storage methods and companies are beginning to make use of Cloud and its advantages. When considering cloud storage options, there are lots of different questions that a company must weigh. Sarah Maston, developer advocate with IBM Cloud Data Services, covered the move to the cloud in a recent DBTA webinar.

Posted October 13, 2015

In what is being hailed as the biggest tech merger ever, Dell Inc. and EMC Corp. today formally announced they have signed a definitive agreement under which Dell will acquire EMC. The total transaction is valued at $67 billion. The deal is expected to close in the second or third quarter of Dell's fiscal year which ends February 3, 2017 (within the months of May to October 2016). The industry is going through a "tremendous transformation," with the old style of IT being "pretty quickly disrupted" yet this rapid change is also presenting "incredibly rich" opportunities, said Joe Tucci, chairman and chief executive officer of EMC, during a conference call with media and industry analysts.

Posted October 12, 2015

IBM has entered into a definitive agreement to acquire Cleversafe, Inc., a developer and manufacturer of object-based storage software and appliances. The acquisition will enhance IBM's positions in storage and hybrid cloud.

Posted October 12, 2015

Actiance is teaming up with IBM to integrate the Actiance social communications compliance, archiving and analytics platform into IBM's Information Lifecycle Governance product portfolio. The combined platform enables customers to more effectively address issues related to migrating current technology solutions to the cloud.

Posted October 12, 2015

Yellowfin, has launched DashXML - a browser-based Java application to make it easier for customers and software partners to create customized analytical functionality and applications. According to the vendor, DashXML is a flexible framework that communicates with Yellowfin via a web wervices API to expose the functionality of Yellowfin's BI platform - such as reports, filters and security capabilities - while simultaneously providing complete freedom regarding application design, layout and user interaction.

Posted October 12, 2015

ClearStory Data, a company specializing in bringing business-oriented data intelligence to everyone, introduced new advancements to its Spark-native Intelligent Data Harmonization and blending capabilities that will help make users more self-reliant.

Posted October 12, 2015

MapR Technologies has added native JSON support to the MapR-DB NoSQL database. The in-Hadoop document database will allow developers to quickly deliver scalable applications that also leverage continuous analytics on real-time data. A developer preview of MapR-DB with sample code is available for download and general availability of these new capabilities in MapR-DB will be available in Q4 2015.

Posted October 07, 2015

Ever since Linux became a viable server operating system, organizations have been looking to all kinds of open source software (OSS) to save on license and maintenance costs and to enjoy the benefits of an open platform that invites innovation. If you're considering MySQL or another open source DBMS as either your primary database or to, perhaps, operate alongside your existing commercial systems, such as Oracle or Microsoft SQL Server, for one reason or another, here are seven things to keep in mind.

Posted October 07, 2015

The Agile methodology is great for getting turgid development teams to start working faster and more coherently. With Agile, which focuses on more rapid, incremental deliverables and cross-departmental collaboration, the bureaucratic plaque is flushed from the information technology groups' arteries. But there is a dark side to Agile approaches.

Posted October 07, 2015

Prior to SQL Server 2016, currently in CTP, your main method for encrypting a SQL Server application was to use a feature called Transparent Data Encryption. TDE provides strong encryption, but with some shortcomings. First, you have to encrypt an entire database. No granularity is offered at a lower level, such as encrypting specific tables or certain data within a table. Second, TDE encrypts only data at rest, in files. Data in memory or in-flight between the application and server are unencrypted. Enter Always Encrypted.

Posted October 07, 2015

Too little emphasis overall is placed on the integrity and recoverability of the data—and too much is placed on performance. Yes, performance is probably the most visible aspect of database systems, at least from the perspective of the end user. But the underlying assumption of the end user is always that they want to access accurate and, usually, up-to-date data. But what good does it do to quickly access the wrong data? Anybody can provide rapid access to the wrong data!

Posted October 07, 2015

There's unrelenting pressure on businesses to compete on analytics and to be able to anticipate customer needs and trends ahead of the curve. Enterprises are looking to expand BI and analytics capabilities as far and wide as technologies and budgets will allow them to go. As a result, the continuing advance of analytic capabilities across the enterprise has reached a "tipping point."

Posted October 07, 2015

IT suppliers and data management managers are experiencing a major pain point with efficient data logging management. The availability of NoSQL open source software has enabled enterprises to collect large volumes of data from different sources, and software companies have implemented "call back home" features that allow their software to send information to data collection centers within various parameters, creating additional run time configurations and data traffic. And as the Internet of Things and a "connected everything" approach to businesses become increasingly popular, more and more data will flow in and out of data management systems, leaving IT managers muddled with millions of pieces of data they must properly manage and store.

Posted October 07, 2015

Magnitude Software, a provider of enterprise information management (EIM) software, has released new product versions designed to improve every component of the Noetix operational reporting solution for Oracle E-Business Suite. Now available, NoetixViews 6.5 for Oracle E-Business Suite features incremental regeneration for global views, as well as additional enhancements. Before incremental regeneration, the only option for implementing NoetixViews Workbench customizations was a full regeneration of the views. Incremental regeneration only processes NoetixViews Workbench changes since the last full or incremental regeneration reducing the wait time to minutes so that end users benefit from quicker access to data and faster time to business decisions, requiring fewer resources and less planning from IT.

Posted October 07, 2015

Oracle has announced Oracle SOA Cloud Service and Oracle API Manager Cloud Service, new additions to the Oracle Cloud Platform for Integration. The two cloud services join Oracle's other iPaaS services, including Oracle Integration Cloud, which was announced in June. The two new releases are part of Oracle's ongoing process of augmenting and covering integration use cases to address the variety of different user requirements, according to Amit Zavery, senior vice president of Oracle Cloud Platform. "These are two different offerings for two different use cases."

Posted October 07, 2015

Embarcadero Technologies, a provider of software solutions for application and database development, recently unveiled DB PowerStudio 2016, a suite of database tools that provides database managers and data professionals with comprehensive administration, development, performance and monitoring capabilities across multiple platforms.

Posted October 07, 2015

Basho Technologies today announced Basho Riak TS, a distributed NoSQL database that is designed to enable analysis of massive amounts of sequenced, unstructured data generated from the Internet of Things (IoT) and other time series data sources.

Posted October 06, 2015

Couchbase is today announcing the general availability of Couchbase Server 4.0, a major new release of its NoSQL database management system. The company is calling Couchbase Server 4.0 a "transformational release," that dramatically increases the types of applications and use cases that Couchbase can now support. The announcement is being made at Couchbase Live New York, where customers such as Marriott, GE, Gannett, Cox Automotive, DIRECTV, Nielsen and others are speaking about their use of Couchbase in a variety of deployments.

Posted October 06, 2015

SnapLogic made three announcements during Strata + Hadoop World in NYC, including a collaboration with Microsoft, new product updates, and the development of new connectors

Posted October 06, 2015

At Strata + Hadoop World 2015, Attunity announced the release of Attunity Replicate Express, a downloadable edition of its data replication and loading software. The solution, which answers a growing demand for more accessible real-time big data analytics, is freely available to download online. The new solution supports ingesting data to and from Oracle, SQL Server, and Hadoop Data Lakes for test and development environments.

Posted October 06, 2015

Built on Hadoop, Kyvos gives business users and analysts the ability to query billions of rows of data within seconds. Kyvos' technology allows users to pre-process data and build cubes on Hadoop for faster performance and instant responses. With this partnership, Kyvos can connect Tableau users to their Hadoop data within minutes, the companies say. "It's a benefit to Tableau because it opens up the data that's available to the business user through Tableau and improves the response time," said Ajay Anand, vice president of product management and marketing at Kyvos."They've been very supportive with what we are trying to do."

Posted October 06, 2015

One of the noticeable changes this year at Strata + Hadoop World 2015 was the rise of Apache Spark, an engine for large scale data processing. In recent months, many companies have extended support to Spark, which can be complementary to Hadoop, but can also be deployed without it.

Posted October 05, 2015

Syncsort is continuing to grow its platforms capabilities by announcing new integration with two active open source platforms, Apache Kafka and Apache Spark, enabling users to better handle real-time, large-scale data processing, analytics, and feeds.

Posted October 01, 2015

At Strata + Hadoop World 2015, SAP showcased its portfolio of big data solutions, including the HANA platform that offers real-time integration of big data and information held in Hadoop with business processes and operational systems, Lumira and SAP BI tools that enable data discovery on Hadoop along with data wrangling capabilities, SAP Data Services, and the newest SAP product for the Hadoop world, HANA Vora, which takes advantage of an in-memory query engine for Apache Spark and Hadoop to speed queries. SAP HANA Vora can be used as a stand-alone, or in concert with SAP HANA platform to extend enterprise-grade analytics to Hadoop clusters and provide enriched, interactive analytics on Hadoop and HANA data.

Posted October 01, 2015

Teradata Corp. has accelerated its roadmap for the open source Presto by delivering ODBC (Open Database Connectivity)/JDBC (Java Database Connectivity) drivers for free. Presto is an open source SQL query engine which supports big data analytics.

Posted October 01, 2015

At Strata + Hadoop World, TIBCO announced the availability of the Spotfire Cloud's data discovery and advanced analytics connector to Apache Spark SQL, along with a commercial integration with SparkR. The Spark SQL direct connector is now available in TIBCO Spotfire Cloud, and will also be incorporated in the next TIBCO Spotfire on-premises release.

Posted October 01, 2015

Objectivity, which recently introduced ThingSpan, a purpose-built information fusion platform intended to simplify and accelerate companies' ability to deploy and derive value from industrial Internet of Things (IoT) applications, has announced plans to support Intel's TAP (Trusted Analytics Platform) at Strata + Hadoop World, in NYC. ThingSpan is aimed at helping companies "that are drowning in data but thirsty for answers in time" said Jay Jarrell, CEO and president of Objectivity, during an interview at the conference.

Posted September 30, 2015

At Strata + Hadoop World in New York City, Talend, a provider of data integration software for the cloud and big data, is announcing a new version of its platform, now offering support for Apache Spark and Spark Streaming. Talend 6 will leverage over 100 Spark components to deliver rapid data processing speed and enable any company to convert streaming big data or IoT sensor information into immediate actionable insights.

Posted September 30, 2015

DataTorrent is teaming up with two big companies that will allow it to provide access to better security and make adoption of Hadoop easier. DataTorrent is partnering with Cisco to allow integration between its DataTorrent RTS platform and Cisco's Application Centric Infrastructure (ACI) through the Application Policy Infrastructure Controller (APIC), offering a unified management architecture for enterprises to manage their big data applications along with network and security. DataTorrent is also integrating its platform with Microsoft Azure HDInsight via the Microsoft Azure Marketplace.

Posted September 29, 2015

Paxata, provider of an adaptive data preparation platform, is partnering with Cisco, creating a jointly developed solution dubbed Cisco Data Preparation (CDP). "We are delighted to partner with a world-class organization like Cisco as we continue to fulfill our vision to bringing Adaptive Data Preparation to every analyst in the enterprise," said Prakash Nanduri, Co-founder and CEO of Paxata.

Posted September 29, 2015

Pentaho is updating its platform to help users blend data more efficiently and manage the analytic data pipeline. "We've learned so much over the last couple of years from our big data customers and customers that have scaled and seen the value of big data and their environments," said Donna Prlich, vice president of products solutions and marketing at Pentaho. "We're really looking at our product line and saying, ‘Where do we take this and where does it need to go?' In 6.0 it's really all about putting big data to work."

Posted September 29, 2015

Arcadia Data, a provider of a unified visual analytics and business intelligence (BI) platform for big data, is releasing Arcadia Enterprise, a solution that will run natively in Hadoop. The company says the platform, dubbed Arcadia Enterprise, bypasses the restrictions of legacy BI and visualization tools by allowing users to work directly with their data on Hadoop. "We give the analyst the ability to do free-form exploration of the highest granularity of data in the Hadoop system," said Priyank Patel, co-founder and chief product officer at Arcadia.

Posted September 29, 2015

The Hortonworks DataFlow (HDF) support subscription is now available. HDF, powered by Apache NiFi, a top-level open source project, is intended to help organizations take advantage of data related to the Internet of Anything (IoAT) and helps make it easier to automate and secure data flows and collect, conduct and curate real-time business insights and actions derived from any data, from anything, anywhere. "By flowing that data into HDP, our customers are able to rapidly bring these new data elements under management in a completely secure and purely open way," said Tim Hall, vice president of product management at Hortonworks.

Posted September 29, 2015

At Strata + Hadoop World in New York City, Cloudera announced a public beta of a new storage to enable faster analytics in Hadoop. Kudu, a new columnar store for Hadoop, enables the combination of fast analytics on fast data. Complementing the existing Hadoop storage options, HDFS and Apache HBase, Kudu is a native Hadoop storage engine that supports both low-latency random access and high-throughput analytics, dramatically simplifying Hadoop architectures for increasingly common real-time use cases.

Posted September 28, 2015

IBM introduced a new cloud security technology that helps safeguard the increasing use of "bring-your-own" cloud-based apps at work. Cloud Security Enforcer combines cloud identity management (Identity-as-a-Service) with the ability for companies to discover outside apps being accessed by employees, including those they are using on their mobile devices. These combined capabilities enable companies to equip their workforce with a secure way to access and use the apps that they want.

Posted September 28, 2015

IBM expanded its array of APIs, technologies, and tools for developers who are creating products, services and applications embedded with Watson. Over the past 2 years, the Watson platform has evolved from one API and a limited set of application-specific deep Q&A capabilities to more than 25 APIs powered by over 50 technologies.

Posted September 28, 2015

In advance of an event at the American Museum of Natural History in New York City, at which MongoDB showcased leading use cases for the company's NoSQL database technology, and introduced a new MongoDB University app for iOS, Kelly Stirman, vice president of strategy and product marketing at MongoDB, discussed how companies are putting MongoDB to use now, and upcoming features in MongoDB 3.2 which will be rolled out before the end of the year.

Posted September 24, 2015

Pentaho, a Hitachi Data Systems company, and Melissa Data, a provider of global contact data quality solutions, have formed a partnership to create new data quality plug-ins for Pentaho's big data integration and analytics platform.

Posted September 24, 2015

Cambridge Semantics, a provider of data solutions driven by semantic web technology, has formed an alliance with MarkLogic, which provides enterprise NoSQL database technology. According to the vendors, the partnership will will help organizations to rapidly store, access, visualize and act upon diverse data to create scalable, semantic-driven data management and investigative analytics applications at a fraction of the time and cost of traditional approaches.

Posted September 24, 2015

MemSQL, a provider of real-time databases for transactions and analytics, has announced Spark Streamliner, an integrated Spark solution to give enterprises immediate access to real-time analytics.

Posted September 24, 2015

MarkLogic, which bills itself as the only enterprise NoSQL database provider, completed a $102 million financing round earlier this year that it will use to accelerate the pace of growth in the $36 billion operational database market. Recently, Big Data Quarterly spoke with Joe Pasqua, executive vice president of products at MarkLogic, about the changing database management market, and what MarkLogic is doing to meet emerging enterprise customer requirements.

Posted September 24, 2015

Traditional data warehousing models and open source alternatives such as Apache Hadoop and Storm have been touted as solutions to a variety of "big data" challenges. However, utilities have found that these approaches cannot handle the scale and complexity of data generated in industrial environments. Additionally, they fail to provide the real-time analysis and situational awareness that utilities need to improve decision making or address critical events in real-time, such as optimizing crews during outages and severe weather events.

Posted September 24, 2015

In-memory databases and grids have entered the enterprise mainstream. New offerings from pure-play in-memory database providers as well as the large relational database management systems vendors are helping organizations that are scrambling to keep pace with the demands of an always-on, real-time economy. These in-memory databases are emerging in many forms—from extensions of relational database management systems to NoSQL databases to cloud hosted NoSQL databases. These new technologies couldn't come a moment too soon.

Posted September 24, 2015

There are various terms being bandied about that describe the new world data centers are entering—from the "third platform" to the "digital enterprise" to the "always-on" organization. Whatever the terminology, it's clear there is a monumental shift underway. Business and IT leaders alike are rethinking their approaches to technology, rethinking their roles in managing this technology, and, ultimately, rethinking their businesses. The underlying technologies supporting this movement—social, mobile, data analytics, and cloud—are also causing IT leaders to rethink the way in which database systems are being developed and deployed.

Posted September 24, 2015

Ricoh Company, Ltd, the printing and document management company, is using Oracle's SPARC T5 servers with Oracle ZFS Storage Appliances and Oracle Database to analyze data from RICOH @Remote, its real-time support service for customers in more than 100 countries and regions. To unify the data across the company for accurate views and analysis, Ricoh built a private cloud accessible by the entire remote service group that allows high speed access from internal networks.

Posted September 24, 2015

StreamSets Inc., a company that aims to speed access to enterprise big data, has closed a $12.5 million round of Series A funding. The single biggest barrier to a successful enterprise analytics platform is the effective and efficient ingest of data, the company says.

Posted September 24, 2015

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88

Sponsors