Newsletters




Business Intelligence and Analytics

The world of Business Intelligence and Analytics is evolving quickly. Increasingly, the emphasis is on real-time Business Intelligence to enable faster decision making, and on Data Visualization, which enables data patterns to be seen more clearly. Key technologies involved in preparing raw data to be used for Business Intelligence, Reporting and Analytics – including ETL (Extract, Transform, and Load), CDC (Change Data Capture), and Data Deduplication – support a wide range of goals within organizations.



Business Intelligence and Analytics Articles

Dell is partnering with Datawatch Corporation to continue growing its analytics business by integrating Datawatch's interactive visualization and dashboarding capabilities directly into its Statistica advanced analytics platform.

Posted April 30, 2015

BackOffice Associates' HiT Software division, a provider of data replication and change data capture solutions for heterogeneous database environments, has announced the release of version 8.5 of its flagship product DBMoto.

Posted April 29, 2015

Qlik, a provider of visual analytics solutions, is previewing Qlik Sense Enterprise 2.0, the latest release of its business intelligence platform, which provides self-service data visualization, reporting and dashboards, guided analytics, and embedded analytics.

Posted April 28, 2015

In a recent DBTA webcast, Shane Johnson, senior product marketing manager, Couchbase, discussed the relationship between NoSQL and Hadoop, detailing the multiple ways to integrate NoSQL databases with Hadoop. "It's not Hadoop or Couchbase Server. It's Hadoop and Couchbase Server," said Johnson.

Posted April 28, 2015

Splice Machine, a provider of Hadoop RDMS, announced that it is partnering with mrc (michaels, ross & cole ltd), to allow Splice Machine's Hadoop RDBMS to be certified and integrated with mrc's m-Power platform. "Our partnership with mrc gives businesses a solution that can speed real-time application deployment on Hadoop with the staff and tools they currently have, while also offering affordable scale-out on commodity hardware for future growth," said Monte Zweben, co-founder and CEO, Splice Machine.

Posted April 28, 2015

Nimble Storage grew its business by providing its customers with insights and recommendations for optimizing their storage infrastructure and simplifying its day-to-day operations.Join this DBTA webcast, on Thursday, April 30 at 2 pm ET / 11 am PT, to find out how, using HP Vertica, Nimble Storage achieved an average of more than $2.3 MM/year in benefits, an ROI of 447% and a payback period of only 6.8 months.

Posted April 28, 2015

Embarcadero Technologies, a provider of software solutions for application and database development, has unveiled the new XE7 version of ER/Studio, its flagship data architecture suite.

Posted April 28, 2015

IBM continues to experience tough quarters, but its mainframe business stands out like a bright beacon. For a platform that consistently is pronounced to be on the verge of obsolescence, it proved again and again that it provides more value to businesses than farms of commodity servers. Able to run clouds, with the best security of any platform, mainframes continue to provide their worth.

Posted April 27, 2015

Cloud technology was a dominant focus at COLLABORATE 15, which took place earlier this month, according to Melissa English, president of the Oracle Applications Users Group (OAUG). "What's on top of everybody's mind is cloud strategy," English noted.

Posted April 27, 2015

Predixion Software, a developer of cloud-based predictive analytics (PA) software, announced that Software AG will lead the company's series D funding round. the company says that this fourth round of funding, which includes participation from existing financial and strategic investors, including GE Software Ventures, will support Predixion's move into the Internet of Things (IoT) analytics market.

Posted April 27, 2015

Pivotal HAWQ is now available on the Hortonworks Data Platform (HDP), enabling the benefits of SQL on Hadoop to be leveraged by enterprises that are investing in HDP. This marks the first time that the features and capabilities of Pivotal HAWQ have been made available outside of Pivotal. The availability aligns with a common Open Data Platform (ODP) Core that allows users to leverage the best-of-breed technology across providers.

Posted April 27, 2015

The future will flourish with machines. We've been told this in pop culture for decades, from the helpful robots of the Jetsons, to the infamous Skynet of the Terminator movies, to the omniscient "computer" of Star Trek. Smart, connected devices will be ubiquitous and it's up to us, the humans, to decide what's next. But the Internet of Things (IoT) is about more than devices and data.

Posted April 23, 2015

There are many factors contributing to data environment changes, including users, technology, economics, and data itself. These four sources of change are creating opportunities to deliver competitive advantage but also new management, administration and optimization challenges.

Posted April 23, 2015

SUSE and Veristorm are partnering to provide certified high-performance Hadoop solutions that run directly on Linux on IBM z Systems, IBM Power Systems, and x86-64. Customers with IBM z Systems can team SUSE Linux Enterprise Server for System z with Veristorm zDoop, a commercial distribution of Hadoop supported on mainframes.

Posted April 23, 2015

Many DBAs are now tasked with managing multi-vendor environments, and handling a variety of data types. Increasingly, DBAs are turning to strategies such as database automation to be able to concentrate more on the big picture of moving their enterprises forward.

Posted April 23, 2015

To help ensure uptime and efficiency of key retail operations, while providing consistent, immediately available upgrades—and thus enabling internal IT groups to focus on differentiating the customer experience and driving growth, Oracle has introduced new Retail Cloud Services. The six new Oracle Retail cloud services are aimed at providing retailers with fast access to enterprise-grade applications for managing critical e-commerce, customer engagement, order management, order fulfillment, loss prevention, and brand compliance operations.

Posted April 22, 2015

Oracle has released Application Express 5, a new version of the popular tool for development and deployment of professional web-based applications for desktops and mobile devices using only a web browser. With the new release, the first since October 2012, not only did Oracle affect what can be produced but the development environment itself has been improved as well, according to Michael Hichwa, vice president of software development at Oracle, and Joel Kallman, director, software development at Oracle.

Posted April 22, 2015

MySQL has announced MySQL Server 5.7, release candidate 1 (5.7.7), which represents the first major release since the 5.6 GA release 2 years ago. MySQL 5.7 RC 1 adds enhancements in several important areas, which are targeted at both growing with existing customers and their current installed base and demands, as well as trying to enter into new market segments, according to Tomas Ulin, vice president for the MySQL engineering team at Oracle. In particular to broaden its use within enterprise environments, with this release, work has been done in areas such as security, GIS, and query optimization.

Posted April 22, 2015

SUSE has improved high availability capabilities for deployments of the SAP HANA platform via SUSE Linux Enterprise Server for SAP Applications, the recommended and supported operating system for use with SAP HANA. The new version of the resource agents enhance the high availability component of SUSE Linux Enterprise for SAP Applications, with two additional scenarios for customers to automate their system replication of SAP HANA. In addition to the original performance-based scenario, SUSE now supports cost-based and multi-tier SAP HANA system replication.

Posted April 22, 2015

Progress Software has introduced a preview program for a standards-based connectivity solution to deliver fast transactions and analytics for SAP HANA. Called "Progress DataDirect ODBC" for SAP HANA, the connectivity solution will support both high-volume transactional workloads and massive analytics, provide connectivity to virtually any application including all major BI and analytics tools, and meet the demands of low latency, real-time query and analysis with superior throughput and CPU efficiency.

Posted April 22, 2015

SAP SE has announced an Industry 4.0 implementation project with GEA to address condition monitoring and predictive maintenance. GEA, a supplier for the food processing industry and a wide range of process industries, will work with SAP to optimize the performance of its separator and decanter machinery with the SAP Predictive Maintenance and Service solution, cloud edition. Based on SAP HANA Cloud Platform, the solution aims to bring together technology, sensors, and machine data with business processes, applications, and practices.

Posted April 22, 2015

A new release of the HP Haven Big Data Enterprise and OnDemand Platform incorporates advanced analytics and predictive capabilities for enterprises working with large volumes and varieties of information.

Posted April 22, 2015

SAP has announced it has certified its products on Oracle Database 12c. "More than two-thirds of all SAP customers run their SAP applications on the Oracle Database today. There is a strong overlap," said Sohan DeMel, vice president, Product Strategy and Business Development, Oracle. "These joint customers are looking to leverage Oracle's best database technology."

Posted April 22, 2015

Voting has opened for the 2015 DBTA Readers' Choice Awards. This year, there are more than 300 nominees across 29 categories. Unlike other awards programs which rely on our editorial and publishing staff's evaluations, the DBTA Readers' Choice Awards are unique in that the winning information management solutions are chosen by you—the people who actually use them.

Posted April 22, 2015

To help organizations answer questions with data spread across disparate analytics systems and data repositories, Teradata has expanded its QueryGrid technologies. "With this announcement we have our foot on the gas pedal," Imad Birouty, director of product marketing, Teradata. "We have seven updates. We are announcing new connectors that are on their way, announcing that we have delivered on the connectors that we previously announced, and we are refreshing previously-released connector versions of the technologies."

Posted April 20, 2015

Perhaps no other technology is more intertwined with the promise of big data than Apache Hadoop, the open source framework that emerged 10 years ago. Hadoop was first leveraged at big web companies for its ability to process large quantities of varied-format data using affordable commodity servers. Today, the Hadoop ecosystem is expanding swiftly, and its value in the enterprise is being recognized.

Posted April 15, 2015

Unstructured data types and new database management systems are playing an increasing role in the modern data ecosystem, but structured data in relational database management systems (RDBMS) remains the foundation of the information infrastructure in most companies. In fact, structured data still makes up 75% of data under management for more than two-thirds of organizations, with nearly one-third of organizations not yet actively managing unstructured data at all, according to a new survey commissioned by Dell Software and conducted by Unisphere Research, a division of Information Today, Inc.

Posted April 15, 2015

Percona, a company that makes MySQL and OpenStack faster and reliable for customers, is acquiring Tokutek, allowing Percona to design, service, and support remote management for both MySQL and the ACID-compliant NoSQL database. Tokutek is known for delivering big data processing power across two open source data management platforms, MySQL and MongoDB.

Posted April 14, 2015

A host of questions surround the implementation of data virtualization and, as the concept becomes commonplace, more businesses need answers and assistance with adapting this method. To address these issues, Lindy Ryan, research director for Radiant Advisors' Data Discovery and Visualization Practice, will present Data Summit 2015 attendees with a toolkit for adopting data virtualization.

Posted April 14, 2015

AtScale, Inc. has introduced a platform that will enable interactive, multi-dimensional analysis on Hadoop, directly from standard business intelligence tools such as Microsoft Excel, Tableau Software or QlikView. Dubbed the "AtScale Intelligence Platform," the new offering provides a Hadoop-native analysis server that allows users to analyze big data at full scale and top speed, while leveraging the existing BI tools they already own.

Posted April 14, 2015

Informatica, an independent provider of data integration software, has released a new tool aimed at taking a data-centric approach to information security by empowering organizations to identify and visualize sensitive data wherever it resides, inside or outside the corporate perimeter.

Posted April 13, 2015

Think Big, a Teradata company, has introduced the Dashboard Engine for Hadoop, which enables organizations to access and report on big data in Hadoop-based data lakes to make agile business decisions. "There are endless streams of data from web browsers, set top boxes, and contact centers that often land in Hadoop, but sometimes don't make their way into downstream analytics," said Ron Bodkin, president, Think Big.

Posted April 13, 2015

Delphix, a data as a service (DaaS) provider, has updated its DaaS Platform to support Amazon GovCloud, OpenStack, and KVM hypervisor environments for simple migration to both private and public clouds. "Getting data from the data center to private and public clouds is a major obstacle for cloud initiatives, both for migrations and continuing operations," said Dan Graves, VP of product management for Delphix.

Posted April 13, 2015

Pivotal has proposed "Project Geode" for incubation by the Apache Software Foundation (ASF). A distributed in-memory database, Geode will be the open source core of Pivotal GemFire, and is now available for review at network.pivotal.io. Pivotal plans to contribute to, support, and help build the Project Geode community while simultaneously producing its commercial distribution of Pivotal GemFire.

Posted April 13, 2015

Oracle has unveiled Oracle Data Integrator for Big Data to help make big data integration more accessible and actionable for customers. The goal with the new data integration capabilities is to bring together disparate communities that have emerged within the Oracle client base and allow the mainstream DBAs and ETL developers as well as the big data development organization to be brought together on a single platform for collaboration, said Jeff Pollock, vice president of product management at Oracle.

Posted April 08, 2015

To fully take advantage of big data tools and architectures, businesses need to adapt a different mindset, according to Edd Dumbill, who contends that looking at the data value chain is the first step to understanding the value of data.

Posted April 08, 2015

Data is being collected everywhere - and from everything. The idea is that it can provide the power of insights never before possible into everything from the patient care to the health of machinery to customer sentiment about products and services. But to reveal these valuable insights, this data also has to be captured and analyzed in ways never before possible.

Posted April 08, 2015

Teradata made its fourth acquisition of 2014 in the big data space with the purchase of Rainstor, a privately held company specializing in online big data archiving on Hadoop. Here, Chris Twogood, vice president of products and services marketing at Teradata, explains why the newly added technologies and services are important to Teradata's big data portfolio.

Posted April 08, 2015

There is no one single path to the data lake within the data architecture of the organization. Likewise, each data lake is unique, with inputs and decisions from the organization contributing a variety of essential elements in organization, governance, and security.

Posted April 08, 2015

In order to truly appreciate Apache Drill, it is important to understand the history of the projects in this space, as well as the design principles and the goals of its implementation.

Posted April 08, 2015

Translated to an analytical setting, Ockham's principle, also known as Ockham's razor, basically states that analytical models should be as simple as possible, free of any unnecessary complexities and/or assumptions.

Posted April 08, 2015

The role of friction in data discovery is much akin to that minimalist design mantra: Less is more.

Posted April 08, 2015

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82

Sponsors