Newsletters




Data Warehousing

Hardware and software that support the efficient consolidation of data from multiple sources in a Data Warehouse for Reporting and Analytics include ETL (Extract, Transform, Load), EAI (Enterprise Application Integration), CDC (Change Data Capture), Data Replication, Data Deduplication, Compression, Big Data technologies such as Hadoop and MapReduce, and Data Warehouse Appliances.



Data Warehousing Articles

The many compromises demanded by our current plethora of database technologies make selecting a database system far harder than it ought to be. Whichever DBMS you choose, you are probably going to experience a "not-quite-right" solution.

Posted December 08, 2016

IDERA, a provider of database lifecycle management solutions, has released Workload Analysis for SAP HANA, which is aimed at enabling a smooth transition to in-memory computing platform as an alternative to traditional, relational databases.

Posted December 07, 2016

In what has become a data-driven world, your organization's data is valuable. It has become the "keys to the kingdom," so to speak. Very few companies today could function without data, especially good data. However, I would suggest that more important than data, is information. Data provides the building blocks, but information is really the consumable outcome that can be used as a competitive edge.

Posted December 01, 2016

If you are a SQL Server professional, but you don't know about the PASS Summit, then you are missing out. The annual conference is convened every fall in downtown Seattle, the backyard of Microsoft, and attracted over 6,000 attendees this year. And, since it's so close to the Microsoft Redmond campus, hundreds of the SQL Server developers and program managers get to attend—answering user questions, delivering sessions, and presenting chalk talks and panel discussions.

Posted December 01, 2016

To shed light on the enterprise and technology issues IT professionals will be facing in 2017 as business or organizational leadership seeks strategies to leverage the "big data" phenomenon, the fourth annual edition of the Big Data Sourcebook is now available for download.

Posted December 01, 2016

What's ahead for 2017 in terms of big data and IoT? IT executives reflect on the impact that Spark, blockchain, data lakes, cognitive computing,AI and machine learning, and other cutting-edge approaches may have on data management and analytics over the year ahead.

Posted November 30, 2016

SUSE is acquiring OpenStack IaaS and Cloud Foundry PaaS Talent and Technology Assets from HPE. The agreement aims to accelerate SUSE's entry into the growing Cloud Foundry Platform-as-a-Service (PaaS) market.

Posted November 30, 2016

AtScale, which provides a self-service BI platform for big data, has announced an expansion of its services. With this announcement, the company says it is introducing a BI platform that enables businesses to work seamlessly across all of big data, on premise and in the cloud. In addition to Hadoop, AtScale has announced preview availability of support for data stored in Teradata, Google Dataproc and BigQuery, expanding on the company's existing support for Microsoft Azure and HDInsight.

Posted November 21, 2016

Snowflake Computing, a cloud-based data warehousing company, has launched a partner program. According to the vendor, the program helps Snowflake customers find partners with expertise in strategies, architectures, design principles, and best practices in big data and data analytics that can work with them to accelerate and maximize the benefits of Snowflake.

Posted November 18, 2016

Databricks has announced that, in collaboration with industry partners, it has broken the world record in the CloudSort Benchmark, a third-party industry benchmarking competition for processing large datasets. Databricks was founded by the team that created the Apache Spark project.

Posted November 16, 2016

SAP SE is releasing an update to its SAP S/4HANA platform aimed at improving user productivity. By utilizing a simplified data model and SAP Fiori 2.0 user experience, the SAP S/4HANA 1610 release can reduce complexity and allow applications to yield new business capabilities, according to SAP. "All companies and all line-of-business and industries need to start with 1610 if they want to capitalize on the digital transformation for their company before their competitor does," said Sven Denecken, senior vice president of product management, co-innovation and packaging S/4HANA at SAP SE.

Posted November 16, 2016

IDERA, a provider of database lifecycle management solutions, has released DB PowerStudio 2016+. The portfolio of database management tools was designed and built from the ground up by IDERA following the company's acquisition of Embarcadero in October 2015. With this update, in particular, IDERA adds new features for Oracle databases.

Posted November 16, 2016

New data sources such as sensors, social media, and telematics along with new forms of analytics such as text and graph analysis have necessitated a new data lake design pattern to augment traditional design patterns such as the data warehouse. Unlike the data warehouse - an approach based on structuring and packaging data for the sake of quality, consistency, reuse, ease of use, and performance - the data lake goes in the other direction by storing raw data that lowers data acquisition costs and provides a new form of analytical agility.

Posted November 03, 2016

Driven by the demands of an always-on global economy, the widespread proliferation of data is combining with an expectation to leverage seamlessly-integrated data in near-real time. However, data integration methods aren't keeping up.

Posted November 02, 2016

Data has become a disruptive force for global businesses and a catalyst for digital transformation. But data can only be leveraged for BI initiatives to the extent it can be accessed and trusted. And, while today's self-service BI and analytics tools satisfy a user's craving for more "consumerized" technology, they often leave an analyst stuck in neutral because the users, first and foremost, cannot find the data they need to perform any analysis.

Posted November 02, 2016

Redis Labs, the home of Redis, is introducing an open source project called Redis-ML, the Redis Module for Machine Learning. The new project will accelerate the delivery of real-time recommendations and predictions for interactive apps in combination with Spark Machine Learning (Spark ML).

Posted November 01, 2016

Trifacta, a provider of data wrangling solutions, is launching Wrangler Edge, a platform designed for analyst teams wrangling diverse data outside of big data environments. "We are packing the Trifacta product and adding enterprise features such as the ability to schedule jobs to handle larger data volumes to connect to diverse sources," said Will Davis, director of product marketing. "We also added collaboration and sharing features as well all without requiring organizations to manage a large Hadoop infrastructure."

Posted November 01, 2016

MongoDB has introduced a new release of its NoSQL document database platform with key features that support additional data models for a "multimodel" approach, the combination of operational and analytical processing, elastic cross-region scaling, and tooling to simplify data management for customers.

Posted November 01, 2016

Rosslyn Data Technologies, formerly known as Rosslyn Analytics, has announced the immediate availability of RAPid One-Click Data Analytics, a new suite of self-service automated analytic (SaaS) solutions, a new suite of self-service automated analytic (SaaS) solutions targeted at helping to reduce the time to visibility and insight.

Posted October 31, 2016

With data flowing into enterprises from so many different sources, and at varying speeds and times, effective solutions are needed to enable insights to be uncovered for faster decision making. To delve into the issues involved in making big data usable more quickly within organizations, DBTA recently presented a webinar featuring executives from the Federal Home Loan Mortgage Corp., known as Freddie Mac.

Posted October 27, 2016

Oracle has introduced new versions of the Oracle Database Appliance, which the company says is designed to save time and money by simplifying deployment, maintenance, and support for database solutions. "Oracle Database Appliance offers a path to cloud, future-proofing your investment," said Karen Sigman, vice president, Platform Business Group, Oracle.

Posted October 26, 2016

Business intelligence (BI) and analytics are at the top of corporate agendas this year, and with good reason. The competitive environment is intense, and business leaders are demanding they have access to greater insights about their customers, markets, and internal operations to make better and faster decisions—often in real time. There have also been dramatic changes with BI and analytics tools and platforms. The three Cs—cloud, consolidation, and collaboration—are elevating BI and analytics to new heights within enterprises and gaining newfound respect at the highest levels.

Posted October 24, 2016

Snowflake Computing is making its platform more accessible to users with Snowflake On Demand—a sign-up process for data users to get immediate insight from Snowflake's data warehouse. According to the vendor, with a virtual swipe of a credit card on Snowflake's website, data users can access the only data warehouse built for the cloud. They can store and analyze their data without relying on their own IT group to get up and running quickly.

Posted October 20, 2016

Managed application service provider TriCore Solutions has acquired Database Specialists, a database managed service company focused on support for Oracle database systems.

Posted October 19, 2016

A new world of self-service BI brings with it its own issue of data chaos. When everyone is looking at the data their own way, people find different answers to the same questions.

Posted October 10, 2016

Organizational issues such as governance and skills—not technology requirements—are the greatest challenges that IT and corporate managers are facing in the emerging world of big data. To get a better handle on the complex new world big data is catalyzing, executives and professionals recognize they must reimagine and re-architect the concept of the "data center"—and what ultimately is coming out of may be a surprise to everyone. These are some key takeaways from a recent survey of 319 corporate and IT managers, conducted by Unisphere Research, a division of Information Today, Inc., in partnership with Cloudera and Intel.

Posted October 07, 2016

For many years now, Cassandra has been renowned for its ability to handle massive scaling and global availability. Based on Amazon's Dynamo, Cassandra implements a masterless architecture which allows database transactions to continue even when the database is subjected to massive network or data center disruption. Even in the circumstance in which two geographically separate data centers are completely isolated through a network outage, a Cassandra database may continue to operate in both geographies, reconciling conflicting transactions—albeit possibly imperfectly—when the outage is resolved.

Posted October 07, 2016

Typically, most applications consist of both batch and online workloads. This is true even today, when most of our attention has turned to online and web-based interaction. Sure, online activities are the most obvious, in-your-face component of countless applications, but batch processing still drives many actions behind the scenes. This can include applying updates, processing reports, integrating input from multiple sources and locations, data extraction, database utility processing, and more.

Posted October 07, 2016

One of the great promises of AI is contextual processing that come naturally to humans, such as understanding visual input and natural language. You can now get in on the new previews from Microsoft to accomplish these goals. Microsoft Cognitive Services is a collection of APIs that enable developers to tap into high-quality vision, speech, language, knowledge, and search technologies—developed through decades of Microsoft research—to build intelligent apps.

Posted October 07, 2016

One symptom of an organization in the middle of a knowledge vacuum is evidenced by SQL that often includes what appears to be extravagant usage of the GROUP BY clause. Writing GROUP BYs here, there, and everywhere becomes a little SQL development dance step, a jitterbug to bypass the issue—moving but not really getting anywhere. Why do these kinds of circumstances exist? Well, maybe the only expert on the involved system has retired and no one else has picked up the torch, so no one is willing to touch the code.

Posted October 07, 2016

Splice Machine, provider of an SQL RDBMS powered by Hadoop and Spark, now supports native PL/SQL on Splice Machine. Announced at Strata + Hadoop World in NYC, the new capabilities are available through the Splice Machine Enterprise Edition.

Posted October 05, 2016

In his second keynote at Oracle OpenWorld 2016, Oracle executive chairman and CTO Larry Ellison highlighted key features of Oracle's recent product announcements. While in his first keynote, Ellison identified Amazon for infrastructure and Workday for applications as Oracle's chief competitors, throughout his second presentation, he centered Amazon in Oracle's competitive cross-hairs. In addition, in the 1-hour address, Ellison continued to expand on hybrid on-premise-and-cloud computing, security, and infrastructure as a service capabilities as areas of focus and differentiation in Oracle's ongoing strategy.

Posted October 05, 2016

SQL Sentry, a provider of tools for monitoring, diagnosing, and optimizing SQL Server environments, has announced it is combining its tools into one platform, and changing its name to SentryOne.

Posted October 05, 2016

NoSQL and Hadoop—two foundations of the emerging agile data architecture—have been on the scene for several years now, and, industry observers say, adoption continues to accelerate—especially within mainstream enterprises that weren't necessarily at the cutting edge of technology in the past.

Posted October 04, 2016

Zaloni, the data lake company, unveiled new platform updates at Strata + Hadoop World 2016 including new enhancements to Bedrock Data Lake Management Platform and its Mica self-service data preparation solution. Bedrock helps businesses govern and manage data across the enterprise, and Bedrock 4.2 adds new capabilities around data privacy, security, and data lifecycle management.

Posted October 03, 2016

At Strata + Hadoop World, Hortonworks showcased its technology solutions for streaming analytics, security, governance, and Apache Spark at scale.

Posted September 30, 2016

Cloudera has added new technology enhancements to its data management and analytics platform to make it easier for companies to take advantage of elastic, on-demand cloud infrastructure for business value from all their data. The move to the cloud has become a top priority for CIOs, said Charles Zedlewski, vice president, of products at Cloudera, at Strata + Hadoop World 2016 in NYC.

Posted September 29, 2016

Capgemini, a provider of consulting, technology, and outsourcing services, and SAP SE are deepening their strategic partnership with the launch of a new joint initiative to help clients in the discrete manufacturing industries.

Posted September 28, 2016

Nimble Storage's Predictive AF-Series All Flash arrays are now certified by SAP as an enterprise storage solution for the SAP HANA platform. As a result, Nimble customers can leverage their existing hardware and infrastructure components for their SAP HANA-based environments, providing an additional choice for organizations working in heterogeneous environments. This certification adds to the SAP HANA certification Nimble previously obtained for its Adaptive Flash CS-Series arrays for use as enterprise storage solutions for the SAP HANA platform.

Posted September 28, 2016

SAP is releasing a next generation data warehouse solution for running a real-time digital enterprise on-premise and in the cloud. The new solution, SAP BW/4HANA, will be available on Amazon Web Services (AWS) and SAP HANA Enterprise Cloud (HEC).

Posted September 28, 2016

Data lakes are quickly transitioning from interesting idea to priority project. A recent study, "Data Lake Adoption and Maturity," from Unisphere Research showed that nearly half of respondents have an approved budget or have requested budget to launch a data lake project. What's driving this rapid rush to the lake?

Posted September 27, 2016

Database Brothers, Inc. (DBI) has released V6.3 of its flagship product, pureFeat Performance Management Suite for IBM DB2 LUW, which adds the Predictive Index Impact Analysis capability.

Posted September 27, 2016

At Strata + Hadoop World, Pentaho announced five new improvements, including SQL on Spark, to help enterprises overcome big data complexity, skills shortages and integration challenges in complex, enterprise environments. According to Donna Prlich, senior vice president, product management, Product Marketing & Solutions, at Pentaho, the enhancements are part of Pentaho's mission to help make big data projects operational and deliver value by strengthening and supporting analytic data pipelines.

Posted September 26, 2016

SnapLogic is extending its pre-built intelligent connectors - called Snaps - to the Microsoft Azure Data Lake Store, providing fast, self-service data ingestion, and transformation from virtually any to Microsoft's cloud-based repository for big data analytics workloads. This latest integration between SnapLogic and Microsoft Azure helps enterprise customers gain new insights and unlock business value from their cloud-based big data initiatives, according to SnapLogic.

Posted September 21, 2016

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21

Sponsors