Data Warehousing

Hardware and software that support the efficient consolidation of data from multiple sources in a Data Warehouse for Reporting and Analytics include ETL (Extract, Transform, Load), EAI (Enterprise Application Integration), CDC (Change Data Capture), Data Replication, Data Deduplication, Compression, Big Data technologies such as Hadoop and MapReduce, and Data Warehouse Appliances.

Data Warehousing Articles

Business intelligence (BI) and analytics are at the top of corporate agendas this year, and with good reason. The competitive environment is intense, and business leaders are demanding they have access to greater insights about their customers, markets, and internal operations to make better and faster decisions—often in real time. There have also been dramatic changes with BI and analytics tools and platforms. The three Cs—cloud, consolidation, and collaboration—are elevating BI and analytics to new heights within enterprises and gaining newfound respect at the highest levels.

Posted October 24, 2016

Snowflake Computing is making its platform more accessible to users with Snowflake On Demand—a sign-up process for data users to get immediate insight from Snowflake's data warehouse. According to the vendor, with a virtual swipe of a credit card on Snowflake's website, data users can access the only data warehouse built for the cloud. They can store and analyze their data without relying on their own IT group to get up and running quickly.

Posted October 20, 2016

Managed application service provider TriCore Solutions has acquired Database Specialists, a database managed service company focused on support for Oracle database systems.

Posted October 19, 2016

A new world of self-service BI brings with it its own issue of data chaos. When everyone is looking at the data their own way, people find different answers to the same questions.

Posted October 10, 2016

Organizational issues such as governance and skills—not technology requirements—are the greatest challenges that IT and corporate managers are facing in the emerging world of big data. To get a better handle on the complex new world big data is catalyzing, executives and professionals recognize they must reimagine and re-architect the concept of the "data center"—and what ultimately is coming out of may be a surprise to everyone. These are some key takeaways from a recent survey of 319 corporate and IT managers, conducted by Unisphere Research, a division of Information Today, Inc., in partnership with Cloudera and Intel.

Posted October 07, 2016

For many years now, Cassandra has been renowned for its ability to handle massive scaling and global availability. Based on Amazon's Dynamo, Cassandra implements a masterless architecture which allows database transactions to continue even when the database is subjected to massive network or data center disruption. Even in the circumstance in which two geographically separate data centers are completely isolated through a network outage, a Cassandra database may continue to operate in both geographies, reconciling conflicting transactions—albeit possibly imperfectly—when the outage is resolved.

Posted October 07, 2016

Typically, most applications consist of both batch and online workloads. This is true even today, when most of our attention has turned to online and web-based interaction. Sure, online activities are the most obvious, in-your-face component of countless applications, but batch processing still drives many actions behind the scenes. This can include applying updates, processing reports, integrating input from multiple sources and locations, data extraction, database utility processing, and more.

Posted October 07, 2016

One of the great promises of AI is contextual processing that come naturally to humans, such as understanding visual input and natural language. You can now get in on the new previews from Microsoft to accomplish these goals. Microsoft Cognitive Services is a collection of APIs that enable developers to tap into high-quality vision, speech, language, knowledge, and search technologies—developed through decades of Microsoft research—to build intelligent apps.

Posted October 07, 2016

One symptom of an organization in the middle of a knowledge vacuum is evidenced by SQL that often includes what appears to be extravagant usage of the GROUP BY clause. Writing GROUP BYs here, there, and everywhere becomes a little SQL development dance step, a jitterbug to bypass the issue—moving but not really getting anywhere. Why do these kinds of circumstances exist? Well, maybe the only expert on the involved system has retired and no one else has picked up the torch, so no one is willing to touch the code.

Posted October 07, 2016

Splice Machine, provider of an SQL RDBMS powered by Hadoop and Spark, now supports native PL/SQL on Splice Machine. Announced at Strata + Hadoop World in NYC, the new capabilities are available through the Splice Machine Enterprise Edition.

Posted October 05, 2016

In his second keynote at Oracle OpenWorld 2016, Oracle executive chairman and CTO Larry Ellison highlighted key features of Oracle's recent product announcements. While in his first keynote, Ellison identified Amazon for infrastructure and Workday for applications as Oracle's chief competitors, throughout his second presentation, he centered Amazon in Oracle's competitive cross-hairs. In addition, in the 1-hour address, Ellison continued to expand on hybrid on-premise-and-cloud computing, security, and infrastructure as a service capabilities as areas of focus and differentiation in Oracle's ongoing strategy.

Posted October 05, 2016

SQL Sentry, a provider of tools for monitoring, diagnosing, and optimizing SQL Server environments, has announced it is combining its tools into one platform, and changing its name to SentryOne.

Posted October 05, 2016

NoSQL and Hadoop—two foundations of the emerging agile data architecture—have been on the scene for several years now, and, industry observers say, adoption continues to accelerate—especially within mainstream enterprises that weren't necessarily at the cutting edge of technology in the past.

Posted October 04, 2016

Zaloni, the data lake company, unveiled new platform updates at Strata + Hadoop World 2016 including new enhancements to Bedrock Data Lake Management Platform and its Mica self-service data preparation solution. Bedrock helps businesses govern and manage data across the enterprise, and Bedrock 4.2 adds new capabilities around data privacy, security, and data lifecycle management.

Posted October 03, 2016

At Strata + Hadoop World, Hortonworks showcased its technology solutions for streaming analytics, security, governance, and Apache Spark at scale.

Posted September 30, 2016

Cloudera has added new technology enhancements to its data management and analytics platform to make it easier for companies to take advantage of elastic, on-demand cloud infrastructure for business value from all their data. The move to the cloud has become a top priority for CIOs, said Charles Zedlewski, vice president, of products at Cloudera, at Strata + Hadoop World 2016 in NYC.

Posted September 29, 2016

Capgemini, a provider of consulting, technology, and outsourcing services, and SAP SE are deepening their strategic partnership with the launch of a new joint initiative to help clients in the discrete manufacturing industries.

Posted September 28, 2016

Nimble Storage's Predictive AF-Series All Flash arrays are now certified by SAP as an enterprise storage solution for the SAP HANA platform. As a result, Nimble customers can leverage their existing hardware and infrastructure components for their SAP HANA-based environments, providing an additional choice for organizations working in heterogeneous environments. This certification adds to the SAP HANA certification Nimble previously obtained for its Adaptive Flash CS-Series arrays for use as enterprise storage solutions for the SAP HANA platform.

Posted September 28, 2016

SAP is releasing a next generation data warehouse solution for running a real-time digital enterprise on-premise and in the cloud. The new solution, SAP BW/4HANA, will be available on Amazon Web Services (AWS) and SAP HANA Enterprise Cloud (HEC).

Posted September 28, 2016

Data lakes are quickly transitioning from interesting idea to priority project. A recent study, "Data Lake Adoption and Maturity," from Unisphere Research showed that nearly half of respondents have an approved budget or have requested budget to launch a data lake project. What's driving this rapid rush to the lake?

Posted September 27, 2016

Database Brothers, Inc. (DBI) has released V6.3 of its flagship product, pureFeat Performance Management Suite for IBM DB2 LUW, which adds the Predictive Index Impact Analysis capability.

Posted September 27, 2016

At Strata + Hadoop World, Pentaho announced five new improvements, including SQL on Spark, to help enterprises overcome big data complexity, skills shortages and integration challenges in complex, enterprise environments. According to Donna Prlich, senior vice president, product management, Product Marketing & Solutions, at Pentaho, the enhancements are part of Pentaho's mission to help make big data projects operational and deliver value by strengthening and supporting analytic data pipelines.

Posted September 26, 2016

SnapLogic is extending its pre-built intelligent connectors - called Snaps - to the Microsoft Azure Data Lake Store, providing fast, self-service data ingestion, and transformation from virtually any to Microsoft's cloud-based repository for big data analytics workloads. This latest integration between SnapLogic and Microsoft Azure helps enterprise customers gain new insights and unlock business value from their cloud-based big data initiatives, according to SnapLogic.

Posted September 21, 2016

In his keynote on Monday at Oracle OpenWorld 2016, Oracle CEO Mark Hurd showcased customer success stories and offered three new predictions for the future of cloud, which, he said, represents a generational shift in the IT market.

Posted September 21, 2016

"In this coming year, you'll see us aggressively moving into infrastructure-as-a-service," Larry Ellison, Oracle's chief technology officer and executive chairman of the board, said to kick off the company's OpenWorld conference Sunday night at the Moscone Center. In the first of his two scheduled keynote addresses, Ellison went on to outline a number of strategic announcements that aim to strengthen the company's offerings, as well as to help it compete with, one of its top challengers.

Posted September 19, 2016

To help organizations continue to get the most from their data, Big Data Quarterly has published the second annual "Big Data 50," a list of companies driving big data innovation. The Big Data 50 includes forward-thinking companies that are expanding what is possible in terms of managing and deriving value from data.

Posted September 14, 2016

With the growth of data variety, volume, and velocity, innovative data management approaches are needed. To help organizations get the most from their data, Big Data Quarterly presents the second annual "Big Data 50," our list of companies driving innovation.

Posted September 14, 2016

erwin Inc. has acquired UK-based Corso Ltd, a provider of enterprise architecture solutions. erwin has also announced general availability of erwin CloudCore, an integrated cloud bundle consisting of erwin Data Modeler and Corso Agile EA.

Posted September 12, 2016

At its partner conference this week, Teradata is announcing three key new offerings to support customers choosing hybrid environments spanning cloud and on-premise, and relational and big data technologies.

Posted September 12, 2016

Conventional wisdom insists that IT will migrate to the cloud entirely at some point. But practical experience shows that enterprises that have invested in legacy architecture that still has many years of life left in it are not likely to rip and replace, at potentially astronomical costs. Instead, implementing a Bimodal IT approach supported by SDDC on integrated systems will allow companies to address scalability needs with agility, while also ensuring the mission-critical functions of their legacy systems are not compromised.

Posted September 12, 2016

Can Oracle and its partners keep up with the increasing demands of customers for real-time digital capabilities? Is the Oracle constellation of solutions—from data analytics to enterprise applications—ready for the burgeoning requirements of the Internet of Things (IoT) and data-driven businesses? For Oracle—along with its far-flung network of software vendors, integrators, and partners—times have never been so challenging.

Posted September 07, 2016

The Independent Oracle Users Group (IOUG) is excited to be joining the Oracle technology community in San Francisco once again at Oracle OpenWorld 2016, September 18-22. IOUG's 30,0000+ member community is comprised of the top Oracle technology experts from around the globe, several of whom will be presenting sessions on hot topics like Data Intelligence, iOT, Data Security, and Cloud migrations.

Posted September 07, 2016

Dell has completed the acquisition of EMC, creating a $74 billion company with a technology portfolio spanning hybrid cloud, software-defined data center, converged infrastructure, platform-as-a-service, data analytics, mobility and cybersecurity. Describing itself as the world's largest privately-controlled technology company, the combined entity will be known as Dell Technologies.

Posted September 07, 2016

Programming is a literal sport. Code does exactly what it is configured to do, no compromises. When the definition of a task is fuzzy, it is up to the developer to do what they believe is correct. Does the code reflect what is desired? That answer is left open to interpretation. Sadly, developers may not have a clear understanding, and even the users requesting the solution may not be sure. The results can be very painful for an organization. Expectations may not align with the delivered solutions. Users will blame IT; IT will blame users.

Posted September 02, 2016

Perhaps the biggest and most overlooked is how to create accurate test data. You're implementing a new system in order to deal with a massive amount of data, and perhaps your relational database can't handle the volume, so it's vitally important to properly test this new system and ensure that it doesn't fall over as soon as the data floods in.

Posted August 23, 2016

SHARE recently wrapped up its summer conference in Atlanta. James Vincent, immediate past president of SHARE, reflected on the changes that have taken place in the IT industry during his tenure and the key takeaways from the event which took place July 31-August 5. "One takeaway is that SHARE is on the right track when it comes to its focus on the new IT generation, what we call zNextGen," said Vincent.

Posted August 22, 2016

Many tools promise that users will gain rapid insights and create more flexibility by capturing and storing data, but introducing NoSQL database technologies into the data warehouse and analytics infrastructure is an approach that can yield faster and more flexible results.

Posted August 20, 2016

Magnitude Software, a provider of enterprise information management software, has acquired Simba Technologies Inc., a provider of cross-platform standards-based data access solutions for relational, multi-dimensional, and non-relational data sources. Terms of the purchase were not disclosed.

Posted August 16, 2016

Paxata is unleashing new native push/pull seamless connectivity options to and from Amazon Web Services (AWS) that include the Amazon Redshift data warehouse and Amazon Simple Storage Service (Amazon S3).

Posted August 11, 2016

Thousands of members of the Oracle Applications Users Group (OAUG) get the answers they need by sharing best practices, case studies and lessons learned. As the world's largest education, networking and advocacy forum for users of Oracle Applications, the OAUG helps members connect to find the solutions they need to do their jobs better and to improve their organizations' return on investment in Oracle Applications.

Posted August 04, 2016