Newsletters




Data Warehousing

Hardware and software that support the efficient consolidation of data from multiple sources in a Data Warehouse for Reporting and Analytics include ETL (Extract, Transform, Load), EAI (Enterprise Application Integration), CDC (Change Data Capture), Data Replication, Data Deduplication, Compression, Big Data technologies such as Hadoop and MapReduce, and Data Warehouse Appliances.



Data Warehousing Articles

At Oracle CloudWorld in New York City, Oracle unveiled new enhancements to the Oracle Cloud Platform to help customers move business-critical applications to the cloud. Describing the industry's move to the cloud as an "irresistible force," Oracle CEO Mark Hurd said, "This is not a what-if; this is the way things are going to go. The sooner you get on board with that the better."

Posted January 17, 2017

IBM broke the U.S. patent record with 8,088 patents granted to its inventors in 2016. IBM's 2016 patent output covers inventions in artificial intelligence and cognitive computing, cognitive health, cloud, cybersecurity and other strategic growth areas for the company.

Posted January 16, 2017

The Oracle Applications Users Group (OAUG) has announced Alyssa Johnson as its 2017 president. An active member of the OAUG since 2003, Johnson has served as a member of the organization's board of directors since 2011 and previously as the OAUG president in 2014.

Posted January 12, 2017

Clustrix, a provider of a distributed SQL database, and Zettaset, a provider of big data security, are partnering on data protection and privacy for companies that rely on large-scale OLTP databases.

Posted January 11, 2017

In what has become a data-driven world, your organization's data is valuable. It has become the "keys to the kingdom," so to speak. Very few companies today could function without data, especially good data. However, I would suggest that more important than data, is information. Data provides the building blocks, but information is really the consumable outcome that can be used as a competitive edge.

Posted January 04, 2017

The "big data" era is still very much upon us, ushering in an age of constantly evolving technologies and techniques. Many wonder whether the enterprise data warehouse(EDW) still has relevance in the industry, particularly since many new alternatives exceed the technical capabilities of the traditional EDW at a drastically reduced cost.

Posted January 04, 2017

Our friends at SpiceWorks recently shared some of their data from 900 IT buyers with us. For the year ahead, mobile computing, security, and IT automation are topping IT spending agendas. However, expect IT budgets to remain relatively flat—with the exception of cloud services, storage and IT consulting services. Overall, new IT purchasing won't be automatic—it will only happen as systems reach their end of life, or a significant new business requirement emerges.

Posted January 03, 2017

There was a time when what you saw was what you got. Building up the components of a business intelligence area was very straight-forward. A staging area was a staging area; an operational data store was an operational data store. But like buying a pitcher of beer for $2, or gas for less than a dollar per gallon, those days are gone. The dynamics have changed, things are more federated, and IT must accept more than one standard tool.

Posted January 03, 2017

The past year was a blockbuster one for those working in the data space. Businesses have wrapped their fates around data analytics in an even tighter embrace as competition intensifies and the drive for greater innovation becomes a top priority. The year ahead promises to get even more interesting, especially for data managers and professionals. Leading experts in the field have witnessed a number of data trends emerge in 2016, and now see new developments coming into view for 2017.

Posted January 03, 2017

With the rise of smartphones, laptops, sensors on machines, vehicles, and appliances, massive amounts of data are being generated, according to Balaji Thiagarajan, group vice president of big data at Oracle. For companies that can transform and manage it, he notes, data represents a huge opportunity as a source of competitive advantage and should be leveraged as such. Big data and cloud are two technologies driving dramatic transformations, and, says Thiagarajan, organizations must be ready to react and take advantage of important new trends and technologies to make sure that they come out ahead next year. Here, Thiagarajan shares 10 key predictions for big data in 2017.

Posted December 21, 2016

Oracle's total quarterly cloud revenue was $1.1 billion, for the first quarter over the $1-billion mark.

Posted December 21, 2016

IT management is at the front lines of ensuring its enterprise infrastructure is optimized and cost effective in order to deliver critical services. In a recent DBTA roundtable webinar, Wally Waltner, director of global IT asset management (ITAM) at First Data, and Cathy Won, product marketing director from BDNA, discussed tips and tricks for how to manage all types of data, and offered insight on how to leverage a ITAM project for other cost cutting and risk management initiatives.

Posted December 20, 2016

Informatica, a provider of data management solutions, is now offering hourly pricing for Informatica Cloud Services for Microsoft Azure in the Azure Marketplace. Available as a pay-as-you-go hourly pricing model, the solution is designed to help users of the Azure cloud platform and Microsoft Cortana Intelligence Suite get started faster on cloud data integration and management projects. Additionally, users can run enterprise-class integration jobs with no upfront costs, and simplify integration of disparate data sources - on-premises, in the cloud or in a hybrid environment.

Posted December 14, 2016

IDERA, a provider of database lifecycle management solutions, has released Workload Analysis for SAP HANA, which is aimed at enabling a smooth transition to in-memory computing platform as an alternative to traditional, relational databases.

Posted December 14, 2016

With all the talk about "big data" in the last few years, the conversation is now turning to: What can be built on this platform? It isn't just about the analytics—many people talk about data lakes, but in reality, organizations are looking beyond the data lake. They're looking for a solution that has a flexible infrastructure that quickly enables finding and linking the right information; gives end users self-service access to data without needing to become experts in SQL and complex database schemas; and universally and consistently enforces fine-grained privacy and security.

Posted December 13, 2016

The past year has seen big data get bigger, complex systems get more complex, and cloudy systems get cloudier. The coming year guarantees more of the same. We wish all our readers a happy holiday season, and look forward to serving you in 2017!

Posted December 12, 2016

When software providers consider transitioning to (or at the very least adding) a SaaS offering, they think about the impact to their business of moving from a perpetual license model to a recurring revenue stream. And while it's easy to remember and consider such migration costs as application-level rearchitecture, other upfront and ongoing costs - such as infrastructure and service-related costs - are often severely underestimated.

Posted December 12, 2016

The many compromises demanded by our current plethora of database technologies make selecting a database system far harder than it ought to be. Whichever DBMS you choose, you are probably going to experience a "not-quite-right" solution.

Posted December 08, 2016

If you are a SQL Server professional, but you don't know about the PASS Summit, then you are missing out. The annual conference is convened every fall in downtown Seattle, the backyard of Microsoft, and attracted over 6,000 attendees this year. And, since it's so close to the Microsoft Redmond campus, hundreds of the SQL Server developers and program managers get to attend—answering user questions, delivering sessions, and presenting chalk talks and panel discussions.

Posted December 01, 2016

To shed light on the enterprise and technology issues IT professionals will be facing in 2017 as business or organizational leadership seeks strategies to leverage the "big data" phenomenon, the fourth annual edition of the Big Data Sourcebook is now available for download.

Posted December 01, 2016

What's ahead for 2017 in terms of big data and IoT? IT executives reflect on the impact that Spark, blockchain, data lakes, cognitive computing,AI and machine learning, and other cutting-edge approaches may have on data management and analytics over the year ahead.

Posted November 30, 2016

SUSE is acquiring OpenStack IaaS and Cloud Foundry PaaS Talent and Technology Assets from HPE. The agreement aims to accelerate SUSE's entry into the growing Cloud Foundry Platform-as-a-Service (PaaS) market.

Posted November 30, 2016

AtScale, which provides a self-service BI platform for big data, has announced an expansion of its services. With this announcement, the company says it is introducing a BI platform that enables businesses to work seamlessly across all of big data, on premise and in the cloud. In addition to Hadoop, AtScale has announced preview availability of support for data stored in Teradata, Google Dataproc and BigQuery, expanding on the company's existing support for Microsoft Azure and HDInsight.

Posted November 21, 2016

Snowflake Computing, a cloud-based data warehousing company, has launched a partner program. According to the vendor, the program helps Snowflake customers find partners with expertise in strategies, architectures, design principles, and best practices in big data and data analytics that can work with them to accelerate and maximize the benefits of Snowflake.

Posted November 18, 2016

Databricks has announced that, in collaboration with industry partners, it has broken the world record in the CloudSort Benchmark, a third-party industry benchmarking competition for processing large datasets. Databricks was founded by the team that created the Apache Spark project.

Posted November 16, 2016

SAP SE is releasing an update to its SAP S/4HANA platform aimed at improving user productivity. By utilizing a simplified data model and SAP Fiori 2.0 user experience, the SAP S/4HANA 1610 release can reduce complexity and allow applications to yield new business capabilities, according to SAP. "All companies and all line-of-business and industries need to start with 1610 if they want to capitalize on the digital transformation for their company before their competitor does," said Sven Denecken, senior vice president of product management, co-innovation and packaging S/4HANA at SAP SE.

Posted November 16, 2016

IDERA, a provider of database lifecycle management solutions, has released DB PowerStudio 2016+. The portfolio of database management tools was designed and built from the ground up by IDERA following the company's acquisition of Embarcadero in October 2015. With this update, in particular, IDERA adds new features for Oracle databases.

Posted November 16, 2016

New data sources such as sensors, social media, and telematics along with new forms of analytics such as text and graph analysis have necessitated a new data lake design pattern to augment traditional design patterns such as the data warehouse. Unlike the data warehouse - an approach based on structuring and packaging data for the sake of quality, consistency, reuse, ease of use, and performance - the data lake goes in the other direction by storing raw data that lowers data acquisition costs and provides a new form of analytical agility.

Posted November 03, 2016

Driven by the demands of an always-on global economy, the widespread proliferation of data is combining with an expectation to leverage seamlessly-integrated data in near-real time. However, data integration methods aren't keeping up.

Posted November 02, 2016

Data has become a disruptive force for global businesses and a catalyst for digital transformation. But data can only be leveraged for BI initiatives to the extent it can be accessed and trusted. And, while today's self-service BI and analytics tools satisfy a user's craving for more "consumerized" technology, they often leave an analyst stuck in neutral because the users, first and foremost, cannot find the data they need to perform any analysis.

Posted November 02, 2016

Redis Labs, the home of Redis, is introducing an open source project called Redis-ML, the Redis Module for Machine Learning. The new project will accelerate the delivery of real-time recommendations and predictions for interactive apps in combination with Spark Machine Learning (Spark ML).

Posted November 01, 2016

Trifacta, a provider of data wrangling solutions, is launching Wrangler Edge, a platform designed for analyst teams wrangling diverse data outside of big data environments. "We are packing the Trifacta product and adding enterprise features such as the ability to schedule jobs to handle larger data volumes to connect to diverse sources," said Will Davis, director of product marketing. "We also added collaboration and sharing features as well all without requiring organizations to manage a large Hadoop infrastructure."

Posted November 01, 2016

MongoDB has introduced a new release of its NoSQL document database platform with key features that support additional data models for a "multimodel" approach, the combination of operational and analytical processing, elastic cross-region scaling, and tooling to simplify data management for customers.

Posted November 01, 2016

Rosslyn Data Technologies, formerly known as Rosslyn Analytics, has announced the immediate availability of RAPid One-Click Data Analytics, a new suite of self-service automated analytic (SaaS) solutions, a new suite of self-service automated analytic (SaaS) solutions targeted at helping to reduce the time to visibility and insight.

Posted October 31, 2016

With data flowing into enterprises from so many different sources, and at varying speeds and times, effective solutions are needed to enable insights to be uncovered for faster decision making. To delve into the issues involved in making big data usable more quickly within organizations, DBTA recently presented a webinar featuring executives from the Federal Home Loan Mortgage Corp., known as Freddie Mac.

Posted October 27, 2016

Oracle has introduced new versions of the Oracle Database Appliance, which the company says is designed to save time and money by simplifying deployment, maintenance, and support for database solutions. "Oracle Database Appliance offers a path to cloud, future-proofing your investment," said Karen Sigman, vice president, Platform Business Group, Oracle.

Posted October 26, 2016

Business intelligence (BI) and analytics are at the top of corporate agendas this year, and with good reason. The competitive environment is intense, and business leaders are demanding they have access to greater insights about their customers, markets, and internal operations to make better and faster decisions—often in real time. There have also been dramatic changes with BI and analytics tools and platforms. The three Cs—cloud, consolidation, and collaboration—are elevating BI and analytics to new heights within enterprises and gaining newfound respect at the highest levels.

Posted October 24, 2016

Snowflake Computing is making its platform more accessible to users with Snowflake On Demand—a sign-up process for data users to get immediate insight from Snowflake's data warehouse. According to the vendor, with a virtual swipe of a credit card on Snowflake's website, data users can access the only data warehouse built for the cloud. They can store and analyze their data without relying on their own IT group to get up and running quickly.

Posted October 20, 2016

Managed application service provider TriCore Solutions has acquired Database Specialists, a database managed service company focused on support for Oracle database systems.

Posted October 19, 2016

A new world of self-service BI brings with it its own issue of data chaos. When everyone is looking at the data their own way, people find different answers to the same questions.

Posted October 10, 2016

Organizational issues such as governance and skills—not technology requirements—are the greatest challenges that IT and corporate managers are facing in the emerging world of big data. To get a better handle on the complex new world big data is catalyzing, executives and professionals recognize they must reimagine and re-architect the concept of the "data center"—and what ultimately is coming out of may be a surprise to everyone. These are some key takeaways from a recent survey of 319 corporate and IT managers, conducted by Unisphere Research, a division of Information Today, Inc., in partnership with Cloudera and Intel.

Posted October 07, 2016

For many years now, Cassandra has been renowned for its ability to handle massive scaling and global availability. Based on Amazon's Dynamo, Cassandra implements a masterless architecture which allows database transactions to continue even when the database is subjected to massive network or data center disruption. Even in the circumstance in which two geographically separate data centers are completely isolated through a network outage, a Cassandra database may continue to operate in both geographies, reconciling conflicting transactions—albeit possibly imperfectly—when the outage is resolved.

Posted October 07, 2016

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21

Sponsors