Newsletters




Data Modeling

Data Modeling and Database Design and Development – including popular approaches such as Agile and Waterfall design - provide the basis for the Visualization, and Management of, Business Data in support of initiatives such a Big Data Analytics, Business Intelligence, Data Governance, Data Security, and other enterprise-wide data-driven objectives.



Data Modeling Articles

The latest release of Embarcadero's portfolio of database tools adds first-class support for Teradata in addition to updating support for the latest releases of the major RDBMSs. Overall, a key theme for the XE5 releases is an emphasis on scale, as big data, with big models and big applications, requires close collaboration across big teams, said Henry Olson, Embarcadero director of product management.

Posted February 26, 2014

In the last several years, there has been an explosion in the array of choices for not only managing relational and unstructured data but also protecting it and extracting value from it. To shine a spotlight on the best data management offerings, DBTA will soon open nominations for our first-ever Readers' Choice Awards.

Posted February 26, 2014

In many organizations, users find it hard to trust their own internal information technology (IT) group, leading them to try any possible option to solve problems own their own. The resulting stealth IT projects can lead to confusion or even complete chaos.

Posted February 10, 2014

DBTA is seeking speakers who possess unique insight into leading technologies, and experience with successful IT and business strategies for the Data Summit conference in New York City, May 12-14, 2014. The deadline to submit your proposal is January 31, 2014.

Posted January 20, 2014

Changes and enhancement to solutions are hard, even under the best of circumstances. It is not usual that, as operational changes roll out into production, the business intelligence area is left uninformed, suggesting that data warehouses and business intelligence be categorized according to the view of the old comedian Rodney Dangerfield because they both "get no respect."

Posted January 07, 2014

The newest release of Oracle SQL Developer, Oracle's integrated development environment, optimizes development and database administration for Oracle Database 12c and expands automation of third-party migrations to Oracle. Addressing the need for user-friendly tools to speed and simplify development and data management activities, Oracle is seeking to increase productivity for database development tasks so organizations can fully capitalize on the power of enterprise data.

Posted December 23, 2013

The data-driven demands on organizations have never been greater. Two of the most pressing concerns that organizations face today are the need to provide analytic access to newer data types such as machine-generated data, documents and graphics, and the need to control the cost of information management for growing data stores. DBTA's new list of Trend-Setting Products in Data for 2014 highlights the products, platforms, and services that seek to provide organizations with the tools necessary to address rapidly changing market requirements.

Posted December 20, 2013

SAP is strengthening its commitment to the developer community with key open source contributions, a real-time development experience for SAP HANA, and the publication of a new unified developer license.

Posted December 18, 2013

The latest release of CA ERwin Data Modeler, a solution for collaboratively visualizing and managing business data, addresses two major objectives - the need for organizations to manage more data across more platforms, and to easily share that data with an expanding number of users with a range of roles and skill sets.

Posted December 17, 2013

OpenText, a provider of Enterprise Information Management (EIM) software, has announced Project Red Oxygen, which the company describes as a "harmonized" release of new EIM software advancements designed to give CIOs the ability to extract value from their enterprise information and accelerate competitive advantage.

Posted December 02, 2013

Two new approaches to application quality have emerged: "risk-based testing" - pioneered in particular by Rex Black - and "exploratory testing" - as evangelized by James Bach and others. Neither claim to eradicate issues of application quality, which most likely will continue as long as software coding involves human beings. However, along with automation of the more routine tests, these techniques form the basis for higher quality application software.

Posted November 20, 2013

Changes to database structures should be performed in a coordinated fashion as the application processes that support the new functionality are rolled out into production. While the "work" involved in adding a column or a table to a relational database is actually minimal, often there are circumstances where developers and DBAs create additional columns and additional tables in anticipation of future needs. Sadly, this "proactive" effort results in databases littered with half-formed ideas, fits-and-starts, and scattered-about columns and tables that provide no meaningful content.

Posted November 13, 2013

Constantly changing tax rules can make payroll deductions and tax payments a time-consuming and costly endeavor for businesses. To get this onerous job done efficiently and cost-effectively, many utilize payroll software specialists that provide tools to support their in-house staff. Read how Revelation Software's OpenInsight and OpenInsight for Web are giving Ardbrook, a Dublin, Ireland-based software provider of payroll software, the agility it needs.

Posted October 23, 2013

Oracle has released the latest version of VM VirtualBox which provides a virtual multi-touch user interface, supports additional devices and platforms, and offers enhanced networking capabilities that allow developers to virtualize modern operating system features while maintaining compatibility with legacy operating systems. Designed for IT professionals, Oracle VM VirtualBox is cross-platform virtualization software that enables users to run multiple operating systems at the same time.

Posted October 16, 2013

How does one avoid the semantically wishy-washy use of NULL-surrogates and instead, actually design structures wherein NULLs are not necessary?

Posted October 09, 2013

A successful DBA must understand application development and the issues involved in programming and design. Here are some things that every DBA must know about application development and the design projects of their organization.

Posted October 09, 2013

Database management systems support numerous unique date and time functions - and while the date-related functions are many, they do not go far enough. One date-driven circumstance often encountered has to do with objects having a type of date range that needs to be associated with it. While there are some exceptions, this date range need generally ends up implemented via two distinct date columns—one signaling the "start" and the other designating the "end." Maybe, should the creative juices of DBMS builders' flow, such things as numeric-range-datatypes could be created in addition to a date-range data-type. Who knows where things could end up?

Posted September 11, 2013

Data models attempt to express the business rules of an organization. A good data model reflects the semantics used within an organization to such an extent that business people within that organization can relate to and easily agree with what is being expressed. In this regard the data modeler's goal is to properly mirror back the organization's concepts onto those people within the organization. The goal is not to force an organization into a "standard" data model, nor is the goal to abstract everything in the creation of a master model that will never need to change even if the business rules were drastically shifted.

Posted September 03, 2013

One of the principles within relational theory is that each entity's row or tuple be uniquely identifiable. This means the defined structure includes some combination of attributes whose populated values serve to identify an individual row within the table/relation. This, or these, attribute(s) are the candidate key(s) for the structure. The candidate key is also known as the primary key, or if a structure has multiple candidate keys, then one of them is designated as the primary key. When building up a logical design, primary keys should be identified by the actual data points in play.

Posted August 06, 2013

Database Trends and Applications introduces the inaugural "DBTA 100," a list of the companies that matter most in data. The past several years have transformed enterprise information management, creating challenges and opportunities for companies seeking to extract value from a sea of data assets. In response to this, established IT vendors as well as legions of newer solution providers have rushed to create the tools to do just that.

Posted June 27, 2013

Database Trends and Applications (DBTA) magazine has announced the inaugural "DBTA 100: The Companies That Matter Most in Data," a list saluting this year's companies in data and enterprise information management—from long-standing industry veterans to fast-growing startups tackling big data. "Beyond the explosion of interest surrounding big data, the past several years have transformed enterprise information management, creating both challenges and opportunities for companies seeking to protect, optimize, integrate, and extract actionable insight from a sea of data assets," remarked Thomas Hogan, group publisher of Database Trends and Applications.

Posted June 26, 2013

Dell has released Toad for Oracle 12.0 which provides developers and DBAs with a key new capability - a seamless connection to the Toad World user community so they will no longer have to exit the tool and open a browser to gain access to the community. "The actual strength of the product has always been the input of users," John Whittaker, senior director of marketing for the Information Management Group at Dell Software, tells 5 Minute Briefing. The new ability to access the Toad World community from within Toad enables database professionals to browse, search, ask questions and start discussions directly in the Toad forums, all while using Toad.

Posted June 19, 2013

The grain of a fact table is derived by the dimensions with which the fact is associated. For example, should a fact have associations with a Day dimension, a Location dimension, a Customer dimension, and a Product dimension, then the usual assumption would be for the fact to be described as being at a "by Day," "by Location," "by Customer," "by Product" metrics level. Evidence of this specific level of granularity for the fact table is seen by the primary key of the fact being the composite of the Day dimension key, Location dimension key, Customer dimension key, and Product dimension key. However, this granularity and these relationships are easily disrupted.

Posted June 13, 2013

There is an emerging field of companies looking to take on the challenges presented by the roiling tide of big data. While their visions vary, each has identified a market need that it believes its technology uniquely addresses. Here, DBTA highlights the approaches of 10 companies we think are worth watching.

Posted June 13, 2013

It seems that juggling is the most useful of all skills when embarking on a data warehousing project. During the discovery and analysis phase, the workload grows insanely large, like some mutant science fiction monster. Pressures to deliver can encourage rampant corner-cutting to move quickly, while the need to provide value urges caution in order not to throw out the proverbial baby with the bath water as the project speeds along. Change data capture is one area that is a glaring example of the necessary juggling and balancing.

Posted May 22, 2013

Datawatch Corporation, provider of information optimization solutions, has announced a strategic partnership with Lavastorm Analytics, an analytics software vendor, to provide customers the ability to expand their use of unstructured and semi-structured data sources when developing analytic applications.

Posted May 07, 2013

Dimensions are the workhorses of a multidimensional design. They are used to manage the numeric content being analyzed. It is through the use of dimensions that the metrics can be sliced, diced, drilled-down, filtered and sorted. Many people relate to dimensions by thinking of them as reference tables. Such thoughts aren't exactly accurate. A dimension groups together the textual/descriptor columns within a rationalized business category. Therefore, much of the content coming from relational tables may be sourced from reference tables, but the relationship between each source reference table and the targeted dimension is unlikely to be one-for-one. These grouped-format dimensions often contain one or more hierarchies of related data items used within the OLAP queries supported by the structures.

Posted April 10, 2013

Do not allow well-meaning but confused proponents to obscure concepts related to normalization and dimensional design. Under a normalized approach one usually would not expect for numeric data items and textual data items to fall into different logical relations when connected to the same entity object. Yet within a multidimensional approach that is exactly what happens. Multidimensional design and normal design are not the same, and one should not expect to claim that both approaches were used and that they resulted in the same data model.

Posted March 14, 2013

Establishing a data warehousing or business intelligence environment initiates a process that works its way through the operational applications and data sources across an enterprise. This process focuses not only on identifying the important data elements the business lives and breathes, but the process also tries very hard to provide rationality in explaining these elements to business intelligence users.

Posted February 27, 2013

Sonata Software, an IT consulting and software services provider headquartered in Bangalore, India, has announced its center of excellence (CoE) for Exalytics, Oracle's engineered system designed for high performance data analysis, modeling and planning.

Posted February 20, 2013

Multi-dimensional design involves dividing the world into dimensions and facts. However, like many aspects of language, the term "fact" is used in multiple ways. Initially, the term referred to the table structure housing the numeric values for the metrics to be analyzed. But "fact" also is used to refer to the metric values themselves. Therefore, when the unique circumstances arise wherein a fact table is defined that does not contain specific numeric measures, such a structure is referred to by the superficially oxymoronic characterization of a "factless fact."

Posted February 12, 2013

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19

Sponsors