When Einstein made his glorious splash in the world of physics everything changed. Well, if you were a theoretical physicist it did.
One impact of Einstein's theories effectively meant that space-time was conceived as a curved reality, making the narrowest expanse between two points an arc and not an actual straight line. But however true it was that the shortest distance between two points was an arc, students across the world are still regularly forced to sit through lessons in Newtonian-based geometry. The reasons for this anomaly are that even though they are “old,” the theories of what Newton conceived of as reality are useful, and easier to handle than the complexities brought in by Einstein.
For most circumstances, straight-line thinking provides numeric answers that are just as good as any other, even though such calculations are based on a reality that isn’t exactly true. In other words, the Newtonian formulations provide instrumentality.
Similarly, good database design need not worry so much about describing an objective reality. Arcs or straight lines, or even dashed lines are fine. If indeed an absolute objective reality were needed, data models could never be completed because reality isn’t nearly as objective and absolute as we might wish it were. If you have any doubts on that issue look into some of the ramifications of quantum physics, spooky actions, string theory, et cetera.
Putting aside metaphysics for a moment, the objects and relationships identified within any database design need to match with the very subjective realities considered true within a given organization. An optimal data model must incorporate the semantics of the business and provide for a structural arrangement supporting the necessary corporate instrumentality. And that model should do so, even if in the details things within the model are contrary to what other organizations may consider to be the nature of their reality. However, the data modelers cannot simply accept everything they may hear and incorporate anything without critical assessment.
In going through the process of architecting a specific model, the data modeler must evaluate the worth of the ideas uncovered in the discovery process. As they run across things that are unusual or unique, a deeper evaluation should occur. Organizations can be much like individuals; sometimes they have unique perspectives that truly work for them, but other times they simply may be misleading themselves. This is where experience helps the database designer create a data model that will be useful today and have a longevity into the future. Designers must try to determine which unusual or unique organizational perspectives are good (or at least harmless) and which of these are limiting an organization from growth or deeper insights.
Tracking the distinctions between harmless and limiting may be subtle. Often it may end up that choices were made to ease expanding an existing implementation. And limitations were accepted as a way to get things done today, while purposefully corrupting future options. Those future options may still be desired but are lost in the you-can’t-get-there-from-here trap created by that first decision. Recognizing such circumstances can be hard; the short-sighted perspectives could be exposed when uncovering a general chain of states/objects like A leads to B leads to C pattern, but not actually seeing A, B, and C as distinct objects in the existing databases. For example, there could be a Customer Order to Purchase Order to Supplier Invoice to Customer Invoice sequence that is missing some of these components, or several objects have been collapsed together into one even though they do not have a one-to-one relationship.
Recognition is key; next is trying to uncover what other things happen to work-around any shortcomings, for it is the work-arounds that flag things that ultimately need to change. Database design is still more art than science because it is built on semantics and perspectives; an ideal design is always established while shooting at a moving target.