The Escalating Necessity for Good Database Design

Bookmark and Share

While good database design is always necessary, the value of good design is reinforced by such endeavors as service-oriented architecture.  The semantics-laden engineering of service-oriented tactics used in creating solutions melds seamlessly into the semantics-laden world of data modeling.  Although one aspect does not necessarily build on top of the other, each works as part of a team in braiding solutions that grow in usefulness as they emerge.  The processes surface as a method to further define the meaning of an object, and the object serves as a harbinger of the processes that must exist.  Business rules much more advanced than today's simple constraints may one day reside within the database itself as a built-in function of a some future DBMS.  Therefore, data architects should never try to wall themselves behind a veneer of thinking only in terms of data without a thought regarding the processes that touch the data.  As a database designer, working within projects that use service-oriented architecture approaches can be exhilarating.

Everything starts with the base semantic layer of business objects that exist within the universe of discourse.  Even a partial listing of required objects assists a rapid forwarding of the overall design.  These business objects become active as initial base CRUD processes are defined (these are the processes that manage the Creation, Reading, Updating, and Deletion functions that must exist for every primal object).  The next leap evolves as the lifecycle of these business objects is addressed.  How do these business objects change over time?  Is there a state-transition progression that describes accurately how each object must advance from creation through to its eventual deletion?  As one works through the lifecycle questions, doors open on an array of database design issues.  What reference tables defining states and sub-states should be incorporated? What elements are required for auditing structures that track the history of changes made to individual instances?  Complexity grows as what was once a set of simple business objects and processes must eventually begin interacting with aggregated processes running larger business tasks.  One is no longer simply updating one object; instead, one must chart a grouping of related objects with changes orchestrated between them all.

Eventually, somewhere along the development's path a contextual leap is made.  Meta-functions must be considered that are more of an instantiation of application-based self-awareness.  These higher-level tasks include watching, monitoring, and controlling the changes across the direct business components.  Workflows of varying complexity drive the functionality that will be exposed to application users.  Database structures may be needed to aid in controlling these workflows in a flexible manner.  Beyond the steps that comprise the workflow, the user interface may consist of more customizable elements, such as "time-zone-displayed" or "currency-used" or "background color."  Some of these tasks are managed by external tools that may be purchased, and others may be custom-built. 

Defining the components that make up a solution becomes akin to composing a clockwork symphony wherein all the pieces must operate coherently for the proper sounds to materialize.  When considered and thoughtful effort has been placed in the design of both database and methods, there will be a healthy synergy between the data and the processes so that distinct events and sequences will have a natural independence wherein each flows smoothly from start to finish.  A good semantic design, for data and for process, leaves the door open to future improvements.  Such attentive design serves as an intuitive guide for enhancements and helps make the place for envisioned changes more obvious.  And then those obvious changes should support improvements that remain independent of previously defined elements, so that each new change does not cause unnecessary rework.