Newsletters




The Meaning of Done


In the analytics world, when building data marts or other data areas meant to support end users, the goal is to present the “needed data” allowing users to perform their queries and analyses. This means that if an “XYZ Value” is required, then the “XYZ Value” is presented. However, if we present the “LMNO Value” along with the “PQR Code,” and require users to determine that the “PQR Code” value of 33 means the associated “LMNO Value” is actually the “XYZ Value” that the user wants, then doing this latter approach means that we have only done some work and are leaving more work for the user to do before they get to perform their real work. Also, beyond doing this extra coding to obtain the necessary data content, the users first need to learn the intricacies of how exactly to find this data. Yes, users may learn how to perform these tasks, and the tools they use may even help them with their tasks; but we in IT have still fallen short in providing the “needed data.” What we have given the user may indeed be better than nothing and may even be better than what the user had before our new solution. But still, the solution has not been properly delivered to our users.

Why do we fall short with our solution delivery? Often it is based on a desire to make things easy for our engineers. We want to size our tasks so that the engineer can deliver quickly. We only have limited resources, and we need to spend them wisely. And while that reasoning may have been the start of it, things tend to escalate quickly.

Scenario 1: Development is cut short under the umbrella of, “this is the Minimum Viable Product [MVP], so we have reached done and we will go back and improve it later.” Only later never comes and new priorities rule.

Scenario 2: We’ve done Scenario 1 so many, many times, that we do not even think about doing anything more because “It is how we always do it.” And presently we can also hide many of these sins beneath an AI layer. The user doesn’t need to know anything about the data issue; the AI will do the work instead of the end user doing the work. While that may be a magic answer, we still have done a less-then-complete job, initially. Tools are ever evolving; the development expense situation may have changed. The “added work” that was initially avoided, may now be only a minor two-minute task. But we haven’t even thought about doing it because we do not worry about doing our tasks any differently than they have always been done. Even when we have new tools.

While we may choose to feel good about delivering a less-than-complete resolution to our users, perhaps we should reflect a little further on how well we actually are doing our job.

Can users answer their questions as if they are “falling off a log”? Will all users be virtually guaranteed to get the same correct answer? Do users need to write extra code to get what they really want?  Do users need to do more than simply find the data item desired in a catalog? Are they truly gathering the answers, or needing to build them? While it is fine for a data scientist to go exploring, finding new relationships, and building brand new measures, the average dashboard and report user is more needful of the “grab and go” option. Providing the right solutions to these “data farmers” should be approached in a fashion minimizing the need for the user to be creative and learned. We shouldn’t consider a solution fully done until it is fit-for-purpose done, not minimally viably done.


Sponsors