Governance Gods, Guardians, and Guides


Debates about when and where to apply AI systems often boil down to a singular point. Namely, what is the nature of the work in question? It is a simple query, devilishly so, one that data and AI governance programs are well served to consider. It’s a larger conversation that got me thinking about the work that is unsung, if not entirely overlooked, in traditional governance frameworks.

Advisor, arbitrator, or auditor? Pick one. Each of these functions requires different expertise with divergent levels of decision-making authority. Beyond this, as an associate recently reiterated, you can’t grade your own homework. Clear demarcation avoids self-reinforcing perverse incentives. This requires each role be discretely represented within any governance organization.

Unfortunately, it is quite commonplace to anoint data stewards, in particular, with all three tasks: defining rules, arbitrating disputes, and evaluating compliance. This is an exercise in futility as well as frustration. Traditional governance frameworks also tend to focus primarily on what I will figuratively refer to as the gods and the guardians: those who make the rules and those who enforce them. With far less investment, if any, in the guides (aka the advisors).

Before expounding on advisors, a word of caution about arbitration. Since it’s impossible to please everyone all the time, success often means leaving everyone a little unhappy. In more politic terms, it’s asking everyone to compromise in support of a shared goal. Not everyone, every time, but most, most of the time. It’s common and altogether too easy to put data stewards in charge of settling data disputes without acknowledging that nonperformative adjudication requires authority. This authority is rarely accorded to those not in executive or senior leadership teams. This is an affordance that sits more naturally at the level of a senior governance council or oversight committee. So, what might a better assignment be for our erstwhile data stewards or their ilk?

Well-earned tropes about consultants aside, the value of internal advisors is underestimated if not flat out misunderstood. As we continue to carve out roles for making rules and tracking conformance, companies must also make room for guidance.

Properly advising teams requires understanding both why the applicable process/product exists and the mindset of the organization itself. In the realm of ethics, the latter would be called the organization’s shared moral framework. It is easy to go wrong by assuming this is a simple checklist exercise.

Indeed, it would be straightforward if the work was a single, simple comparative exercise. Is proposed use A in conformance with rule 1? There is room and a role for this: that of the auditor. An advisor is different.

Yes, advisors require a working knowledge of the applicable rules. Well beyond this, an advisor must be in tune with the nature of the underlying work.

Asking whether a proposed data use conforms with established practices is step one. Answering yes or a no orients the advisor toward the next line of inquiry. If no, what are the potential implications of using the information in this way? What legal or contractual red lines does this skirt or flirt with today? If yes, will conformance with established practice result in untoward effects that, in time, won’t? Stated more directly: Are our mindset and practices still fit for purpose? What legal or contractual red lines might this cause us to skirt or flirt with tomorrow? Yes or no, are there alternate approaches to consider? What tradeoffs do each choice require?

This level of awareness, and the practice of foresight, is increasingly important in an age dominated by data-fueled automation and augmentation.

Blind conformance to existing standards begets precarious data practices in which no data can go unmined. In which no insight, no matter how contrived, is underived. No connection, no matter how tenuous, unplumbed. In this environment, the tendency is to default to finding reasons for yes without exploring countervailing reasons to say no. Worse yet, being forced to justify jettisoning data rather than acutely evaluating whether it should be captured, created, or kept at all. Over and above this, regulatory, level, and operational policies and rules are rarely strictly hierarchical. First A, then B, unless C. Different operational contexts may necessarily assign different weights to the same rules. Addressing and, when necessary, reconciling these tensions is core to this work.

Great guidance explores the junctions between intent and practice. It is the realm of impact and implication in which the job is not to about a singular path (i.e., “best route now due to traffic”) but the alternates as well. As we’ve all likely learned, sometimes the vaunted route lands you, cursing, in a foreseeable jam.

The current competitive, regulatory, and legal landscapes around data and analytics feel more turbulent than ever. In this environment, it is no longer enough to focus AI and data governance only on those who make and enforce the rules. We must deliberately invest in our guides as well.



Newsletters

Subscribe to Big Data Quarterly E-Edition