Artificial Intelligence or AI has been a hot topic of conversation for some years now. However, since the release of ChatGPT in late 2022 the topic has taken on new energy. In discussions with organizations on this topic, two themes have emerged.
First is a desire, or in many instances a directive, on the part of leadership to do “something with AI.” In many cases, organizations have a notion of the strategic value of AI but have yet to translate this into a deeper understanding that supports the development of an implementation approach and an operational strategy.
Secondly, the data and application teams charged with figuring out what doing “something with AI” means have no clear mandate. How do data practitioners move forward within their organizations? Many organizations simply do not have the internal operational, budgeting, and decision-making structures to easily execute the “do something” directive.
This article seeks to present a practical – indeed perhaps tactical – framework around which organizations can start the journey down the AI path.
Executive leadership and the investment community are jumping on the AI bandwagon, and comments from leadership are at times truly aspirational. Embracing the hype, AI is going to solve organizational problems and impact operations across the board - from product development, and marketing to supply chain optimization, analytics, and decision-making. This may indeed be the case in the long run. However, in the short run, there are some challenges that are not dissimilar to the typical data management challenges of the pre-AI era.
At its heart, AI is about data-driven automation. However, AI is in one form or another automating decision-making that was previously human-led. A central premise of AI is that it can “see” and consume data from a wide variety of sources from sources both internal and external to the organization. This requires a wide scope of integration related to business processes, data, and applications, which is often constrained by underlying platform and resource capabilities.
The challenge is in determining what “good” automation looks like.
From a tooling perspective, the AI world is moving fast. Prior to the introduction of readily available large language models (LLMs) that underpin the newer notion of AI, the biggest hurdle for many organizations was how to acquire or create the underlying domain knowledge model and related logic to drive how artificial intelligence might be operationalized. It is now possible to acquire an LLM, and rapidly deploy it in a scalable environment supported by a name-brand platform and toolset.
The challenge is to ensure that AI is driving the right business outcomes.
The adoption of AI requires organizations to address underlying capabilities that ensure that the AI machine is getting fed the “right” data and the AI is providing accurate output. Broadly speaking, organizations with more advanced management and governance maturity will have an easier time implementing, sustaining, and governing AI than those that do not.
The challenge is to ensure that what AI is doing is governed.
Two broad categories of activities start the process of implementing AI and addressing the above challenges. The first is implementing the mechanics and related plumbing of AI; and the second is addressing the organizational “enablers” that ensure AI operates effectively, can scale, and is organizationally supported in a “steady state” mode. These two activities need to evolve concurrently, and are presented in Figure 1.
Core AI capabilities: Core capabilities will vary by organization and use case. However, as a baseline, this refers to the capabilities required to execute large language models (LLMs). Simplistically, one needs a subscription to ChatGPT or other large language model, and the programming expertise to code against the LLM’s programming interface. The goal is to rapidly establish initial AI capabilities using readily available LLMs, before tackling the more complex tasks and the considerable work of building custom company-specific LLMs.
Domain level control: Control is created by organizing the information relevant to the domain into a knowledge model against which rules can be authored. Rules ensure alignment to business objectives, mitigate risks, govern compliance, and maintain the quality and integrity of inputs, outputs, and algorithms.
Operational approach: The operating approach, also referred to as an operating model, identifies how AI is executed within the organization: who is involved, roles and responsibilities, and how problems are identified and solved. The operating model also identifies the organizational components and structures that support these activities.
Management Framework: The management framework represents a view of the information managers need to effectively answer questions related to the AI initiative. Figure 2 provides a view of the major elements of the Framework concept.
At the initial stage, the critical activity is linking business processes with business objectives and aligning AI use cases in a way that drives metrics.
Given the rapidly evolving state of AI, it is important to start the journey with an approach guided by some core principles:
This is a journey – We do not know enough now to understand how the journey will go, only that each step must take us closer to our goals.
Value-focused – Activities are grounded in use cases that have an established business value. Collectively these can represent an overarching “North Star” view of the future state.
Iterative and gated – The build-out is iterative with each iteration providing an opportunity to assess, redirect, realign, and re-budget progress towards the North Star end state.
Employ a “stepping-stone” approach – Each iteration must build capabilities that provide, at least in part, capabilities that support the North Star concept and can be used in following iterations.
Aligned to current state capabilities – Iterations are “right-sized” to reflect where the organization is with respect to maturity.
The general approach is shown below
In our principles, we talk about “right-sizing” the elements of Figure 3 to current state capabilities. Many companies find that creating an initial AI roadmap integrated into the overall business strategy and aligned to the IT and Data strategies is a bridge too far. While the breadth of the initiative can feel overwhelming, it is imperative that categorical alignment with IT and Data is a top priority, given that infrastructure and data quality will ultimately impact the effective outputs and business adoption of your North Star initiative.
Some ways to bring in support is by creating an AI/Governance council led by the business or operations leaders that define POCs and provide thought leadership to the initiative. Additionally, the council provides oversight to the integration of AI into existing process workflows closer to revenue generators. Scope the initial roadmap to something that tells a compelling North Star business story that builds support through simplifying communication.
The “North Star” approach does not have to be complicated. It can be as simple as a set of objectives, and a “placemat” visualization that details the desired operational and often the architectural concepts.
For the data leader, the Phase 0 activity provides a way to communicate how AI capabilities will be built out. Most importantly, it provides a clear connection between strategic objectives and measurable business outcomes. An output of Phase 0 is the North Star roadmap that identifies capability initiatives and how those can be implemented iteratively.
The same “right size” thinking can be applied to the Readiness Assessment. There are many best practice frameworks for assessing capabilities and organizational readiness. A commonly used foundational methodology is the Enterprise Data Management Council’s Data Capability Assessment Model (DCAM).
Phase 1 starts the journey of implementing one or more of those initiatives. A notional process is shown below in Figure 4. A critical element of the approach is the notion of the gates.
Gate 1 in Figure 4 provides an opportunity for leadership review. This provides the team with the opportunity to socialize the progress, the challenges, and any changes required to the roadmap. Additionally, this gate ensures activities remain aligned with business goals.
Gate 2 is an opportunity for the delivery team to review the activities within that iteration. What worked well, what needs changing, and have our priorities changed?
The focus of this framework is twofold:
1. Building knowledge and getting started in AI using readily available products.
2. Ensuring that activities are aligned to address business needs, are governed, and can be measured against business outcomes and metrics.
This approach is anchored in the core principles presented. Most importantly, the adoption of game-changing technologies is a journey and requires an iterative and flexible approach that ensures activities are driving business value. This is captured in the management framework and operating model. However, these are items that evolve over time.
To get the ball rolling, the North Star concept is the most important initial activity. Building and communicating a good North Star story drives consensus amongst leadership on the overarching objectives, impact, and operational concepts.
The North Star tells the story that addresses the C-Suites’ strategic “wants,” and ties them to the less visible – workstream level – operational enablers. The product of this controlled approach ensures smooth scaling and interoperability of all pending adoption initiatives.
About the Author:
Jonathan Adams is the Founding Partner of Data Reach, LLC, and has over 25 years of experience in the data and analytics world supporting organizations in establishing data management and governance operations based on industry best practices. He is a contributing author and certified in both the ISACA/CMMI and EDMC best practice frameworks.
Adams is an adjunct professor at the University of Maryland school of Information Sciences and has taught graduate courses in Data Governance and Data Quality; and Data Preparation for Analytics.