(US & Canada) Kirsten Dalboe, Chief Data Officer at the U.S. Federal Energy Regulatory Commission (FERC), speaks with Adita Karkera, Ph.D., Chief Data Officer for Government and Public Services at Deloitte, in a video interview, about the key aspects of cultivating data culture, building AI policies and governance, and how FERC prioritizes data management.
FERC ensures reliable, safe, secure, and economically efficient energy for consumers at a reasonable cost.
As a CDO, Dalboe states that cultivating a data culture starts with building trust. While coordinating across different offices at FERC, she highlights the importance of having a data governance board in that process.
The data governance board helps connect with the right person in a particular office, and Dalboe builds solid relationships, which is necessary for building trust. Her first step to building a data-driven culture and proving trustworthiness was through a reporting modernization project. Dalboe shares how she discovered an old legacy platform that was generating PDF reports. She notes that the platform had 500-700 static reports, a lot of which were copies, built at different times, and difficult to sort and filter.
As this raised serious concern, the reporting modernization project became key to promoting a data-driven culture, says Dalboe. The project helped people understand that there are modern tools that can filter data quickly to get answers faster.
Additionally, Dalboe says, this also brought a great opportunity to learn what data governance is and why it matters. She continues that as the teams reported on core terms and data elements, having consistent definitions of those became a must.
With that, data governance came into the picture, common definitions were established, and people were taught why governance matters. Eventually, they got excited about a fun tool to work with.
Consequently, Dalboe and her team started with the cloud platform sooner, as people were inspired to learn tools like Power BI, build consistent data models, and have self-service capabilities.
Next, Dalboe emphasizes the role of data stewardship in promoting a data-driven culture. The goal is to enable people by establishing a formal data stewardship program.
Elaborating further, she says that there are subject matter experts in specific areas; for instance, at FERC, there is a particular office that specializes in grid reliability. A data stewardship program would enable the office to catalog the necessary data and help in effectively carrying out grid reliability assessments.
Data stewardship is important because it helps people understand that this is a team sport, says Dalboe. She reiterates that her intentions do not involve taking away anyone’s data, but ensuring the data is used effectively.
Delving further, Dalboe says that her role is to establish broader data governance processes and consistent methodologies around data governance and data management. Apart from those, she also mentions creating a data science training program with monthly lunch-and-learns.
With topics ranging from business intelligence to geospatial analysis, data science, stewardship, and data governance, the participation also promotes data-driven culture, says Dalboe. Of late, she has been working on data product governance, with the belief that the future lies in self-service analytics.
Through data product governance, Dalboe wishes to empower all the people in program offices to be able to perform analytics but while following a consistent governance process. This would validate that the data is sourced from an authoritative, trusted source and that there is an organizationally governed definition of things, which in turn, promotes a data-driven culture.
When asked whether the topic of AI falls under her role, Dalboe affirms that the CIO, CDO, and CISO work in close collaboration to build AI policies and AI governance.
Moving forward, Dalboe discusses prioritizing data management by focusing on data operations maturity and analytic workload migration while reducing on-prem use. She mentions shifting the gear from data governance maturity to getting the platform operational.
This includes making sure there are robust processes in place for tracking all the data projects while meeting all data governance requirements. Also, it must be ensured that everything is built into the DevOps and product backlog planning.
Furthermore, Dalboe stresses ingesting clean and right data that is well documented, has metadata, and is well-stewarded. This is crucial for maintaining consistency in platform architecture.
Thereafter, for analytic workload migration and reduction of on-prem use, FERC is inventorying all on-prem analytic workloads and working with different program offices to understand their usage for migration. The commission is also working with the offices to understand the change management of that process, as some workloads need refactoring to run on the cloud.
In conclusion, Dalboe states that it becomes necessary to ensure that the code is running more efficiently for the cloud because of the charges associated. However, the best part is now the code runs in four minutes as opposed to eight to twelve hours.
CDO Magazine appreciates Kirsten Dalboe for sharing her insights with our global community.