Sangeeta Edwin, VP Data and Analytics at U.S. Venture, speaks with Mike Woods, VP Sales at Denodo in a video interview about having the right data foundation before embarking on the AI journey, building infrastructure, the collaborative role of a data and analytics team, running PoCs, and prioritizing foundations, processes, and people.
US Venture is a diversified company that offers products and services for transportation, renewable energy, and industrial markets.
Edwin sets the tone of the discussion by stating that organizations must have the right data foundation before starting the AI/ML journey. At U.S. Venture, it was the company CEO who instilled the vision to enable data analytics function and imbibe a data and insights-driven culture, she adds.
As the organizational journey commenced, Edwin recalls that the CEO, senior leadership, and herself were building the infrastructure, including the cloud-based platform with Microsoft Azure. The organization took use cases from finance to HR and AutoForce to understand the need.
Elaborating further, Edwin states that it is not just about bringing data into the cloud but also knowing how to profile, cleanse, and sustain data quality. She mentions how a CFO helped her get better at building the platform by stating his need to have a single version of certified data.
Consequently, Edwin baked in data quality from the beginning along with governance, certifications, and accountability in people. Further, she mentions that her advanced analytics team comprises four data scientists, data engineers, and analysts who collaborate to build the data.
Edwin also mentions having a platform engineering architecture team to build the platform, bring the data, and deliver the use cases. The advanced analytics team collaborates with each function to understand the strategic roadmap, she notes.
Additionally, the AI and ML team is focused on having a variety of data, internal and external to make a good AI/ML use case, says Edwin. Figuring out the problem to be solved, is the focal point of building a use case, and it must be a difficult problem that requires a variety of data. Then, the quality of historical data available to the organization comes into the equation.
Moving forward, Edwin discusses how the CEO holds the data and analytics team accountable for what it delivers. Therefore, the team needs to define the use cases to the point of defining value, and then checking it off the other functions to measure and show impact.
Taking an instance from the AutoForce function, Edwin shares how the sales team came up with a business concern about not meeting multiple potential leads on the same route that could bring in more customers. To resolve the issue, the data and analytics team started looking at maps, densifying routes taken, and the possibility of getting buyers.
Thereafter, Edwin highlights creating an automated lead-generating tool, which led to bringing together sales, geospatial, and external market data among other things. Matching those up, the tool pointed toward the potential leads, and in a year, it has matured to the extent that those leads now get loaded into Salesforce and get assigned to salespeople.
Delving further, Edwin states that the generated leads come with the tag that confirms it was generated by the D&A Lead Directory Project. She continues that whenever revenue gets tracked, the team gets credibility for getting millions in revenue for the organization.
Moving forward, Edwin explains how the D&A function has a federated setup wherein her enterprise team sits in the center. The enterprise team comprises platform engineering, architecture, data management, governance, advanced analytics, and project management.
While the enterprise team is mature, Edwin conveys that she has also enabled data and analytics for the finance team, which includes a business analyst, product owner, and other analysts. The Finance D&A team also has data owners and stewards which are key roles.
Furthermore, smaller D&A teams are growing up with the business and HR side, says Edwin. Speaking of the AutoForce division, she observes that it is also staffing up.
While doing the program increment planning in HR, all the teams are brought together to understand what they are working on and where the dependencies lie, says Edwin. This helps in building structure while maintaining the data quality. However, when it comes to use cases that talk about new data and its feasibility. Edwin mentions setting up a proof of concepts at first.
With data, the team brings it in and puts in resources to do a proof of concept of trying it out, says Edwin. She explains that her stack includes Azure, Databricks, Power BI, and Data IQ. A proof of concept is run from two weeks to three months as one program increment is a three-month time cycle, and it only becomes a project if it meets the required outcomes.
As a takeaway, Edwin emphasizes implementing data management governance before implementing the next-gen application. To get the feet wet with generative AI, it is critical to have the metadata defined and have good quality data connected to the system enabling it to deliver the right answers.
While delivering solutions is necessary to be agile, there have been instances, says Edwin, where such solutions could not scale. Therefore, it is imperative to have the right foundation, processes, and people.
In conclusion, Edwin opines that while the processes and people's accountability lie with the central team, the organization needs business experts to define, implement rules and make things happen. In the world of generative AI, sustaining the data quality, and metadata lineage will add massive value, she says.
CDO Magazine appreciates Sangeeta Edwin for sharing her data and business insights with our global community.