By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

TDWI Articles

Achieving Scalable, Agile, and Comprehensive Data Management and Governance (Part 2 of 3)

Danny Sandwell, senior solutions manager with Quest, and Conor Jensen, field CDO with Dataiku, discuss the new TDWI Best Practices Report on achieving scalable, agile, and comprehensive data management and data governance.

In this recent “Speaking of Data” podcast, Quest’s Danny Sandwell and Conor Jensen of Dataiku discussed the latest developments in data management and governance. Danny Sandwell is senior solutions manager with Quest and Conor Jensen is Field CDO with Dataiku. This is the second part of a discussion about TDWI’s Best Practices Report about data management and governance. [Editor’s note: Speaker quotations have been edited for length and clarity.]

For Further Reading:

Achieving Scalable, Agile, and Comprehensive Data Management and Governance (Part 1 of 3)

Achieving Scalable, Agile, and Comprehensive Data Management and Governance (Part 3 of 3)

3 Data Management Rules to Live By

Sandwell and Jensen began by discussing the biggest challenges they see facing their customers. “One of the big problems we see,” Jensen said, “is the underlying belief that all data is of equal value and that before anyone can use it, it has to be perfect.” Not everything requires that level of rigor, though, he added. “If I’m off by 5 T-shirts in my inventory, no one is coming for my head, as opposed to if I’m off $5 million in my accounting.

“There’s also the issue of legacy systems,” he continued. “Some of these systems have been in place for 40 years or more and have so much process tied into them you can’t just unhook them. However, replacing them can cost into the tens of millions, so organizations have to really look at the ROI on such an investment.”

One risk of not modernizing that organizations often underestimate is the personnel risk. “The average COBOL programmer is more than 70 years old and can charge as much as $1000 an hour because they’re so scarce,” Jensen said.

“Another issue we see,” Sandwell added, “is with companies’ lack of visibility into their actual data landscape. Solving this begins with getting people to understand exactly what they have and setting up governance to mitigate risks and deliver value faster from their data. From there, the next step is operationalizing this to make it real in people’s working lives.” This involves increasing the data literacy of everyone involved, Sandwell said.

“We’re still seeing many customers whose first task is still to establish a data-first culture within their organizations,” Sandwell explained. “Many people have a vested interest in how they currently do business, so bringing them on board can take time and effort.” However, once you build that culture, you must continue to sell it to your organization because that’s how you’ll have sustained success.

“One of the biggest difficulties with traditional data warehousing was that once data was loaded into the warehouse, it would then have to be contorted in all kinds of ways to try to answer the questions,” Sandwell said. “Fortunately, we now have the capacity to collect and store lots of raw data and transform it for purpose.” Many customers, he said, are interested in building an internal data marketplace that’s less about monetizing data than it is about directing people to the right data so they’re not asking for things that already exist.

“When dealing with big data early on,” Jensen added, “companies were interested in collecting data from as many external sources as possible. It turns out, though, that the most important data was their own proprietary data -- the data that couldn’t be obtained anywhere else.” Jensen went on to say that none of the exciting new things being talked about – such as machine learning and generative AI -- are possible without investing in the data to drive it. That means ensuring trust, transparency, and explainability in your models, as well as modernizing your governance to include the new requirements that new technologies such as LLMs bring with them.

“One exciting new opportunity is looking at how to start using AI in your governance process,” Jensen offered, “or finding ways it can be used to streamline time-consuming processes such as data prep for analytics.” The company-wide chatbots that people can ask any question against the company’s whole data collection is still a way off, he cautioned, but there are many applications to improve data governance and other tasks that haven’t been uncovered yet.

“With these innovations in AI/ML,” Sandwell said, “it’s more important than ever to make sure your data is of the highest quality, because the consequences of bad data can quickly spiral out of control before you even know it.” He expects to see some advances in making AI models more transparent but thinks that these innovations will ultimately push the limits of democratization as more people in the organization begin to take advantage of new tools.

“If you look at all of the roads to modernization right now,” he continued, “at the center of every one of them is data governance and data intelligence, whether it's a data fabric, a data mesh, or the lakehouse.”

[Editor’s note: This is the second part of a discussion about data management and governance. David Stodder, senior research director for business intelligence with TDWI, discusses his newest Best Practices Report, which covers how to achieve the best in data management and governance, in part 1. You can also download a free copy of the TDWI Best Practices Report.]

TDWI Membership

Accelerate Your Projects,
and Your Career

TDWI Members have access to exclusive research reports, publications, communities and training.

Individual, Student, and Team memberships available.