Change Management Sustains Data Quality Improvement Initiatives
Making data quality improvement a “top of mind” action issue throughout an organization doesn’t happen by accident. It doesn’t happen because you buy a tool suite. It doesn’t happen because the IT organization gets tired of explaining that they don’t own the organization’s data and don’t have the budget and staff to keep cleaning it up.
Organization-wide initiatives get attention and action when top management takes ownership and everyone in the organization takes note. How do the many take their cue from the few? Data Quality improvement is process improvement, and process improvement depends on Change Management. Recently, Cal Braunstein of the Robert Frances Group and I interviewed two Poor Data Quality – Negative Business Outcomes survey participants who are having huge and sustainable success in changing the hearts and minds of business-side colleagues about data quality. What follows is a quick synopsis of what we learned.
Our interviewees rated their organization as about average in data quality, when compared to their industry peers. At the outset of their data quality improvement initiative, some of their business-functional units understood they had data quality issues, while others were in denial. Considering the IBM InfoGov Community Data Governance Maturity Model, these folks told us they started their data quality initiative in the first stage of maturity, “Initial”, and are now driving forward to be fully compliant with the second stage, “Defined”, in 2014. They believe they will achieve some elements of the third maturity stage, “Repeatable”.
Synopsis: Our interviewees are industry veterans experienced in data governance, data quality, and project management. Their initial state self report seems entirely credible.
Top management became interested in data quality improvement because the success of several organization-wide projects would depend on having nearly perfect data. This interest catalyzed an aggressive drive toward data quality improvement. This is an organization with operations in a dozen countries around the world. That said, they were also motivated because they had great difficulty in rolling up data from subsidiaries the to corporate level. There were simply too many inconsistencies preventing combined information across unit and country lines.The final issue was comprehensive multi-national privacy compliance, visible on the horizon. No one here wants to run afoul of an army of regulators.
IT put together a formal business case with business related benefits. They were conservative about projected benefits, eliminating soft benefits, and difficult to prove elements like “greater agility.” The business case did include all projected expenses. Payback period, using a discounted cash flow model, met the corporate threshold for project approval. With this nuts and bolts approach, the result was “Go” and and IT received the funding it requested. In earlier posts I have underscored the tremendous payback potential for improving data quality. Why more organizations don’t undertake this kind of analysis remains a mystery to me.
Data quality should be critical for everyone in the organization. It should be an intrinsic part of the business-side’s DNA .The business knows it must formally “own” their data. While the stewardship model is developing, there is discussion about advanced domain-level stewardship. Domain-level stewardship means having a steward for all customer data or all financial data, wherever sourced and used. Domain-level stewardship means comprehensive silo-breaking.
As one top executive put it, “Who would have thought master data management was such fun?” But things didn’t start out that way. An initiative of this scope is a true cultural shift and requiring participant education and top-down support. Change management remains a large part of this project.Things began by getting business and IT people in a room together, sometimes for the first time. Key challenges included avoiding the “blame game” and keeping the atmosphere positive.
Why did this initiative succeed? “Having the right people at the right time”, as one of our interviewees put it. The technical leads are seasoned data management and project management experts, capable of driving the project forward while anticipating the challenges that have and will arise. The level of executive sponsorship was and remains suburb, and the steering committee that represents the business remains engaged and involved.
One thing this project didn’t have was an incentive plan in place. People participated because executive leaders showed the way, experts managed the project, hands-on, and business and IT got the results that were promised. Don’t worry, this year there will be an incentive plan to make sure that everyone participates in the rewards that success brings.
The data quality improvement initiative has already spread beyond its initial scope in terms of functional area improvement. Two support organizations have asked to be part of the initiative. In my experience, the request for inclusion is uncommon, unless things are going swimmingly and other units don’t want to be left behind. In a world of silos, getting business units to sign on to a silo-breaking quality improvement effort is dicey at best. However, nothing succeeds like success.
The Bottom Line
I hope I’ve given you a taste of what it takes for a data quality improvement effort to succeed. Have you been involved in a data quality improvement initiative? Did it run like this one, or was there another outcome? Either way, Cal Braunstein and I would appreciate the opportunity to talk to you and learn about what worked and what didn’t. Please do contact me to arrange an interview. Your participation would be much appreciated.