"An extraordinary thinker and strategist" "Great knowledge and a wealth of experience" "Informative and entertaining as always" "Captivating!" "Very relevant information" "10 out of 7 actually!" "In my over 20 years in the Analytics and Information Management space I believe Alan is the best and most complete practitioner I have worked with" "Surprisingly entertaining..." "Extremely eloquent, knowledgeable and great at joining the topics and themes between presentations" "Informative, dynamic and engaging" "I'd work with Alan even if I didn't enjoy it so much." "The quintessential information and data management practitioner – passionate, evangelistic, experienced, intelligent, and knowledgeable" "The best knowledgeable, enthusiastic and committed problem solver I have ever worked with" "His passion and depth of knowledge in Information Management Strategy and Governance is infectious" "Feed him your most critical strategic challenges. They are his breakfast." "A rare gem - a pleasure to work with."

Wednesday 17 July 2013

What's cooking? Data quality recipes are few and far between


In his recent post "The Data Quality Cookbook", Gary Alleman offers a useful kitchen analogy, likening data quality to the ingredients of a meal. The better the ingredients, the better you eat.

However, I'm not sure there are many "recipes" or "menus" for good data quality, at least not yet. 

To go with Gary's analogy, there are certainly raw ingredients (the data), gadgets (the software tools), cooking techniques (methodologies, processes and guidelines) and cooks/chefs (the people playing the data governance or data quality function). 

But by and large, the recipes - the specific list the of ingredients, techniques to be applied and the step-by-step instructions that you need to follow to create your meal vary depending on what you're choosing to cook - don't exist. You could make an omelette, fried eggs or scrambled eggs from the same three ingredients of an egg, a splash of water and a little oil. (And even with a nice fresh egg, I could still botch all three meals...) Likewise, you could provide a very good or bad set of forecasts for future business product sales based on the same underlying set of sales history data. 

And so it is with data quality - the respective "quality" of the data is entirely dependent upon the specific context of its use. So what do you want to cook today? 

I personally don't know of too many step-by-step approaches that seek to deliver a specific successful business outcome. (Rob Mattinson's "Telco Revenue Assurance Handbook" is one such in the Telco space, http://www.amazon.com/The-Telco-Revenue-Assurance-Handbook/dp/1411628012 , while Sunil Soares offers some high-level expectations by Industry problem in "Selling Information Governance to the Business"http://www.amazon.com/Selling-Information-Governance-Business-Practices/dp/1583473688/ref=sr_1_2?s=books&ie=UTF8&qid=1374116844&sr=1-2 ) 

On the whole though, it currently stands that each data cook/chef is cooking a bespoke meal each time, based on their knowledge and experience of how best to use the techniques and tools at their disposal to treat the ingredients (and with the palate of the diner in mind, if you're really lucky). Good luck if you're cooking your data over an open fire with a rusty frying pan... 

Without some real and detailed recipes, the best we can expect is that the data ingredients being offered aren't too out of date and mouldy, the tools are reasonably new and functional and that the chef really does know his way about the kitchen!

2 comments:

  1. Hi Alan

    Nice post. The good news is that every enterprise can actually have a unique set of ingredients and a complete set of instructions for its data quality, right here, right now by following these seven simple steps:

    1. Sit and interview the C-Level executives. Recorded structured interviews essential.
    2. Convert all structured interviews to transcripts.
    3. Extract all verb phrases from transcripts and convert to business functions.
    4. Build Business Function Model (BFM) from these extracted functions.
    5. Extract all noun phrases from transcripts and convert to data entities, attributes and associations.
    6. Build the Logical Data Model (LDM) from the extracted entities and associations.
    7. Build a CRUD matrix that connects each entity and attribute the those business functions that create, read, uses or update them.

    These seven simple steps will work for any enterprise of any size in any sector and will, if properly followed, immediately provide a future state function and data architecture for the enterprise.

    It works every time.

    Kind regards
    John

    ReplyDelete
  2. Thanks John - fully agree with everything you say, and the overall approach and method you suggest is bang on the money, in my view.

    I also proposed some specific facilitation methods for the interview/workshop stage back in May: http://www.informationaction.blogspot.com.au/2014/05/the-one-question-that-you-must-never-ask.html

    ReplyDelete