Createstopbecreative Uncategorized Data Denial and Business Intelligence – How to Achieve Data Quality

Data Denial and Business Intelligence – How to Achieve Data Quality

The greatest battle you may face inside the organization will be to get management to the point where they agree that data quality is a goal even worth considering.

Everybody talks about data, but many often confuse it with information and knowledge. Basically, data is a core corporate asset that must be synthesized into information before it can serve as the basis for knowledge within the organization. Nevertheless, data is ubiquitous – it is used to support every aspect of the business, and is an integral component of every key business process. However, incorrect data cannot generate useful information, and knowledge built on invalid information can lead organizations into catastrophic situations. As such, the usefulness of the data is only as good as the data itself – and this is where many organizations run into trouble.

Many organizations neither recognize nor accept the bad quality status of their data, and try instead to divert the attention to supposed faults within their respective systems or processes. To these organizations data denial has practically become an art form, where particularly daunting corporate barriers have been built – typically over long periods of time – to avoid the call to embark on any “real” Data Quality improvement initiatives.

However, we have found that the best way to measure the extent to which your organization may be dealing with data denial is to ask the following key questions:

  • Are you aware of any Data Quality issues within your company?
  • Are there existing processes that are not working as originally designed?
  • Are people circumventing, the system in order to get their work completed?
  • Have you ever been forced to deny a business request for information due to an issue of Data Quality?
  • If the system was functioning properly, would this information have been readily available?
  • Has a business case been made outlining the economic impact of this issue? And, if so, has it ever been addressed with the organization’s leadership?
  • What was the response to these issues? And if there was no response, what is stifling this process?
  • What causes these “gaps” in Data Quality?
  • How are these issues affecting the responsiveness of your organization (i.e., to customers, stockholders, employees, etc.)?
  • If these issues were to be addressed and corrected, what strategic value would be added or enhanced?
  • Who bears the responsibility for addressing these issues within your organization?
  • What can be done to address these issues in the future?
  • What support is needed to implement a Data Quality strategy?

 

Depending on the answers to these questions, your organization may already be facing significant barriers to attaining Data Quality, each of which will need to be identified, assessed, prioritized and corrected. According to William K. Pollock, president of the Westtown, PA-based services consulting firm, Strategies For GrowthSM, “Most companies already know what data they do not have – and for them, this is a significant problem. However, the same companies are probably not aware that some of the data they do have may be faulty, incomplete or inaccurate – and if they use this faulty data to make important business decisions, that becomes an even bigger problem”.

Common Problems with Corporate Data

Research has shown that the amount of data and information acquired by companies has close to tripled in the past four years, while an estimated 10 to 30 percent of it may be categorized as being of “poor quality” (i.e., inaccurate, inconsistent, poorly formatted, entered incorrectly, etc.). The common problems with corporate data are many, but typically fall into the following five major areas:

  • Data Definition – typically manifesting itself through inconsistent definitions within a company’s corporate infrastructure.
  • Initial Data Entry – caused by incorrect values entered by employees (or vendors) into the corporate database; typos and/or intentional errors; poor training and/or monitoring of data input; poor data input templates; poor (or nonexistent) edits/proofs of data values; etc.
  • Decay – causing the data to become inaccurate over time (e.g., customer address, telephone, contact info; asset values; sales/purchase volumes; etc.).
  • Data Movement – caused by poor extract, transform and load (ETL) processes that lead to the creation of data warehouses often comprised of more inaccurate information than the original legacy sources, or excluding data that is mistakenly identified as inaccurate; inability to mine data in the source structure; or poor transformation of data.
  • Data Use – or the incorrect application of data to specific information objects, such as spreadsheets, queries, reports, portals, etc.

 

Each of these areas represents a potential problem to any business; both in their existence within the organization, as well as the ability of the organization to even recognize that the problem exists. In any case, these are classic symptoms of “data denial” – one of the most costly economic drains on the well-being of businesses today.

Data Quality Maturity Levels

There are five key status indicators that can be used to measure the existing levels of Data Quality maturity in an organization, each with its own set of distinct corporate – and human – attributes. However, it is at the mature level where you will want your organization to be positioned.

  1. Embryonic – this level is the least beneficial place to be, as Data Quality does not even appear on the organization’s radar screen; there is extensive finger-pointing with respect to data-associated blame, generally leading to cover-ups and CYAs; and there is no formal Data Quality organization in place. As far as the humans involved in the process are concerned – they are totally “clueless”.
  2. Infancy– this level is not that much better, although the organization data hk  has begun to consider looking into Data Quality; various ad hoc groups may have been established to search for “answers”; and Data Quality has been positioned as a subset of corporate IT. This typically occurs as the human element begins to show an emerging interest.
  3. Adolescence – this level is one of mixed Data Quality accomplishments where most of the pain points have already been identified and the strategy team has shifted into a crisis-driven “full court press” managed by formal Data Quality teams that are populated and coordinated by both IT and the Business. However, this is also the point where alternating periods of panic and frenzy typically set in.
  4. Young Adult – by the time the organization reaches this level, there begins to be some semblance of an evolving Data Quality structure, where the entire organization is involved; one where both IT and the Business have begun to work as partners toward a common goal. Accordingly, the human attribute has also become much more “stabilizing”.
  5. Mature – once the organization has attained the this level, it has finally reached the point where it has implemented an effective Data Quality structure, characterized by collaborative efforts and Data Quality/Center of Excellence (DQCE), as well as the ability to measure and track customer value over time. As such, the organization has been able to attain a “controlled” environment, where all of the personnel involved – on both the supply and demand sides – are comfortable that the desired levels of Data Quality have been achieved.

 

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Post