Provides practices that enable an organization to develop a meaningful measurement capability to support milestone tracking and monitoring of the effectiveness of data management activities.
An organization instituting a data management program or enhancing existing data management processes needs to determine how progress and improvements will be measured. The organization can expect that business representatives, information technology staff, managers, and executives will want to know what is being achieved in return for the effort and expense incurred, since developing and implementing new processes also ushers in some changes to the current state of work as usual.
Meaningful measures and metrics, once defined and approved by relevant stakeholders through governance, are an effective communication and monitoring mechanism. Capabilities for developing measures and metrics include abilities to:
For the purpose of data management, measures and metrics can be confused as these terms are often not used precisely. In the context of this document, they are defined as followed:
Over the past few decades, data management as a set of organized disciplines has lagged in overall capabilities compared to software development and engineering. Even organizations which have a set of sensible measures and metrics in place for IT, such as number of defects, defect rates, delivery times for lifecycle phases, error classifications (e.g. severe, moderate, minor) etc., do not typically build in measures and metrics for data management products and activities as they are established. This is unfortunate, because metrics serve a number of important purposes:
Establishing metrics for data management processes (e.g. number of data stores profiled, number of defects, number of unmatched patient records, matching rates over time, etc.) and products (e.g. number of data elements for which metadata is complete, number of development projects following data standards, number of noncompliance occurrences, etc.) reveals issues to be resolved, provides objective feedback to stakeholders, and maintains alignment of data management to business objectives. It is recommended to start with one key data management process area or one well-defined subset of data, to allow stakeholders to learn the measurement definition process. Involved staff will quickly gain in skill at developing useful measures and metrics once they have engaged in the steps of the process together.
Measurement and analysis provides valuable information about the performance of data management efforts, through a sequenced approach including the following steps.
The integration of measurement and analysis activities into data management processes enables the following capabilities:
It is difficult to overstate the benefits that the organization can realize by intentionally evolving its business processes to embed the definition and use of metrics. Benefits include:
Objectives establish the context for understanding what a metric is measuring, and why it matters. The organization should agree on the factors that will drive patient demographic data quality, set objectives for them, measure key activities, and then monitor them for progress.
The factors that impact patient demographic data quality span across all data management processes. However, the relative importance of each factor will depend on the organization’s complexity, culture, and current capabilities.
An important first step is to establish a baseline measure of the quality of critical patient demographic data attributes across the lifecycle (See Data Quality Assessment). Then, as decisions and investments are made to support the data management infrastructure, positive impacts on the goal can be monitored, and efforts can be adjusted with confidence.
Process Area | Goal | Core Question | Objective | Metric |
---|---|---|---|---|
Business Glossary | A compliance and enforcement process ensures consistent application of business terms as new data requirements and projects arise. | Are business terms referenced as the first step in the design of application data stores and repositories? | Business data requirements definition uses terms in the business glossary. | Business Glossary Utilization Rate: Entries Mapped to Data Requirements/Total # of Data Requirements |
Data Quality Assessment | Establish and sustain a businessdriven function to evaluate and improve the quality of data assets. | Are the business, technical, and cost impacts of data quality issues analyzed and used as input to data quality improvement priorities? | Ensure top priority data quality improvements are consistently addressed | Resolution Rate of High Priority DQ Improvements: # High Priority Improvements Resolved/# High Priority Improvements Addressed |
Governance Management | Data governance ensures that all relevant stakeholders are included, and that roles, responsibilities, and authorities are clearly defined and established. | How does the organization define roles and responsibilities and ensure that all relevant stakeholders are involved? | All relevant stakeholders have clear data governance roles and responsibilities | Assignment Rate of DG Roles and Responsibilities: # of Assigned Roles/ Total # of Roles Defined |
Example Work Products
Metrics and objectives should reflect the actual changes in patient demographic data quality impacting key stakeholders and business goals. Developing a reliable set of metrics and objectives is an iterative process. Once baseline of key performance indicators has been established, standards for data collections, storage, and analysis should be clearly specified.
At different points along the patient demographic data lifecycle, duplicate rates on patient records may be measured using different inputs and methods. While this may be appropriate for a specific purpose, it may also be discovered that inconsistent and valid definitions of “duplicate” exist. Inconsistently defined metrics can be compared in certain types of analysis where the differences in meaning are known. However, inconsistent metrics should never be aggregated. Data quality considerations should be established, used, and maintained for the measurement repository, and should be monitored by data governance.
Example Work Products
Metrics can serve as a powerful form of communication. They can convey in simple terms complex behavior and serve as the basis for making rational decisions that not only lead to, but quantitatively define, success. While it is important to invest in the development of key metrics for patient demographic data quality, it is equally important to invest in analyzing and interpreting results.
Metrics rarely speak for themselves. Stakeholders need full disclosure on what is being presented. Metrics and objectives need to be precisely defined, so that the rationale for measuring and analyzing is understood. Quantitative thresholds help stakeholders to understand how good is good enough (See Data Quality Assessment). If charts are used, axes should be labeled. Legends should be employed so that color coding, shapes, and other discriminating visual characteristics can be clearly understood.
Analysts should succinctly state conclusions where possible, pose considerations or questions that should be contemplated, and if there are key patterns that support conclusions, they should be visually identified using arrows, captions, boxes, or other methods of drawing attention to what is important.
Example Work Products
1.1 Has the organization developed measurement objectives for quality improvements to patient demographic data?
2.1 Has the organization established a baseline of patient demographic data measurements?
3.1 Does the organization communicate results of measurement, analysis, and interpretation to all relevant stakeholders?