Patient Demographic Data Quality Framework

Measurement and Analysis

Purpose

Provides practices that enable an organization to develop a meaningful measurement capability to support milestone tracking and monitoring of the effectiveness of data management activities.

Introductory Notes

An organization instituting a data management program or enhancing existing data management processes needs to determine how progress and improvements will be measured. The organization can expect that business representatives, information technology staff, managers, and executives will want to know what is being achieved in return for the effort and expense incurred, since developing and implementing new processes also ushers in some changes to the current state of work as usual.

Meaningful measures and metrics, once defined and approved by relevant stakeholders through governance, are an effective communication and monitoring mechanism. Capabilities for developing measures and metrics include abilities to:

  • Define objectives – what should the organization measure and why;
  • Obtain and store – how will measures be captured, where will they be stored, how will they be made available;
  • Analyze and interpret – how the measures and metrics will be used and what the consensus is on their meaning;
  • Evaluate progress and the effectiveness of measured process – monitoring against objectives, using results to improve the process;
  • Report to relevant audiences – how, and how often, will results be reported to relevant stakeholders and management; and
  • Interpret results and evaluate impacts to identify remediation actions.

For the purpose of data management, measures and metrics can be confused as these terms are often not used precisely. In the context of this document, they are defined as followed:

  • Measure - a count that is collected, and presents a quantified result that is used for reporting and monitoring. Measures may initially be applied to create a baseline and tracked to gauge improvements. Examples of measures include: number of defects, transaction counts, and number of duplicates.
  • Metric - specifies the quantification of measures to derive business meaning. Specifically, a metric defines the calculation method (+, -, *, /, %, etc.) and unit of measure(s), as well as the meaning and context of the specified measurements. Examples of metrics include: percentage of records with one or more missing data elements, percentage of unmatched records, rate of matching percentage improvement by month, etc.

Over the past few decades, data management as a set of organized disciplines has lagged in overall capabilities compared to software development and engineering. Even organizations which have a set of sensible measures and metrics in place for IT, such as number of defects, defect rates, delivery times for lifecycle phases, error classifications (e.g. severe, moderate, minor) etc., do not typically build in measures and metrics for data management products and activities as they are established. This is unfortunate, because metrics serve a number of important purposes:

  • Progress tracking – those involved with performing the process can track achievements against milestones and monitor results to demonstrate how effective their efforts have been according to the baseline. This information by itself has been proven to improve outcomes and also serves to motivate staff.
  • Management visibility – managers and executives can make better decisions about resources, costs, and priorities based on meaningful metrics.
  • Funding justification – a set of measures and metrics, established according to objectives and maintained and published, increases the likelihood of continuing to receive funding for important work.

Establishing metrics for data management processes (e.g. number of data stores profiled, number of defects, number of unmatched patient records, matching rates over time, etc.) and products (e.g. number of data elements for which metadata is complete, number of development projects following data standards, number of noncompliance occurrences, etc.) reveals issues to be resolved, provides objective feedback to stakeholders, and maintains alignment of data management to business objectives. It is recommended to start with one key data management process area or one well-defined subset of data, to allow stakeholders to learn the measurement definition process. Involved staff will quickly gain in skill at developing useful measures and metrics once they have engaged in the steps of the process together.

Measurement and analysis provides valuable information about the performance of data management efforts, through a sequenced approach including the following steps.

  • Specifying the objectives for measures and metrics to ensure that they are aligned with identified information needs and objectives.
    • Gain agreement on a ‘vision of the good’ – objectives for the process area, e.g., lower the number of patient records with missing demographic data; and
    • Determine what can be quantitatively demonstrated to gain insight into progress toward the objectives.
  • Specifying measures, analysis techniques, and mechanisms for data collection, storage of measurement data, reporting, and feedback.
    • Determine how the quantitative information can be captured;
    • Decide when the baseline will be conducted;
    • Decide where the measurement data will be stored; and
    • Determine the stakeholders who should be informed.
  • Implementing the analysis techniques and mechanisms for data collection, data reporting, and feedback.
    • Precisely define the measures (i.e., counts) and metrics (i.e., calculations) and decide how they will be interpreted; and
    • Define a reporting standard, channel, and frequency – e.g., governance will receive detailed and summary metrics monthly, executives will receive quarterly summary reports, a dashboard will be available to all interested parties, etc.
  • Providing objective results that can be used in making informed decisions and taking appropriate remediation action.
    • Develop decision criteria for management that assists in determining if changes to priorities or resources are necessary to achieve the objectives.

The integration of measurement and analysis activities into data management processes enables the following capabilities:

  • Objective validation of facts about improvements against the measures;
  • Alignment of measures to one or more business decisions;
  • Engagement of relevant stakeholders in developing and employing measures and metrics;
  • Increased transparency by making key measures and metrics visible and accessible; and
  • Periodic evaluation of measurement data to ensure that it remains useful.

It is difficult to overstate the benefits that the organization can realize by intentionally evolving its business processes to embed the definition and use of metrics. Benefits include:

  • Focuses and motivates improvement efforts, what gets measured gets accomplished;
  • Enables prediction towards objectives and reduces estimation risk;
  • Increases accuracy of estimating time, effort, and recourses;
  • Improves resource productivity through objective analysis of work and by a stronger focus on objectives;
  • Reveals actual vs. planned progress;
  • Identifies and resolves process related issues;
  • Informs assessment if current progress will lead to achievement of goals; and
  • Provides evidence of progress to support funding for data improvements.

Additional Information

Objectives establish the context for understanding what a metric is measuring, and why it matters. The organization should agree on the factors that will drive patient demographic data quality, set objectives for them, measure key activities, and then monitor them for progress.

The factors that impact patient demographic data quality span across all data management processes. However, the relative importance of each factor will depend on the organization’s complexity, culture, and current capabilities.

An important first step is to establish a baseline measure of the quality of critical patient demographic data attributes across the lifecycle (See Data Quality Assessment). Then, as decisions and investments are made to support the data management infrastructure, positive impacts on the goal can be monitored, and efforts can be adjusted with confidence.

Process Area Goal Core Question Objective Metric
Business Glossary A compliance and enforcement process ensures consistent application of business terms as new data requirements and projects arise. Are business terms referenced as the first step in the design of application data stores and repositories? Business data requirements definition uses terms in the business glossary. Business Glossary Utilization Rate: Entries Mapped to Data Requirements/Total # of Data Requirements
Data Quality Assessment Establish and sustain a businessdriven function to evaluate and improve the quality of data assets. Are the business, technical, and cost impacts of data quality issues analyzed and used as input to data quality improvement priorities? Ensure top priority data quality improvements are consistently addressed Resolution Rate of High Priority DQ Improvements: # High Priority Improvements Resolved/# High Priority Improvements Addressed
Governance Management Data governance ensures that all relevant stakeholders are included, and that roles, responsibilities, and authorities are clearly defined and established. How does the organization define roles and responsibilities and ensure that all relevant stakeholders are involved? All relevant stakeholders have clear data governance roles and responsibilities Assignment Rate of DG Roles and Responsibilities: # of Assigned Roles/ Total # of Roles Defined

Example Work Products

  • Measurement goals and objectives
  • Initial set of measures and metrics

Additional Information

Metrics and objectives should reflect the actual changes in patient demographic data quality impacting key stakeholders and business goals. Developing a reliable set of metrics and objectives is an iterative process. Once baseline of key performance indicators has been established, standards for data collections, storage, and analysis should be clearly specified.

At different points along the patient demographic data lifecycle, duplicate rates on patient records may be measured using different inputs and methods. While this may be appropriate for a specific purpose, it may also be discovered that inconsistent and valid definitions of “duplicate” exist. Inconsistently defined metrics can be compared in certain types of analysis where the differences in meaning are known. However, inconsistent metrics should never be aggregated. Data quality considerations should be established, used, and maintained for the measurement repository, and should be monitored by data governance.

Example Work Products

  • Defined data management metrics and measurements procedures
  • Metrics analysis reports
  • Defined responsibilities for metrics-based governance
  • Specifications of base and derived measures
  • Documented data collection and storage tools and procedures
  • Analysis specifications and procedures
  • Metrics reports and related analysis results
  • Repository or repositories for measurement data

Additional Information

Metrics can serve as a powerful form of communication. They can convey in simple terms complex behavior and serve as the basis for making rational decisions that not only lead to, but quantitatively define, success. While it is important to invest in the development of key metrics for patient demographic data quality, it is equally important to invest in analyzing and interpreting results.

Metrics rarely speak for themselves. Stakeholders need full disclosure on what is being presented. Metrics and objectives need to be precisely defined, so that the rationale for measuring and analyzing is understood. Quantitative thresholds help stakeholders to understand how good is good enough (See Data Quality Assessment). If charts are used, axes should be labeled. Legends should be employed so that color coding, shapes, and other discriminating visual characteristics can be clearly understood.

Analysts should succinctly state conclusions where possible, pose considerations or questions that should be contemplated, and if there are key patterns that support conclusions, they should be visually identified using arrows, captions, boxes, or other methods of drawing attention to what is important.

Example Work Products

  • Contextual information or guidance to help interpret analysis results
  • Measurement and metrics standards
  • Analysis results and regular metrics analysis reports
  • Conclusions and recommendations resulting from analysis

Practice Evaluation Questions

Tier 1: Foundational

1.1 Has the organization developed measurement objectives for quality improvements to patient demographic data?

Tier 2: Building

2.1 Has the organization established a baseline of patient demographic data measurements?

Tier 3: Advanced

3.1 Does the organization communicate results of measurement, analysis, and interpretation to all relevant stakeholders?