Skip Navigation Archive: U.S. Department of Health and Human Services U.S. Department of Health and Human Services
Archive: Agency for Healthcare Research Quality www.ahrq.gov
Archival print banner

This information is for reference purposes only. It was current when produced and may now be outdated. Archive material is no longer maintained, and some links may not work. Persons with disabilities having difficulty accessing this information should contact us at: https://info.ahrq.gov. Let us know the nature of the problem, the Web address of what you want, and your contact information.

Please go to www.ahrq.gov for current information.

Evaluation of the Use of AHRQ and Other Quality Indicators

Chapter 3. The Market for Quality Indicators

Our environmental scan revealed strong demand for hospital care quality indicators. Demand for indicators for research and quality monitoring is strong and has a relatively long history. Demand is higher and increasing rapidly for quality indicators that can be used for other, newer purposes. These purposes include public reporting to inform consumers' choice of providers and otherwise drive provider improvement; pay-for-performance to reward high-quality providers; the development of tiered insurance products; and using quality indicators to select a network of providers.

This demand has led to a proliferation of quality indicators. In addition to AHRQ, the market leaders in developing hospital quality indicators are the Centers for Medicare and Medicaid Services (CMS), the Joint Commission on Accreditation of Healthcare Organizations (JCAHO), the Hospital Quality Alliance (HQA—a collaboration between CMS, JCAHO, and several other organizations), and the Leapfrog Group. In this section, we discuss these and other developers and vendors of quality indicators, and how the quality indicators developed by each of these agencies/organizations compares to the AHRQ QIs. Our environmental scan identified two main categories of players in the market for quality indicators. The first type, "developers," includes organizations that develop, support, and distribute quality indicators. The second type, "vendors," includes organizations that develop and/or sell quality measurement products to providers, insurers, and others. Their products often include the AHRQ QIs (or variants thereof), indicators from other developers, and/or indicators developed by the vendors themselves.

3.1. Developers

The environmental scan identified 12 organizations that have developed indicators that are similar in some way to the AHRQ QIs. The organizations that have developed indicators that are widely used and focused on hospitals are summarized in Table 3.1 and described below.

Although there are similarities between these indicators and those developed by AHRQ, none of the indicators developed by organizations other than AHRQ were comparable to the AHRQ QIs on all of their major characteristics: based on administrative data, outcome-focused, hospital-focused, based on transparent methodology, and available for public use.

 

Table 3.1. Developers of Quality Indicators and Comparison with AHRQ QIs

Developer Indicators Similarities to AHRQ QIs Differences from AHRQ QIs
JCAHO
CMS
HQA
  • Core Measures
  • Hospital Quality Indicators
  • National standard
  • Process measures
  • Clinical data
  • Implemented through licensed vendors
The Leapfrog Group
  • Leapfrog Leaps
  • National standard
  • Some outcomes indicators
  • Collected through survey
  • Mostly structure and process measures
Institute for Healthcare Improvement
  • Hospital Standardized Mortality Ratios
  • Outcomes indicator (risk-adjusted mortality)
  • Mortality not condition-specific
  • Used in conjunction with specific quality improvement program
States (e.g., PA and CA)
  • PA Health Care Cost Containment Council Hospital Performance Report Indicators
  • CA Healthcare Quality and Analysis Division Indicators
  • Outcomes indicators
  • Administrative data
  • Uses data elements not available in administrative data in most states
Vendors
  • Various
  • Administrative data
  • Some outcomes indicators
  • Methodology often not transparent

Source: RAND analysis of environmental scan results.
Note: Indicators were judged to be a "national standard" if they were described that way by any of the study's interviewees.

JCAHO/CMS/HQA. Both JCAHO and CMS have developed quality indicators of hospital care for common conditions. CMS's measures were originally used for quality improvement initiatives conducted by Medicare Quality Improvement Organizations (QIOs). JCAHO's Core Measures have been used as part of the JCAHO hospital accreditation process since 2002. They cover five clinical areas: (1) acute myocardial infarction, (2) heart failure, (3) pneumonia, (4) surgical infection prevention, and (5) pregnancy and related conditions. JCAHO-accredited hospitals choose 3 of these 5 areas for reporting, depending on the services they provide. JCAHO publishes the results of the measures publicly on the Web.21

Since the measures had significant overlap, CMS and JCAHO agreed in 2004 to align specifications for overlapping measures and to maintain them as a shared set of measures. A subset of the joint CMS-JCAHO measures was later selected by the HQA, a public-private partnership for measuring and reporting hospital quality. Their Hospital Quality Measures are now publicly reported on the Web for both accredited and non-accredited hospitals.22 They are also used in other CMS activities such as the Premier pay-for-performance demonstration project.23

Like the AHRQ QIs, the CMS/JCAHO/HQA measures are widely used and viewed as a national standard.d A key difference between those measures and the AHRQ QIs is that they are largely based on clinical data collected from medical records rather than administrative data. JCAHO has estimated that collection of the clinical data for the Core Measures takes an average of 22-27 minutes per case for acute myocardial infarction, heart failure, and pneumonia.24

A second key difference is that the CMS/JCAHO/HQA measures are process indicators while the AHRQ QIs are outcome indicators. Another difference is that, while the AHRQ QIs reflect a broad range of conditions, the CMS/JCAHO/HQA measures currently reflect only five conditions; however, JCAHO and CMS are currently developing indicators in additional clinical areas.

The method used by JCAHO to implement its Core Measures is also different from that used for the AHRQ QIs. Hospitals pay vendors to measure the JCAHO Core Measures on their behalf using standardized specifications. Hospitals have made a wide variety of arrangements with vendors for Core Measure collection and reporting, according to their specific needs and characteristics. All vendors of the JCAHO Core Measures must undergo a certification process through which JCAHO ensures that they have appropriately implemented the measures.

Due to these differences, the CMS/JCAHO/HQA measures and the AHRQ QIs can be considered complementary in some respects. A number of the users of the AHRQ QIs interviewed (11 of 36) also use the JCAHO/CMS/HQA measures.

The only way in which the CMS/JCAHO/HQA measures and the AHRQ QIs could be considered competitors is as a function of limited hospital resources for quality measurement. Hospitals are required to report the JCAHO Core Measures for accreditation and may have limited resources for other quality measurement activities, including the AHRQ QIs. One interviewee told us:

AHRQ could do a lot of terrific things with the AHRQ QIs, but facilities are trying to meet requirements right now and don't have time and resources to work with other quality indicators to the exclusion of what they might like to do. Hospitals are doing only what they have to do—either by mandate or by the market.e

Leapfrog. The Leapfrog Group has developed a set of quality indicators that are widely used and considered to be a national standard. The indicators are intended to improve value in health care purchasing. Provider performance on the indicators is presented in a public report on Leapfrog's Web site. In addition to developing and marketing its own quality indicators, Leapfrog operates a pay-for-performance program, the Leapfrog Hospital Rewards Program, which uses JCAHO Core Measures and an efficiency measure in addition to the Leapfrog indicators. The program is implemented through vendors, who pay Leapfrog for every participating hospital, and then charge hospitals accordingly.

Unlike the AHRQ QIs, most of the Leapfrog indicators are not outcome-focused and require primary data collection. The indicators are organized into four content areas called "Leaps": (1) computerized physician order entry, (2) intensive care unit staffing, (3) high-risk treatments, and (4) safe practices. Data are collected through a survey of hospitals. Leaps 1, 2, and 4 are structure and process indicators, such as use of a computerized physician order entry system or staffing hospital intensive care units with intensivists (physicians who specialize in critical care medicine). Leap 3 (high-risk treatments) overlaps considerably with the AHRQ IQIs. It measures procedure volume and risk-adjusted mortality for selected conditions. Leapfrog is currently standardizing its specifications to those used in the AHRQ IQIs in order to minimize the reporting burden for hospitals.

Institute for Healthcare Improvement (IHI). The IHI measures overall hospital mortality as part of its activities to improve hospital quality. This measurement activity is conducted in conjunction with the implementation of a specific set of interventions that are intended to improve quality in participating hospitals. The indicator used is similar to the AHRQ IQIs in that it is based on risk-adjusted mortality associated with hospital stays and is based on the analysis of administrative data. Unlike the AHRQ IQIs, however, the IHI measures the mortality rate for all conditions. Hospital- and area-level characteristics are used in regression models to control for patient risk. This measurement approach originated in the United Kingdom and has also been applied to hospitals in many countries other than the United States.25

States. We also interviewed representatives from California and Pennsylvania, two states that have developed their own methodologies for measuring quality using administrative data. These states developed their own measurement approaches largely because their public reporting efforts predate the development of the AHRQ QIs. Both states also use data elements that are unavailable in the hospital administrative data collected in most other states. These features include a flag to indicate conditions that were present on hospital admission (California) and detailed data on severity of illness (Pennsylvania). Other states, such as New York, have also developed their own measurement approaches which may predate the AHRQ QIs or use data elements not available in other states.

Vendors. We interviewed several vendors who, in addition to implementing existing measures from other developers in their measurement tools, have also developed proprietary indicators. Some of these indicators are similar to the AHRQ QIs in that they are based on administrative data and are outcomes indicators. The key difference is the definitions and specifications of most vendors' indicators are proprietary. The vendors' indicators have also not always been subjected to validation of the same rigor as the AHRQ QIs. In the next subsection, we discuss the vendors identified by the environmental scan in more detail.

Return to Contents

3.2. Vendors

The environmental scan identified 12 vendors of quality measurement products that were determined to include the AHRQ QIs.fThese vendors are listed in Table 3.2.

Typically, the AHRQ QIs are included in software tools that are marketed to hospitals for quality improvement or to insurers or business groups for hospital profiling. The vendors' products offer additional functionality to the basic AHRQ QI software. For example, the vendors' measurement tools often include non-quality indicators that inform hospital administration, such as financial performance indicators. The tools are often designed to offer users a variety of reporting options. These measurement tools may be particularly useful for hospitals that do not have the in-house expertise or staff time to construct indicators of quality and other aspects of care from raw data. Similar tools are used by insurance companies and other organizations.

 

Table 3.2. Vendors of Quality Measurement Products That Include the AHRQ QIs

Vendors
CareScience
Consumers' Checkbook
Health Benchmarks
HealthGrades
Innovative Health Solutions
Mediqual (Cardinal Health)
Medisolv
Midas+
Solucient
Subimo
WebMD Quality Services

Source: RAND analysis of environmental scan results.

As mentioned above, many of these tools include proprietary quality indicators developed by the vendors themselves. In addition, many of the vendors are licensed to implement the JCAHO Core Measures, and many also produce indicators from other developers, such as Leapfrog.

Some users of the AHRQ QIs whom we interviewed use vendors for their measurement activities and expressed a high degree of satisfaction with the vendors' services. On the other hand, some users expressed a concern that the AHRQ QIs as implemented by some vendors may differ in key respects from the official AHRQ QI specifications, and that the proprietary nature of the tools makes these differences non-transparent. One hospital association captured this sentiment:

The AHRQ QIs are standardized measures, risk-adjusted, and not in a "black box" so we can get the numerator and denominator and make them accessible to hospitals. The industry is sick and tired of vendors and consulting firms creating black boxes.

Another interviewee sounded similar themes:

The problem is that if there's any "black box" methodology, [users] won't touch it—it's politically dead, even if there is an underlying valid scientific process. Hospitals want to check their own numbers. [The vendors'] offers sound nice. The problem is, a hospital can't replicate the findings or understand differences in methodology/calculations. [Users] like transparency, a tool that is open, where everyone can see what is happening, hospitals can replicate the results, then everyone can talk about the differences. It democratizes quality reporting.

Return to Contents

3.3. AHRQ's Position in the Market for Quality Indicators

While the quality indicators developed by organizations other than AHRQ share certain characteristics with the AHRQ QI program, there are no other sources of indicators that are viewed as a national standard and are also publicly available, transparent, hospital-focused, outcome-focused, and based on administrative data. Many of our interviewees stressed that the AHRQ QIs fill an important void in this respect. A representative of an integrated delivery system described the process of searching for quality indicators that could be readily used for monitoring quality and guiding quality improvement activities:

When we started looking for indicators, we really struggled to find valid quality measures based on readily available data and with benchmark potential. Without manually auditing patient charts, and coming up with numerator and denominator definitions on our own, there was no way we could do it by ourselves. AHRQ offered the set of measures prescribed for our needs.

A representative of a state doing public reporting told us:

If we didn't have the AHRQ QIs, it would be difficult as a state to come up with our own indicators and there are not many other choices that are based on administrative data. Until electronic medical records are commonplace (5-10 years at least), we need to deal with using administrative data.

An insurance company representative highlighted the importance of AHRQ's role in the quality indicator market, stating that more marketing of the QIs is needed:

AHRQ is doing something that no one else is doing. We have to have a national standard, something used across the country for comparison. [Does AHRQ] realize they're one of the only good options out there? They should really pick up the outreach so that others will pick up using the QIs.

3.3.1. Overview of users and uses of the AHRQ QI

AHRQ's unique position in the market for quality indicators has led to a wide proliferation of uses for the AHRQ QIs. Our environmental scan of users of the AHRQ QIs identified 114 users of the indicators and a range of different purposes, including public reporting, quality improvement/benchmarking, pay-for-performance, and research. Table 3.3 summarizes the number of users of the AHRQ QIs by type of organization and purpose of use.

 

Table 3.3. Users of the AHRQ Qis

Type of Organization Type of Use
Public Reporting Quality Improvement/ Benchmarking Pay-for-Performance Research Other/ Unknown Total
Business Group 2         2
Consulting Firm       2   2
Employer   1       1
Federal Government   1 1 19   21
Health plan 1 1 3   4 9
Hospital Association 1 8   2   11
Hospital or Hospital Network 2 3   1 9 15
Integrated Delivery System   2     7 9
Other 2 4     1 7
Research Organization   1   14 1 16
State or Local Government 12 2   5 2 21
Total 20 23 4 43 24 114

Source: RAND analysis of environmental scan results.

The most common uses of the AHRQ QIs include:

  • Research. We identified 43 organizations that use AHRQ QIs for research. For example, Leslie Greenwald and colleagues used the AHRQ QIs to compare the quality of care provided in physician-owned specialty hospitals and competitor hospitals.26
  • Quality improvement. We identified 23 organizations that use the AHRQ QIs as part of a quality improvement activity, including reports benchmarking performance against peers; however, these organizations do not release the quality information into the public domain.g
  • Public reporting. We identified 20 organizations using the AHRQ QIs for public reporting. We classified an activity as "public reporting" if a publicly available report was published that compares AHRQ QI results between hospitals or geographic areas such as counties. The organizations using the AHRQ QIs for public reporting, with Web links to the reports, are listed in Table 3.4.
  • Pay-for-Performance. We identified 4 organizations that are using the AHRQ QIs in pay-for-performance programs. Three were health plans and one was a CMS demonstration project.

 

Table 3.4. Users of the AHRQ Qis

Organization Name Type of Report Description QIs used Citation
AFSCME Council 31 One-time report The union published a report on quality at Resurrection Health Care hospitals after complaints about quality from workers. IQIs 15-20 AFSCME Council 31. The High Price of Growth at Resurrection Health Care: Corporatization and the Decline of Quality of Care. November 2005. Available at: http://www.afscme31.org/cmaextras/qualityofcare.pdf, last accessed January 2006.
California Office of Statewide Health Planning & Development Interactive tool and periodic written reports A Web site includes an interactive tool for hospital comparison on selected IQIs and other risk-adjusted mortality indicators. IQIs 1, 2, 4-7, 21-23, 33, 34; PDI 7 California Office of Statewide Health Planning and Development. "Consumer Information on Quality of Care." Available at: http://www.oshpd.ca.gov/oshpdKEY/qualityofcare.htm, last accessed September 2006.
Chicago Department of Public Health Periodic report Chicago runs a Web site providing a health profile of city community areas, including PQIs. PQIs (all except 2,9) City of Chicago. "Community Health Profiles." Available at: http://www.cchsd.org/cahealthprof.html, last accessed September 2006.
Colorado Health and Hospital Association Periodic report Hospital reports are shared among hospitals and published on a Web site. IQIs 15-20, 12-14, 30, 31, 4-7 http://www.hospitalquality.org/index.php , last accessed November 2005.
Connecticut Office of Health Care Access One-time report Databook on preventable hospitalizations. PQIs (all) Office of Health Care Access Databook. Preventable Hospitalizations in Connecticut: Assessing Access to Community Health Services. FY2000-2004. Available at: http://www.ct.gov/ohca/lib/ohca/publications/acsc_databook00-04.pdf, last accessed November 2005.
Excellus Blue Cross/Blue Shield Interactive tool Online hospital comparison tool for health plan members only. Unspecified (members only) https://www.excellusbcbs.com/guests/ find_a_doctor_or_hospital/click_and_compare.shtml#, last accessed September 2006.
Exempla Healthcare Periodic report Exempla publishes quality information on its hospitals on its Web site. (The same results are also reported by the Colorado Health and Hospital Association). IQIs 12-20, 30, 31 http://www.exempla.org/about/quality/ MortalityReportExemplaELMCandESJH51105.htm, last accessed September 2006.
Florida State Center for Health Statistics Interactive tool Online hospital comparison tool. PSIs 3, 6-8, 12, 13; IQIs 8-20, 32 http://www.floridacomparecare.gov/, last accessed September 2006.
Georgia Partnership for Health and Accountability Periodic report A periodic report on health in Georgia includes a chapter on avoidable hospitalizations. PQIs 3, 5, 8, 10, 11, 15 Georgia Partnership for Health & Accountability. The State of the Health of Georgia, 2004: Ambulatory Care Sensitive Conditions. Available at: http://www.gha.org/pha/publications/stateofthehealth/2004/ACS112704.pdf, last accessed November 2005
Massachusetts Dept. of Health and Human Services Interactive tool Online hospital comparison tool. IQIs 14, 16-20, 32, 21, 33, 34 www.mass.gov/healthcareqc/, last accessed September 2006.
Missouri Department of Health and Senior Services Periodic report Comparison of hospital surgery volume to help consumers choose a hospital. IQIs 1, 2, 4-7; PDI 7 http://www.dhss.mo.gov/HospitalSurgeryVolume/index.html
Niagara Health Quality Coalition and Alliance for Quality Health Care Interactive tool Online hospital comparison tool. IQIs 1-25 http://www.myhealthfinder.com/, last accessed September 2006.
Norton Healthcare Interactive tool Health system publishes quality data for its hospitals on its Web site. PSIs 1-6, 8-16, 18-20; IQIs 1, 2, 4-9, 11-20, 22-24, 30, 31, 34; PDIs 2, 3, 4-9, 11, 13 http://www.nortonhealthcare.com/about/qualityreport/index.aspx, last accessed September 2006.
Ohio Department of Health Periodic report Online comparison of avoidable hospitalizations by county. PQIs 1, 4, 5, 7, 8, 11, 14, 15 http://www2.odh.ohio.gov/Data/CntyPfls/primcare1.htm, last accessed September 2006.
Oregon Interactive tool Online hospital comparison tool and a report on Oregon' safety net by the Safety Net Advisory Council. IQIs 11, 12, 15-17, 19, 20, 30; PQIs 3, 5, 8, 10, 11, 12, 15 http://egov.oregon.gov/DAS/OHPPR/HQ/HospReports.shtml (IQIs)
http://www.oregon.gov/DAS/OHPPR/SNAC/SNACWelcomePage.shtml #Power_Point_Presentations (PQIs)
Rhode Island One-time report Report on hospital procedure volumes. Future reports on IQIs and PSIs in preparation. IQIs 1-7 Williams KA, Buechner JS. Health by Numbers Vol. 6, No. 2. February 2004. Available at: http://www.health.ri.gov/chic/statistics/hbn_feb2004.pdf, last accessed December 2005.
Texas Health Care Information Collection Interactive tool Online hospital comparison tool. IQIs 1-14, 16-20, 22-25, 30-33; PQIs (all) http://www.dshs.state.tx.us/THCIC, last accessed September 2006.
The Alliance (Wisconsin) Periodic report QualityCounts report on hospital safety performance. The report is based on AHRQ PSIs but modifies them for reporting. PSIs 3, 6, 7, 8, 12, 17; IQI 33 http://www.qualitycounts.org/, last accessed September 2006
Utah Department of Public Health Periodic report Web site providing health information for geographic areas. Three PQIs are included with numerous health status and other measures. State-level IQI results are presented on a one-page poster, available online. PQIs 11, 4, 1+3+14 combined; IQIs (all) PQIs: http://ibis.health.utah.gov/indicator/index/alphabetical.html
IQIs: http://health.utah.gov/hda/AHRQ2005.pdf
Vermont Department of Banking, Insurance, Securities & Health Care Administration Periodic report Online hospital comparison report. IQIs 1, 2, 4-9, 11, 12, 30, 31; PDIs 6, 7 http://www.bishca.state.vt.us/HcaDiv/HRAP_Act53/HRC_BISHCAcomparison_2006/ BISHCA_HRC_compar_menu_2006.htm, last accessed September 2006.

Source: RAND analysis of environmental scan results.
Note: "Public reporting" was defined as a publicly available report that compares AHRQ QI results between hospitals or geographic areas such as counties. Not all of the public reports identified in this table are intended to influence consumers' choice of provider.
"One-time reports" are published comparisons that are not labeled as an ongoing activity.
"Periodic reports" are published comparisons, updated periodically, that are in static format (e.g., documents available as .pdf files online).
"Interactive tools" are online comparisons that allow users to create customized reports (e.g., selection of providers or indicators of interest).

3.3.2. Uses of Specific AHRQ QIs

We asked users of the AHRQ QIs, and vendors of quality measurement packages including the AHRQ QIs, which specific QIs they were using. Among organizations we interviewed, the PSIs and IQIs were used more frequently than the PQIs. Of the 42 organizations, 33 were using the PSIs, 30 were using the IQIs, and 17 were using the PQIs.

Within the PSI and IQI sets, some indicators were used more frequently than others. Users of the PQIs, on the other hand, were more likely to use every PQI. There were no meaningful differences in the frequency of use of particular PQIs (data not shown).

3.3.2.1. Use of IQIs

Figure 3.1 shows the frequency of use of each IQI by the users and vendors we interviewed. The IQIs that reflect mortality rates for medical conditions were used most frequently, particularly:

  • IQI 16—congestive heart failure mortality (23 users).
  • IQI 17—stroke mortality (23 users).
  • IQI 20—pneumonia mortality (22 users).

The rates of mortality for certain medical procedures were also commonly used, particularly:

  • IQI 12—coronary artery bypass graft mortality (23 users).
  • IQI 13—craniotomy mortality (19 users).
  • IQI 11—abdominal aortic aneurysm repair mortality (18 users).
  • IQI 14 —hip replacement mortality (18 users), and
  • IQI 30—percutaneous transluminal coronary angioplasty mortality (18 users).

The procedure volume indicators were used less frequently, and the procedure utilization rates, both hospital- and area-level, were used least frequently.

 

Figure 3.1. Frequency of Use of Specific AHRQ IQIs among 42 Users of the AHRQ QIs

Figure 3.1. depicts five categories of quality indicators measured against frequency of use in a columnar graph. The data represented by this graph is contained in the following table.

Category IQI Indicator Number Frequency (number of users)
Volume of Procedures 1 15
2 15
3 12
4 15
5 14
6 15
7 15
Mortality Rates for Surgical Procedures 8 15
9 15
10 10
11 18
12 23
13 19
14 18
30 18
31 17
Mortality Rates for Medical Conditions 15 21
16 23
17 23
18 19
19 19
20 22
32 16
Area-Level Procedure Utilization Rates 21 14
22 13
23 13
24 13
25 12
33 15
34 12
Hospital-Level Procedure Utilization Rates 26 12
27 11
28 11
29 11

Source: RAND analysis of environmental scan results.

3.3.2.2. Use of PSIs

Figure 3.2 shows similar counts of the frequency of use of each PSI. The area-level PSIs were used less frequently than the hospital-level PSIs. Among the hospital-level indicators, there was considerable variation in frequency of use between the indicators. The most frequently used PSIs were PSI 12—postoperative pulmonary embolism (PE) or deep vein thrombosis (DVT) (28 users), PSI 8—postoperative hip fracture (26 users), and PSI 13—postoperative sepsis (26 users). The least frequently used hospital-level PSIs were PSIs 18, 19, and 20—obstetric trauma with instrument, without instrument, and during cesarean delivery (15, 16, and 15 users, respectively).

 

Figure 3.2. Frequency of Use of Specific AHRQ PSIs among 42 Users of the AHRQ QIs

Figure 3.2. depicts two categories of Patient Safety Indicators (PSI) measured against frequency of use in a columnar graph. The data represented by this graph is contained in the following table.

Category AHRQ PSI Frequency (number of users)
Hospital-Level Indicators 1. Complications of anesthesia 20
2. Death in low mortality DRGs 19
3. Decubitis ulcer 25
4. Failure to rescue 20
5. Foreign body left in during procedure 20
6. Iatrogenic pneumothorax 25
7. Selected infections due to medical care 25
8. Postoperative hip fracture 26
9. Postoperative hemorrhage or hematoma 22
10. Postoperative physiologic & metabolic derangements 23
11. Postoperative respiratory failure 23
12. Postoperative PE or DVT 28
13. Postoperative sepsis 26
14. Postoperative wound dehiscence 23
15. Accidental puncture and laceration 22
16. Transfusion reaction 18
17. Birth trauma—injury to neonate 18
18. OB trauma—vag. delivery with instrument 15
19. OB trauma—vag. delivery without instrument 16
20. OB trauma—cesarean delivery 15
Area-Level Indicators 21. Foreign body left in during procedure 13
22. Iatrogenic pneumothorax 14
23. Selected infections due to medical care 14
24. Postoperative wound dehiscence 13
25. Accidental puncture and laceration 13
26. Transfusion reaction 13
27. Postoperative hemorrhage or hematoma 12

Source: RAND analysis of environmental scan results.

3.3.3. International uses of QIs

Measuring quality of care has become a policy priority in many countries outside of the United States, and numerous countries and international organizations are in the process of instituting requirements for data collection and reporting of quality indicators.27 The AHRQ QIs are an attractive option for international users, since many countries already require hospitals to report the required administrative data.

Perhaps the most visible international endeavor is the Organization for Economic Cooperation and Development's (OECD) Health Care Quality Indicators (HCQI) Project. The OECD is an intergovernmental economic research institution headquartered in Paris, France, with a membership of 30 countries that share a commitment to democratic government and the market economy. One of its widely used products is OECD Health Data, which provides internationally comparable information on infrastructure, cost and utilization at the health system level,28 but so far no information on quality of care. In an attempt to bridge this gap, in 2003 the OECD brought 23 of its member countries together with international organizations such as the World Health Organization (WHO) and the European Commission (EC), expert organizations such as the International Society of Quality in Healthcare (ISQua) and the European Society for Quality in Healthcare (ESQH), and several universities.29 The goal of the meeting was to work on the development and implementation of quality indicators at the international level.

The project initiated its work with two major activities. The first was an effort to introduce a pilot set of quality indicators that can be reported by a substantial portion of the OECD countries. This activity recently led to the 2006 release of an initial list of indicators and corresponding data.30 The second activity was to identify additional quality indicators for five priority areas: cardiac care, diabetes mellitus, mental health, patient safety, primary care/prevention/health promotion. Through an expert panel process, 86 indicators were selected for the five areas and the OECD is currently investigating the availability and validity of required data.31 Several AHRQ PSIs were selected for the patient safety area32 and an indicator similar to the PQIs was selected for the primary care area.33

Researchers from several countries have tried to run the PSIs against national discharge data both as part of their participation in the HCQI Project and also for other projects. This has been attempted in Canada, Germany, Italy, and Sweden. In addition, a group in Belgium successfully constructed some of the HCUP indicators, the predecessors of the AHRQ QIs, from national data.34

At this point, results from those projects are largely unpublished in English-language journals. But during a recent OECD meeting in Dublin, Ireland, experts from 15 countries discussed issues around the adoption of the AHRQ PSIs in countries other than the United States. Researchers from several countries had cross-walked the AHRQ PSI specifications, which are based on the U.S.-specific ICD-9-CM diagnosis codes, to ICD-10 diagnosis codes, which most countries are now using.

This conversion was found to be unproblematic, in particular because only a limited number of diagnosis codes had to be cross-walked to construct the indicators. A greater issue turned out to be the conversion of procedure codes. The AHRQ definitions are based on the ICD-9 procedure classification, whereas other countries use national procedure classification systems. Similarly, other countries use different DRG groupings than those used in the United States. Substantial work on mapping the different coding systems used in the U.S. and in other countries is needed.

In countries that have tested the AHRQ PSIs, the average rates were reported to be similar to those observed in the United States. Countries that do not yet have DRG-based prospective payment systems saw much lower rates, possibly resulting from less thorough coding of secondary diagnoses in the absence of implications for payment.

Our interviews show that there is interest in using the AHRQ QIs internationally and sufficient data and technical capability to implement them. This makes it likely that some AHRQ QIs will be adopted by the OECD HCQI Project for international comparisons of quality of care and patient safety. Furthermore, as several international organizations are striving to align their measurement and reporting activities,h selected AHRQ QIs could become part of an international measurement standard.

3.3.4. "Non-users" of QIs

We identified and interviewed representatives of five organizations that are currently using quality indicators other than the AHRQ QIs but that are similar to the AHRQ QIs. Our goal was to understand why these organizations were not using the AHRQ QIs, and specifically whether this decision was based on an evaluation of the merits of the QIs. Three of the organizations were using quality indicators that they had developed themselves and which predated the AHRQ QIs. They did not voice any strong objections to the AHRQ QIs but preferred their own indicators due to various methodological factors and the fact that their indicators were better tailored for their specific needs. The other two organizations had elected not to use the AHRQ QIs because they were not already in use by the hospitals that would be participating in the organizations' quality measurement activities. The JCAHO Core Measures were chosen instead because they were already being collected by hospitals.

3.3.5. Examples of uses of QIs

In order to illustrate how the AHRQ QIs are being used, we have chosen examples of specific uses for quality monitoring, public reporting, and pay-for-performance.

3.3.5.1. Example of AHRQ QI use for quality improvement

Figure 3.3 was drawn from a report provided to hospitals by a hospital association we interviewed. Reports such as the one we reviewed are sent to hospital CEOs quarterly. The reports include all AHRQ IQIs (shown in the example) as well as all AHRQ PSIs. The report also includes indicators from JCAHO and Leapfrog (not shown). Hospitals are presented with targets based on benchmarks calculated by the hospital association. The hospital association works with hospitals to help them explain why they do not meet targets in areas of poor performance.

 

Figure 3.3. Sample AHRQ QI Report Used by a Hospital Association for Quality Improvement

Figure 3.3. depicts a sample AHRQ QI report in a tabular format. The data represented by this graph is contained in the following table:

AHRQ Quality Indicators Apr-Jun 2005 (Q2) Jul 2004-Jun 2005 (recent year)
Relative Performance Denom Observed Target Median Relative Performance Denom Observed Target Median
N Percent N Percent
Post Procedure Mortality (%)
CABG E 74 4.3 3.2 3.5 E 299 3.9 3.2 3.8
PTCA E 181 1.3 1.3 1.1 E 722 2.1 1.4 1.4
Carotid endarterectomy E* 16 0.8 0.6 0.0 E 65 0.6 0.7 0.0
AAA repair E* 6 10.9 12.6 0.0 E* 20 13.4 11.9 6.2
Esophageal cancer resection E* 2 2.5 4.9 0.0 E* 8 7.4 6.3 0.0
Pancreatic cancer resection E* 5 4.1 4.8 0.0 E* 18 3.7 5.1 0.0
Pediatric heart surgery E 28 6.0 5.0 2.2 E 90 4.6 7.1 3.1
Craniotomy E 84 6.1 5.8 4.9 E 313 6.6 5.9 5.5
Hip replacement E 36 0.1 1.2 0.0 E 133 0.1 1.2 0.0
In-Hospital Mortality (%)
AMI - Mortality E 88 8.3 7.6 5.8 E 360 8.0 7.9 7.1
AMI - Mortality without transfers E 62 7.6 7.9 5.8 E 248 7.7 8.4 6.9
HF E 137 3.0 4.3 2.9 E 527 3.3 4.3 3.1
Pneumonia E 125 6.0 6.2 5.5 E 492 6.0 6.0 5.7
Acute stroke E 83 12.6 13.5 12.8 E 316 12.8 13.2 13.5
GI hemorrhage E 55 2.7 6.1 2.6 D 221 2.9 6.1 2.7
Hip fracture E 26 3.1 3.1 1.7 E 98 2.8 3.0 2.6
Utilization Rates (%)
C section all   527 25.8   25.7   2072 26.0   25.6
C section primary   452 16.6   16.3   1779 17.1   16.6
VBAC all   93 17.5   16.9   361 19.0   17.9
VBAC uncomplicated   76 18.4   17.6   299 19.7   18.9
Volumes Observed Median Relative Performance Observed Target Median
N N N
CABG 75 55.5 D 302 100.0 230.0
PTCA 182 124.0 D 725 200.0 448.0
Carotid endarterectomy 16 12.0 E 65 50.0 41.0
AAA repair 6 4.0 E 20 20.0 13.0
Esophageal cancer resection 3 2.0 D 8 6.0 5.5
Pancreatic cancer resection 6 5.0 D 18 10.0 13.5
Pediatric heart surgery 26 16.0 E 90 90.0 48.0

Notes:
A = No Data Available;
B = Substantially Worse than Target;
C = Worse than Target;
D = Better than Target;
E = Meets Target;
* Interpret with caution (less than 25 cases)

3.3.5.2. Example of AHRQ QI use for pay-for-performance

The State of Florida uses the AHRQ QIs as part of a public reporting tool aimed to help consumers choose a hospital. Figure 3.4 captures a segment of a Web page comparing hospitals in Broward County on one of the AHRQ IQIs, postoperative hip fracture (IQI 19). Users can click on a hospital to get more detailed information on quality as well as the hospital's characteristics (teaching status, non-profit status, etc.) and location.

 

Figure 3.4. Sample AHRQ QI Report Used by the State of Florida for Public Reporting

Figure 3.4. depicts a table showing sample AHRQ QI Report used by the State of Florida. The data represented by this graph is contained in the following table.

Facility/City Risk Adjusted Rate
Statewide 0.03%
Memorial Regional Hospital 100038
Hollywood
Higher than Expected
0.1%
North Broward Medical Center 100086
Pompano Beach
As Expected
0.07%
North Ridge Medical Center 100237
Fort Lauderdale
As Expected
0.04%
Northwest Medical Center 100189
Margate
As Expected
0.0%
Plantation General Hospital 100167
Plantation
As Expected
0.0%
University Hospital and Medical Center 100224
Tamarac
As Expected
0.0%
Westside Regional Medical Center 100228
Plantation
As Expected
0.09%

All copyrights in and to 3M™ APR™ DRGs Classification System and 3M™ APR™ Software are owned by 3M. All rights reserved.

What the Complication/infection rate means:

The percentage rates reported on this page reflect each hospital's unique population, and should not be compared between hospitals. Instead, it is strongly recommended consumers compare each hospital on the basis of whether their rates are "as expected", "lower than expected", or "higher than expected".

The percentage rate is:

  • Lower than Expected—Fewer complications/infections than expected given how sick patients were
  • As Expected—Expected number of complications/infections given how sick patients were
  • Higher than Expected—More complications/infections than expected given how sick patients were

"X" is less than 30 cases; "Too few cases" is less than 5 cases.

3.3.5.3. Example of AHRQ QI use for public reporting

Figure 3.5 is drawn from a report provided to hospitals by an insurer we interviewed. The example extracts one AHRQ PSI (PSI 12), postoperative pulmonary embolism (PE) or deep vein thrombosis (DVT). The report allows hospitals to compare their performance to that of their peers. Good performers earn an incentive payment.

 

Figure 3.5. Sample AHRQ QI Report Used by an Insurance Company for Pay-for-Performance

Figure 3.5. depicts a sample table of an AHRQ QI report used by an insurance company to report pay-for-performance data. The data contained in this graph is in the following table.

Hospital NR Sig Cases At Risk Rate APO
110—Posoperative PE or DVT
2135 C 25 5294 0.47 0.68
2020 C 8 1835 0.44 0.70
10th Percentile 0.79
2101 A 18 2007 0.90 0.83
2014 A 31 3772 0.82 0.88
2007 D 19 2263 0.84 0.92
2094 D 21 2226 0.94 0.92
2108 D 8 1393 0.57 0.96
2010 D 29 3048 0.95 1.01
2107 D 36 3541 1.02 1.06
Cohort Average 1.12
2099 D 18 1494 1.20 1.19
2114 D 29 1837 1.58 1.21
2040 D 17 1860 0.91 1.22
2058 D 22 1797 1.22 1.32
2337 D 61 3928 1.55 1.33
2118 D 24 1636 1.47 1.37
2075 B 29 2870 1.01 1.55
90th Percentile 1.55
2149 B 60 2955 2.03 1.56
2225 C 51 2126 2.40 1.97

Notes: Confidence Interval Designations
A = Significant Difference at 80%
B = Significant Difference at 90%
C = Significant Difference at 95%
D = No Significant Difference vs. Cohort Average


d. Indicators were judged to be a "national standard" if they were described that way by any of the study's interviewees.

e. This and all quotes appearing in this report are reconstructions based on interview notes or recordings.

f. We attempted to determine whether vendors' proprietary products included the AHRQ QIs, but since limited information is available from some vendors, some mistaken attribution is possible. There are also other vendors with similar quality measurement products that do not include the AHRQ QIs, but they were not included in our study.

g. Due to the methods used to identify users, the scan is likely to have significantly undercounted the number of organizations (especially hospitals and hospital associations) using the AHRQ QIs for internal quality improvement activities, since this type of use rarely results in publicly available information that could be used to identify the user in an environmental scan.

h. For example, the European Commission has recently ceded its activities in quality indicator development to the OECD to avoid duplication and is funding part of the HCQI Project.


Return to Contents
Proceed to Next Section

Page last reviewed October 2014
Page originally created September 2012
Internet Citation: Chapter 3. The Market for Quality Indicators. Content last reviewed October 2014. Agency for Healthcare Research and Quality, Rockville, MD. https://archive.ahrq.gov/research/findings/final-reports/qualityindicators/chapter3.html

The information on this page is archived and provided for reference purposes only.

 

AHRQ Advancing Excellence in Health Care