This is the accessible text file for GAO report number GAO-07-320 entitled 'Hospital Quality Data: HHS Should Specify Steps and Time Frame for Using Information Technology to Collect and Submit Data' which was released on May 7, 2007. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. Report to the Committee on Finance, U.S. Senate: United States Government Accountability Office: GAO: April 2007: Hospital Quality Data: HHS Should Specify Steps and Time Frame for Using Information Technology to Collect and Submit Data: GAO-07-320: GAO Highlights: Highlights of GAO-07-320, a report to the Committee on Finance, U.S. Senate Why GAO Did This Study: Hospitals submit data in electronic form on a series of quality measures to the Centers for Medicare & Medicaid Services (CMS) and receive scores on their performance. Increasingly, the clinical information from which hospitals derive the quality data for CMS is stored in information technology (IT) systems. GAO was asked to examine (1) hospital processes to collect and submit quality data, (2) the extent to which IT facilitates hospitals’ collection and submission of quality data, and (3) whether CMS has taken steps to promote the use of IT systems to facilitate the collection and submission of hospital quality data. GAO addressed these issues by conducting case studies of eight hospitals with varying levels of IT development and interviewing relevant officials at CMS and the Department of Health and Human Services (HHS). What GAO Found: The eight case study hospitals used six steps to collect and submit quality data: (1) identify the patients, (2) locate information in their medical records, (3) determine appropriate values for the data elements, (4) transmit the quality data to CMS, (5) ensure that the quality data have been accepted by CMS, and (6) supply copies of selected medical records to CMS to validate the data. Several factors account for the complexity of abstracting all relevant information in a patient’s medical record, including the content and organization of the medical record, the scope of information and the clinical judgment required for the data elements, and frequent changes by CMS in its data specifications. Due in part to these complexities, most of the case study hospitals relied on clinical staff to abstract the quality data. Increases in the number of quality measures required by CMS led to increased demands on clinical staff resources. Offsetting the demands placed on clinical staff were the benefits that case study hospitals reported finding in the quality data, such as providing feedback to clinicians and reports to hospital administrators. GAO’s case studies showed that existing IT systems can help hospitals gather some quality data but are far from enabling hospitals to automate the abstraction process. IT systems helped hospital staff to abstract information from patients’ medical records, in particular by improving accessibility to and legibility of the medical record. The limitations reported by officials in the case study hospitals included having a mix of paper and electronic records, which required staff to check multiple places to get the needed information; the prevalence of data recorded as unstructured narrative or text, which made locating the information time-consuming because it was not in a prescribed place in the record; and the inability of some IT systems to access related data stored in another IT system in the same hospital, which required staff to access each IT system separately to obtain related pieces of information. Hospital officials expected the scope and functionality of their IT systems to increase over time, but this process will occur over a period of years. CMS has sponsored studies and joined HHS initiatives to examine and promote the current and potential use of hospital IT systems to facilitate the collection and submission of quality data, but HHS lacks detailed plans, including milestones and a time frame against which to track its progress. CMS has joined efforts by HHS to promote the use of IT in health care, including a Quality Workgroup charged with specifying how IT could capture, aggregate, and report inpatient and outpatient quality data. HHS plans to expand the use of health IT for quality data collection and submission through contracts with nongovernmental entities that currently address the use of health IT for a range of other purposes. However, HHS has identified no detailed plans, milestones, or time frames for either its broad effort to encourage IT in health care nationwide or its specific objective to promote the use of health IT for quality data collection. What GAO Recommends: GAO recommends that the Secretary of HHS identify the specific steps the department plans to take to promote the use of health IT for the collection and submission of data for CMS’s hospital quality measures and inform interested parties about those steps, the expected time frame, and associated milestones. In commenting on a draft of this report on behalf of HHS, CMS concurred with these recommendations. [Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-07-320]. To view the full product, including the scope and methodology, click on the link above. For more information, contact Cynthia A. Bascetta, (202) 512-7101 or BascettaC@gao.gov. [End of section] Contents: Letter: Results in Brief: Background: Hospitals Use Six Basic Steps to Collect and Submit Quality Data, Two of Which Involve Complex Abstraction by Hospital Staff: Existing IT Systems Can Help Hospitals Gather Some Quality Data but Are Far from Enabling Automated Abstraction: CMS Sponsored Studies and Joined Broader HHS Initiatives to Promote Use of IT for Quality Data Collection and Submission, but HHS Lacks Detailed Plans, Milestones, and Time Frame: Conclusions: Recommendations for Executive Action: Agency Comments and Our Evaluation: Appendix I: Medicare Quality Measures Required for Full Annual Payment Update: Appendix II: Data Elements Used to Calculate Hospital Performance on a Heart Attack Quality Measure: Appendix III: Tables on Eight Case Study Hospitals: Appendix IV: Scope and Methodology: Appendix V: Comments from the Centers for Medicare & Medicaid Services: Appendix VI: GAO Contact and Staff Acknowledgments: Tables: Table 1: Case Study Hospital Characteristics: Table 2: How Case Study Hospital Officials Described the Steps Taken to Complete Quality Data Collection and Submission: Table 3: Resources Used for Abstraction and Data Submission at Eight Case Study Hospitals: Table 4: Electronic and Paper Records at Eight Case Study Hospitals: Figures: Figure 1: Six Basic Steps for Hospitals Collecting and Submitting Quality Data: Figure 2: Example of the Process for Locating and Assessing Clinical Information to Determine the Appropriate Value for One Data Element: Figure 3: Data Elements Used to Calculate Hospital Performance on the Heart Attack Quality Measure That Asks Whether a Beta Blocker Was Given When the Patient Arrived at the Hospital: Abbreviations: ACEI: angiotensin-converting enzyme inhibitor: AHIC: American Health Information Community: AHIMA: American Health Information Management Association: AHRQ: Agency for Healthcare Research and Quality: Alliance: National Alliance for Health Information Technology: AMI: acute myocardial infarction: APU: Annual Payment Update: ARB: angiotensin receptor blocker: CART: CMS Abstraction & Reporting Tool: CCHIT: Certification Commission for Health Information Technology: CHI: Consolidated Healthcare Informatics: CMS: Centers for Medicare & Medicaid Services: CPOE: computerized physician order entry: DICOM: Digital Imaging Communications in Medicine: DRA: Deficit Reduction Act of 2005: FTE: full- time equivalent: H&P: history and physical: HCAHPS: Hospital Consumer Assessment of Healthcare Providers and Systems: HHS: Department of Health and Human Services: HIMSS: Healthcare Information and Management Systems Society: HITSP: Healthcare Information Technology Standards Panel: ICD- 9: International Classification of Diseases, Ninth Revision: IFMC: Iowa Foundation for Medical Care: IT: information technology: JCAHO: Joint Commission on Accreditation of Healthcare Organizations: LOINC: Laboratory Logical Observation Identifier Name Codes: LPN: licensed practical nurse: LVSD: left ventricular systolic dysfunction: MAR: medication administration record: MMA: Medicare Prescription Drug, Improvement, and Modernization Act: MSA: metropolitan statistical area: NCDCP: National Council on Prescription Drug Programs: ONC: Office of the National Coordinator for Health Information Technology: POS: provider of services: QIO: quality improvement organization: RN: registered nurse: SNOMED-CT: Systematized Nomenclature of Medicine Clinical Terms: United States Government Accountability Office: Washington, DC 20548: April 25, 2007: The Honorable Max Baucus: Chairman: The Honorable Charles E. Grassley: Ranking Minority Member: Committee on Finance: United States Senate: The Medicare Prescription Drug, Improvement, and Modernization Act of 2003 (MMA) created a financial incentive for hospitals to submit to the Centers for Medicare & Medicaid Services (CMS) data that are used to calculate hospital performance on measures of the quality of care provided.[Footnote 1] CMS established the Annual Payment Update (APU) program[Footnote 2] to implement that incentive. The APU program requires that participating hospitals submit these quality data[Footnote 3] on a quarterly basis in order to avoid a reduction in their full Medicare payment update each fiscal year.[Footnote 4] Although the APU program was originally set to expire in 2007, the Deficit Reduction Act of 2005[Footnote 5] (DRA) made the APU program permanent. The act also raised the reduction[Footnote 6] and required the Secretary of Health and Human Services (HHS) to increase the number of measures for which hospitals participating in the APU program would have to provide data in order to receive their full Medicare payment update.[Footnote 7] CMS plans to continue expanding the number of required measures in future years.[Footnote 8] Furthermore, DRA directed the Secretary to develop a plan to implement a value-based purchasing program for Medicare that beginning in fiscal year 2009 would adjust payments to hospitals based on factors related to the quality of care they provide. Such pay-for-performance programs are intended to strengthen the financial incentives for hospitals to invest in quality improvement efforts. Each quality measure consists of a set of standardized data elements, which define the specific data that hospitals need to submit to CMS. Hospitals determine a value for each data element of a measure for patients--Medicare and non-Medicare--who have a medical condition covered by the APU program, that is, heart attack, heart failure, pneumonia, or surgery. The values for the data elements consist of numerical data and other administrative and clinical information that are obtained from the medical records of the patients.[Footnote 9] For example, there are 8 required quality measures for the heart attack condition, one of which is whether a beta blocker was given to the patient upon arrival at the hospital.[Footnote 10] This single measure, in turn, consists of 11 data elements, including administrative data elements, such as the patient's date of arrival at the hospital, and clinical data elements, such as whether the patient received a beta blocker within 24 hours after hospital arrival (see app. II). The values entered for data elements are used to calculate hospital performance on the 21 quality measures that are in effect as of fiscal year 2007. For a hospital submitting data on all 21 measures, CMS receives values for a total of 73 unique data elements. For heart attack measures alone, the 8 measures utilize 35 of the 73 data elements. (Some data elements are used in more than 1 measure. See app. I for the number of data elements required for each measure.) Hospitals submit their quality data electronically, over the Internet, to a clinical data warehouse operated by a CMS contractor. Increasingly, the information in patients' medical records that provides the basis for hospital quality data submissions may be stored and accessed in electronic form in information technology (IT) systems. Currently, many hospitals record and store such clinical information on patients in a combination of paper and electronic systems. Over time, hospitals have added new health IT systems to expand the amount of information that is stored electronically. In 2005, the Secretary of HHS established the American Health Information Community (AHIC) to advance the adoption of electronic health records, after the President called in 2004 for the widespread adoption of interoperable electronic health records within 10 years and appointed a National Coordinator for Health Information Technology to promote that goal. On August 12, 2005, CMS issued a regulation for the APU program that stated a goal of facilitating the use of health IT by hospitals to make it easier for them to collect the quality data from the medical record and submit them to CMS.[Footnote 11] In the preamble to the regulation, CMS said that it intended to begin working toward modifying its requirements and mechanisms for accepting quality data to allow hospitals to transfer their data directly from hospital IT systems without having to first transfer the data into specially formatted files as is currently required. Because the vast majority of acute care hospitals treating Medicare patients choose to submit quality data each quarter to CMS, rather than accept a reduced annual payment update, you asked us to examine (1) how hospitals collect and submit quality data for the Medicare hospital quality measures, (2) the extent to which IT facilitates hospitals' collection and submission of quality data for the Medicare hospital quality measures, and (3) whether CMS has taken steps to promote the development and use of IT systems that could facilitate the collection and submission of hospital quality data. To assess how hospitals collect and submit quality data, we conducted case studies of eight individual acute care hospitals to obtain information about the processes they used to collect and submit the data.[Footnote 12] The hospitals varied on a number of standard hospital characteristics, including size, urban/rural location, and teaching status (see app. III, table 1). We visited each case study hospital, and we interviewed the individuals responsible for collecting and submitting the quality data to CMS, managers of the hospital's quality department, and hospital administrators. To assess the extent to which IT facilitates hospitals' collection and submission of quality data, we selected the case study hospitals to include both hospitals with relatively well-developed IT systems that supported electronic patient records and hospitals with less-developed levels of IT, based on screening interviews done at the time we selected the case study hospitals.[Footnote 13] During our site visits, we also interviewed IT staff involved in the process of collecting and submitting the quality data. To assess whether CMS has taken steps to promote the development and use of IT systems that could facilitate the collection and submission of hospital quality data, we reviewed relevant federal regulations, reports, and related documents and interviewed CMS officials and CMS contractors, as well as officials in HHS's Office of the National Coordinator for Health Information Technology (ONC). Because our evidence is limited to the eight case studies, it does not offer a basis for relating any differences we observed among these particular hospitals to their differences on specific dimensions, such as size or teaching status. Nor can we generalize from the group of eight as a whole to acute care hospitals across the country. Where appropriate, we obtained relevant information about these hospitals from CMS documents and databases; however, most of our information for these case studies was reported by hospital officials.[Footnote 14] Furthermore, although we examined the processes hospitals used to collect and submit quality data and the role that IT plays in that process, we did not examine general IT adoption in the hospital industry. We conducted our work from February 2006 to April 2007 in accordance with generally accepted government auditing standards. For a complete description of our methodology, see appendix IV. Results in Brief: The case study hospitals we visited used six steps to collect and submit quality data, two of which (steps 2 and 3) involved complex abstraction--the process of reviewing and assessing all relevant pieces of information in a patient's medical record to determine the appropriate value for each data element. Whether that patient information was recorded electronically, on paper, or as a mix of both, the six steps were (1) identify the patients, (2) locate information in their medical records, (3) determine appropriate values for the data elements, (4) transmit the quality data to CMS, (5) ensure that the quality data have been accepted by CMS, and (6) supply copies of selected medical records to CMS to validate the data. Several factors account for the complexity of the abstraction process (steps 2 and 3), including the content and organization of the medical record, the scope of information and clinical judgment required for the data elements, and frequent changes by CMS in its data specifications. Due in part to these complexities, most of our case study hospitals relied on clinical staff to abstract the quality data. Increases in the number of quality measures required by CMS led to increased demands on clinical staff resources. Offsetting the demands placed on clinical staff were the benefits that case study hospitals reported finding in the quality data. For example, all the hospitals reported having a process in place to track changes in their performance over time and provide feedback to clinicians and reports to hospital administrators and trustees. Our case studies showed that existing IT systems can help hospitals gather some quality data but are far from enabling hospitals to automate the abstraction process. IT systems helped hospital staff abstract information from patients' medical records, in particular by improving accessibility to and legibility of the medical record and by enabling hospitals to incorporate CMS's required data elements into the medical record. The limitations reported by officials in the case study hospitals included having a mix of paper and electronic records, which required staff to check multiple places to get the needed information; the prevalence of data recorded as unstructured narrative or text, which made locating the information time-consuming because it was not in a prescribed place in the record; and the inability of some IT systems to access related data stored in another IT system in the same hospital, which required hospital staff to access each IT system separately to obtain related pieces of information. While hospital officials expected the scope and functionality of their IT systems to increase over time, they projected that this process would occur incrementally over a period of years. CMS has sponsored studies and joined HHS initiatives to examine and promote the current and potential use of hospital IT systems to facilitate the collection and submission of quality data, but HHS lacks detailed plans, including milestones and a time frame against which to track its progress. CMS sponsored two studies that examined the use of hospital IT systems for quality data collection and submission. Promoting the use of health IT for quality data collection is also 1 of 14 objectives that HHS has identified in its broader effort to encourage the development and nationwide implementation of interoperable IT in health care. CMS has joined this broader effort by HHS, as well as the Quality Workgroup that AHIC created in August 2006 to specify how IT could capture, aggregate, and report inpatient and outpatient quality data. Through its representation in AHIC and the Quality Workgroup, CMS has participated in decisions about the specific focus areas to be examined through contracts with nongovernmental entities. These contracts currently address the use of health IT for a range of purposes, which may also include quality data collection and submission in the near future. However, HHS has identified no detailed plans, milestones, or time frames for either its broad effort to encourage IT in health care nationwide or its specific objective to promote the use of health IT for quality data collection. To support the expansion of quality measures for the APU program, we recommend that the Secretary of HHS identify the specific steps that the department plans to take to promote the use of health IT for the collection and submission of data for CMS's hospital quality measures and inform interested parties on those steps and the expected time frame, including milestones for completing them. In commenting on a draft of this report on behalf of HHS, CMS expressed its appreciation of our thorough analysis of the processes that hospitals use to report quality data and the role that IT systems can play in that reporting, and it concurred with our two recommendations. Background: The quality data submitted by hospitals are collected from the medical records of patients admitted to the hospital. Hospital patient medical records contain many different types of information, which are organized into different sections. Frequently found examples of these sections include: * the face sheet, which summarizes basic demographic and billing data, including diagnostic codes; * history and physicals (H&P), which record both patient medical history and physician assessments; * physician orders, which show what medications, tests, and procedures were ordered by a physician; * medication administration records (MAR), which show that a specific medication was given to a patient, when it was given, and the dosage; * laboratory reports, radiology reports, and test results, such as an echocardiogram reading; * progress notes, in which physicians, nurses, and other clinicians record information chronologically on patient status and response to treatments during the patient's hospital stay; * operative reports for surgery patients; * physician and nursing notes for patients treated in the emergency department; and: * discharge summaries, in which a physician summarizes the patient's hospital stay and records prescriptions and instructions to be given to the patient at discharge. Hospitals have discretion to determine the structure of their patient medical records, as well as to set general policies stating what, where, and how specific information should be recorded by clinicians. To guide the hospital staff in the abstraction process--that is, in finding and properly assessing the information in the patient's medical record needed to fill in the values for the data elements--CMS and the Joint Commission[Footnote 15] have jointly issued a Specifications Manual.[Footnote 16] It contains detailed specifications that define the data elements for which the hospital staff need to collect information and determine values and the correct interpretation of those data elements. The Joint Commission also requires hospitals to submit the same data that they submit to CMS for the APU program (and some additional data) to receive Joint Commission accreditation. In many hospitals, information in a patient's medical record is recorded and stored in a combination of paper and electronic systems. Patient medical records that clinicians record on paper may be stored in a folder in the hospital's medical record department and contain all the different forms, reports, and notes prepared by different individuals or by different departments during the patient's stay. Depending on the length of the patient's hospital stay and the complexity of the care, an individual patient medical record can amount to hundreds of pages.[Footnote 17] For information stored electronically, clinicians may enter information directly into the electronic record themselves, as they do for paper records, or they may dictate their notes to be transcribed and added to the electronic record later. Information may also be recorded on paper and then scanned into the patient's electronic record. For example, if a patient is transferred from another hospital, the paper documents from the transferring hospital may be scanned into the patient's electronic record. The patient medical information that hospitals store electronically, rather than on paper, typically resides in multiple health IT systems. One set of IT systems usually handles administrative tasks such as patient registration and billing. Hospitals acquire other IT systems to record laboratory test results, to store digital radiological images, to process physician orders for medications, and to record notes written by physicians and nurses. Hospitals frequently build their health IT capabilities incrementally by adding new health IT systems over time.[Footnote 18] If the systems that hospitals purchase come from different companies, they are likely to be based on varying standards for how the information is stored and exchanged electronically. As a result, even in a single hospital, it can be difficult to access from one IT system clinical data stored in a different health IT system. One of the main objectives of ONC is to overcome the problem of multiple health IT systems, within and across health care providers, that store and exchange information according to varying standards. The mission of ONC is to promote the development and nationwide implementation of interoperable health IT in both the public and the private sectors in order to reduce medical errors, improve quality of care, and enhance the efficiency of health care.[Footnote 19] Health IT is interoperable when systems are able to exchange data accurately, effectively, securely, and consistently with different IT systems, software applications, and networks in such a way that the clinical or operational purposes and meaning of the data are preserved and unaltered. Hospitals Use Six Basic Steps to Collect and Submit Quality Data, Two of Which Involve Complex Abstraction by Hospital Staff: The case study hospitals we visited used six steps to collect and submit quality data, two of which involved complex abstraction--the process of reviewing and assessing all relevant pieces of information in a patient's medical record to determine the appropriate value for each data element. Factors accounting for the complexity of the abstraction process included the content and organization of the medical record, the scope of information required for the data elements, and frequent changes by CMS in its data specifications. Due in part to these complexities, most of our case study hospitals relied on clinical staff to abstract the quality data. Increases in the number of required quality measures led to increased demands on clinical staff resources. However, all case study hospitals reported finding benefits in the quality data that helped to offset the demands placed on clinical staff. Hospitals Collect and Submit Quality Data by Completing Six Basic Steps: We found that whether patient information was recorded electronically, on paper, or as a mix of both, all the case study hospitals collected and submitted their quality data by carrying out six sequential steps (see fig. 1). These steps started with identifying the patients for whom the hospitals needed to provide quality data to CMS and continued through the process of examining each patient's medical record, one after the other, to find the information needed to determine the appropriate values for each of the required data elements for that patient. Then, for each patient, those values were entered by computer into an electronic form or template listing each of the data elements for that condition. These forms were provided by the data vendor with which the hospital had contracted to transmit its quality data to CMS. The vendors also assisted the hospitals in checking that the data were successfully received by CMS. Finally, the hospitals sent copies of the medical records of a selected sample of patients to a CMS contractor that used those records to validate the accuracy of the quality data submitted by the hospital. Figure 1: Six Basic Steps for Hospitals Collecting and Submitting Quality Data: [See PDF for image] Source: GAO. Note: Patient information may be obtained from either electronic or paper records. [End of figure] Specifically, the six steps, which are summarized for each case study hospital in appendix III, table 2, were as follows: Step 1: Identify patients--The first step was to identify the patients for whom the hospitals needed to submit quality data to CMS. Staff at three case study hospitals identified these patients using information on the patient's principal diagnosis, or principal procedure in the case of surgery patients, obtained from the hospital's billing data.[Footnote 20] Five case study hospitals had their data vendor use the hospital's billing data to identify the eligible patients for them. Every month, all eight hospitals that we visited identified patients discharged in the prior month for whom quality data should be collected. The hospitals identified all patients retrospectively for quality data collection because hospitals have to wait until a patient is discharged to determine the principal diagnosis.[Footnote 21] CMS permits hospitals to reduce their data collection effort by providing quality data for a representative sample of patients when the total number of patients treated for a particular condition exceeds a certain threshold.[Footnote 22] Five case study hospitals drew samples for at least one condition. The data vendor performed this task for four of those case study hospitals, and assisted the hospital in performing this task for the fifth hospital. Only one of the case study hospitals reported using nonbilling data sources to check the accuracy of the lists of patients selected for quality data collection that the hospitals drew from their billing data (see app. III, table 3). Several stated that they occasionally noted discrepancies, such as patients selected for heart attack measures who, upon review of their medical record, should not have had that as their principal diagnosis. However, the hospital officials we interviewed told us that discrepancies of this sort were likely to be minor. Officials at three hospitals noted that hospitals generally have periodic routine audits conducted of the coding practices of their medical records departments, which would include the accuracy of the principal diagnoses and procedures. Step 2: Locate information in the medical record--Steps 2 and 3 were in practice closely linked in our case study hospitals. Abstractors[Footnote 23] at the eight case study hospitals examined each selected patient's medical record, looking for all of the discrete pieces of information that, taken together, would determine what they would decide--in step 3--was the correct value for each of the data elements. For some data elements, there was a one-to-one correspondence between the piece of information in the medical record and the value to be entered. Typical examples included a patient's date of birth and the name of a medication administered to the patient. For other data elements, the abstractors had to check for the presence or absence of multiple pieces of information in different parts of the medical record to determine the correct value for that data element. For example, to determine if the patient did, or did not, have a contraindication for aspirin, abstractors looked in different parts of the medical record for potential contraindications, such as the presence of internal bleeding, allergies, or prescriptions for certain other medications such as Coumadin.[Footnote 24] In order for abstractors to find information in the patient's medical record, it had to be recorded properly by the clinicians providing the patient's care. Officials at all eight case study hospitals described efforts designed to educate physicians and nurses about the specific data elements for which they needed to provide information in each patient's medical record. The hospital officials were particularly concerned that the clinicians not undermine the hospital's performance on the quality measures by inadequately documenting what they had done and the reasons why. For example, one heart failure measure tracks whether a patient received each of six specific instructions at the time of discharge, but unless information was explicitly recorded in a heart failure patient's medical record for each of the six data elements, that patient was counted by CMS as one who had not received all pertinent discharge instructions and therefore did not meet that quality measure.[Footnote 25] This particular measure was cited by officials at several hospitals as one that required a higher level of documentation than had previously been the norm at their hospital. Step 3: Determine appropriate data element values--Once abstractors had located all the relevant pieces of information pertaining to a given data element, they had to put those pieces together to arrive at the appropriate value for the data element. The relevance of that information was defined by the detailed instructions provided by the hospitals' vendors, as well as the Specifications Manual jointly issued by CMS and the Joint Commission that serves as the basis for the vendor instructions. The Specifications Manual sets out the decision rules for choosing among the allowable values for each data element. It also identifies which parts of the patient's medical record may or may not provide the required information, and often lists specific terms or descriptions that, if recorded in the patient's medical record, would indicate the appropriate value for a given data element. In addition, the Specifications Manual provides abstractors with guidance on how to interpret conflicting information in the medical record, such as a note from one clinician that the patient is not a smoker and a note elsewhere in the record from another clinician that the patient does smoke. To help keep track of multiple pieces of information, many abstractors reported that they first filled in the data element values on a paper copy of the abstraction form provided by the data vendor. In this way, they could write notes in the margin to document how they came to their conclusions. Step 4: Transmit data to CMS--In order for the quality data to be accepted by the clinical data warehouse, they must pass a battery of edit checks that look for missing, invalid, or improperly formatted data element entries.[Footnote 26] All the case study hospitals contracted with data vendors to submit their quality data to CMS. They did so, in part, because all of the hospitals submitted the same data to the Joint Commission, and it requires hospitals to submit their quality data through data vendors that meet the Joint Commission's requirements. The additional cost to the hospitals to have the data vendors also submit their quality data to CMS was generally minimal (see app. III, table 3). All of the case study hospitals submitted their data to the data vendor by filling in values for the required data elements on an electronic version of the vendor's abstraction form.[Footnote 27] Many abstractors did this for a batch of patient records at a time, working from paper copies of the form that they had filled in previously. Some abstractors entered the data online at the same time that they reviewed the patient's medical records. In other cases, someone other than the abstractor who filled in the paper form used the completed form to enter the data on a computer. Step 5: Ensure data have been accepted by CMS--The case study hospitals varied in the extent to which they actively monitored the acceptance of their quality data into CMS's clinical data warehouse. After the data vendors submitted the quality data electronically, they and the hospitals could download reports from the clinical data warehouse indicating whether the submitted data had passed the screening edits for proper formatting and valid entries. The hospitals could use these reports to detect data entry errors and make corrections prior to CMS's data submission deadline. Three case study hospitals shared this task with their data vendors, three hospitals left it for their data vendors to handle, and two hospitals received and responded to reports on data edit checks produced by their data vendors, rather than reviewing the CMS reports. Approximately 2 months after hospitals submitted their quality data, CMS released reports to the hospitals showing their performance scores on the quality measures before posting the results on its public Web site. Step 6: Supply copies of selected medical records--CMS has put in place a data validation process to ensure the accuracy of hospital quality data submissions. It requires hospitals to supply a CMS contractor with paper copies of the complete medical record for five patients selected by CMS each quarter.[Footnote 28] Officials at five hospitals noted that they check to make sure that all parts of the medical records that they used to abstract the data originally are included in the package shipped to the CMS contractor. Most of the case study hospitals relied on CMS's data validation to ensure the accuracy of their abstractions. However, two hospitals reported that they also routinely draw their own sample of cases, which are abstracted a second time by a different abstractor in the hospital, followed by a comparison of the two sets of results (see app. III, table 3). Two Most Complex Steps Were Locating Relevant Clinical Information and Determining Appropriate Values for Data Elements: The description by hospital officials of the processes they used to collect and submit quality data indicated that locating the relevant clinical information and determining appropriate values for the data elements (steps 2 and 3) were the most complex steps of the six identified, due to several factors. These included the content and organization of the medical record, the scope of the information encompassed by the data elements, and frequent changes in data specifications. The first complicating factor related to the medical record was that the information abstractors needed to determine the correct data element values for a given patient was generally located in many different sections of the patient's medical record. These included documents completed for admission to the hospital, emergency department documents, laboratory and test results, operating room notes, medication administration records, nursing notes, and physician- generated documents such as history and physicals, progress notes and consults, orders for medications and tests, and discharge summaries. In addition, the abstractors may have had to look at documents that came from other providers if the patient was transferred to the hospital. Much of the clinical information needed was found in the sections of the medical record prepared by clinicians. Often the information in question, such as contraindications for aspirin or beta blockers, could be found in any of a number of places in the medical record where clinicians made entries. As a result, abstractors frequently had to read through multiple parts of the record to find the information needed to determine the correct value for just one data element. At two case study hospitals, abstractors said that they routinely read each patient's entire medical record. Experienced abstractors often knew where they were most likely to find particular pieces of information. They nevertheless also had to check for potentially contradictory information in different parts of the medical record. For example, as noted, patients may have provided varying responses about their smoking history to different clinicians. If any of these responses indicated that the patient had smoked cigarettes in the last 12 months, the patient was considered to be a smoker according to CMS's data specifications. Another example concerns the possibility that a heart attack or heart failure patient may have had multiple echocardiogram results recorded in different parts of the medical record. Abstractors needed to find all such results in order to apply the rules stated in the Specifications Manual for identifying which result to use in deciding whether the patient had left ventricular systolic dysfunction (LVSD). This data element is used for the quality measure assessing whether an angiotensin-converting enzyme inhibitor (ACEI) or angiotensin receptor blocker (ARB) was prescribed for LVSD at discharge.[Footnote 29] The second factor was related to the scope of the information required for certain data elements. Some of the data elements that the abstractors had to fill in represented a composite of related data and clinical judgment applied by the abstractor, not just a single discrete piece of information. Such composite data elements typically were governed by complicated rules for determining the clinical appropriateness of a specific treatment for a given patient. For example, the data element for contraindications for both ACEIs and ARBs at discharge requires abstractors to check for the presence and assess the severity of any of a range of clinical conditions that would make the use of either ACEIs or ARBs inappropriate for that patient.[Footnote 30] (See fig. 2.) These conditions may appear at any time during the patient's hospital stay and so could appear at any of several places in the medical record. Abstractors must also look for evidence in the record from a physician[Footnote 31] linking a decision not to prescribe these drugs to one or more of those conditions. Figure 2: Example of the Process for Locating and Assessing Clinical Information to Determine the Appropriate Value for One Data Element: [See PDF for image] Source: GAO, CMS. Note: In this illustrative case, adapted from CMS training materials, an abstractor would find that the patient was given an ACEI, Zestril, in the emergency department (see MAR,1/30), but because of its apparent effect on the patient's pulse and blood pressure (see Progress Notes, 01/31), it was not continued during the hospital stay (see Progress Notes, 02/03) and no ACEI was prescribed at discharge (see Discharge Summary). However, there is no mention in the patient's record of ARBs or aortic stenosis. The arrows point to some of the key pieces of information an abstractor would take note of in determining that the appropriate value for this data element was "N" for "no." [End of figure] The third factor is the necessity abstractors at the case study hospitals faced to adjust to frequent changes in the data specifications set by CMS. Since CMS first released its detailed data specifications jointly with the Joint Commission in September 2004, it has issued seven new versions of the Specifications Manual.[Footnote 32] Therefore, from fall 2004 through summer 2006, roughly every 3 months hospital abstractors have had to stop and take note of what had changed in the data specifications and revamp their quality data collection procedures accordingly. Some of these changes reflected modifications in the quality measures themselves, such as the addition of ARBs for treatment of LVSD. Other changes revised or expanded the guidance provided to abstractors, often in response to questions submitted by hospitals to CMS. CMS recently changed its schedule for issuing revisions to its data specifications from every 3 months to every 6 months, but that change had not yet affected the interval between new revisions issued to hospitals at the time of our case study site visits. Clinical Staff Abstract Quality Data at Most Hospitals: Case study hospitals typically used registered nurses (RN), often exclusively, to abstract quality data for the CMS quality measures (see app. III, table 3). One hospital relied on a highly experienced licensed practical nurse, and two case study hospitals used a mix of RNs and nonclinical staff. Officials at one hospital noted that RNs were familiar with both the nomenclature and the structure of the hospital's medical records and they could more readily interact with the physicians and nurses providing the care about documentation issues. Even when using RNs, all but three of the case study hospitals had each abstractor focus on one or two medical conditions with which they had expertise. Four hospitals had tried using nonclinical staff, most often trained as medical record coders, to abstract the quality data. Officials at one of these hospitals reported that this approach posed challenges. They said that it was difficult for nonclinical staff to learn all that they needed to know to abstract quality data effectively, especially with the constant changes being made to the data specifications. At the second hospital, officials reported that using nonclinical staff for abstraction did not work at all and they switched to using clinically trained staff. At the third hospital, the chief clinician leading the quality team stated that the hospital's nonclinical abstractors worked well enough when clinically trained colleagues were available to answer their questions. Officials at the fourth hospital cited no concerns about using staff who were not RNs to abstract quality data, but they subsequently hired an RN to abstract patient records for two of the four conditions. Case study hospitals drew on a mix of existing and new staff resources to handle the collection and submission of quality data to CMS. In two hospitals, new staff had been hired specifically to collect quality data for the Joint Commission and CMS. In other hospitals, quality data collection was assigned to staff already employed in the hospital's quality management department or performing other functions. Adding Quality Measures Required a Proportionate Increase in Staff Resources: All the case study hospitals found that, over time, they had to increase the amount of staff resources devoted to abstracting quality data for the CMS quality measures, most notably as the number of measures on which they were submitting data expanded. Officials at the case study hospitals generally reported that the amount of staff time required for abstraction increased proportionately with the number of conditions for which they reported quality data. The hospitals had all begun to report most recently on the surgical quality measures. They found that the staff hours needed for this new set of quality measures were directly related to the number of patient records to be abstracted and the number of data elements collected. In other words, they found no "economies of scale" as they expanded the scope of quality data abstraction. At the time of our site visits, four hospitals continued to draw on existing staff resources, while others had hired additional staff. Hospital officials estimated that the amount of staff resources devoted to abstracting data for the CMS quality measures ranged from 0.7 to 2.5 full-time equivalents (FTE) (app. III, table 3).[Footnote 33] Hospitals Value and Use Quality Data: Hospital officials reported that the demands that quality data collection and submission placed on their clinical staff resources were offset by the benefits that they derived from the resulting information on their clinical performance. Each one had a process for tracking changes in their performance over time. Based on those results, they provided feedback to individual clinicians and reports to hospital administrators and trustees. Because they perceived feedback to clinicians to be much more effective when provided as soon as possible, several of the case study hospitals found ways to calculate their performance on the quality measures themselves, often on a monthly basis, rather than wait for CMS to report their results for the quarter. Officials at all eight case study hospitals pointed to specific changes they had made in their internal procedures designed to improve their performance on one or more quality measures. Most of the case study hospitals developed "standing order sets" for particular diagnoses. Such order sets provide a mechanism for standardizing both the care provided and the documentation of that care, in such areas as prescribing beta blockers and aspirin on arrival and at discharge for heart attack patients. Another common example involved prompting physicians to administer pneumococcal vaccinations to pneumonia patients. However, at most of the case study hospitals, use of many standing order sets was optional for physicians, and hospital officials reported widely varying rates of physician use, from close to 100 percent of physicians at one hospital using its order set for heart attack patients to just a few physicians using any order sets in another hospital. Case study hospitals also responded to the information generated from their quality data by adjusting their treatment protocols, especially for patients treated in their emergency departments. For example, five hospitals developed or elaborated on procedural checklists for emergency department nurses treating pneumonia patients. The objective of these changes was to more quickly identify pneumonia patients when they arrived at the emergency department and then expeditiously perform required blood tests so that the patients would score positively for the quality measure on receiving antibiotics within 4 hours of arrival at the hospital. Three hospitals strengthened their procedures to identify smokers and make sure that they received appropriate counseling. Hospital officials noted that they provided quality of care data to entities other than CMS and the Joint Commission, such as state governments and private insurers, but for the most part they reported that the CMS quality measures had two advantages. First, the CMS quality measures enabled hospitals to benchmark their performance against the performances of virtually every other hospital in the country. Second, officials at two hospitals noted that the CMS measures were based on clinical information obtained from patient medical records and therefore had greater validity as measures of quality of care than measures based solely on administrative data.[Footnote 34] Many hospital officials said that they wished that state governments and other entities collecting quality data would accept the CMS quality measures instead of requiring related quality data based on different definitions and patient populations. Hospital officials in two states reported some movement in that direction. Existing IT Systems Can Help Hospitals Gather Some Quality Data but Are Far from Enabling Automated Abstraction: In the case studies, existing IT systems helped hospital abstractors to complete their work more quickly, but the limitations of those IT systems meant that trained staff still had to examine the entire patient medical record and manually abstract the quality data submitted to CMS. IT systems helped abstractors obtain information from patients' medical records, in particular by improving their accessibility and legibility, and by enabling hospitals to incorporate CMS's required data elements into those medical records. The challenges reported by hospital officials included having a mix of paper and electronic records, which required abstractors to check multiple places to get the needed information; the prevalence of unstructured data, which made locating the information time-consuming because it was not in a prescribed place in the record; and the presence of multiple IT systems that did not share data, which required abstractors to separately access each IT system for related pieces of information that were in different parts of the medical record. While hospital officials expected the scope and functionality of their IT systems to increase over time, they projected that this would occur incrementally over a period of years.[Footnote 35] Existing IT Systems Help Abstractors Obtain Information from Medical Records but Have Notable Limitations: Hospitals found that their existing IT systems could facilitate the collection of quality data, but that there were limits on the advantages that the systems could provide. IT systems, and the electronic records they support, offered hospitals two key benefits: (1) improving accessibility to and legibility of the medical record, and (2) facilitating the incorporation of CMS's required data elements into the medical record. Many hospital abstractors noted that existing electronic records helped quality data collection by improving accessibility and legibility of patient records. In general, paper records were less accessible than electronic records because it took time to find them or to have them transported if hospitals had stored them in a remote location after the patients were discharged. Also, paper records were more likely to be missing or in use by someone else. However, in one case study hospital, an abstractor noted difficulties in gaining access to a computer terminal to view electronic medical records. Many abstractors noted improvements in legibility as a fundamental benefit of electronic records. This advantage applied in particular to the many sections of the medical record that consisted of handwritten text, including history and physicals, progress notes, medication administration records, and discharge summaries. Some hospitals have used their existing IT systems to facilitate the abstraction of information by designing a number of discrete data fields that match CMS's data elements. For example, two hospitals incorporated prompts for pneumococcal vaccination in their electronic medication ordering system. These prompts not only reminded physicians to order the vaccination (if the patient was not already vaccinated) but also helped to insure documentation of the patient's vaccination status. One hospital developed a special electronic discharge program for heart attack and heart failure patients that had data elements for the quality measures built into it. Another hospital built a prompt into its electronically generated discharge instructions to instruct patients to measure their weight daily. This enabled the hospital to document more consistently one of the specific instructions that heart failure patients are supposed to receive on discharge but that physicians and nurses tended to overlook in their documentation. The limitations that hospital officials reported in using existing IT systems to collect quality data stemmed from having a mix of paper and electronic systems; the prevalence of data recorded in IT systems as unstructured paragraphs of narrative or text, as opposed to discrete data fields reserved for specific pieces of information; and the inability of some IT systems to access related data stored on another IT system in the same hospital. Because all but one of the case study hospitals stored clinical records in a mix of paper and electronic systems, abstractors generally had to consult both paper and electronic records to obtain all needed information. What was recorded on paper and what was recorded electronically varied from hospital to hospital (see app. III, table 4). However, admissions and billing data were electronic at all the case study hospitals. Billing data include principal diagnosis and birth date, which are among the CMS-required data elements. With regard to clinical data, all case study hospitals had test results, such as echocardiogram readings, in an electronic form. In contrast, nurse progress notes were least likely to be in electronic form at the case study hospitals. Moreover, it was not uncommon for a hospital to have the same type of clinical documentation stored partly in electronic form and partly on paper. For example, five of the eight case study hospitals had a mix of paper and electronic physician notes, reflecting the differing personal preferences of the physicians. Discharge summaries and medication administration records, on the other hand, tended to be either paper or electronic at a given hospital. Many of the data in existing IT systems were recorded in unstructured formats--that is, as paragraphs of narrative or other text, rather than in data fields designated to contain specific pieces of information-- which created problems in locating the needed information. For example, physician notes and discharge summaries were often dictated and transcribed. Abstractors typically read through the entire electronic document to make sure that they had found all potentially relevant references, such as for possible contraindications for a beta blocker or an ACEI. By contrast, some of the data in existing IT systems were in structured data fields so that specific information could be found in a prescribed place in the record. One common example was a list of medication allergies, which abstractors used to quickly check for certain drug contraindications. However, officials at several hospitals said that developing and implementing structured data fields were labor intensive, both in terms of programming and in terms of educating clinical staff in their use. That is why many of the data stored in electronic records at the case study hospitals remained in unstructured formats. Another limitation with existing IT systems was the inability of some systems to access related data stored on another IT system in the same hospital. This situation affected six of the eight case study hospitals to some degree. For example, one hospital had an IT system in the emergency department and an IT system on the inpatient floors, but the two systems were independent and the information in one was not linked to the information in the other. Abstractors had to access each IT system separately to obtain related pieces of information, which made abstraction more complicated and time-consuming. Existing IT systems helped hospital abstractors to complete their work more quickly, but the limitations of those IT systems meant that, for the most part, the nature of their work remained the same. Existing IT systems enabled abstractors at several hospitals to more quickly locate the clinical information needed to determine the appropriate values for at least some of the data elements that the hospitals submitted to CMS. Where hospitals designed a discrete data field in their IT systems to match a specific CMS data element, abstractors could simply transcribe that value into the data vendor's abstraction form. However, in all the case study hospitals there remained a large number of data elements for which there was no discrete data field in a patient's electronic record that could provide the required value for that data element. As a result, trained staff still had to examine the medical record as a whole and manually abstract the quality data submitted to CMS, whether the information in the medical record was recorded electronically or on paper.[Footnote 36] Full Automation of Quality Data Collection Is Not Imminent: All the case study hospitals were working to expand the scope and functionality of their IT systems, but this expansion was generally projected to occur incrementally over a period of years. Hospital officials noted that with wider use of IT systems, the advantages of these systems--including accessibility, legibility, and the use of discrete data fields--would apply to a larger proportion of the clinical records that abstractors have to search. As the case study hospitals continue to bring more of their clinical documentation into IT systems, and to link separate systems within their hospital so that data in one system can be accessed from another, it should reduce the time required to collect quality data. However, most officials at the case study hospitals viewed full-scale automation of quality data collection and submission through implementation of IT systems as, at best, a long-term prospect. They pointed to a number of challenges that hospitals would have to overcome before they could use IT systems to achieve full-scale automation of quality data collection and submission. Primary among these were overcoming physician reluctance to use IT systems to record clinical information and the intrinsic complexity of the quality data required by CMS. One hospital with unusually extensive IT systems had initiated a pilot project to see how close it could get to fully automating quality data collection for patients with heart failure. Drawing to the maximum extent on the data that were amenable to programming, which excluded unstructured physician notes, the hospital found that it could complete data collection for approximately 10 percent of cases without additional manual abstraction. Reflecting on this effort, the hospital official leading this project noted that at least some of the data elements required for heart failure patients represented "clinical judgment calls." An official at another hospital observed that someone had to apply CMS's complex decision rules to determine the appropriate value for the data elements. If a hospital wanted to eliminate the need for an abstractor, who currently makes those decisions retrospectively after weighing multiple pieces of information in the patient's medical record, the same complex decisions would have to be made by the patient's physician at the time of treatment. The official suggested that it was preferable not to ask physicians to take on that additional task when they should be focused on making appropriate treatment decisions. Another barrier to automated quality data collection mentioned by several hospital officials was the frequency of change in the data specifications. As noted above, hospitals had to invest considerable staff resources for programming and staff education to develop structured data fields for the clinical information required for the data elements. Officials at one hospital stated that it would be difficult to justify that investment without knowing how long the data specifications underlying that structured data field would remain valid. CMS Sponsored Studies and Joined Broader HHS Initiatives to Promote Use of IT for Quality Data Collection and Submission, but HHS Lacks Detailed Plans, Milestones, and Time Frame: CMS has sponsored studies and joined HHS initiatives to examine and promote the current and potential use of hospital IT systems to facilitate the collection and submission of quality data, but HHS lacks detailed plans, including milestones and a time frame against which to track its progress. CMS sponsored two studies that examined the use of hospital IT systems for quality data collection and submission. Promoting the use of health IT for quality data collection is also 1 of 14 objectives that HHS has identified in its broader effort to encourage the development and nationwide implementation of interoperable IT in health care. CMS has joined this broader effort by HHS, as well as the Quality Workgroup that AHIC created in August 2006 to specify how IT could capture, aggregate, and report inpatient and outpatient quality data. Through its representation in AHIC and the Quality Workgroup, CMS has participated in decisions about the specific focus areas to be examined through contracts with nongovernmental entities. These contracts currently address the use of health IT for a range of purposes, which may also include quality data collection and submission in the near future. However, HHS has identified no detailed plans, milestones, or time frames for either its broad effort to encourage IT in health care nationwide or its specific objective to promote the use of health IT for quality data collection. CMS Sponsored Studies Examining Use of IT Systems for Collection and Submission of Quality Data: Over the past several years, CMS sponsored two studies to examine the current and potential capacity of hospital IT systems to facilitate quality data collection and submission. These studies identified challenges to using existing hospital IT systems for quality data collection and submission, including gaps and inconsistencies in applicable data standards, as well as in the content of clinical information recorded in existing IT systems. Data standards create a uniform vocabulary for electronically recorded information by providing common definitions and coding conventions for a specified set of medical terms. Currently, an array of different standards apply to different aspects of patient care, including drug ordering, digital imaging, clinical laboratory results, and overall clinical terminology relating to anatomy, problems, and procedures.[Footnote 37] The studies also found that existing IT systems did not record much of the specific clinical information needed to determine the appropriate data element values that hospitals submit to CMS. To achieve CMS's goal of enabling hospitals to transmit quality data directly from their own IT systems to CMS's nationwide clinical database, the sets of data in the two systems should conform to a common set of data standards and capture all the data necessary for quality measures.[Footnote 38] A key element in the effort to create this congruence is the further development and implementation of data standards. In the first study, completed in March 2005, CMS contracted with the Colorado Foundation for Medical Care to test the potential for directly downloading values for data elements for CMS's hospital quality measures using patient data from electronic medical records in three hospitals and one hospital system.[Footnote 39] The study found that numerous factors impeded this process under current conditions, including the lack of certain key types of information in the hospitals' IT systems, such as emergency department data, prearrival data, transfer information, and information on medication contraindications. The study also noted that hospitals differed in how they coded their data, and that even when they had implemented data standards, the hospitals had used different versions of the standards or applied them in different ways.[Footnote 40] For example, the study found wide variation in the way that the hospitals recorded drug names and laboratory results in their IT systems, as none of the hospitals had implemented the existing data standards in those areas. In the second study, which was conducted by the Iowa Foundation for Medical Care and completed in February 2006, CMS examined the potential to expand its current data specifications for heart attack, heart failure, pneumonia, and surgical measures to incorporate the standards adopted by the federal Consolidated Healthcare Informatics (CHI) initiative.[Footnote 41] Unlike the first study, which focused on actual patient data in existing IT systems, this study focused on the relationship of current data standards to the data specifications for CMS's quality data. It found that there were inconsistencies in the way that corresponding data elements were defined in the CMS/Joint Commission Specifications Manual and in the CHI standards that precluded applying those standards to all of CMS's data elements. Moreover, it found that some of the data elements are not addressed in the CHI standards. These results suggested to CMS officials that the data standards needed to undergo further development before they could support greater use of health IT to facilitate quality data collection and submission. CMS Has Joined HHS's Efforts to Promote Greater Use of Health IT for Quality Data Collection and Submission, but HHS Lacks Detailed Plans, Milestones, and a Time Frame to Track Progress: CMS has joined efforts by HHS to promote greater use of health IT in general and, more recently, in facilitating the use of health IT for quality data collection and submission. The overall goal of HHS's efforts in this area, working through AHIC and ONC, is to encourage the development and nationwide implementation of interoperable health IT in both the public and the private sectors. To guide those efforts, ONC has developed a strategic framework that outlines its goals, objectives, and high-level strategies. One of the 14 objectives involves the collection of quality information.[Footnote 42] CMS, through its participation in AHIC, has taken part in the selection of specific focus areas for ONC to pursue in its initial activities to promote health IT. Those activities have largely taken place through a series of contracts with a number of nongovernmental entities. ONC has sought through these contracts to address issues affecting wider use of health IT, including standards harmonization, the certification of IT systems, and the development of a Nationwide Health Information Network. For example, the initial work on standards harmonization, conducted under contract to ONC by the Healthcare Information Technology Standards Panel (HITSP), focused on three targeted areas: biosurveillance,[Footnote 43] sharing laboratory results across institutions, and patient registration and medication history. Meanwhile, the Certification Commission for Health Information Technology (CCHIT) has worked under a separate contract with ONC to develop and apply certification criteria for electronic health record products used in physician offices, with some initial work on certification of electronic health record products for inpatient care as well.[Footnote 44] CMS is also represented on the Quality Workgroup that AHIC created in August 2006 as a first step in promoting the use of health IT for quality data collection and submission. One of seven workgroups appointed by AHIC, the Quality Workgroup received a specific charge to specify how health IT should capture, aggregate, and report inpatient as well as outpatient quality data. It plans to address this charge by adding activities related to using IT for quality data collection to the work performed by HITSP and CCHIT addressing other objectives under their ongoing ONC contracts. Members of the Quality Workgroup, along with AHIC itself, have recently begun to consider the specific focus areas to include in the directions given to HITSP and CCHIT for their activities during the coming year.[Footnote 45] Early discussions among AHIC members indicated that they would try to select focus areas that built on the work already completed by ONC's contractors and that targeted specific improvements in quality data collection that could also support other priorities for IT development that AHIC had identified.[Footnote 46] The focus areas that AHIC selects will, over time, influence the decisions that HHS makes regarding the resources it will allocate and the specific steps it will take to overcome the limitations of existing IT systems for quality data collection and submission. In a previous report and subsequent testimony, we noted that ONC's overall approach lacked detailed plans and milestones to ensure that the goals articulated in its strategic framework were met. We pointed out that without setting milestones and tracking progress toward completing them, HHS cannot tell if the necessary steps are in place to provide the building blocks for achieving its overall objectives.[Footnote 47] HHS concurred with our recommendation that it establish detailed plans and milestones for each phase of its health IT strategic framework, but it has not yet released any such plans, milestones, or a time frame for completion. Moreover, HHS has not announced any detailed plans or milestones or a time frame relating to the efforts of the Quality Workgroup to promote the use of health IT to capture, aggregate, and report inpatient and outpatient quality data. Without such plans, it will be difficult to assess how much the focus areas AHIC selects in the near term on its contracted activities will contribute to enabling the Quality Workgroup to fulfill its charge in a timely way. Conclusions: There is widespread agreement on the importance of hospital quality data. The Congress made the APU program permanent to provide a financial incentive for hospitals to submit quality data to CMS and directed the Secretary of HHS to increase the number of measures for which hospitals would have to provide data. In addition, the hospitals we visited reported finding value in the quality data they collected and submitted to CMS to improve care. Collecting quality data is a complex and labor-intensive process. Hospital officials told us that as the number of quality measures required by CMS increased, the number of clinically trained staff required to collect and submit quality data increased proportionately. They also told us that increased use of IT facilitates the collection and submission of quality data and thereby lessens the demand for greater staff resources. The degree to which existing IT systems can facilitate data collection is, however, constrained by limitations such as the prevalence of data recorded as unstructured narrative or text. Overcoming these limitations would enhance the potential of IT systems to ease the demand on hospital resources. Promoting the use of health IT for quality data collection is 1 of 14 objectives that HHS has identified in its broader effort to encourage the development and nationwide implementation of interoperable IT in health care. The extent to which HHS can overcome the limitations of existing IT systems and make progress on this objective will depend in part on where this objective falls on the list of priorities for the broader effort. To date, HHS has identified no detailed plans, milestones, or time frames for either the broad effort or the specific objective on promoting the use of health IT for collecting quality data. Without such plans, HHS cannot track its progress in promoting the use of health IT for collecting quality data, making it less likely that HHS will achieve that objective in a timely way. Our analysis indicates that unless activities to facilitate greater use of IT for quality data collection and submission proceed promptly, hospitals may have difficulty collecting and submitting quality data required for an expanded APU program. Recommendations for Executive Action: To support the expansion of quality measures for the APU program, we recommend that the Secretary of HHS take the following actions: * identify the specific steps that the department plans to take to promote the use of health IT for the collection and submission of data for CMS's hospital quality measures; and: * inform interested parties about those steps and the expected time frame, including milestones for completing them. Agency Comments and Our Evaluation: In commenting on a draft of this report on behalf of HHS, CMS expressed its appreciation of our thorough analysis of the processes that hospitals use to report quality data and the role that IT systems can play in that reporting, and it concurred with our two recommendations. (CMS's comments appear in app. V.) With respect to the recommendations, CMS stated that it will continue to participate in relevant HHS studies and workgroups, and, as appropriate, it will inform interested parties regarding progress in the implementation of health IT for the collection and submission of hospital quality data as specific steps, including time frames and milestones, are identified. In addition, as health IT is implemented, CMS anticipates that a formal plan will be developed that includes training for providers in the use of health IT for reporting quality data. CMS also provided technical comments that we incorporated where appropriate. CMS made two additional comments relating to the information provided on our case study hospitals and our discussion of patients excluded from the hospital performance assessments. CMS suggested that we describe the level of health IT adoption in the case study hospitals in table 1 of appendix III; this information was already provided in table 4 of appendix III. CMS suggested that we highlight the application of patient exclusions in adapting health IT for quality data collection and submission. We chose not to because our analysis showed that the degree of challenge depended on the nature of the information required for a given data element. Exclusions based on billing data, such as discharge status, pose much less difficulty than other exclusions, such as checking for contraindications to ACEIs and ARBs for LVSD, which require a wide range of clinical information. CMS noted that the AHIC Quality Workgroup had presented its initial set of recommendations at AHIC's most recent meeting on March 13, 2007, and provided a copy of those recommendations as an appendix to its comments. The agency characterized these recommendations as first steps, with initial timelines, to address the complex issues that affect implementation of health IT for quality data collection and submission. Specifically with reference to collecting quality data from hospitals as well as physicians, the Quality Workgroup recommended the appointment of an expert panel that would designate a set of quality measures to have priority for standardization of their data elements, which, in turn, would enable automation of their collection and submission using electronic health records and health information exchange. The first recommendations from the expert panel are due June 5, 2007. The work of the expert panel is intended to guide subsequent efforts by HITSP to fill identified gaps in related data standards and by CCHIT to develop criteria for certifying electronic health record products. In addition, the Quality Workgroup recommended that CMS and the Agency for Healthcare Research and Quality (AHRQ) both work to bring together the developers of health quality measures and health IT vendors, so that development of future health IT systems would take greater account of the data requirements of emerging quality measures. AHIC approved these recommendations from the Quality Workgroup at its March 13 meeting. We also sent to each of the eight case study hospitals sections from the appendixes pertaining to that hospital. We asked each hospital to check that the section accurately described its processes for collecting and submitting quality data as well as related information on its characteristics and resources. Officials from four of the eight hospitals responded and provided technical comments that we incorporated where appropriate. As arranged with your offices, unless you publicly announce its contents earlier, we plan no further distribution of this report until 30 days after its issue date. At that time, we will send copies of this report to the Secretary of HHS, the Administrator of CMS, and other interested parties. We will also make copies available to others on request. In addition, the report will be available at no charge on GAO's Web site at http://www.gao.gov. If you or your staffs have any questions about this report, please contact me at (202) 512-7101 or BascettaC@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made major contributions to this report are listed in appendix VI. Signed by: Cynthia A. Bascetta: Director, Health Care: [End of section] Appendix I: Medicare Quality Measures Required for Full Annual Payment Update: Condition: Heart attack; Quality measure: Aspirin at hospital arrival[A]; Number of required data elements: 11. Condition: Heart attack; Quality measure: Aspirin prescribed at discharge[A]; Number of required data elements: 7. Condition: Heart attack; Quality measure: Angiotensin-converting enzyme inhibitor or angiotensin receptor blocker for left ventricular systolic dysfunction[A]; Number of required data elements: 9. Condition: Heart attack; Quality measure: Beta blocker at hospital arrival[A]; Number of required data elements: 11. Condition: Heart attack; Quality measure: Beta blocker prescribed at discharge[A]; Number of required data elements: 7. Condition: Heart attack; Quality measure: Thrombolytic agent received within 30 minutes of hospital arrival; Number of required data elements: 13. Condition: Heart attack; Quality measure: Percutaneous coronary intervention received within 120 minutes of hospital arrival; Number of required data elements: 16. Condition: Heart attack; Quality measure: Adult smoking cessation advice/counseling; Number of required data elements: 7. Condition: Heart failure; Quality measure: Left ventricular function assessment[A]; Number of required data elements: 7. Condition: Heart failure; Quality measure: Angiotensin-converting enzyme inhibitor or angiotensin receptor blocker for left ventricular systolic dysfunction[A]; Number of required data elements: 10. Condition: Heart failure; Quality measure: Discharge instructions; Number of required data elements: 12. Condition: Heart failure; Quality measure: Adult smoking cessation advice/ counseling; Number of required data elements: 8. Condition: Pneumonia; Quality measure: Initial antibiotic received within 4 hours of hospital arrival[A]; Number of required data elements: 16. Condition: Pneumonia; Quality measure: Oxygenation assessment[A]; Number of required data elements: 11. Condition: Pneumonia; Quality measure: Pneumococcal vaccination status[A]; Number of required data elements: 8. Condition: Pneumonia; Quality measure: Blood culture performed before first antibiotic received in hospital; Number of required data elements: 19. Condition: Pneumonia; Quality measure: Adult smoking cessation advice/counseling; Number of required data elements: 9. Condition: Pneumonia; Quality measure: Appropriate initial antibiotic selection; Number of required data elements: 24. Condition: Pneumonia; Quality measure: Influenza vaccination status; Number of required data elements: 9. Condition: Surgery; Quality measure: Prophylactic antibiotic received within 1 hour prior to surgical incision; Number of required data elements: 14. Condition: Surgery; Quality measure: Prophylactic antibiotics discontinued within 24 hours after surgery end time; Number of required data elements: 17. Sources: Federal Register, CMS, GAO (analysis). Notes: The 21 measures are listed in 71 Fed. Reg. 47870, 48033-48034, 48045 (Aug. 18, 2006), and we analyzed the Specifications Manual for National Hospital Quality Measures, version 2.1a, to calculate the number of required data elements for each. This set of quality measures is effective for discharges from July 2006 on. The Centers for Medicare & Medicaid Services (CMS) uses 73 different data elements to calculate hospital performance on the 21 measures required for the APU program. The total number of unique data elements is less than the sum of the data elements used to calculate each measure because some data elements are included in the calculation of more than one quality measure. In addition, CMS obtains from hospitals approximately 20 other data elements on each patient, including demographic and billing data. [A] One of the 10 original quality measures. [End of table] [End of section] Appendix II: Data Elements Used to Calculate Hospital Performance on a Heart Attack Quality Measure: Figure 3: Data Elements Used to Calculate Hospital Performance on the Heart Attack Quality Measure That Asks Whether a Beta Blocker Was Given When the Patient Arrived at the Hospital: [See PDF for image] Source: GAO. Notes: The boxes represent data elements and the circles and rounded rectangles represent values for those elements. In addition to the seven data elements shown in the figure (including arrival date and discharge date that appear in the same box), an eighth data element, comfort measures only, is first applied for this quality measure, as well as all the other heart attack, heart failure, and pneumonia quality measures, to screen out terminal patients receiving palliative care. Three other data elements--principal diagnosis, admission date, and birthdate--are used to initially identify the patients for whom the heart failure quality measures apply in a given quarter. [A] Included codes consist of eight different values for admission source that represent patients who were admitted from any source other than those listed in footnote b, including physician referral, skilled nursing facility, and the hospital's emergency room. [B] Excluded codes consist of three different values for admission source that represent patients who were transferred to this hospital from another acute care hospital, from a critical access hospital, or within the same hospital with a separate claim. [C] Patients may be excluded from the population used to calculate a hospital's performance for a variety of reasons, including inappropriateness of beta blockers for their treatment--for example, if they have a contraindication for their use--or prior treatment in another acute care facility. [D] Included codes consist of 13 different values for discharge status that represent patients who were discharged to any setting other than those listed in footnote e, including home care, skilled nursing facility, and hospice. [E] Excluded codes consist of five different values for discharge status that represent patients who were discharged to another acute care hospital or federal health care facility, left against medical advice, or died. [End of figure] [End of section] Appendix III: Tables on Eight Case Study Hospitals: Table 1: Case Study Hospital Characteristics: Number of beds; Case study hospital: A: 300-349; Case study hospital: B: 500+; Case study hospital: C: 50-99; Case study hospital: D: 500+; Case study hospital: E: 100-149; Case study hospital: F: 500+; Case study hospital: G: 150-199; Case study hospital: H: 500+. Urban/rural; Case study hospital: A: Urban; Case study hospital: B: Urban; Case study hospital: C: Rural; Case study hospital: D: Urban; Case study hospital: E: Suburban; Case study hospital: F: Urban; Case study hospital: G: Suburban; Case study hospital: H: Urban. Major teaching; Case study hospital: A: Yes; Case study hospital: B: Yes; Case study hospital: C: No; Case study hospital: D: Yes; Case study hospital: E: No; Case study hospital: F: Yes; Case study hospital: G: No; Case study hospital: H: Yes. Member of multihospital system; Case study hospital: A: Yes; Case study hospital: B: Yes; Case study hospital: C: Yes; Case study hospital: D: No; Case study hospital: E: No; Case study hospital: F: No; Case study hospital: G: No; Case study hospital: H: No. Joint Commission accredited; Case study hospital: A: Yes; Case study hospital: B: Yes; Case study hospital: C: Yes; Case study hospital: D: Yes; Case study hospital: E: Yes; Case study hospital: F: Yes; Case study hospital: G: Yes; Case study hospital: H: Yes. Vendor submits quality data; Case study hospital: A: Yes; Case study hospital: B: Yes; Case study hospital: C: Yes; Case study hospital: D: Yes; Case study hospital: E: Yes; Case study hospital: F: Yes; Case study hospital: G: Yes; Case study hospital: H: Yes. Patients identified for data collection how often; Case study hospital: A: Monthly; Case study hospital: B: Monthly; Case study hospital: C: Weekly; Case study hospital: D: Monthly; Case study hospital: E: Monthly; Case study hospital: F: Monthly; Case study hospital: G: Monthly; Case study hospital: H: Monthly. Abstraction tool used; Case study hospital: A: Vendor's; Case study hospital: B: Vendor's; Case study hospital: C: Vendor's; Case study hospital: D: CART[A]; Case study hospital: E: Vendor's; Case study hospital: F: Vendor's; Case study hospital: G: Vendor's; Case study hospital: H: Vendor's. Conditions reported on; Case study hospital: A: Heart attack, heart failure, pneumonia, surgery; Case study hospital: B: Heart attack, heart failure, pneumonia, surgery; Case study hospital: C: Heart attack, heart failure, pneumonia, surgery; Case study hospital: D: Heart attack, heart failure, pneumonia, surgery; Case study hospital: E: Heart attack, heart failure, pneumonia, surgery; Case study hospital: F: Heart attack, heart failure, pneumonia, surgery; Case study hospital: G: Heart attack, heart failure, pneumonia, surgery; Case study hospital: H: Heart attack, heart failure, pneumonia, surgery. Entities that receive Annual Payment Update (APU) program data; Case study hospital: A: CMS, Joint Commission; Case study hospital: B: CMS, Joint Commission, vendor database, private insurers; Case study hospital: C: CMS, Joint Commission; Case study hospital: D: CMS, Joint Commission; Case study hospital: E: CMS, Joint Commission, vendor database; Case study hospital: F: CMS, Joint Commission; Case study hospital: G: CMS, Joint Commission; Case study hospital: H: CMS, Joint Commission. Entities that receive different quality data; Case study hospital: A: Leapfrog[B]; Case study hospital: B: Leapfrog, state health department, private insurers; Case study hospital: C: Private insurer; Case study hospital: D: Leapfrog, private insurer; Case study hospital: E: Private insurer; Case study hospital: F: Leapfrog, private insurer; Case study hospital: G: State health department, private insurers; Case study hospital: H: Private insurer. Amount of projected reduction in fiscal year 2006 Medicare payments if quality data not submitted[C]; Case study hospital: A: $139,000; Case study hospital: B: $608,000; Case study hospital: C: $33,000; Case study hospital: D: $449,000; Case study hospital: E: $57,000; Case study hospital: F: $430,000; Case study hospital: G: $93,000; Case study hospital: H: $123,000. Amount of projected reduction in fiscal year 2007 Medicare payments if quality data not submitted[C]; Case study hospital: A: $801,000; Case study hospital: B: $3,250,000; Case study hospital: C: $161,000; Case study hospital: D: $2,298,000; Case study hospital: E: $283,000; Case study hospital: F: $2,451,000; Case study hospital: G: $503,000; Case study hospital: H: $608,000. Sources: American Hospital Association, GAO, Centers for Medicare & Medicaid Services (CMS). [A] CART, which stands for the CMS Abstraction and Reporting Tool, was developed by CMS and made available to hospitals at no charge for collecting and submitting quality data. [B] The Leapfrog Group is a consortium of large private and public health care purchasers that publicly recognizes hospitals that have implemented certain specific quality and safety practices, such as computerized physician order entry. [C] The projected reduction in fiscal year 2006 and fiscal year 2007 Medicare payments (rounded to the nearest $1,000) represents the amount that the hospital's revenue from Medicare would have decreased for that fiscal year had the hospital not submitted quality data under the Annual Payment Update program. These estimates are based on information on the number and case mix of Medicare patients served by these hospitals during the previous period. This is the information that was available to hospital administrators from CMS at the beginning of the fiscal year. The actual reduction would ultimately depend on the number and case mix of the Medicare patients that the hospital actually treated during the course of that fiscal year. The projected reduction for fiscal year 2007 was substantially larger because that was the first year in which the higher rate of reduction mandated by the Deficit Reduction Act of 2005--from 0.4 percentage points to 2.0 percentage points--took effect. [End of table] Table 2: How Case Study Hospital Officials Described the Steps Taken to Complete Quality Data Collection and Submission: 1. Identify patients[A]; Case study hospital: A: Vendor prepares list of patients to abstract, sampling heart failure, pneumonia, and surgery; Case study hospital: B: Vendor prepares list of patients based on diagnosis codes, and draws samples for heart failure, pneumonia, and surgery; Case study hospital: C: Vendor prepares list of patients to abstract based on billing data, no sampling; Case study hospital: D: Hospital IT department identifies patients based on billing data, no sampling; Case study hospital: E: Hospital prepares list of patients from billing data, no sampling; Case study hospital: F: Hospital provides billing data to vendor; vendor draws samples and generates list of patients to abstract; Case study hospital: G: Hospital creates list from billing data; vendor provides instructions to draw sample of pneumonia cases; Case study hospital: H: Hospital submits billing data to vendor, which identifies eligible patients and draws samples. 2. Locate information in the medical record; Case study hospital: A: Abstractor searches through emergency room and inpatient electronic and paper records, checking multiple forms and screens where relevant information could be found; Case study hospital: B: Abstractor starts search with electronic discharge summary, then other electronic records and paper documents; Case study hospital: C: Abstractor searches through different components of paper record, including printouts from electronic records; Case study hospital: D: Abstractor clicks through various electronic screens representing different types of records, plus some scanned documents, for example, from other providers; Case study hospital: E: Abstractor works through paper records, such as face sheet, emergency room treatment forms, progress notes, and discharge summary; Case study hospital: F: Abstractor starts with electronic records (for heart attack and heart failure)--first structured records (discharge) and then free text--and then examines paper records if needed; paper records searched for pneumonia and surgery; Case study hospital: G: Abstractor starts searching through paper records, then looks for additional information in electronic records (e.g., for echocardiogram results); Case study hospital: H: Abstractor searches through both electronic and paper records. 3. Determine appropriate data element values; Case study hospital: A: Some demographic data prepopulated; abstractor notes ambiguous or conflicting information on paper abstraction form; Case study hospital: B: Some demographic data prepopulated; other data elements written on paper abstraction form; Case study hospital: C: Some demographic data prepopulated; other data elements entered directly into vendor's online abstraction tool; Case study hospital: D: Some demographic data prepopulated; most abstractors fill in data elements on paper abstraction form; Case study hospital: E: Data elements entered into computerized abstraction form; Case study hospital: F: Some demographic data prepopulated; abstractors fill out abstraction form, some on paper and some online; Case study hospital: G: Some demographic data prepopulated; other data elements written on paper abstraction form; Case study hospital: H: Some demographic data prepopulated; other data elements written on paper abstraction form. 4. Transmit data to CMS; Case study hospital: A: Data elements copied from paper abstraction form to vendor's online form; Case study hospital: B: Data elements copied from paper abstraction form to vendor's online form; Case study hospital: C: Data elements entered directly into vendor's online abstraction tool; Case study hospital: D: Data elements copied from paper abstraction form to vendor's electronic form; data manager checks data and uploads file to vendor; Case study hospital: E: Completed abstraction forms sent on disk to vendor; will change soon to completion of forms online; Case study hospital: F: For pneumonia and surgery, abstractor enters data online, for heart attack and heart failure, hospital scans paper abstraction forms and sends electronic file to vendor, which submits data to CMS; Case study hospital: G: Data elements copied from paper abstraction form to vendor's online form; Case study hospital: H: Data elements copied from paper abstraction form to vendor's online form. 5. Ensure data have been accepted by CMS; Case study hospital: A: Performed by vendor; Case study hospital: B: Hospital staff reviews error reports from clinical data warehouse and corrects errors; Case study hospital: C: Performed by vendor; Case study hospital: D: Hospital staff reviews error reports from vendor; Case study hospital: E: Hospital reviews error reports from vendor and clinical warehouse; Case study hospital: F: Performed by vendor; Case study hospital: G: Hospital receives error report from vendor and clinical data warehouse and makes corrections; Case study hospital: H: Hospital reviews error reports from vendor and makes corrections; vendor deals with clinical data warehouse. 6. Supply copies of selected medical records; Case study hospital: A: Hospital copies and ships requested patient records; Case study hospital: B: Hospital copies, checks completeness of, and ships requested patient records; Case study hospital: C: Hospital copies, checks completeness of, and ships requested patient records; Case study hospital: D: Hospital copies and ships requested patient records. Case study hospital: E: Hospital copies and ships requested patient records; Case study hospital: F: Hospital copies and ships requested patient records; before shipping hospital flags relevant information; Case study hospital: G: Hospital copies, checks completeness of, and ships requested patient records; Case study hospital: H: Hospital copies, checks completeness of, and ships requested patient records. Source: GAO. Note: Information summarized from hospital case study interviews. [A] The identifying patients step included both determining all the patients who met the CMS criteria for inclusion and the application of the CMS sampling procedures, if applicable. CMS only permitted hospitals to sample patients for a given condition in a given quarter if the number of eligible patients met a certain threshold. Otherwise, the hospital was required to abstract quality data for all patients who met the inclusion criteria for any one of the four conditions. Hospitals could also choose not to sample, even if it were permitted under the CMS sampling procedures. [End of table] Table 3: Resources Used for Abstraction and Data Submission at Eight Case Study Hospitals: Qualifications of abstractors; Case study hospital: A: Medical record coders and a Master of Public Health; Case study hospital: B: Registered nurse (RN) and nonclinical; Case study hospital: C: All RN; Case study hospital: D: All RN; Case study hospital: E: Licensed practical nurse (LPN); Case study hospital: F: Medical records coder and RN with physician support; Case study hospital: G: RN and LPN[A]; Case study hospital: H: All RN. Number of abstractors; Case study hospital: A: 3; Case study hospital: B: 3; Case study hospital: C: 3; Case study hospital: D: 9; Case study hospital: E: 2; Case study hospital: F: 3; Case study hospital: G: 3; Case study hospital: H: 4. Estimated full time equivalents for abstraction of data elements; Case study hospital: A: 0.7; Case study hospital: B: <2.0; Case study hospital: C: <1.5; Case study hospital: D: 2.5; Case study hospital: E: 1.2; Case study hospital: F: 1.3; Case study hospital: G: 1.2; Case study hospital: H: 2.0. Estimated time to abstract one chart; Case study hospital: A: 60 minutes (average); Case study hospital: B: 10 to 15 minutes; Case study hospital: C: 20 minutes (average); Case study hospital: D: 3 to 120 minutes; Case study hospital: E: 5 to 60 minutes; Case study hospital: F: 5 to 60 minutes; Case study hospital: G: 10 to 30 minutes; Case study hospital: H: 10 to 90 minutes. Average number of heart attack, heart failure, and pneumonia charts abstracted per quarter[B]; Case study hospital: A: 222; Case study hospital: B: 399; Case study hospital: C: 86; Case study hospital: D: 686; Case study hospital: E: 118; Case study hospital: F: 252; Case study hospital: G: 190; Case study hospital: H: 202. Average number of surgery charts abstracted per quarter; Case study hospital: A: 94[B]; Case study hospital: B: 218[C]; Case study hospital: C: 6[C]; Case study hospital: D: 553[B]; Case study hospital: E: 105[B]; Case study hospital: F: 186[B]; Case study hospital: G: 82[C]; Case study hospital: H: 61[C]. Data vendor costs for CMS quality data services per year; Case study hospital: A: $8,200; Case study hospital: B: $5,000 to $7,500; Case study hospital: C: $3,600; Case study hospital: D: $3,500; Case study hospital: E: $560; Case study hospital: F: $1,800; Case study hospital: G: $12,450; Case study hospital: H: $13,000. Checks for accuracy of case selection against another data source; Case study hospital: A: Some discrepancies observed; Case study hospital: B: A few discrepancies observed; Case study hospital: C: Relies on vendor processes and audits of medical records coding; Case study hospital: D: Checks only that data were submitted to CMS for all patients on original list to be abstracted; Case study hospital: E: None; Case study hospital: F: Some discrepancies observed; Case study hospital: G: None; Case study hospital: H: Checks patient lists for heart attack patients against medical records. Checks for accuracy of data abstraction; Case study hospital: A: Hospital reabstracts 5 percent of cases each quarter; Case study hospital: B: None beyond reviews by CMS contractor; Case study hospital: C: Hospital redoes 5 to 10 cases per measure set every quarter; Case study hospital: D: Only for cases where quality standard not met; Case study hospital: E: Only for cases where quality standard not met; Case study hospital: F: Only for cases where quality standard not met; Case study hospital: G: Not routinely, only for startup in new condition; Case study hospital: H: None beyond reviews by CMS contractor. Sources: GAO, CMS. [A] The LPN was abstracting cases for one condition temporarily until an RN could be hired to perform the work. [B] Based on submissions to the clinical warehouse for four quarters of discharges from April 2005 through March 2006. [C] Based on submissions to the clinical warehouse for one quarter of discharges from January through March 2006. [End of table] Table 4: Electronic and Paper Records at Eight Case Study Hospitals: Admissions; Case study hospital: A: E; Case study hospital: B: E; Case study hospital: C: E; Case study hospital: D: E; Case study hospital: E: E; Case study hospital: F: E; Case study hospital: G: E; Case study hospital: H: E. Billing; Case study hospital: A: E; Case study hospital: B: E; Case study hospital: C: E; Case study hospital: D: E; Case study hospital: E: E; Case study hospital: F: E; Case study hospital: G: E; Case study hospital: H: E. Emergency department; Case study hospital: A: E&P; Case study hospital: B: E&P; Case study hospital: C: P; Case study hospital: D: E; Case study hospital: E: P; Case study hospital: F: E; Case study hospital: G: P; Case study hospital: H: P. Medication administration; Case study hospital: A: E; Case study hospital: B: E; Case study hospital: C: P; Case study hospital: D: E; Case study hospital: E: P; Case study hospital: F: P; Case study hospital: G: E; Case study hospital: H: P. Physician orders including prescriptions; Case study hospital: A: E&P; Case study hospital: B: E&P; Case study hospital: C: P; Case study hospital: D: E; Case study hospital: E: P; Case study hospital: F: E; Case study hospital: G: P; Case study hospital: H: E. Nursing notes; Case study hospital: A: P; Case study hospital: B: P; Case study hospital: C: P; Case study hospital: D: E; Case study hospital: E: P; Case study hospital: F: P; Case study hospital: G: E; Case study hospital: H: P. Laboratory and test results; Case study hospital: A: E; Case study hospital: B: E; Case study hospital: C: E; Case study hospital: D: E; Case study hospital: E: E; Case study hospital: F: E; Case study hospital: G: E; Case study hospital: H: E. Physician notes; Case study hospital: A: P; Case study hospital: B: E&P; Case study hospital: C: P; Case study hospital: D: E; Case study hospital: E: E&P; Case study hospital: F: E&P; Case study hospital: G: E&P; Case study hospital: H: E&P. Discharge summaries and instructions; Case study hospital: A: P; Case study hospital: B: E; Case study hospital: C: P; Case study hospital: D: E; Case study hospital: E: P; Case study hospital: F: E&P; Case study hospital: G: E; Case study hospital: H: E. Operating room; Case study hospital: A: P; Case study hospital: B: E&P; Case study hospital: C: E&P; Case study hospital: D: E; Case study hospital: E: P; Case study hospital: F: E&P; Case study hospital: G: E; Case study hospital: H: E. Source: GAO. Note: E = electronic, P = paper. [End of table] [End of section] Appendix IV: Scope and Methodology: To examine how hospitals collect and submit quality data, and to determine the extent to which information technology (IT) facilitates those processes, we conducted case studies of eight individual acute care hospitals that collect and submit quality data to the Centers for Medicare & Medicaid Services (CMS). We chose this approach to obtain an in-depth understanding of these processes as they are currently experienced at the hospital level. For background information on the requirements that the hospitals had to satisfy, we reviewed CMS documents relevant to the Annual Payment Update (APU) program. In particular, we examined multiple revisions of the Specifications Manual for National Hospital Quality Measures, which is issued jointly by CMS and the Joint Commission (formerly the Joint Commission on Accreditation of Healthcare Organizations). We structured our selection of hospitals for the eight case studies to provide a contrast of hospitals with highly sophisticated IT systems and hospitals with an average level of IT capability. We excluded critical access hospitals from this selection process because they are not included in the APU program.[Footnote 48] The selected hospitals varied on several hospital characteristics, including urban/rural location, size, teaching status, and membership in a system that linked multiple hospitals through shared ownership or other formal arrangements. (See app. III, table 1.) To select four hospitals with highly sophisticated IT systems, we relied on recommendations from interviews with a number of experts in the field of health IT, as well as on a recent review of the research literature on the costs and benefits of health IT[Footnote 49] and other published articles. Three of the four hospitals we chose were among those where much of the published research has taken place. They were all early adopters of health IT, and each had implemented internally developed IT systems. The fourth hospital had more recently acquired and adapted a commercially developed system. This hospital was distinguished by the extent to which it had replaced its paper medical records with an integrated system of electronic patient records. Each of these four case study hospitals was located in a different metropolitan area. We selected the four hospitals with less sophisticated IT systems from the geographic vicinity of the four hospitals already chosen, thus providing two case study hospitals from each of four metropolitan areas. We decided that one should be a rural hospital, using the Medicare definition of rural, which is located outside of a Metropolitan Statistical Area (MSA). To determine from which of the four metropolitan areas we should select a neighboring rural hospital, we analyzed data on Medicare-approved hospitals drawn from CMS's Provider of Services (POS) file. We identified the rural hospitals located within 150 miles of each of the first four hospitals. From among those four sets of rural hospitals, we chose the set with the largest number of acute care hospitals as the set from which to choose our rural case study hospital. For each of the remaining three metropolitan areas, we used the hospitals listed in the POS file as short-term acute care hospitals located in the same MSAs as the three sets from which to choose our remaining three hospitals. We excluded hospitals located in a different state from the first hospital selected for that metropolitan area, so that all of the hospitals under consideration for that area would come under the jurisdiction of the same Quality Improvement Organization (QIO).[Footnote 50] To select the second case study hospital from among those available in or near each of the four metropolitan areas, we applied a procedure designed to produce a straightforward and unbiased selection. We began by recording the total number of cases for which each of these hospitals had reported results on CMS's Web site for heart attack, heart failure, and pneumonia quality measures. We obtained this information from the Web site itself, running reports for each hospital that showed, for each quality measure, the number of cases that the hospital's quality performance score was based on. Since some quality measures apply only to certain patients, we recorded the largest number of cases listed for any of the quality measures reported for a given condition. Next we summed the cases for the three conditions and rank ordered the hospitals in each of the three MSAs, and the rural hospitals in the fourth metropolitan area, from most to least total cases submitted. We then made a preliminary selection by taking the hospital with the median value in each of those lists.[Footnote 51] By selecting the hospital with the median number of cases reported, we attempted to minimize the chances of picking a hospital that would represent an outlier compared to other hospitals in the selection pool.[Footnote 52] Before selecting the final four case study hospitals, we checked to make sure that the hospitals did not happen to have an unusually high level of IT capabilities with respect to electronic patient records. To do this, we contacted each of the selected hospitals and obtained a description of its current IT systems. We compared this description to the stages of electronic medical record implementation laid out by the Healthcare Information and Management Systems Society (HIMSS).[Footnote 53] The HIMSS model identifies eight stages based on the scope and sophistication of clinical functions implemented through a hospital's system of electronic medical records. According to HIMSS, the large majority of hospitals in the United States are at the lower three stages. Based on the descriptions of these stages, we determined that none of the prospectively selected hospitals had IT systems that exceeded the third stage. We collected information about the processes used to collect and submit quality data from each of the eight case study hospitals through on- site interviews with hospital abstractors, quality managers, IT staff, and hospital administrators. We told these officials that neither they personally nor their hospitals would be identified by name in our report. The site visits took place between mid-July and early September 2006 and ranged in duration from 3 to 8 hours. Our data collection at each hospital was guided by a protocol that specified a series of topics to cover in our interviews. These topics included a description of the processes used at each hospital and the financial and staff resources devoted to quality data collection and submission. We pretested the protocol at two hospitals not included in our set of eight case study hospitals. As part of the protocol, we asked abstractors at each hospital to explain in detail how they found the information needed to determine the appropriate values for each of the data elements required for two specific quality measures: (1) angiotensin-converting enzyme inhibitor (ACEI) or antiotensin receptor blocker (ARB) for left ventricular systolic dysfunction (LVSD) for heart failure patients and (2) initial antibiotic received within 4 hours of hospital arrival for pneumonia patients. We selected these measures because they covered a number of different types of data elements, including those involving administration of medications, determining contraindications, date and time variables, and making clinical assessments such as whether a patient had LVSD. To determine the extent to which IT facilitated these processes at the eight case study hospitals, we included several topics on IT systems in our site visit protocol. We asked about any IT systems used by the abstractors in locating relevant clinical information in patient medical records and the specific advantages and limitations they encountered in using those systems. We also asked hospital officials to assess the potential for IT systems to provide higher levels of assistance for quality data collection and submission over time. If separate IT staff were involved in the hospital's quality data collection and submission process, we included them in the interviews. Where possible, we supplemented the information provided through interviews with direct observation of the processes used by hospitals to collect and submit quality data. We asked the case study hospitals to show us how they performed these processes, and five of the eight hospitals arranged for us to observe the collection of quality data for all or part of a patient record. We observed abstractors accessing clinical information from both paper and electronic records. We also obtained pertinent information about the case study hospitals from CMS documents and contractors. The estimated amount of dollars that the case study hospitals would have lost had they not submitted quality data to CMS, presented in appendix III, table 1, was calculated from data provided in documents made available to all hospitals at the start of each of the fiscal years.[Footnote 54] Information on the average number of patient charts abstracted quarterly by each case study hospital, shown in appendix III, table 3, was drawn from a table showing the number of patients for whom quality data were submitted to CMS's clinical data warehouse. We obtained that table from the Iowa Foundation for Medical Care (IFMC), which is the CMS contractor that operates the clinical data warehouse. The IFMC table provided this information for all hospitals submitting quality data for discharges that occurred from April 2005 through March 2006. These were the most recent data available. The evidence that we obtained from our eight case study hospitals is specific to those hospitals. In particular, it does not offer a basis for relating any differences we observed among these individual hospitals to their differences on specific dimensions, such as size or teaching status. Nor can we generalize from the group of eight as a whole to acute care hospitals across the country. Furthermore, although we examined the processes hospitals used to collect and submit quality data and the role that IT plays in that process, we did not examine general IT adoption in the hospital industry. To obtain information on whether CMS has taken steps to promote the development of IT systems to facilitate quality data collection and submission, we interviewed CMS officials as well as CMS contractors and reviewed documents including reports on related studies funded by CMS. We also interviewed officials at the Office of the National Coordinator for Health Information Technology (ONC) regarding the plans and activities of the American Health Information Community (AHIC) quality workgroup. In addition, we downloaded relevant documents from the AHIC Web site, including meeting agendas, prepared presentations, and meeting minutes for both AHIC as a whole and its Quality Workgroup. We conducted our work from February 2006 to April 2007 in accordance with generally accepted government auditing standards. [End of section] Appendix V: Comments from the Centers for Medicare & Medicaid Services: Department Of Health & Human Services: Centers for Medicare & Medicaid Services: Administrator: Washington, DC 20201: Date: Mar 29 2007: To: Cynthia A. Bascetta: Director, Health Care: Government Accountability Office: From: Leslie Nor-walk, Esd.: Acting Administrator: Subject: Government Accountability Office's (GAO) Draft Report: "Hospital Quality Data: HHS Should Specify Steps and Timeframe for Using Information Technology to Collect and Submit Data" (GAO-07-320): Thank you for the opportunity to review and comment on the above referenced draft report. The Centers for Medicare & Medicaid Services (CMS) appreciates the GAO's thorough examination of hospital processes to collect and submit quality data, the extent to which information technology (IT) facilitates hospital collection and submission of quality data, and the steps the Department has taken to simplify the collection of this data. The CMS considers the reporting of hospital quality measures a positive step toward improving the quality of care that hospitals provide to patients. With the assistance of the Medicare Quality Improvement Organizations (QIOs), hospitals began reporting quality data in 2003 through the voluntary Hospital Quality Initiative. As a result of certain provisions in the Medicare Prescription Drug, Improvement and Modernization Act of 2003 (MMA) and the Deficit Reduction Act of 2005 (DRA), the Reporting Hospital Quality Data for Annual Payment Update (RHQDAPU) program has been expanded. For fiscal year (FY) 2007, approximately 95 percent of prospective payment system hospitals met all reporting requirements and received the full market basket update. Hospitals that did not successfully report quality measures received a two percentage point reduction to their market basket update for FY 2007. In both the FY 2007 Inpatient Prospective Payment System and Outpatient Prospective Payment System final rules, the Agency expanded the clinical quality measures reported by hospitals from 10 to 27. The current measurement set includes process, outcome, and patient experience of care. The CMS will continue to expand the measures for FY 2008 to include additional surgical care measures, and 30-day mortality measures. The GAO has provided a thorough examination of the processes hospitals use to report this data and the role IT systems can play in this reporting. We appreciate both the examination and recommendation. GAO Recommendation: GAO recommends that the Secretary of Health and Human Services (HHS) identify the specific steps that the Department plans to take to promote the use of health information technology (HIT) for the collection and submission of data for CMS' hospital quality measures, and inform interested parties about those steps and the expected timeframe including milestones for completing them. CMS Response: The CMS concurs with the recommendations and looks forward to implementing recognized interoperability standards. CMS will continue to participate in appropriate HHS studies and workgroups, as mentioned by the GAO. As appropriate, CMS will inform interested parties regarding progress in the implementation of HIT for the collection and submission of hospital quality data as specific steps, including timeframes and milestones, are identified. Current mechanisms include publication in the Federal Register as well as ongoing collaboration with external stakeholders such as the Hospital Quality Alliance, the American H Hospital Association, the Federation of American Hospitals, the Association of American Medical Colleges; and The Joint Commission. We further anticipate that as HIT is implemented, a formal plan, including training, will be developed to assist providers in understanding and utilizing HIT in reporting. In addition, we will assess the effectiveness of our communications with providers and stakeholders as it relates to all information dissemination pertinent to collecting hospital quality data as part of an independent and comprehensive external evaluation of the RHQDAPU program. The GAO report, through case studies of eight hospitals, has described in detail and with accuracy the extent of the burden shouldered by these hospitals in order to report quality data elements to CMS through the Iowa Foundation for Medical Informatics. Current regulation requires that this reporting be tied to regular financial updates provided through CMS. The GAO report also describes many of the barriers impeding the adoption of H H HT that, if overcome, could ease the burden of reporting quality data. Both the barriers and enablers to use of HIT have been discussed at the American Health Information Community (AHIC), which responded by forming the Quality Workgroup in September 2006. This workgroup has the following charges from AHIC: 1. Broad Charge for the Workgroup: Make recommendations to the AHIC so that breakthroughs in HIT can provide the data needed for the development of quality measures, automate the measurement and reporting of a comprehensive set of quality measures, and accelerate the use of clinical decision support that can improve performance on those quality measures. Also, make recommendations for how performance measures should align with the capabilities and limitations of HIT. 2. Specific Charge for the Workgroup: Make recommendations to the AHIC that specify how certified health HT should support the capture, aggregation, and reporting of data for a core set of ambulatory and inpatient quality measures. The Quality Workgroup presented its initial set of recommendations related to their specific charge to the AHIC at the most recent meeting on March 13, 2007. These recommendations reflect the first steps to address the complex issues associated with electronic extraction, aggregation, and exchange of quality information through certified HIT, along with initial timelines. (Additional information about the AHIC Quality Workgroup's charge and its recommendations are attached in Appendix A to this response.) We recommend, therefore, that GAO incorporate the issues raised in these early recommendations into the body of its report as a starting point for further development of technical and policy enablers for electronic reporting of quality metrics. We recognize that development of these technical and policy enablers is a first, but separate necessary component in achieving the overall goal of integrating quality and HIT. The degree to which hospitals can actually adopt the supporting HIT depends on many other factors, such as their financial status, their own business models, and the degree of and investment in pre-existing HIT systems. Overall adoption will, therefore, follow its own timeline and trajectories, which will be influenced by a number of factors yet to be defined in the environment. In addition, CMS recognizes the importance of ensuring that the infrastructure and processes associated with the RHQDAPU program are sound. In order to assess opportunities for improvement in the program, plans are currently in place to obtain an independent and comprehensive evaluation of the RHQDAPU program. The goal of this evaluation is to ensure and demonstrate accountability to taxpayers and other stakeholders. The evaluation will provide assurance to management that operations are well-managed, efficient, and within the bounds of applicable laws, regulations, and policies. CMS anticipates the evaluation will--: 1. Identify and recommend improvements to address any weaknesses in systems, controls, and management practices; and: 2. Identify and make recommendations to address any opportunities to reduce expenditures and better protect the government's assets. Additional Comments: 1. It might be useful if GAO included, in Table I of Appendix 3, some information about the existing level of HIT adoption in systems in each of the eight hospitals. 2. The report did not mention the issue of exclusions. Exclusions refer to measure specifications that exclude specific patients from performance assessments. A key part of getting to certified electronic health records (EHRs) that include the functionality of quality reporting is figuring out how to easily locate information in a patient's record that would exclude that patient from performance assessment for a particular measure. [End of section] Appendix VI: GAO Contact and Staff Acknowledgments: GAO Contact: Cynthia A. Bascetta, (202) 512-7101 or BascettaC@gao.gov: Acknowledgments: In addition to the contact named above, Linda T. Kohn, Assistant Director; Mohammad S. Khan; Eric A. Peterson; Roseanne Price; Jessica C. Smith; and Teresa F. Tucker made key contributions to this report. FOOTNOTES [1] See Pub. L. No. 108-173, § 501(b), 117 Stat. 2066, 2289-90. [2] Throughout this report, we refer to CMS's Reporting Hospital Quality Data for the Annual Payment Update program as the "APU program." [3] Throughout this report, we refer to the data that hospitals submit to CMS that the agency uses to calculate their performance on its quality measures as "quality data." [4] Most acute care hospitals (i.e., those paid under the Medicare inpatient prospective payment system) receive an annual payment update that increases the standardized payment amount that Medicare pays them per patient, based on projected increases in hospital operating expenses. For fiscal year 2007, 3,319 hospitals received their full payment update, about 95 percent of those eligible to participate in the APU program, and the remaining 5 percent of eligible hospitals received a reduced annual payment update. CMS posts on a public Web site the performance scores that hospitals receive on quality measures derived from the data they submit. [5] See Pub. L. No. 109-171, § 5001(a), 120 Stat. 4, 28-29. [6] The magnitude of the reduction in the annual payment update for hospitals not submitting the quality data rose from 0.4 percentage points to 2 percentage points, starting in fiscal year 2007. [7] Initially, CMS designated 10 required quality measures under the APU program that applied to patients treated for heart attacks, heart failure, or pneumonia. In accordance with DRA, the Secretary increased the number of required quality measures to 21. Nine of the new measures related to the original three conditions, and 2 related to surgery, a new condition for the program. (See app. I for the list of measures.) [8] CMS recently announced the addition of three surgery measures for fiscal year 2008. In addition, to receive their full annual update payment, hospitals will have to submit to CMS the responses provided by a random sample of their discharged patients on a specified survey instrument--the Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) survey--which is designed to obtain patient assessments of the care they received. See 71 Fed. Reg. 67960, 68201-10 (Nov. 24, 2006). [9] Although the term value is often associated with numerical data, we use it in this report to identify the information that hospitals submit to CMS for a given data element. Some data elements call for numerical values, and others call for nonnumerical values, such as Y or N for "yes" or "no." [10] Beta blockers are medications that decrease the rate and force of heart contractions, which over time improves the heart's pumping ability. [11] 70 Fed. Reg. 47278, 47420 (Aug. 12, 2005). [12] All eight hospitals participated in the APU program and had their performance scores on the quality measures posted on CMS's Hospital Compare Web site, www.hospitalcompare.hhs.gov. [13] Available research suggested that only a handful of hospitals had developed a high level of IT implementation for purposes of documenting patient care in electronic records, with the large majority having reached just the initial stages of this process. See D. Garets and M. Davis, "Electronic Medical Records vs. Electronic Health Records: Yes, There Is a Difference" (Chicago, Ill.: HIMSS Analytics, LLC, updated Jan. 26, 2006). [14] Information obtained from CMS sources included the projected amount of Medicare payments that the hospitals would lose if they had not submitted quality data under the APU program and the number of patients for whom they submitted data to CMS. See app. IV. [15] The Joint Commission (previously the Joint Commission on Accreditation of Healthcare Organizations or JCAHO) is a private, not- for-profit organization that accredits approximately 82 percent of hospitals that participate in Medicare. [16] Current and past versions of the Specifications Manual for National Hospital Quality Measures are available at www.qualitynet.org. [17] Generally, the records for multiple hospital admissions for the same patient are stored together, creating an even more voluminous collection of documents in a single patient record. [18] For example, many hospitals have IT systems to record laboratory test results electronically, but fewer have added IT systems to record radiology results electronically. D. Blumenthal et al., Health Information Technology in the United States: The Information Base for Progress (Princeton, N.J.: Robert Wood Johnson Foundation, 2006), 3:26. [19] Exec. Order No. 13335, 69 Fed. Reg. 24059 (Apr. 27, 2004). [20] The CMS data specifications list the International Classification of Diseases, Ninth Revision (ICD-9) diagnostic codes that make a patient eligible for quality data collection. The other factor determining basic patient eligibility is age at the time of admission, derived from the patient's admission date and date of birth, also available from billing data. For pneumonia patients, secondary diagnoses may also affect eligibility. [21] Medicare bases its payments for inpatient care on principal diagnosis, which it defines as the condition established after study to be chiefly responsible for the admission. Principal diagnosis may also affect determination of the principal procedure, if several procedures were performed. [22] CMS's specific sampling requirements vary by medical condition. For example, hospitals that have more than 78 heart attack patients in a given quarter can submit quality data for a random sample of those patients, as long as their sample includes a minimum of 78 patients and applies a sampling rate of 20 percent up to a maximum required sample of 311. [23] Throughout this report, we use the term abstractor to indicate hospital staff who are trained to follow a detailed protocol in order to extract specified information in a consistent fashion from the medical records of multiple patients. [24] Coumadin is a medication that acts as an anticoagulant. It is used to prevent and treat harmful blood clots that increase the risk of heart attack and stroke. [25] These discharge instructions are supposed to cover recommended level of activity, diet, follow-up care after discharge, medications, weight monitoring, and what to do if symptoms worsen. The abstractor fills in a value for six separate data elements, one for each of the six specific instructions. The value is either a yes, written discharge instructions addressing the specified activity were provided, or no, instructions addressing the specified activity were not provided or unable to determine from medical record documentation. Anything less than a yes on all six data elements leads to a negative score on this quality measure for that patient. [26] This process is described in GAO, Hospital Quality Data: CMS Needs More Rigorous Methods to Ensure Reliability of Publicly Released Data, GAO-06-54 (Washington, D.C.: Jan. 31, 2006), 14. [27] Abstractors at many hospitals entered the data through an online connection to the vendor. Other hospitals submitted their quality data in the form of electronic files. [28] CMS draws a sample of five patient records from among those submitted by each hospital that provided data on six or more patients to the clinical data warehouse in a given quarter. CMS then tells each hospital which patients need to have their records copied, or printed out in the case of electronic records, and shipped to its contractor. The contractor then abstracts values for clinical data elements from those records, following the CMS/Joint Commission Specifications Manual, and the results are compared--data element by data element-- with the values originally abstracted and submitted by the hospital. If hospitals do not achieve a match of at least 80 percent of their data element values with those of the CMS contractor, following the outcome of any appeals, the hospitals will not receive a full payment update from Medicare for the subsequent fiscal year. See GAO-06-54, 14-16. [29] ACEIs and ARBs are two classes of drugs that have been shown in clinical trials to reduce mortality and morbidity in patients with LVSD. [30] ACEIs or ARBs may be contraindicated if the patient is known to be allergic to such drugs or suffers from certain medical conditions, such as moderate to severe aortic stenosis or renal disease. [31] A nurse practitioner or physician assistant may also provide this documentation in the patient's medical record. [32] This tally of revisions only takes account of new versions of the Specifications Manual. Recently, the release of new versions has been followed by multiple addendums to the revision, weeks or months later, to provide further modifications and clarifications. [33] These represent the FTEs devoted specifically to quality data collection and submission. Hospital officials noted that additional FTEs were involved in analyzing the hospital's performance on the quality measures and achieving improvements through changes in clinical process and educational efforts with the hospital's clinicians. [34] For example, hospital officials identified several private insurers that assess quality based on patient outcomes derived from administrative data, such as hospital billing data. [35] For example, one case study hospital began several years ago to use an IT system to record nursing notes. Hospital officials told us that they planned to initiate a pilot test of a new component of that IT system that would provide computerized physician order entry (CPOE) beginning in October 2006. The officials reported that they would assess their experience with the pilot test in one hospital unit before deciding how quickly to expand it to the rest of the hospital. They said that they were planning ultimately to store all patient medical records in electronic form but there was no fixed timeline for that objective. The timing would depend, they said, on the success of their CPOE pilot test. [36] Some of the data vendors captured values for certain data elements from the hospital's billing data, such as the patient's birth date and discharge status, and entered those values in the abstraction form that it provided to the hospital for that patient. The abstractors were supposed to make sure those entries were consistent with the information found in the patient's medical record. [37] The following standards apply in these areas: the National Council on Prescription Drug Programs (NCDCP) for drug ordering, Digital Imaging Communications in Medicine (DICOM) for radiological and other images, Laboratory Logical Observation Identifier Name Codes (LOINC) for clinical laboratory results, and Systematized Nomenclature of Medicine Clinical Terms (SNOMED-CT) for clinical terminology. [38] This congruence is one component within the broader initiative announced by the President to promote the adoption of interoperable electronic health records. [39] They were MedStar Health hospital system in Baltimore, New York Presbyterian Hospital in New York, Vanderbilt University Medical Center in Nashville, and Wishard Memorial Hospital in Indianapolis. All had volunteered to participate in a demonstration project called Connecting for Health sponsored by the independent, nonprofit eHealth Initiative. See Colorado Foundation for Medical Care, Analysis of Data from the "Healthcare Collaborative Network" (HCN) Project, CMS Special Study SS- CO-08, Final Report (Denver, Colo., Mar. 31, 2005). [40] Most notably, the hospitals used the messaging standard for transmitting clinical and administrative data--HL7--in different ways, including their coding for such data fields as admission source and discharge disposition. [41] The results of the Hospital Data Collection Consolidated Healthcare Informatics Adaptation Project were summarized in an internal CMS memo dated March 9, 2006. The CHI initiative is a collaborative agreement among federal agencies to adopt a common set of health information interoperability standards encompassing a wide range of clinical domains, including the data standards referred to in footnote 37. It is a component of the Federal Health Architecture, which is a partnership of approximately 20 federal agencies that use health IT. [42] See GAO, Health Information Technology: HHS Is Continuing Efforts to Define Its National Strategy, GAO-06-1071T (Washington, D.C.: Sept. 1, 2006), 17-18. Other objectives that are in the strategic framework, and that ONC has initiated specific activities to address, include encouraging widespread adoption of data standards, promoting consumer use of personal health information, and expanding health information support in disasters and crises. [43] Biosurveillance generally refers to the automated monitoring of information sources of potential value in detecting an emerging epidemic, whether naturally occurring or the result of bioterrorism. [44] CCHIT is a voluntary, private-sector organization set up in July 2004 by three leading health IT industry associations--the American Health Information Management Association (AHIMA), the Healthcare Information and Management Systems Society (HIMSS), and the National Alliance for Health Information Technology (Alliance)--to certify health IT products. [45] As discussed at the AHIC Meeting, Washington, D.C., October 31, 2006. These discussions resulted in a set of recommendations that the workgroup presented at AHIC's Meeting on March 13, 2007. [46] AHIC has identified priority areas involving consumer empowerment, biosurveillance, electronic health records, and chronic care. [47] GAO, Health Information Technology: HHS Is Taking Steps to Develop a National Strategy, GAO-05-628 (Washington, D.C.: May 27, 2005), 3; GAO-06-1071T, 18. [48] Some critical access hospitals submit quality data to CMS voluntarily, but this does not affect their Medicare payments. [49] P.G. Shekelle, S.C. Morton, and E.B. Keeler, Costs and Benefits of Health Information Technology, Evidence Report/Technology Assessment No. 132 (prepared by the Southern California Evidence-based Practice Center under Contract No. 290-02-0003), Agency for Healthcare Research and Quality Publication No. 06-E006 (Rockville, Md., April 2006). [50] QIOs are independent organizations that work under contract to CMS to monitor quality of care for the Medicare program within a given state and help providers to improve their clinical practices. CMS has assigned primary responsibility to the QIOs to inform hospitals about the APU program's requirements and to provide technical assistance to hospitals in meeting those requirements. [51] Any ties on median values, which were possible if the list had an even number of hospitals, were resolved by implementing a decision rule that alternated between taking the higher number of cases for the first instance, the lower number for the second instance, and so on. [52] For example, a few hospitals on these lists had submitted results for only one condition. [53] D. Garets and M. Davis, "Electronic Medical Records vs. Electronic Health Records: Yes, There Is a Difference" (Chicago, Ill.: HIMSS Analytics, LLC, updated Jan. 26, 2006). [54] These included the final rule for Inpatient Prospective Payment System updates for fiscal years 2006 and 2007, 70 Fed. Reg. 47507 (Aug. 12, 2005) and 71 Fed. Reg. 48166 (Aug. 18, 2006), and the "Impact file for IPPS FY 2006 Final Rule" and "Impact file for IPPS FY 2007 Final Rule" downloaded from Hyperlink, http://www.cms.hhs.gov/AcuteInpatientPPS/FFD/list.asp#TopOfPage on October 12, 2006. GAO's Mission: The Government Accountability Office, the audit, evaluation and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO's Web site (www.gao.gov). Each weekday, GAO posts newly released reports, testimony, and correspondence on its Web site. To have GAO e-mail you a list of newly posted products every afternoon, go to www.gao.gov and select "Subscribe to Updates." Order by Mail or Phone: The first copy of each printed report is free. Additional copies are $2 each. A check or money order should be made out to the Superintendent of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or more copies mailed to a single address are discounted 25 percent. Orders should be sent to: U.S. Government Accountability Office 441 G Street NW, Room LM Washington, D.C. 20548: To order by Phone: Voice: (202) 512-6000 TDD: (202) 512-2537 Fax: (202) 512-6061: To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov Automated answering system: (800) 424-5454 or (202) 512-7470: Congressional Relations: Gloria Jarmon, Managing Director, JarmonG@gao.gov (202) 512-4400 U.S. Government Accountability Office, 441 G Street NW, Room 7125 Washington, D.C. 20548: Public Affairs: Paul Anderson, Managing Director, AndersonP1@gao.gov (202) 512-4800 U.S. Government Accountability Office, 441 G Street NW, Room 7149 Washington, D.C. 20548: