This is the accessible text file for GAO report number GAO-12-418 entitled 'Medical Devices: FDA Has Met Most Performance Goals but Device Reviews Are Taking Longer' which was released on March 29, 2012. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States Government Accountability Office: GAO: Report to Congressional Requesters: February 2012: Medical Devices: FDA Has Met Most Performance Goals but Device Reviews Are Taking Longer: GAO-12-418: GAO Highlights: Highlights of GAO-12-418, a report to congressional requesters. Why GAO Did This Study: Why GAO Did This Study The Food and Drug Administration (FDA) within the Department of Health and Human Services (HHS) is responsible for overseeing the safety and effectiveness of medical devices sold in the United States. New devices are generally subject to FDA review via the 510(k) process, which determines if a device is substantially equivalent to another legally marketed device, or the more stringent premarket approval (PMA) process, which requires evidence providing reasonable assurance that the device is safe and effective. The Medical Device User Fee and Modernization Act of 2002 (MDUFMA) authorized FDA to collect user fees from the medical device industry to support the process of reviewing device submissions. FDA also committed to performance goals that include time frames within which FDA is to take action on a proportion of medical device submissions. MDUFMA was reauthorized in 2007. Questions have been raised as to whether FDA is sufficiently meeting the performance goals and whether devices are reaching the market in a timely manner. In preparation for reauthorization, GAO was asked to (1) examine trends in FDA’s 510(k) review performance from fiscal years (FY) 2003-2010, (2) examine trends in FDA’s PMA review performance from FYs 2003-2010, and (3) describe stakeholder issues with FDA’s review processes and steps FDA is taking that may address these issues. To do this work, GAO examined FDA medical device review data, reviewed FDA user fee data, interviewed FDA staff regarding the medical device review process and FDA data, and interviewed three industry groups and four consumer advocacy groups. What GAO Found: What GAO Found Even though FDA met all medical device performance goals for 510(k)s, the elapsed time from submission to final decision has increased substantially in recent years. This time to final decision includes the days FDA spends reviewing a submission as well as the days FDA spends waiting for a device sponsor to submit additional information in response to a request by the agency. FDA review time excludes this waiting time, and FDA review time alone is used to determine whether the agency met its performance goals. Each fiscal year since FY 2005 (the first year that 510(k) performance goals were in place), FDA has reviewed over 90 percent of 510(k) submissions within 90 days, thus meeting the first of two 510(k) performance goals. FDA also met the second goal for all 3 fiscal years it was in place by reviewing at least 98 percent of 510(k) submissions within 150 days. Although FDA has not yet completed reviewing all of the FY 2011 submissions, the agency was exceeding both of these performance goals for those submissions on which it had taken action. Although FDA review time decreased slightly from FY 2003 through FY 2010, the time that elapsed before FDA’s final decision increased substantially. Specifically, from FY 2005 through FY 2010, the average time to final decision for 510(k)s increased 61 percent, from 100 days to 161 days. FDA was inconsistent in meeting performance goals for PMA submissions. FDA designates PMAs as either original or expedited; those that FDA considers eligible for expedited review are devices intended to (a) treat or diagnose life-threatening or irreversibly debilitating conditions and (b) address an unmet medical need. While FDA met the performance goals for original PMA submissions for 4 out of 7 years the goals were in place, it met those goals for expedited PMA submissions only twice out of 7 years. FDA review time and time to final decision for both types of PMAs were highly variable but generally increased in recent years. For example, the average time to final decision for original PMAs increased from 462 days for FY 2003 to 627 days for FY 2008 (the most recent year for which complete data are available). The three industry groups and four consumer advocacy groups GAO interviewed noted a number of issues related to FDA’s review of medical device submissions. The four issues most commonly raised by stakeholders included (1) insufficient communication between FDA and stakeholders throughout the review process, (2) a lack of predictability and consistency in reviews, (3) an increase in time to final decision, and (4) inadequate assurance of the safety and effectiveness of approved or cleared devices. FDA is taking steps-— including issuing new guidance documents, enhancing reviewer training, and developing an electronic system for reporting adverse events—-that may address many of these issues. It is important for the agency to monitor the impact of those steps in ensuring that safe and effective medical devices are reaching the market in a timely manner. In commenting on a draft of this report, HHS generally agreed with GAO’s findings and noted that FDA has identified some of the same performance trends in its annual reports to Congress. HHS also called attention to the activities FDA has undertaken to improve the medical device review process. View [hyperlink, http://www.gao.gov/products/GAO-12-418]. For more information, contact Marcia Crosse at (202) 512-7114 or crossem@gao.gov. [End of section] Contents: Letter: Background: FDA Met All Performance Goals for 510(k)s but the Time to Final Decision Has Increased Substantially in Recent Years: FDA Was Inconsistent in Meeting Performance Goals for PMAs While FDA Review Time and Time to Final Decision Generally Increased: Stakeholders Noted Issues with the Medical Device Review Process; FDA Is Taking Steps That May Address Many of These Issues: Concluding Observations: Agency Comments: Appendix I: FDA Medical Device Review Performance for Fiscal Years (FY) 2000-2011: Appendix II: Number of Full-time Equivalent (FTE) FDA Staff Supporting Medical Device User Fee Activities, FYs 2003 through 2010: Appendix III: Comments from the Department of Health and Human Services: Appendix IV: GAO Contact and Staff Acknowledgments: Tables: Table 1: FDA's 510(k) Performance Goals, FYs 2003-2011: Table 2: GAO Definitions of FDA Review Time and Time to Final Decision: Table 3: FDA's PMA Performance Goals, FYs 2003-2011: Table 4: FDA Premarket Notification (510(k)) Review Performance, FYs 2000-2011: Table 5: FDA Premarket Approval (PMA) Review Performance for Original PMAs, FYs 2000-2011: Table 6: FDA Premarket Approval (PMA) Review Performance for Expedited PMAs, FYs 2000-2011: Figures: Figure 1: Percentage of 510(k)s FDA Reviewed within 90 Days for the Fiscal Year 2000-2010 Cohorts: Figure 2: Percentage of 510(k)s FDA Reviewed within 150 Days for the Fiscal Year 2000-2010 Cohorts: Figure 3: Average FDA Review Time and Average Time to Final Decision for 510(k)s in the Fiscal Year 2000-2010 Cohorts: Figure 4: Average Number of Review Cycles Per 510(k) for the Fiscal Year 2000-2010 Cohorts: Figure 5: Percentage of 510(k) Submissions Receiving FDA First-Cycle Substantially Equivalent Decisions and First-Cycle Additional Information Requests for the Fiscal Year 2000-2010 Cohorts: Figure 6: Percentage of FDA Final Decisions That Devices Were Substantially Equivalent or Not Substantially Equivalent for 510(k) Submissions for the Fiscal Year 2000-2010 Cohorts: Figure 7: Percentage of Original PMAs FDA Reviewed within 180 Days for the Fiscal Year 2000-2010 Cohorts: Figure 8: Percentage of Original PMAs FDA Reviewed within 320 Days and 295 Days for the Fiscal Year 2000-2010 Cohorts: Figure 9: Percentage of Expedited PMAs FDA Reviewed within 180 Days for the Fiscal Year 2000-2010 Cohorts: Figure 10: Percentage of Expedited PMAs FDA Reviewed within 300 Days and 280 Days for the Fiscal Year 2000-2010 Cohorts: Figure 11: Average FDA Review Time and Average Time to Final Decision for Original PMAs in the Fiscal Year 2000-2010 Cohorts: Figure 12: Average FDA Review Time and Average Time to Final Decision for Expedited PMAs in the Fiscal Year 2000-2010 Cohorts: Abbreviations: AI: additional information: CBER: Center for Biologics Evaluation and Research: CDRH: Center for Devices and Radiological Health: FDA: Food and Drug Administration: FDAAA: Food and Drug Administration Amendments Act of 2007: FTE: full-time equivalent: FY: fiscal year: HHS: Department of Health and Human Services: IOM: Institute of Medicine: MDUFA: Medical Device User Fee Amendments of 2007: MDUFMA: Medical Device User Fee and Modernization Act of 2002: PMA: premarket approval: SOP: standard operating procedure: [End of section] United States Government Accountability Office: Washington, DC 20548: February 29, 2012: The Honorable Richard Burr: Ranking Member: Subcommittee on Children and Families: Committee on Health, Education, Labor, and Pensions: United States Senate: The Honorable Tom Coburn: Ranking Member: Permanent Subcommittee on Investigations: Committee on Homeland Security and Governmental Affairs: United States Senate: The Food and Drug Administration (FDA) within the Department of Health and Human Services (HHS) is responsible for overseeing the safety and effectiveness of medical devices sold in the United States.[Footnote 1] Congress passed the Medical Device User Fee and Modernization Act of 2002 (MDUFMA) to provide additional resources for FDA to support the process of reviewing medical device applications.[Footnote 2] MDUFMA authorized FDA to collect user fees from the medical device industry to supplement its annual appropriation for salaries and expenses for fiscal years (FY) 2003 through 2007.[Footnote 3] The medical device user fee program was reauthorized in 2007 as part of the Food and Drug Administration Amendments Act (FDAAA); the reauthorization was called the Medical Device User Fee Amendments of 2007 (MDUFA) and authorizes FDA to collect user fees for FYs 2008 through 2012.[Footnote 4] FDA's authority to collect user fees for medical devices expires on October 1, 2012, and the medical device user fee program will need to be reauthorized for FDA to continue to collect user fees. Medical device user fee amounts have become a larger proportion of FDA's funding for medical device review processes, rising from 10.6 percent of costs in FY 2003--the first year FDA collected medical device user fees--to 19.5 percent of costs in FY 2010, the most recent year for which data are available. In FY 2010, MDUFA user fees collected by FDA--including application, establishment, and product fees--totaled nearly $67 million, including over $29 million in application fees.[Footnote 5] Application fees are collected for a variety of medical device submission types, including premarket notifications (510(k)s) and premarket approvals (PMAs). [Footnote 6] Under each authorization of the medical device user fee program, FDA committed to performance goals related to the review of medical device submissions.[Footnote 7] The performance goals include specific time frames within which FDA is to take action on submissions.[Footnote 8] These performance goals, as well as user fee amounts, are negotiated between FDA and industry stakeholders and submitted to congressional committees prior to each reauthorization. Questions have been raised about whether FDA is sufficiently meeting the user fee performance goals and whether medical devices are reaching the market in a timely manner. A number of congressional committees have recently held hearings during which the medical device industry questioned FDA's timeliness, while other stakeholders questioned FDA's ability to ensure safety and effectiveness. In preparation for the reauthorization of the medical device user fee program, you requested that we examine FDA's medical device review process. In this report, we (1) examine trends in FDA's 510(k) medical device review performance for FYs 2003 through 2010, (2) examine trends in FDA's PMA medical device review performance for FYs 2003 through 2010, and (3) describe the issues stakeholders have raised about the medical device review processes and steps FDA is taking that may address these issues. We provide additional details on FDA's medical device review performance in appendix I. You also asked us to provide information on the number of full-time equivalent (FTE) staff involved in the medical device review process; this information is provided in appendix II. To determine the trends in FDA's medical device review performance for 510(k)s and PMAs for FYs 2003 through 2010, we examined data obtained from FDA on the review process for all 510(k)s and PMAs submitted to FDA in those years.[Footnote 9] To provide context for FDA's performance prior to enactment of the user fee acts, we also analyzed review data for all 510(k)s and PMAs submitted for FYs 2000 through 2002. Additionally, we reviewed data on FY 2011 submissions in order to provide preliminary performance results for that year.[Footnote 10] Our analyses focused on the proportion of medical device submissions in each fiscal year for which FDA met or did not meet the applicable performance goal(s); the total time from the date of submission to the date a final decision was made--including both the time FDA spent reviewing a submission and any time the sponsor took to respond to questions or requests for additional information from FDA; the FDA review time (i.e., the time counted toward user fee performance goals, which does not include any time the sponsor took to respond to any questions from FDA); and the average number of review cycles prior to approval.[Footnote 11] We also reviewed publicly available FDA user fee data for FY 2003 through 2010 and interviewed FDA staff regarding the medical device review process and the data we received from FDA. To describe the issues stakeholders have raised about the device review processes and what steps FDA is taking that may address these issues, we reviewed congressional testimony and interviewed three industry groups and four consumer advocacy groups.[Footnote 12] All of these groups have participated in at least half of the meetings held by FDA to discuss the reauthorization of the user fee program. Furthermore, the industry groups we interviewed represent a mixture of large and small medical device manufacturers and cover a significant portion of the device market. We performed content analyses of the interviews to determine the most pressing issues based on how often each issue was raised.[Footnote 13] We conducted this performance audit from October 2011 through February 2012 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. Background: Medical devices are reviewed primarily by FDA's Center for Devices and Radiological Health (CDRH), with a smaller proportion reviewed by the Center for Biologics Evaluation and Research (CBER). FDA classifies each device type into one of three classes--class I, II, or III--based on the level of risk it poses and the controls necessary to reasonably ensure its safety and effectiveness.[Footnote 14] Class I includes devices with the lowest risk (e.g., tongue depressors, reading glasses, forceps), while class III includes devices with the highest risk (e.g., breast implants, coronary stents). Almost all class I devices and some class II devices (e.g., mercury thermometers, certain adjustable hospital beds) are exempt from premarket notification requirements. Most class III device types are required to obtain FDA approval through the PMA process, the most stringent of FDA's medical device review processes.[Footnote 15] The remaining device types are required to obtain FDA clearance or approval through either the 510(k) or PMA processes.[Footnote 16] If eligible, a 510(k) is filed when a manufacturer seeks a determination that a new device is substantially equivalent to a legally marketed device known as a predicate device.[Footnote 17] In order to be deemed substantially equivalent (i.e., cleared by FDA for marketing), a new device must have the same technological characteristics and intended use as the predicate device, or have the same intended use and different technological characteristics but still be demonstrated to be as safe and effective as the predicate device without raising new questions of safety and effectiveness. Most device submissions filed each year are 510(k)s. For example, of the more than 13,600 device submissions received by FDA in FYs 2008 through 2010, 88 percent were 510(k)s.[Footnote 18] The medical device performance goals were phased in during the period covered by MDUFMA (the FYs 2003 through 2007 cohorts) and were updated for MDUFA.[Footnote 19] Under MDUFA, FDA's goal is to complete the review process for 90 percent of the 510(k)s in a cohort within 90 days of submission (known as the Tier 1 goal) and to complete the review process for 98 percent of the cohort within 150 days (the Tier 2 goal).[Footnote 20] (See table 1 for the 510(k) performance goals for the FYs 2003 through 2011 cohorts). FDA may take any of the following actions on a 510(k) after completing its review: * issue an order declaring the device substantially equivalent; * issue an order declaring the device not substantially equivalent; or: * advise the submitter that the 510(k) is not required (i.e., the product is not regulated as a device or the device is exempt from premarket notification requirements). Each of these actions ends the review process for a submission. [Footnote 21] A sponsor's withdrawal of a submission also ends the review process. Table 1: FDA's 510(k) Performance Goals, FYs 2003-2011: Fiscal year cohort: Tier 1 goal percentage[B]; Period covered by MDUFMA: 2003: [Empty]; 2004: [Empty]; 2005: 75%[C]; 2006: 75%[C]; 2007: 80%[C]; Period covered by MDUFA[A]: 2008: 90%; 2009: 90%; 2010: 90%; 2011: 90%. Fiscal year cohort: Tier 2 goal percentage[D]; Period covered by MDUFMA: 2003: [Empty]; 2004: [Empty]; 2005: [Empty]; 2006: [Empty]; 2007: [Empty] Period covered by MDUFA[A]: 2008: 98%; 2009: 98%; 2010: 98%; 2011: 98%. Source: GAO analysis of FDA data. Notes: A review cohort includes all the medical device submissions relating to a particular performance goal that were submitted in a given fiscal year. For example, all 510(k)s received by FDA from October 1, 2010, to September 30, 2011, make up the 510(k) review cohort for FY 2011. There were no 510(k) performance goals prior to MDUFMA. Fiscal years for which there was no corresponding 510(k) performance goal are denoted with [Empty]. [A] MDUFA performance goals cover the FYs 2008 through 2012 cohorts; we are showing only those cohorts we examined as part of our analysis. [B] Percentage of 510(k) submissions to be completed by FDA within 90 days of submission. [C] These were not designated as Tier 1 goals prior to FY 2008 because there were no Tier 2 goals for those cohorts. We have aligned them with the Tier 1 goals for FYs 2008 through 2011 because they are based on the same 90-day time frame and this placement illustrates the gradual increase in the goal percentage over time. [D] Percentage of 510(k) submissions to be completed by FDA within 150 days of submission. [End of table] Alternatively, FDA may "stop the clock" on a 510(k) review by sending a letter asking the sponsor to submit additional information (known as an AI letter). This completes a review cycle but does not end the review process. The clock will resume (and a new review cycle will begin) when FDA receives a response from the sponsor. As a result, FDA may meet its 510(k) performance goals even if the time to final decision (FDA review time plus time spent waiting for the sponsor to respond to FDA's requests for additional information) is longer than the time frame allotted for the performance goal. For example, a sponsor might have submitted a 510(k) on March 1, 2009, to start the review process. If FDA sent an AI letter on April 1, 2009 (after 31 days on the clock), the sponsor provided a response on June 1, 2009 (after an additional 61 days off the clock), and FDA issued a final decision on June 11, 2009 (10 more days on the clock), then the FDA review time counted toward the MDUFA performance goals would be 41 days (FDA's on-the-clock time). FDA would have met both the Tier 1 (90 day) and Tier 2 (150 day) time frames for that device even though the total number of calendar days (on-and off-the-clock) from beginning the review to a final decision was 102 days. (See table 2 for a comparison of FDA review time and time to final decision.) FDA tracks the time to final decision and reports on it in the agency's annual reports to Congress on the medical device user fee program.[Footnote 22] Table 2: GAO Definitions of FDA Review Time and Time to Final Decision: Term: FDA review time; Definition: The time used to determine whether FDA met the medical device user fee performance goals. Determined by counting the on-the- clock time that FDA spends reviewing a submission during all review cycles and does not include any time that FDA spends between review cycles waiting for the sponsor to submit additional information. Term: Time to final decision; Definition: The total elapsed time from the date of submission through the date of FDA's final decision. Determined by adding on-the-clock time (FDA review time) for all review cycles and any off-the-clock time that FDA spends between review cycles waiting for the sponsor to submit additional information. Source: GAO. [End of table] A PMA is filed when a device is not substantially equivalent to a predicate device or has been classified as a class III PMA device (when the risks associated with the device are considerable). The PMA review process is the most stringent type of medical device review process required by FDA, and user fees are much higher for PMAs than for 510(k)s.[Footnote 23] PMAs are designated as either original or expedited.[Footnote 24] FDA considers a device eligible for expedited review if it is intended to (a) treat or diagnose a life-threatening or irreversibly debilitating disease or condition and (b) address an unmet medical need.[Footnote 25] FDA assesses all medical device submissions to determine which are appropriate for expedited review, regardless of whether a company has identified its device as a potential candidate for this program. To meet the MDUFA goals, FDA must complete its review of 60 percent of the original PMAs in a cohort within 180 days of submission (Tier 1) and 90 percent within 295 days (Tier 2). For expedited PMAs, 50 percent of a cohort must be completed within 180 days (Tier 1) and 90 percent within 280 days (Tier 2). (See table 3 for the PMA performance goals for the FYs 2003 through 2011 cohorts.) The various actions FDA may take during its review of a PMA are the following: * approval order; * approvable letter; * major deficiency letter; * not approvable letter; and: * denial order.[Footnote 26] The major deficiency letter is the only one of these actions that does not end the review process for purposes of determining whether FDA met the MDUFA performance goal time frame for a given submission. As with the AI letter in a 510(k) review, FDA can stop the clock during the PMA review process by sending a major deficiency letter (ending a review cycle) and resume it later upon receiving a response from the manufacturer. In contrast, taking one of the other four actions permanently stops the clock, meaning any further review that occurs is excluded from the calculation of FDA review time. In addition, the approval order and denial order are also considered final decisions and end FDA's review of a PMA completely. A sponsor's withdrawal of a submission also ends the review process. Table 3: FDA's PMA Performance Goals, FYs 2003-2011: Fiscal year cohort: Original PMA Tier 1 goal percentage[C]; Period covered by MDUFMA: 2003: [Empty]; 2004: [Empty]; 2005[B]: [Empty]; 2006[B]: [Empty]; 2007[B]: 50%; Period covered by MDUFA[A]: 2008: 60%; 2009: 60%; 2010: 60%; 2011: 60%. Fiscal year cohort: Original PMA Tier 2 goal percentage[D]; Period covered by MDUFMA: 2003: [Empty]; 2004: [Empty]; 2005[B]: [Empty]; 2006[B]: 80%; 2007[B]: 90%; Period covered by MDUFA[A]: 2008: 90%; 2009: 90%; 2010: 90%; 2011: 90%. Fiscal year cohort: Expedited PMA Tier 1 goal percentage[E]; Period covered by MDUFMA: 2003: [Empty]; 2004: [Empty]; 2005[B]: [Empty]; 2006[B]: [Empty]; 2007[B]: [Empty]; Period covered by MDUFA[A]: 2008: 50%; 2009: 50%; 2010: 50%; 2011: 50%. Fiscal year cohort: Expedited PMA Tier 2 goal percentage[F]; Period covered by MDUFMA: 2003: [Empty]; 2004: [Empty]; 2005[B]: 70%; 2006[B]: 80%; 2007[B]: 90%; Period covered by MDUFA[A]: 2008: 90%; 2009: 90%; 2010: 90%; 2011: 90%. Source: GAO analysis of FDA data. Notes: A review cohort includes all the medical device submissions relating to a particular performance goal that were submitted in a given fiscal year. For example, all PMAs received by FDA from October 1, 2010, to September 30, 2011, make up the PMA review cohort for FY 2011. There were no performance goals prior to MDUFMA. Fiscal years for which there was no corresponding PMA performance goal are denoted with [Empty]. [A] MDUFA performance goals cover the FYs 2008 through 2012 cohorts; we are showing only those cohorts we examined as part of our analysis. [B] PMA performance goals were not designated as Tier 1 or Tier 2 until FY 2008. We have aligned the performance goals in place prior to FY 2008 with the Tier 1 or Tier 2 goals for FYs 2008 through 2011 based on sharing the same or similar goal time frames. This placement illustrates the increase in the goal percentage over time. [C] Percentage of original PMAs to be completed by FDA within 180 days of submission. [D] Percentage of original PMAs to be completed by FDA within 320 days (for FYs 2006 through 2007) or 295 days (FYs 2008 through 2011) of submission. [E] Percentage of expedited PMAs to be completed by FDA within 180 days of submission. [F] Percentage of expedited PMAs to be completed by FDA within 300 days (for FYs 2005 through 2007) or 280 days (FYs 2008 through 2011) of submission. [End of table] FDA's review of medical device submissions has been discussed in recent congressional hearings, meetings between FDA and stakeholders about the medical device user fee program reauthorization, and published reports. In addition, in August 2010, FDA released reports which described the results of two internal assessments conducted by FDA of its medical device review programs.[Footnote 27] In January 2011, FDA released a plan of action that included 25 steps FDA intends to take to address the issues identified in these assessments.[Footnote 28] FDA Met All Performance Goals for 510(k)s but the Time to Final Decision Has Increased Substantially in Recent Years: For FYs 2003 through 2010, FDA met all Tier 1 and Tier 2 performance goals for 510(k)s. In addition, FDA review time for 510(k)s decreased slightly during this period, but time to final decision increased substantially. The average number of review cycles and FDA's requests for additional information for 510(k) submissions also increased during this period. FDA Met All Tier 1 and Tier 2 Performance Goals for 510(k)s from 2003 through 2010: FDA met all Tier 1 performance goals for the completed 510(k) cohorts that had Tier 1 goals in place.[Footnote 29] The percentage of 510(k)s reviewed within 90 days (the current Tier 1 goal time frame) exceeded 90 percent for the FYs 2005 through 2010 cohorts (see figure 1.) Although the 510(k) cohort for FY 2011 was still incomplete at the time we received FDA's data, FDA was exceeding the Tier 1 goal for those submissions on which it had taken action.[Footnote 30] FDA's performance varied for 510(k) cohorts prior to the years that the Tier 1 goals were in place but was always below the current 90 percent goal. Figure 1: Percentage of 510(k)s FDA Reviewed within 90 Days for the Fiscal Year 2000-2010 Cohorts: [Refer to PDF for image: vertical bar graph] Fiscal year 2000; Complete cohort where goal was not in place (shown for context): 79.9%. Fiscal year 2001; Complete cohort where goal was not in place (shown for context): 76.4%. Fiscal year 2002; Complete cohort where goal was not in place (shown for context): 78.1%. Fiscal year 2003; Complete cohort where goal was not in place (shown for context): 76.9%. Fiscal year 2004; Complete cohort where goal was not in place (shown for context): 84.1%. Fiscal year 2005; Complete cohort where goal was in place: 91.1%; 501 (k) Tier 1 Performance goal: 75%. Fiscal year 2006; Complete cohort where goal was in place: 91.2%; 501 (k) Tier 1 Performance goal: 75%. Fiscal year 2007; Complete cohort where goal was in place: 90.6%; 501 (k) Tier 1 Performance goal: 80%. Fiscal year 2008; Complete cohort where goal was in place: 93.6%; 501 (k) Tier 1 Performance goal: 90%. Fiscal year 2009; Complete cohort where goal was in place: 90.1%; 501 (k) Tier 1 Performance goal: 90%. Fiscal year 2010; Complete cohort where goal was in place: 91.6%; 501 (k) Tier 1 Performance goal: 90%. For FY 2005-2006, the Tier 1 performance goal was to have 75 percent of 510(k)s reviewed within 90 days. For FY 2007, the Tier 1 performance goal was to have 80 percent of 510(k)s reviewed within 90 days. For FY 2008-2010, the Tier 1 performance goal was to have 90 percent of 510(k)s reviewed within 90 days. Source: GAO analysis of FDA data. Note: Only 510(k)s that had received a final decision from FDA were included in this analysis. Tier 1 and Tier 2 designations refer to the length of time allotted (90 days and 150 days, respectively) for FDA to complete its review of 510(k) submissions. If FDA completed its review of a 510(k) submission in 90 days or less, it met the time frames for both the Tier 1 and Tier 2 goals. If the review was completed in more than 90 days but not more than 150 days, only the time frame for the Tier 2 goal was met. If the review took longer than 150 days, FDA did not meet the time frame for either goal. FDA did not designate 510(k) performance goals as either Tier 1 or Tier 2 prior to FY 2008. We have aligned the performance goals in place prior to FY 2008 with the Tier 1 goals for FYs 2008 through 2011 based on sharing the same 90-day time frame. This placement illustrates the increase in the goal percentage over time. [End of figure] FDA met the Tier 2 goals for all three of the completed cohorts that had Tier 2 goals in place. Specifically, FDA met the goal of reviewing 98 percent of submissions within 150 days for the FYs 2008, 2009, and 2010 cohorts (see figure 2.) Additionally, although the 510(k) cohort for FY 2011 was still incomplete at the time we received FDA's data, FDA was exceeding the Tier 2 goal for those submissions on which it had taken action.[Footnote 31] FDA's performance for 510(k) cohorts prior to the years that the Tier 2 goals were in place was generally below the current 98 percent goal. Figure 2: Percentage of 510(k)s FDA Reviewed within 150 Days for the Fiscal Year 2000-2010 Cohorts: [Refer to PDF for image: vertical bar graph] Fiscal year 2000; Complete cohort where goal was not in place (shown for context): 89.5%. Fiscal year 2001; Complete cohort where goal was not in place (shown for context): 87.9%. Fiscal year 2002; Complete cohort where goal was not in place (shown for context): 88.5%. Fiscal year 2003; Complete cohort where goal was not in place (shown for context): 87.9%. Fiscal year 2004; Complete cohort where goal was not in place (shown for context): 95.3%. Fiscal year 2005; Complete cohort where goal was not in place (shown for context): 97.6%. Fiscal year 2006; Complete cohort where goal was not in place (shown for context): 97.1%. Fiscal year 2007; Complete cohort where goal was not in place (shown for context): 96.7%. Fiscal year 2008; Complete cohort where goal was in place: 98.4%; 501 (k) Tier 2 Performance goal: 98%. Fiscal year 2009; Complete cohort where goal was in place: 97.6%; 501 (k) Tier 2 Performance goal: 98%. Fiscal year 2010; Complete cohort where goal was in place: 98.5%; 501 (k) Tier 2 Performance goal: 98%. For FY 2008-2010, the Tier 2 performance goal was to have 98 percent of 510(k)s reviewed within 150 days. Source: GAO analysis of FDA data. Note: Only 510(k)s that had received a final decision from FDA were included in this analysis. For purposes of determining whether a goal was met, the percentage is rounded to the nearest whole number. Tier 1 and Tier 2 designations refer to the length of time allotted (90 days and 150 days, respectively) for FDA to complete its review of 510(k) submissions. If FDA completed its review of a submission in 90 days or less, it met the time frames for both the Tier 1 and Tier 2 goals. If the review was completed in more than 90 days but not more than 150 days, only the time frame for the Tier 2 goal was met. If the review took longer than 150 days, FDA did not meet the time frame for either goal. FDA did not designate 510(k) performance goals as either Tier 1 or Tier 2 prior to FY 2008. [End of figure] FDA Review Time for 510(k)s Decreased Slightly from 2003 to 2010 but Time to Final Decision Increased Substantially: While the average FDA review time for 510(k) submissions decreased slightly from the FY 2003 cohort to the FY 2010 cohort, the time to final decision increased substantially. Specifically, the average number of days FDA spent on the clock reviewing a 510(k) varied somewhat but overall showed a small decrease from 75 days for the FY 2003 cohort to 71 days for the FY 2010 cohort (see figure 3). However, when we added off-the-clock time (where FDA waited for the sponsor to provide additional information) to FDA's on-the-clock review time, the resulting time to final decision decreased slightly from the FY 2003 cohort to the FY 2005 cohort before increasing 61 percent--from 100 days to 161 days--from the FY 2005 cohort through the FY 2010 cohort. FDA officials told us that the only alternative to requesting additional information is for FDA to reject the submission. The officials stated that as a result of affording sponsors this opportunity to respond, the time to final decision is longer but the application has the opportunity to be approved. Figure 3: Average FDA Review Time and Average Time to Final Decision for 510(k)s in the Fiscal Year 2000-2010 Cohorts: [Refer to PDF for image: multiple line graph] Fiscal year: 2000; Average time to final decision: 105.1 days; Average FDA review time: 74.1 days. Fiscal year: 2001; Average time to final decision: 109.3 days; Average FDA review time: 79 days. Fiscal year: 2002; Average time to final decision: 109 days; Average FDA review time: 76.9 days. Fiscal year: 2003; Average time to final decision: 111.2 days; Average FDA review time: 75 days. Fiscal year: 2004; Average time to final decision: 99.2 days; Average FDA review time: 62.9 days. Fiscal year: 2005; Average time to final decision: 100.4 days; Average FDA review time: 56.1 days. Fiscal year: 2006; Average time to final decision: 113.4 days; Average FDA review time: 60.5 days. Fiscal year: 2007; Average time to final decision: 131.8 days; Average FDA review time: 65.6 days. Fiscal year: 2008; Average time to final decision: 137.2 days; Average FDA review time: 65.5 days. Fiscal year: 2009; Average time to final decision: 158 days; Average FDA review time: 72.6 days. Fiscal year: 2010; Average time to final decision: 160.5 days; Average FDA review time: 70.8 days. Source: GAO analysis of FDA review time. Note: Only 510(k)s that had received a final decision from FDA were included in this analysis. Average FDA review time refers to the time that FDA spends reviewing a submission and therefore excludes any time the sponsor may spend responding to FDA requests for additional information. Average time to final decision includes both the time FDA spends reviewing a submission and the time the sponsor may spend responding to any requests for additional information. [End of figure] Additionally, although the 510(k) cohort for FY 2011 was still incomplete at the time we received FDA's data, the average FDA review time and time to final decision were lower in FY 2011 for those submissions on which it had taken action.[Footnote 32] Number of Review Cycles and Requests for Additional Information Increased for 510(k) Submissions from 2003 to 2010: The average number of review cycles per 510(k) increased substantially (39 percent) from FYs 2003 through 2010, rising from 1.47 cycles for the FY 2003 cohort to 2.04 cycles for the FY 2010 cohort (see figure 4). Figure 4: Average Number of Review Cycles Per 510(k) for the Fiscal Year 2000-2010 Cohorts: [Refer to PDF for image: line graph] Fiscal year: 2000; Average number of review cycles per submission: 1.38. Fiscal year: 2001; Average number of review cycles per submission: 1.43. Fiscal year: 2002; Average number of review cycles per submission: 1.39. Fiscal year: 2003; Average number of review cycles per submission: 1.47. Fiscal year: 2004; Average number of review cycles per submission: 1.53. Fiscal year: 2005; Average number of review cycles per submission: 1.61. Fiscal year: 2006; Average number of review cycles per submission: 1.69. Fiscal year: 2007; Average number of review cycles per submission: 1.78. Fiscal year: 2008; Average number of review cycles per submission: 1.83. Fiscal year: 2009; Average number of review cycles per submission: 1.94. Fiscal year: 2010; Average number of review cycles per submission: 2.04. Source: GAO analysis of FDA data. Note: Cycles that were currently in progress at the time we received FDA's data were included in this analysis. [End of figure] In addition, the percentage of 510(k)s receiving a first-cycle decision of substantially equivalent (i.e., cleared by FDA for marketing) decreased from 54 percent for the FY 2003 cohort to 20 percent for the FY 2010 cohort, while the percentage receiving first- cycle AI requests exhibited a corresponding increase. (See figure 5.) The average number of 510(k) submissions per year remained generally steady during this period. Although the 510(k) cohort for FY 2011 was still incomplete at the time we received FDA's data, of the first- cycle reviews that had been completed, the percentage of submissions receiving a first-cycle decision of substantially equivalent was slightly higher than for the FY 2010 cohort (21.2 percent in FY 2011 compared with 20.0 percent in FY 2010).[Footnote 33] In addition, the percentage receiving a first-cycle AI request was lower (68.2 percent for FY 2011 compared with 77.0 for FY 2010).[Footnote 34] Figure 5: Percentage of 510(k) Submissions Receiving FDA First-Cycle Substantially Equivalent Decisions and First-Cycle Additional Information Requests for the Fiscal Year 2000-2010 Cohorts: [Refer to PDF for image: multiple line graph] Fiscal year: 2000; Substantially equivalent: 54.1%; Additional information requests: 37%. Fiscal year: 2001; Substantially equivalent: 53.8%; Additional information requests: 38.6%. Fiscal year: 2002; Substantially equivalent: 55.7%; Additional information requests: 35.9%. Fiscal year: 2003; Substantially equivalent: 54%; Additional information requests: 40.1%. Fiscal year: 2004; Substantially equivalent: 52.1%; Additional information requests: 43.5%. Fiscal year: 2005; Substantially equivalent: 47.7%; Additional information requests: 49.7%. Fiscal year: 2006; Substantially equivalent: 42.2%; Additional information requests: 55.6%. Fiscal year: 2007; Substantially equivalent: 36.6%; Additional information requests: 61.3%. Fiscal year: 2008; Substantially equivalent: 33.3%; Additional information requests: 64.5%. Fiscal year: 2009; Substantially equivalent: 26.2%; Additional information requests: 71.6%. Fiscal year: 2010; Substantially equivalent: 20%; Additional information requests: 77%. Source: GAO analysis of FDA data. Notes: Only 510(k)s that had received a first-cycle decision from FDA were included in this analysis. The percentages for each year do not add to 100 percent because there are other possible actions classified as first-cycle decisions (e.g., a sponsor's withdrawal of a submission). The first review cycle starts when FDA receives a submission and ends when FDA either makes a decision regarding substantial equivalence or requests additional information from the sponsor, or the sponsor withdraws the submission. More than one cycle may occur before FDA reaches its final decision. [End of figure] The percentage of 510(k)s that received a final decision of substantially equivalent also decreased in recent years--from a high of 87.9 percent for the FY 2005 cohort down to 75.1 percent for the FY 2010 cohort. The percentage of 510(k)s receiving a final decision of not substantially equivalent increased for each cohort from FYs 2003 through 2010, rising from just over 2.9 percent to 6.4 percent. (See figure 6.) Figure 6: Percentage of FDA Final Decisions That Devices Were Substantially Equivalent or Not Substantially Equivalent for 510(k) Submissions for the Fiscal Year 2000-2010 Cohorts: [Refer to PDF for image: multiple line graph] Fiscal year: 2000; Substantially equivalent: 81.4%; Not substantially equivalent: 1.1%. Fiscal year: 2001; Substantially equivalent: 83.5%; Not substantially equivalent: 1.9%. Fiscal year: 2002; Substantially equivalent: 83%; Not substantially equivalent: 1.9%. Fiscal year: 2003; Substantially equivalent: 85.6%; Not substantially equivalent: 2.9%. Fiscal year: 2004; Substantially equivalent: 87.6%; Not substantially equivalent: 3.5%. Fiscal year: 2005; Substantially equivalent: 87.9%; Not substantially equivalent: 3.7%. Fiscal year: 2006; Substantially equivalent: 85.7%; Not substantially equivalent: 3.7%. Fiscal year: 2007; Substantially equivalent: 83.5%; Not substantially equivalent: 3.7%. Fiscal year: 2008; Substantially equivalent: 81%; Not substantially equivalent: 3.8%. Fiscal year: 2009; Substantially equivalent: 77.3%; Not substantially equivalent: 5.5%. Fiscal year: 2010; Substantially equivalent: 75.1%; Not substantially equivalent: 6.4%. Source: GAO analysis of FDA data. Notes: Only 510(k)s that had received a final decision from FDA were included in this analysis. [End of figure] The percentages for each year do not add to 100 percent because there are other possible actions classified as final decisions (e.g., a sponsor's withdrawal of a submission). FDA Was Inconsistent in Meeting Performance Goals for PMAs While FDA Review Time and Time to Final Decision Generally Increased: For FYs 2003 through 2010, FDA met most of the goals for original PMAs but fell short on most of the goals for expedited PMAs. In addition, FDA review time and time to final decision for both types of PMAs generally increased during this period. Finally, the average number of review cycles increased for certain PMAs while the percentage of PMAs approved after one review cycle generally decreased. FDA Met Most Goals for Original PMAs but Fell Short of Most Goals for Expedited PMAs from 2003-2010: Since FY 2003, FDA met the original PMA performance goals for four of the seven completed cohorts that had goals in place, but met the goals for only two of the seven expedited PMA cohorts with goals.[Footnote 35] Specifically, FDA met its Tier 1 performance goals for original PMAs for all three of the completed original PMA cohorts that had such goals in place, with the percentage increasing from 56.8 percent of the FY 2007 cohort to 80.0 percent of the FY 2009 cohort completed on time.[Footnote 36] (See figure 7.) While the FY 2010 and 2011 cohorts are still incomplete, FDA is exceeding the goals for those submissions on which it has taken action.[Footnote 37] FDA's performance had declined steadily in the years immediately before implementation of these goals- -from 67.1 percent of the FY 2000 cohort to 34.5 percent of the FY 2006 cohort completed within 180 days. Figure 7: Percentage of Original PMAs FDA Reviewed within 180 Days for the Fiscal Year 2000-2010 Cohorts: [Refer to PDF for image: vertical bar graph] Fiscal year: 2000; Complete cohort where goal was not in place (shown for context): 67.1%. Fiscal year: 2001; Complete cohort where goal was not in place (shown for context): 56.8%. Fiscal year: 2002; Complete cohort where goal was not in place (shown for context): 52.9%. Fiscal year: 2003; Complete cohort where goal was not in place (shown for context): 50%. Fiscal year: 2004; Complete cohort where goal was not in place (shown for context): 40%. Fiscal year: 2005; Complete cohort where goal was not in place (shown for context): 32.1%. Fiscal year: 2006; Complete cohort where goal was not in place (shown for context): 34.5%. Fiscal year: 2007; Complete cohort where goal was in place: 56.8%; Tier 1 performance goal: 50%. Fiscal year: 2008; Complete cohort where goal was in place: 61.8%; Tier 1 performance goal: 60%. Fiscal year: 2009; Complete cohort where goal was in place: 80%; Tier 1 performance goal: 60%. Fiscal year: 2010; Incomplete cohort where goal was in place: 86.4%; Tier 1 performance goal: 60%. For FY 2007, the Tier 1 performance goal was to have 50 percent of original PMAs reviewed within 180 days. For FY 2008-2010, the Tier 1 performance goal was to have 60 percent of original PMAs reviewed within 180 days. Source: GAO analysis of FDA data. Notes: Tier 1 and Tier 2 designations refer to the length of time allotted (for the FYs 2008 through 2011 cohorts: 180 days and 295 days, respectively) for FDA to complete its review of original PMA submissions. FDA did not designate PMA performance goals as either Tier 1 or Tier 2 prior to FY 2008. Prior to FY 2006 there were no goals for original PMA submissions. For FY 2006 there was only one goal: 320 days. For FY 2007, the goals for original PMA submissions were 180 days and 320 days. We have aligned the performance goals in place prior to FY 2008 with the Tier 1 and Tier 2 goals for FYs 2008- 2011 based on sharing similar time frames. This placement illustrates the increase in the goal percentage over time. If FDA completed its review of a submission in 180 days or less, it met the time frames for both the Tier 1 and Tier 2 goals. If the review was completed in more than 180 days but not more than 320 or 295 days (depending on the cohort), only the time frame for the Tier 2 goal was met. If the review took longer than 320 or 295 days, FDA did not meet the time frame for either goal. We treated PMA submissions as meeting the time frame for a given performance goal if they were reviewed within the goal time plus any extension to the goal time that may have been made. The only reason the goal time can be extended is if the sponsor submits a major amendment to the submission on its own initiative (i.e., unsolicited by FDA). [A] This analysis includes only those original PMAs for which FDA or the sponsor had made a decision that would permanently stop the review clock for purposes of determining whether FDA met its performance goals (i.e., an approval, approvable, not approvable, withdrawal, or denial); this includes reviews by CBER through September 30, 2011, and reviews by CDRH through December 1, 2011. Submissions without such a decision are not included in the results for each cohort shown above. We considered a cohort to be incomplete if more than 10 percent of submissions had not yet received such a decision. For this analysis, the FY 2010 cohort was still incomplete. Specifically, for 18.5 percent of the FY 2010 original PMA cohort, a decision that would permanently stop the review clock had not been made at the time we received FDA's data. As a result, it was too soon to tell what the final results for this cohort would be. It is possible that some of the reviews taking the most time were among those not completed when we received FDA's data. The percentage of original PMAs reviewed within 180 days for this cohort may increase or decrease as those reviews are completed. [End of figure] FDA's performance in meeting the Tier 2 performance goals for original PMAs fell short of the goal for three of the four completed cohorts during the years that these goals were in place. FDA met the MDUFMA Tier 2 performance goal (320 days) for the FY 2006 original PMA cohort but not for the FY 2007 cohort, and did not meet the MDUFA Tier 2 performance goal (295 days) for either of the completed cohorts (FYs 2008 and 2009) to which the goal applied (see figure 8). While the FYs 2010 and 2011 original PMA cohorts are still incomplete, FDA is exceeding the MDUFA Tier 2 goals for those submissions on which it has taken action.[Footnote 38] FDA's performance varied for original PMA cohorts prior to the years that the Tier 2 goals were in place but was always below the current goal to have 90 percent reviewed within 295 days. Figure 8: Percentage of Original PMAs FDA Reviewed within 320 Days and 295 Days for the Fiscal Year 2000-2010 Cohorts: [Refer to PDF for image: 2 vertical bar graphs] Original PMAs reviewed within 320 days (MDUFMA Tier 2 goal): Fiscal year: 2000; Complete cohort where goal was not in place (shown for context): 90.0%. Fiscal year: 2001; Complete cohort where goal was not in place (shown for context): 90.5%. Fiscal year: 2002; Complete cohort where goal was not in place (shown for context): 78.4%. Fiscal year: 2003; Complete cohort where goal was not in place (shown for context): 94.0%. Fiscal year: 2004; Complete cohort where goal was not in place (shown for context): 95.6%. Fiscal year: 2005; Complete cohort where goal was not in place (shown for context): 89.3%. Fiscal year: 2006; Complete cohort where goal was place: 81.8%; Tier 2 performance goal: 80%. Fiscal year: 2007; Complete cohort where goal was in place: 89.2%; Tier 2 performance goal: 90%. Fiscal year: 2008; Complete cohort where goal was not in place (shown for context): 79.4%. Tier 2 performance goal: 60%. Fiscal year: 2009; Complete cohort where goal was not in place (shown for context): 87.5%. Tier 2 performance goal: 60%. Fiscal year: 2010; Incomplete cohort where goal was not in place (shown for context): 100%[A]. For FY 2006, the Tier 2 performance goal was to have 80 percent of original PMAs reviewed within 320 days. For FY 2007, the Tier 2 performance goal was to have 90 percent of original PMAs reviewed within 320 days. [End of first graph] Original PMAs reviewed within 290 days (MDUFMA Tier 2 goal): Fiscal year: 2000; Complete cohort where goal was not in place (shown for context): 82.9%. Fiscal year: 2001; Complete cohort where goal was not in place (shown for context): 85.1%. Fiscal year: 2002; Complete cohort where goal was not in place (shown for context): 76.5%. Fiscal year: 2003; Complete cohort where goal was not in place (shown for context): 86.0%. Fiscal year: 2004; Complete cohort where goal was not in place (shown for context): 88.9%. Fiscal year: 2005; Complete cohort where goal was not in place (shown for context): 73.2%. Fiscal year: 2006; Complete cohort where goal was not in place (shown for context): 61.8%. Fiscal year: 2007; Complete cohort where goal was not in place (shown for context): 78.4%. Fiscal year: 2008; Complete cohort where goal was place: 79.4%; Tier 2 performance goal: 90%. Fiscal year: 2009; Complete cohort where goal was place: 87.0; Tier 2 performance goal: 90%. Fiscal year: 2010; Incomplete cohort where goal was in place: 100%[A]; Tier 2 performance goal: 90%. For FY 2008-2010, the Tier 2 performance goal was to have 90 percent of original PMAs reviewed within 295 days. Source: GAO analysis of FDA data. Notes: Tier 1 and Tier 2 designations refer to the length of time allotted (for the FYs 2008-2011 cohorts: 180 days and 295 days, respectively) for FDA to complete its review of original PMA submissions. FDA did not designate PMA performance goals as either Tier 1 or Tier 2 prior to FY 2008. Prior to FY 2006 there were no goals for original PMA submissions. For FY 2006 there was only one goal: 320 days. For FY 2007, the goals for original PMA submissions were 180 days and 320 days. We have aligned the performance goals in place prior to FY 2008 with the Tier 1 and Tier 2 goals for FYs 2008 through 2011 based on sharing similar time frames. This placement illustrates the increase in the goal percentage over time. If FDA completed its review of a submission in 180 days or less, it met the time frames for both the Tier 1 and Tier 2 goals. If the review was completed in more than 180 days but not more than 320 or 295 days (depending on the cohort), only the time frame for the Tier 2 goal was met. If the review took longer than 320 or 295 days, FDA did not meet the time frame for either goal. We treated PMA submissions as meeting the time frame for a given performance goal if they were reviewed within the goal time plus any extension to the goal time that may have been made. The only reason the goal time can be extended is if the sponsor submits a major amendment to the submission on its own initiative (i.e., unsolicited by FDA). [A] These analyses include only those original PMAs for which FDA or the sponsor had made a decision that would permanently stop the review clock for purposes of determining whether FDA met its performance goals (i.e., an approval, approvable, not approvable, withdrawal, or denial); this includes reviews by CBER through September 30, 2011, and reviews by CDRH through December 1, 2011. Submissions without such a decision are not included in the results for each cohort shown above. We considered a cohort to be incomplete if more than 10 percent of submissions had not yet received such a decision. For this analysis, the FY 2010 original PMA cohort was still incomplete. Specifically, for 18.5 percent of the FY 2010 original PMA cohort, a decision that would permanently stop the review clock had not been made at the time we received FDA's data. As a result, it was too soon to tell what the final results for this cohort would be. It is possible that some of the reviews taking the most time were among those not completed when we received FDA's data. The percentage of original PMAs reviewed within 320 and 295 days for this cohort may decrease as those reviews are completed. [End of figure] For expedited PMAs, FDA met the Tier 1 and Tier 2 performance goals for only two of the seven completed cohorts for which the goals were in effect. FDA met the Tier 1 (180-day) goal for only one of the two completed cohorts during the years the goal has been in place, meeting the goal for the FY 2009 cohort but missing it for the FY 2008 cohort (see figure 9). FDA's performance varied for cohorts prior to the years that the Tier 1 expedited PMA goals were in place but was below the current goal of 50 percent in all but 1 year. Figure 9: Percentage of Expedited PMAs FDA Reviewed within 180 Days for the Fiscal Year 2000-2010 Cohorts: [Refer to PDF for image: vertical bar graph] Fiscal year: 2000; Complete cohort where goal was not in place (shown for context): 18.2%. Fiscal year: 2001; Complete cohort where goal was not in place (shown for context): 30%. Fiscal year: 2002; Complete cohort where goal was not in place (shown for context): 25%. Fiscal year: 2003; Complete cohort where goal was not in place (shown for context): 50%. Fiscal year: 2004; Complete cohort where goal was not in place (shown for context): 47.1%. Fiscal year: 2005; Complete cohort where goal was not in place (shown for context): 16.7%. Fiscal year: 2006; Complete cohort where goal was not in place (shown for context): 0; Fiscal year: 2007; Complete cohort where goal was not in place (shown for context): 0; Fiscal year: 2008; Complete cohort where goal was place: 25%. Tier 1 performance goal: 50%. Fiscal year: 2009; Complete cohort where goal was place: 50%. Tier 1 performance goal: 50%. Fiscal year: 2010; Incomplete cohort where goal was in place: 40%[A]; Tier 1 performance goal: 50%. For FY 2008-2010, the Tier 1 performance goal was to have 50 percent of original PMAs reviewed within 180 days. Source: GAO analysis of FDA data. Notes: Tier 1 and Tier 2 designations refer to the length of time allotted (for the FYs 2008 through 2011 cohorts: 180 days and 280 days, respectively) for FDA to complete its review of expedited PMA submissions. FDA did not designate PMA performance goals as either Tier 1 or Tier 2 prior to FY 2008. For FYs 2005 through 2007, there was only one goal for expedited PMAs: 300 days. Prior to FY 2005 there were no goals for expedited PMA submissions. We have aligned the performance goal in place prior to FY 2008 with the Tier 2 goal for FYs 2008 through 2011 based on sharing a similar time frame. This placement illustrates the increase in the goal percentage over time. If FDA completed its review of a submission in 180 days or less, it met the time frames for both the Tier 1 and Tier 2 goals. If the review was completed in more than 180 days but not more than 300 or 280 days (depending on the cohort), only the time frame for the Tier 2 goal was met. If the review took longer than 300 or 280 days, FDA did not meet the time frame for either goal. We treated PMA submissions as meeting the time frame for a given performance goal if they were reviewed within the goal time plus any extension to the goal time that may have been made. The only reason the goal time can be extended is if the sponsor submits a major amendment to the submission on its own initiative (i.e., unsolicited by FDA). [A] This analysis includes only those expedited PMAs for which FDA or the sponsor had made a decision that would permanently stop the review clock for purposes of determining whether FDA met its performance goals (i.e., an approval, approvable, not approvable, withdrawal, or denial); this includes reviews by CBER through September 30, 2011, and reviews by CDRH through December 1, 2011. Submissions without such a decision are not included in the results for each cohort shown above. We considered a cohort to be incomplete if more than 10 percent of submissions had not yet received such a decision. For this analysis, the FY 2010 expedited PMA cohort was still incomplete. Specifically, for 16.7 percent of the FY 2010 expedited PMA cohort, a decision that would permanently stop the review clock had not been made at the time we received FDA's data. As a result, it was too soon to tell what the final results for this cohort would be. It is possible that some of the reviews taking the most time were among those not completed when we received FDA's data. The percentage of expedited PMAs reviewed within 180 days for this cohort may increase or decrease as those reviews are completed. [End of figure] FDA's performance in meeting the Tier 2 performance goals for expedited PMAs fell short of the goal for four of the five completed cohorts during the years that these goals were in place. FDA met the MDUFMA Tier 2 performance goal (300 days) for the FY 2005 cohort but not for the FY 2006 or 2007 cohorts, and did not meet the MDUFA Tier 2 performance goal (280 days) for either of the completed cohorts (FY 2008 and 2009) to which the goal applied (see figure 10). FDA's performance varied for expedited PMA cohorts prior to the years that the Tier 2 goals were in place but always fell below the current goal to have 90 percent reviewed within 280 days. Figure 10: Percentage of Expedited PMAs FDA Reviewed within 300 Days and 280 Days for the Fiscal Year 2000-2010 Cohorts: [Refer to PDF for image: 2 vertical bar graphs] Expedited PMAs reviewed within 300 days (MDUFMA Tier 2 goal): Fiscal year: 2000; Complete cohort where goal was not in place (shown for context): 72.7%. Fiscal year: 2001; Complete cohort where goal was not in place (shown for context): 60.0%. Fiscal year: 2002; Complete cohort where goal was not in place (shown for context): 91.7%. Fiscal year: 2003; Complete cohort where goal was not in place (shown for context): 75.0%. Fiscal year: 2004; Complete cohort where goal was not in place (shown for context): 82.4%. Fiscal year: 2005; Complete cohort where goal was place: 83.3%; Tier 2 performance goal: 70%. Fiscal year: 2006; Complete cohort where goal was place: 66.7%; Tier 2 performance goal: 80%. Fiscal year: 2007; Complete cohort where goal was not in place (shown for context): 0.0%; Tier 2 performance goal: 90%. Fiscal year: 2008; Complete cohort where goal was not in place (shown for context): 50.0%. Fiscal year: 2009; Complete cohort where goal was not in place (shown for context): 75.0%. Fiscal year: 2010; Incomplete cohort where goal was not in place (shown for context): 100%[A]. For FY 2005, the Tier 2 performance goal was to have 70 percent of original PMAs reviewed within 300 days. For FY 2006, the Tier 2 performance goal was to have 80 percent of original PMAs reviewed within 300 days. For FY 2007, the Tier 2 performance goal was to have 90 percent of original PMAs reviewed within 300 days. [End of first graph] Expedited PMAs reviewed within 290 days (MDUFMA Tier 2 goal): Fiscal year: 2000; Complete cohort where goal was not in place (shown for context): 63.6%. Fiscal year: 2001; Complete cohort where goal was not in place (shown for context): 60.0%. Fiscal year: 2002; Complete cohort where goal was not in place (shown for context): 83.3%. Fiscal year: 2003; Complete cohort where goal was not in place (shown for context): 75.0%. Fiscal year: 2004; Complete cohort where goal was not in place (shown for context): 82.4%. Fiscal year: 2005; Complete cohort where goal was not in place (shown for context): 66.7%. Fiscal year: 2006; Complete cohort where goal was not in place (shown for context): 33.3%. Fiscal year: 2007; Complete cohort where goal was not in place (shown for context): 0.0%. Fiscal year: 2008; Complete cohort where goal was place: 50.0%; Tier 2 performance goal: 90%. Fiscal year: 2009; Complete cohort where goal was place: 75.0; Tier 2 performance goal: 90%. Fiscal year: 2010; Incomplete cohort where goal was in place: 100%[A]; Tier 2 performance goal: 90%. For FY 2008-2010, the Tier 2 performance goal was to have 90 percent of original PMAs reviewed within 280 days. Source: GAO analysis of FDA data. Notes: Tier 1 and Tier 2 designations refer to the length of time allotted (for the FYs 2008 through 2011 cohorts: 180 days and 280 days, respectively) for FDA to complete its review of expedited PMA submissions. FDA did not designate PMA performance goals as either Tier 1 or Tier 2 prior to FY 2008. For FYs 2005 through 2007, there was only one goal for expedited PMAs: 300 days. Prior to FY 2005 there were no goals for expedited PMA submissions. We have aligned the performance goal in place prior to FY 2008 with the Tier 2 goal for FYs 2008 through 2011 based on sharing a similar time frame. This placement illustrates the increase in the goal percentage over time. If FDA completed its review of a submission in 180 days or less, it met the time frames for both the Tier 1 and Tier 2 goals. If the review was completed in more than 180 days but not more than 300 or 280 days (depending on the cohort), only the time frame for the Tier 2 goal was met. If the review took longer than 300 or 280 days, FDA did not meet the time frame for either goal. We treated PMA submissions as meeting the time frame for a given performance goal if they were reviewed within the goal time plus any extension to the goal time that may have been made. The only reason the goal time can be extended is if the sponsor submits a major amendment to the submission on its own initiative (i.e., unsolicited by FDA). [A] These analyses include only those expedited PMAs for which FDA or the sponsor had made a decision that would permanently stop the review clock for purposes of determining whether FDA met its performance goals (i.e., an approval, approvable, not approvable, withdrawal, or denial); this includes reviews by CBER through September 30, 2011, and reviews by CDRH through December 1, 2011. Submissions without such a decision are not included in the results for each cohort shown above. We considered a cohort to be incomplete if more than 10 percent of submissions had not yet received such a decision. For this analysis, the FY 2010 expedited PMA cohort was still incomplete. Specifically, for 16.7 percent of the FY 2010 expedited PMA cohort, a decision that would permanently stop the review clock had not been made at the time we received FDA's data. As a result, it was too soon to tell what the final results for this cohort would be. It is possible that some of the reviews taking the most time were among those not completed when we received FDA's data. The percentage of expedited PMAs reviewed within 300 and 280 days for this cohort may decrease as those reviews are completed. [End of figure] FDA Review Time and Time to Final Decision Generally Increased for PMAs from 2003 to 2010: FDA review time for both original and expedited PMAs was highly variable but generally increased across our analysis period, while time to final decision also increased for original PMAs. Specifically, average FDA review time for original PMAs increased from 211 days in the FY 2003 cohort (the first year that user fees were in effect) to 264 days in the FY 2008 cohort, then fell in the FY 2009 cohort to 217 days (see figure 11). When we added off-the-clock time (during which FDA waited for the sponsor to provide additional information or correct deficiencies in the submission), average time to final decision for the FYs 2003 through 2008 cohorts fluctuated from year to year but trended upward from 462 days for the FY 2003 cohort to 627 days for the FY 2008 cohort.[Footnote 39] Figure 11: Average FDA Review Time and Average Time to Final Decision for Original PMAs in the Fiscal Year 2000-2010 Cohorts: [Refer to PDF for image: multiple line and plotted point graph] Fiscal year: 2000; Average time to final decision: 405 days; Average FDA review time: 201 days. Fiscal year: 2001; Average time to final decision: 593 days; Average FDA review time: 223 days. Fiscal year: 2002; Average time to final decision: 488 days; Average FDA review time: 246 days. Fiscal year: 2003; Average time to final decision: 462 days; Average FDA review time: 211 days. Fiscal year: 2004; Average time to final decision: 395 days; Average FDA review time: 225 days. Fiscal year: 2005; Average time to final decision: 580 days; Average FDA review time: 252 days. Fiscal year: 2006; Average time to final decision: 476 days; Average FDA review time: 300 days. Fiscal year: 2007; Average time to final decision: 564 days; Average FDA review time: 232 days. Fiscal year: 2008; Average time to final decision: 627 days; Average FDA review time: 264 days. Fiscal year: 2009; Average FDA review time: 217 days. Three plotted points: [A], [A], [B]. Source: GAO analysis of FDA data. Note: FDA review time refers to the time that FDA spends reviewing a submission and therefore excludes any time the sponsor may spend responding to FDA requests for additional information. Time to final decision includes both the time FDA spends reviewing a submission and the time the sponsor may spend responding to any FDA action. [A] The analysis of time to final decision includes only those original PMAs for which FDA or the sponsor had made a final decision (i.e., a decision that ends the review process such as an approval, denial, or withdrawal); this includes reviews by CBER through September 30, 2011, and reviews by CDRH through December 1, 2011. Submissions without a final decision are not included in the results for each cohort shown above. We considered a cohort to be incomplete if more than 10 percent of submissions had not yet received a final decision. For this analysis, the FYs 2009-2010 original PMA cohorts were still incomplete. Specifically, 22 percent of the FY 2009 original PMA cohort and 46.3 percent of the FY 2010 cohort had not yet received a final decision. As a result, it was too soon to tell what the final results for this cohort would be. It is possible that some of the reviews taking the most time were among those not completed when we received FDA's data. The average time to final decision for these two cohorts may increase or decrease as those reviews are completed. [B] The analysis of FDA review time includes only those original PMAs for which FDA or the sponsor had made a decision that would permanently stop the review clock for purposes of determining whether FDA met its performance goals (i.e., an approval, approvable, not approvable, withdrawal, or denial); this includes reviews by CBER through September 30, 2011, and reviews by CDRH through December 1, 2011. Submissions without such a decision are not included in the results for each cohort shown above. We considered a cohort to be incomplete if more than 10 percent of submissions had not yet received such a decision. For this analysis, the FY 2010 original PMA cohort was still incomplete. Specifically, for 18.5 percent of the FY 2010 original PMA cohort, a decision that would permanently stop the review clock had not been made at the time we received FDA's data. As a result, it was too soon to tell what the final results for this cohort would be. It is possible that some of the reviews taking the most time were among those not completed when we received FDA's data. The average FDA review time for this cohort may increase or decrease as those reviews are completed. The results for expedited PMAs fluctuated even more dramatically than for original PMAs--likely due to the small number of submissions (about 7 per year on average). Average FDA review time for expedited PMAs generally increased over the period that user fees have been in effect, from 241 days for the FY 2003 cohort to 356 days for the FY 2008 cohort, then fell to 245 days for the FY 2009 cohort (see figure 12). The average time to final decision for expedited PMAs was highly variable but overall declined somewhat during this period, from 704 days for the FY 2003 cohort to 545 days for the FY 2009 cohort. [End of figure] Figure 12: Average FDA Review Time and Average Time to Final Decision for Expedited PMAs in the Fiscal Year 2000-2010 Cohorts: [Refer to PDF for image: multiple line and plotted point graph] Fiscal year: 2000; Average time to final decision: 475 days; Average FDA review time: 275 days. Fiscal year: 2001; Average time to final decision: 382 days; Average FDA review time: 289 days. Fiscal year: 2002; Average time to final decision: 459 days; Average FDA review time: 224 days. Fiscal year: 2003; Average time to final decision: 704 days; Average FDA review time: 241 days. Fiscal year: 2004; Average time to final decision: 495 days; Average FDA review time: 250 days. Fiscal year: 2005; Average time to final decision: 1,044 days; Average FDA review time: 303 days. Fiscal year: 2006; Average time to final decision: 652 days; Average FDA review time: 417 days. Fiscal year: 2007; Average time to final decision: 1,111 days; Average FDA review time: 344 days. Fiscal year: 2008; Average time to final decision: 755 days; Average FDA review time: 356 days. Fiscal year: 2009; Average time to final decision: 545 days; Average FDA review time: 245 days. Two plotted points: [A], [B]. Source: GAO analysis of FDA data. Note: FDA review time refers to the time that FDA spends reviewing a submission and therefore excludes any time the sponsor may spend responding to FDA requests for additional information. Time to final decision includes both the time FDA spends reviewing a submission and the time the sponsor may spend responding to any FDA action. [A] The analysis of time to final decision includes only those expedited PMAs for which FDA or the sponsor had made a final decision (i.e., a decision that ends the review process such as an approval, denial, or withdrawal); this includes reviews by CBER through September 30, 2011, and reviews by CDRH through December 1, 2011. Submissions without a final decision are not included in the results for each cohort shown above. We considered a cohort to be incomplete if more than 10 percent of submissions had not yet received a final decision. For this analysis, the FY 2010 expedited PMA cohort was still incomplete. Specifically, 33 percent of the FY 2010 expedited PMA cohort had not yet received a final decision. As a result, it was too soon to tell what the final results for this cohort would be. It is possible that some of the reviews taking the most time were among those not completed when we received FDA's data. The average time to final decision for this cohort may increase or decrease as those reviews are completed. [B] The analysis of FDA review time includes only those expedited PMAs for which FDA or the sponsor had made a decision that would permanently stop the review clock for purposes of determining whether FDA met its performance goals (i.e., an approval, approvable, not approvable, withdrawal, or denial); this includes reviews by CBER through September 30, 2011, and reviews by CDRH through December 1, 2011. Submissions without such a decision are not included in the results for each cohort shown above. We considered a cohort to be incomplete if more than 10 percent of submissions had not yet received such a decision. For this analysis, the FY 2010 expedited PMA cohort was still incomplete. Specifically, for 16.7 percent of the FY 2010 expedited PMA cohort, a decision that would permanently stop the review clock had not been made at the time we received FDA's data. As a result, it was too soon to tell what the final results for this cohort would be. It is possible that some of the reviews taking the most time were among those not completed when we received FDA's data. The average FDA review time for this cohort may increase or decrease as those reviews are completed. [End of figure] The Average Number of Review Cycles Increased for Certain PMAs While the Percentage of PMAs Approved after One Review Cycle Generally Decreased: The average number of review cycles per original PMA increased 27.5 percent from 1.82 in the FY 2003 cohort (the first year that user fees were in effect) to 2.32 cycles in the FY 2008 cohort. For expedited PMAs, the average number of review cycles per submission was fairly steady at approximately 2.5 cycles until the FY 2004 cohort, then peaked at 4.0 in the FY 2006 cohort before decreasing back to 2.5 cycles in the FY 2009 cohort. We found nearly identical trends when we examined the subsets of original and expedited PMAs that received a final FDA decision of approval. In addition, the percentage of original PMAs receiving a decision of approval at the end of the first review cycle fluctuated from FYs 2003 through 2009 but generally decreased--from 16 percent in the FY 2003 cohort to 9.8 percent in the FY 2009 cohort. Similarly, the percentage receiving a first-cycle approvable decision decreased from 12 percent in the FY 2003 cohort to 2.4 percent in the FY 2009 cohort. The percentage of expedited PMAs receiving first-cycle approval fluctuated from year to year, from 0 percent in 5 of the years we examined to a maximum of 25 percent in FY 2008. The percentage of original PMAs that ultimately received approval from FDA fluctuated from year to year but exhibited an overall decrease for the completed cohorts from FYs 2003 through 2008. Specifically, 74.0 percent of original PMAs in the FY 2003 cohort were ultimately approved, compared to 68.8 percent of the FY 2008 cohort. The percentage of expedited PMAs that were ultimately approved varied significantly from FYs 2003 through 2009, from a low of 0 percent in the FY 2007 cohort to a high of 100 percent in the FY 2006 cohort. Stakeholders Noted Issues with the Medical Device Review Process; FDA Is Taking Steps That May Address Many of These Issues: The industry groups and consumer advocacy groups we interviewed noted a number of issues related to FDA's review of medical device submissions. The most commonly mentioned issue raised by industry and consumer advocacy stakeholder groups was insufficient communication between FDA and stakeholders throughout the review process. Industry stakeholders also noted a lack of predictability and consistency in reviews and an increase in time to final decision. Consumer advocacy group stakeholders noted issues related to inadequate assurance of the safety and effectiveness of approved or cleared devices. FDA is taking steps that may address many of these issues. Stakeholders Cite Insufficient Communication between FDA and Stakeholders throughout the Review Process: Most of the three industry and four consumer advocacy group stakeholders that we interviewed told us that there is insufficient communication between FDA and stakeholders throughout the review process. For example, four stakeholders noted that FDA does not clearly communicate to stakeholders the regulatory standards that it uses to evaluate submissions. In particular, industry stakeholders noted problems with the regulatory guidance documents issued by FDA. These stakeholders noted that these guidance documents are often unclear, out of date, and not comprehensive. Stakeholders also noted that after sponsors submit their applications to FDA, insufficient communication from FDA prevents sponsors from learning about deficiencies in their submissions early in FDA's review. According to one of these stakeholders, if FDA communicated these deficiencies earlier in the process, sponsors would be able to correct them and would be less likely to receive a request for additional information. Two consumer advocacy group stakeholders also noted that FDA does not sufficiently seek patient input during reviews. One stakeholder noted that it is important for FDA to incorporate patient perspectives into its reviews of medical devices because patients might weigh the benefits and risks of a certain device differently than FDA reviewers. FDA has taken or plans to take several steps that may address issues with the frequency and quality of its communications with stakeholders, including issuing new guidance documents, improving the guidance development process, and enhancing interactions between FDA and stakeholders during reviews. For example, in December 2011, FDA released draft guidance about the regulatory framework, policies, and practices underlying FDA's 510(k) review in order to enhance the transparency of this program.[Footnote 40] In addition, FDA implemented a tracking system and released a standard operating procedure (SOP) for developing guidance documents for medical device reviews to provide greater clarity, predictability, and efficiency in this process.[Footnote 41] FDA also created a new staff position to oversee the guidance development process. Additionally, according to an overview of recent FDA actions to improve its device review programs, FDA is currently enhancing its interactive review process for medical device reviews by establishing performance goals for early and substantive interactions between FDA and sponsors during reviews. [Footnote 42] This overview also notes that FDA is currently working with a coalition of patient advocacy groups on establishing mechanisms for obtaining reliable information on patient perspectives during medical device reviews.[Footnote 43] Industry Stakeholders Report a Lack of Predictability and Consistency in Reviews: The three industry stakeholders that we interviewed also told us that there is a lack of predictability and consistency in FDA's reviews of device submissions. For example, two stakeholders noted that review criteria sometimes change after a sponsor submits an application. In particular, one of these stakeholders noted that criteria sometimes change when the FDA reviewer assigned to the submission changes during the review. Additionally, stakeholders noted that there is sometimes inconsistent application of criteria across review divisions or across individual reviewers. Stakeholders noted that enhanced training for reviewers and enhanced supervisory oversight could help resolve inconsistencies in reviews and increase predictability for sponsors. In the two internal assessments of its device review programs that FDA released in August 2010, the agency found that insufficient predictability in its review programs was a significant problem.[Footnote 44] FDA has taken steps that may address issues with the predictability and consistency of its reviews of device submissions, including issuing new SOPs for reviews and enhancing training for FDA staff. For example, in June 2011, FDA issued an SOP to standardize the practice of quickly issuing written notices to sponsors to inform them about changes in FDA's regulatory expectations for medical device submissions.[Footnote 45] FDA also recently developed an SOP to assure greater consistency in the review of device submissions when review staff change during the review.[Footnote 46] Additionally, in April 2010, FDA began a reviewer certification program for new FDA reviewers designed to improve the consistency of reviews.[Footnote 47] According to the overview of recent FDA actions to improve its device review programs, FDA also plans to implement an experiential learning program for new reviewers to give them a better understanding of how medical devices are designed, manufactured, and used.[Footnote 48] Industry Stakeholders Note an Increase in Time to Final Decision: The three industry stakeholders we interviewed told us that the time to final decision for device submissions has increased in recent years. This is consistent with our analysis, which showed that the average time to final decision has increased for completed 510(k) and original PMA cohorts since FY 2003. Additionally, stakeholders noted that FDA has increased the number of requests for additional information, which our analysis also shows. Stakeholders told us they believe the additional information being requested is not always critical for the review of the submission. Additional information requests increase the time to final decision but not necessarily the FDA review time because FDA stops the review clock when it requests additional information from sponsors. Two of the stakeholders stated that reviewers may be requesting additional information more often due to a culture of increased risk aversion at FDA or because they want to stop the review clock in order to meet performance goals. According to FDA, the most significant contributor to the increased number of requests for additional information--and therefore increased time to final decision--is the poor quality of submissions received from sponsors. In July 2011, FDA released an analysis it conducted of review times under the 510(k) program.[Footnote 49] According to FDA, in over 80 percent of the reviews studied for this analysis, reviewers asked for additional information from sponsors due to problems with the quality of the submission.[Footnote 50] FDA officials told us that sending a request for additional information is often the only option for reviewers besides issuing a negative decision to the sponsor. FDA's analysis also found that 8 percent of its requests for additional information during the first review cycle were inappropriate. Requests for additional information were deemed inappropriate if FDA requested additional information or data for a 510(k) that (1) were not justified, (2) were not permissible as a matter of federal law or FDA policy, or (3) were unnecessary to make a substantial equivalence determination. FDA has taken steps that may address issues with the number of inappropriate requests for additional information. For example, the overview of recent FDA actions indicates the agency is developing an SOP for requests for additional information that clarifies when these requests can be made for 510(k)s, the types of requests that can be made, and the management level at which the decision must be made. Consumer Advocacy Group Stakeholders Suggest That FDA Provides Inadequate Assurance of the Safety and Effectiveness of Approved or Cleared Devices: Three of the four consumer advocacy group stakeholders with whom we spoke stated that FDA is not adequately ensuring the safety and effectiveness of the devices it approves or clears for marketing. One of these stakeholders told us that FDA prioritizes review speed over safety and effectiveness. Two stakeholders also noted that the standards FDA uses to approve or clear devices are lower than the standards that FDA uses to approve drugs, particularly for the 510(k) program. Two stakeholders also expressed concern that devices reviewed under the 510(k) program are not always sufficiently similar to their predicates and that devices whose predicates are recalled due to safety concerns do not have to be reassessed to ensure that they are safe. Finally, three stakeholders told us that FDA does not gather enough data on long-term device safety and effectiveness through methods such as postmarket analysis and device tracking. These issues are similar to those raised elsewhere, such as a public meeting to discuss the reauthorization of the medical device user fee program, a congressional hearing, and an Institute of Medicine (IOM) report. For example, during a September 14, 2010, public meeting to discuss the reauthorization, consumer advocacy groups--including two of those we interviewed for our report--urged the inclusion of safety and effectiveness improvements in the reauthorization, including raising premarket review standards for devices and increasing postmarket surveillance.[Footnote 51] Additionally, during an April 13, 2011, congressional hearing, another consumer advocacy group expressed concerns about FDA's 510(k) review process and recalls of high-risk devices that were cleared through this process.[Footnote 52] Finally, in July 2011, IOM released a report summarizing the results of an independent evaluation of the 510(k) program. FDA had requested that IOM conduct this evaluation to determine whether the 510(k) program optimally protects patients and promotes innovation. IOM concluded that clearance of a 510(k) based on substantial equivalence to a predicate device is not a determination that the cleared device is safe or effective.[Footnote 53] FDA has taken or plans to take steps that may address issues with the safety and effectiveness of approved and cleared devices, including evaluating the 510(k) program and developing new data systems. For example, FDA analyzed the safety of 510(k) devices cleared on the basis of multiple predicates by investigating an apparent association between these devices and increased reports of adverse events.[Footnote 54] FDA concluded that no clear relationship exists. FDA also conducted a public meeting to discuss the recommendations proposed in the IOM report in September 2011.[Footnote 55] FDA is also developing a device identification system that will allow FDA to better track devices that are distributed to patients, as well as an electronic reporting system that will assist with tracking and analyzing adverse events in marketed devices.[Footnote 56] Concluding Observations: While FDA has met most of the goals for the time frames within which the agency was to review and take action on 510(k) and PMA device submissions, the time that elapses before a final decision has been increasing. This is particularly true for 510(k) submissions, which comprise the bulk of FDA device reviews. Stakeholders we spoke with point to a number of issues that the agency could consider in addressing the cause of these time increases. FDA tracks and reports the time to final decision in its annual reports to Congress on the medical device user fee program, and its own reports reveal the same pattern we found. In its July 2011 analysis of 510(k) submissions, FDA concluded that reviewers asked for additional information from sponsors--thus stopping the clock on FDA's review time while the total time to reach a final decision continued to elapse--mainly due to problems with the quality of the submission. FDA is taking steps that may address the increasing time to final decision. It is important for the agency to monitor the impact of those steps in ensuring that safe and effective medical devices are reaching the market in a timely manner. Agency Comments: HHS reviewed a draft of this report and provided written comments, which are reprinted in appendix III. HHS generally agreed with our findings and noted that FDA has identified some of the same performance trends in its annual reports to Congress. HHS noted that because the total time to final decision includes the time industry incurs in responding to FDA's concerns, FDA and industry bear shared responsibility for the increase in this time and will need to work together to achieve improvement. HHS also noted that in January 2011, FDA announced 25 specific actions that the agency would take to improve the predictability, consistency, and transparency of its premarket medical device review programs. Since then, HHS stated, FDA has taken or is taking actions designed to create a culture change toward greater transparency, interaction, collaboration, and the appropriate balancing of benefits and risk; ensure predictable and consistent recommendations, decision making, and application of the least burdensome principle; and implement efficient processes and use of resources. HHS also provided technical comments, which we incorporated as appropriate. As agreed with your office, unless you publicly announce the contents of this report earlier, we plan no further distribution until 30 days from the report date. At that time, we will send copies of this report to the Secretary of Health and Human Services, the Commissioner of FDA, and other interested parties. In addition, the report will be available at no charge on the GAO website at [hyperlink, http://www.gao.gov]. If you or your staff have any questions about this report, please contact me at (202) 512-7114 or crossem@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made key contributions to this report are listed in appendix IV. Signed by: Marcia Crosse: Director, Health Care: [End of section] Appendix I: FDA Medical Device Review Performance for Fiscal Years (FY) 2000-2011: Table 4: FDA Premarket Notification (510(k)) Review Performance, FYs 2000-2011: Fiscal year cohorts: Total number of 510(k) submissions; Prior to implementation of the Medical Device User Fee and Modernization Act of 2002 (MDUFMA): 2000: 4,242; 2001: 4,294; 2002: 4,365; Period covered by MDUFMA: 2003: 4,292; 2004: 3,711; 2005: 3,713; 2006: 3,914; 2007: 3,714; Period covered by the Medical Device User Fee Amendments of 2007 (MDUFA): 2008: 3,901; 2009: 4,153; 2010: 3,938; 2011[A]: 3,878. Fiscal year cohorts: Number of 510(k) submissions with a final FDA decision; Prior to implementation of the Medical Device User Fee and Modernization Act of 2002 (MDUFMA): 2000: 4,242; 2001: 4,294; 2002: 4,365; Period covered by MDUFMA: 2003: 4,292; 2004: 3,711; 2005: 3,713; 2006: 3,914; 2007: 3,713; Period covered by the Medical Device User Fee Amendments of 2007 (MDUFA): 2008: 3,899; 2009: 4,148; 2010: 3,853; 2011[A]: 2,366. Fiscal year cohorts: Number reviewed in less than or equal to 90 days[B]; Prior to implementation of the Medical Device User Fee and Modernization Act of 2002 (MDUFMA): 2000: 3,391; 2001: 3,279; 2002: 3,411; Period covered by MDUFMA: 2003: 3,299; 2004: 3,121; 2005: 3,381; 2006: 3,569; 2007: 3,364; Period covered by the Medical Device User Fee Amendments of 2007 (MDUFA): 2008: 3,651; 2009: 3,737; 2010: 3,530; 2011[A]: 2,300. Fiscal year cohorts: Percentage reviewed in less than or equal to 90 days[B]; Prior to implementation of the Medical Device User Fee and Modernization Act of 2002 (MDUFMA): 2000: 79.9; 2001: 76.4; 2002: 78.1; Period covered by MDUFMA: 2003: 76.9; 2004: 84.1; 2005: 91.1; 2006: 91.2; 2007: 90.6; Period covered by the Medical Device User Fee Amendments of 2007 (MDUFA): 2008: 93.6; 2009: 90.1; 2010: 91.6; 2011[A]: 97.2. Fiscal year cohorts: Tier 1 goal percentage[C]; Prior to implementation of the Medical Device User Fee and Modernization Act of 2002 (MDUFMA): 2000: [Empty]; 2001: [Empty]; 2002: [Empty]; Period covered by MDUFMA: 2003: [Empty]; 2004: [Empty]; 2005: 75; 2006: 75; 2007: 80; Period covered by the Medical Device User Fee Amendments of 2007 (MDUFA): 2008: 90; 2009: 90; 2010: 90; 2011[A]: 90. Fiscal year cohorts: Met Tier 1 goal[D]; Prior to implementation of the Medical Device User Fee and Modernization Act of 2002 (MDUFMA): 2000: n/a; 2001: n/a; 2002: n/a; Period covered by MDUFMA: 2003: n/a; 2004: n/a; 2005: Yes; 2006: Yes; 2007: Yes; Period covered by the Medical Device User Fee Amendments of 2007 (MDUFA): 2008: Yes; 2009: Yes; 2010: Yes[E]; 2011[A]: Yes[E]. Fiscal year cohorts: Number reviewed in less than or equal to 150 days[B]; Prior to implementation of the Medical Device User Fee and Modernization Act of 2002 (MDUFMA): 2000: 3,796; 2001: 3,773; 2002: 3,863; Period covered by MDUFMA: 2003: 3,773; 2004: 3,535; 2005: 3,623; 2006: 3,799; 2007: 3,591; Period covered by the Medical Device User Fee Amendments of 2007 (MDUFA): 2008: 3,835; 2009: 4,049; 2010: 3,795; 2011[A]: 2,361. Fiscal year cohorts: Percentage reviewed in less than or equal to 150 days[B]; Prior to implementation of the Medical Device User Fee and Modernization Act of 2002 (MDUFMA): 2000: 89.5; 2001: 87.9; 2002: 88.5; Period covered by MDUFMA: 2003: 87.9; 2004: 95.3; 2005: 97.6; 2006: 97.1; 2007: 96.7; Period covered by the Medical Device User Fee Amendments of 2007 (MDUFA): 2008: 98.4; 97.6; 98.5; 99.8. Fiscal year cohorts: Tier 2 goal percentage[C]; Prior to implementation of the Medical Device User Fee and Modernization Act of 2002 (MDUFMA): 2000: [Empty]; 2001: [Empty]; 2002: [Empty]; Period covered by MDUFMA: 2003: [Empty]; 2004: [Empty]; 2005: [Empty]; 2006: [Empty]; 2007: [Empty]; Period covered by the Medical Device User Fee Amendments of 2007 (MDUFA): 2008: 98; 2009: 98; 2010: 98; 2011[A]: 98. Fiscal year cohorts: Met Tier 2 goal[D]; Prior to implementation of the Medical Device User Fee and Modernization Act of 2002 (MDUFMA): 2000: n/a; 2001: n/a; 2002: n/a; Period covered by MDUFMA: 2003: n/a; 2004: n/a; 2005: n/a; 2006: n/a; 2007: n/a; Period covered by the Medical Device User Fee Amendments of 2007 (MDUFA): 2008: Yes; 2009: Yes; 2010: Yes[E]; 2011[A]: Yes[E]. Fiscal year cohorts: Average number of review cycles per submission[F]; Prior to implementation of the Medical Device User Fee and Modernization Act of 2002 (MDUFMA): 2000: 1.38; 2001: 1.43; 2002: 1.39; Period covered by MDUFMA: 2003: 1.47; 2004: 1.53; 2005: 1.61; 2006: 1.69; 2007: 1.78; Period covered by the Medical Device User Fee Amendments of 2007 (MDUFA): 2008: 1.83; 2009: 1.94; 2010: 2.04; 2011[A]: 1.64. Fiscal year cohorts: Of first-cycle decisions, percentage that were substantially equivalent (SE); Prior to implementation of the Medical Device User Fee and Modernization Act of 2002 (MDUFMA): 2000: 54.1; 2001: 53.8; 2002: 55.7; Period covered by MDUFMA: 2003: 54.0; 2004: 52.1; 2005: 47.7; 2006: 42.2; 2007: 36.6; Period covered by the Medical Device User Fee Amendments of 2007 (MDUFA): 2008: 33.3; 2009: 26.2; 2010: 20.0; 2011[A]: 21.2. Fiscal year cohorts: Among 510(k) submissions with a final decision, percentage of final decisions that were: Substantially equivalent (SE); Prior to implementation of the Medical Device User Fee and Modernization Act of 2002 (MDUFMA): 2000: 81.4; 2001: 83.5; 2002: 83.0; Period covered by MDUFMA: 2003: 85.6; 2004: 87.6; 2005: 87.9; 2006: 85.7; 2007: 83.5; Period covered by the Medical Device User Fee Amendments of 2007 (MDUFA): 2008: 81.0; 2009: 77.3; 2010: 75.1; 2011[A]: 87.2. Fiscal year cohorts: Among 510(k) submissions with a final decision, percentage of final decisions that were: Not substantially equivalent (NSE); Prior to implementation of the Medical Device User Fee and Modernization Act of 2002 (MDUFMA): 2000: 1.1; 2001: 1.9; 2002: 1.9; Period covered by MDUFMA: 2003: 2.9; 2004: 3.5; 2005: 3.7; 2006: 3.7; 2007: 3.7; Period covered by the Medical Device User Fee Amendments of 2007 (MDUFA): 2008: 3.8; 2009: 5.5; 2010: 6.4; 2011[A]: 2.4. Fiscal year cohorts: Among 510(k) submissions with a final decision, percentage of final decisions that were: Withdrawn; Prior to implementation of the Medical Device User Fee and Modernization Act of 2002 (MDUFMA): 2000: 0.1; 2001: 0.1; 2002: 0.2; Period covered by MDUFMA: 2003: 0.3; 2004: 0.1; 2005: 0.2; 2006: 0.1; 2007: 0.2; Period covered by the Medical Device User Fee Amendments of 2007 (MDUFA): 2008: 0.1; 2009: 0.2; 2010: 0.3; 2011[A]: 0.1. Fiscal year cohorts: Among 510(k) submissions with a final decision, percentage of final decisions that were: Other; Prior to implementation of the Medical Device User Fee and Modernization Act of 2002 (MDUFMA): 2000: 17.4; 2001: 14.5; 2002: 14.9; Period covered by MDUFMA: 2003: 11.3; 2004: 8.7; 2005: 8.2; 2006: 10.4; 2007: 12.7; Period covered by the Medical Device User Fee Amendments of 2007 (MDUFA): 2008: 15.1; 2009: 16.9; 2010: 18.3; 2011[A]: 10.4. Fiscal year cohorts: Average FDA review time (in days) for 510(k) submissions that were not reviewed within 150 days; Prior to implementation of the Medical Device User Fee and Modernization Act of 2002 (MDUFMA): 2000: 195; 2001: 201; 2002: 201; Period covered by MDUFMA: 2003: 197; 2004: 195; 2005: 186; 2006: 221; 2007: 228; Period covered by the Medical Device User Fee Amendments of 2007 (MDUFA): 2008: 259; 2009: 282; 2010: 226; 2011[A]: 176. Fiscal year cohorts: Average time to final decision (in days) for 510(k) submissions that were not reviewed within 150 days; Prior to implementation of the Medical Device User Fee and Modernization Act of 2002 (MDUFMA): 2000: 295; 2001: 302; 2002: 300; Period covered by MDUFMA: 2003: 306; 2004: 325; 2005: 374; 2006: 426; 2007: 429; Period covered by the Medical Device User Fee Amendments of 2007 (MDUFA): 2008: 457; 2009: 460; 2010: 377; 2011[A]: 269. Source: GAO analysis of Food and Drug Administration (FDA) data. Note: A review cohort includes all the medical device submissions relating to a particular performance goal that were submitted in a given fiscal year. For example, all 510(k)s received by FDA from October 1, 2010, to September 30, 2011, make up the 510(k) review cohort for FY 2011. Cohorts were considered complete if fewer than 10 percent of submissions were still under review at the time we received FDA's data. All cohorts except FY 2011 were complete for the purposes of our analysis. As a result, it was too soon to tell what the final results for this cohort would be. [A] Approximately 39 percent of 510(k)s received in fiscal year (FY) 2011 were still under review at the time we received FDA's data, which cover reviews by CDRH through October 26, 2011, and reviews by CBER through December 23, 2011. As a result, it was too soon to tell what the final results for this cohort would be. It is possible that some of the reviews taking the most time were among those not completed when we received FDA's data. The percentage of 510(k)s reviewed within 90 days and within 150 days for the FY 2011 cohort may increase or decrease as those reviews are completed. The number of 510(k)s reviewed within 90 and 150 days and the average number of review cycles for the FY 2011 cohort may increase as those reviews are completed but will not decrease. [B] Only 510(k)s that had received a final decision from FDA were used to determine the number and percentage of 510(k)s reviewed within 90 days and within 150 days. [C] Fiscal years for which there was no corresponding 510(k) performance goal are denoted with [Empty]. [D] "n/a" denotes not applicable. In these years, there was no corresponding 510(k) performance goal and therefore no determination of whether the goal was met. [E] These results may change as the remaining 510(k) submissions for the FY 2010 and FY 2011 cohorts receive final decisions. [F] Cycles that were currently in progress at the time we received FDA's data were included in this analysis. The average number of review cycles for the FY 2011 cohort may increase as those reviews are completed but will not decrease. [End of table] Table 5: FDA Premarket Approval (PMA) Review Performance for Original PMAs, FYs 2000-2011: Fiscal years: Total number of submissions; Pre-MDUFMA: 2000: 73; 2001: 75; 2002: 51; MDUFMA: 2003: 50; 2004: 45; 2005: 57; 2006: 55; 2007: 37; MDUFA: 2008: 34; 2009: 41; 2010[A]: 54; 2011[A]: 43. Number reviewed in less than or equal to 180 days[B]; Pre-MDUFMA: 2000: 47; 2001: 42; 2002: 27; MDUFMA: 2003: 25; 2004: 18; 2005: 18; 2006: 19; 2007: 21; MDUFA: 2008: 21; 2009: 32; 2010[A]: 38; 2011[A]: 20. Percentage reviewed in less than or equal to 180 days[B]; Pre-MDUFMA: 2000: 67.1; 2001: 56.8; 2002: 52.9; MDUFMA: 2003: 50.0; 2004: 40.0; 2005: 32.1; 2006: 34.5; 2007: 56.8; MDUFA: 2008: 61.8; 2009: 80.0; 2010[A]: 86.4; 2011[A]: 90.9. Tier 1 goal percentage[C]; Pre-MDUFMA: 2000: [Empty]; 2001: [Empty]; 2002: [Empty]; MDUFMA: 2003: [Empty]; 2004: [Empty]; 2005: [Empty]; 2006: [Empty]; 2007: 50; MDUFA: 2008: 60; 2009: 60; 2010[A]: 60; 2011[A]: 60. Met Tier 1 goal[D]; Pre-MDUFMA: 2000: n/a; 2001: n/a; 2002: n/a; MDUFMA: 2003: n/a; 2004: n/a; 2005: n/a; 2006: n/a; 2007: n/a; MDUFA: 2008: Yes; 2009: Yes; 2010[A]: Yes[E]; 2011[A]: Yes[E]. Number reviewed in less than or equal to 320 days[B]; Pre-MDUFMA: 2000: 63; 2001: 67; 2002: 40; MDUFMA: 2003: 47; 2004: 43; 2005: 50; 2006: 45; 2007: 33; MDUFA: 2008: 27; 2009: 35; 2010[A]: 44; 2011[A]: 22. Percentage reviewed in less than or equal to 320 days[B]; Pre-MDUFMA: 2000: 90.0; 2001: 90.5; 2002: 78.4; MDUFMA: 2003: 94.0; 2004: 95.6; 2005: 89.3; 2006: 81.8; 2007: 89.2; MDUFA: 2008: 79.4; 2009: 87.5; 2010[A]: 100.0; 2011[A]: 100.0. Tier 2 goal percentage[C]; Pre-MDUFMA: 2000: [Empty]; 2001: [Empty]; 2002: [Empty]; MDUFMA: 2003: [Empty]; 2004: [Empty]; 2005: [Empty]; 2006: 80; 2007: 90; 2008: [Empty]; 2009: [Empty]; 2010[A]: [Empty]; 2011[A]: [Empty]. Met Tier 2 goal[D]; Pre-MDUFMA: 2000: n/a; 2001: n/a; 2002: n/a; MDUFMA: 2003: n/a; 2004: n/a; 2005: n/a; 2006: Yes; 2007: No; MDUFA: 2008: n/a; 2009: n/a; 2010[A]: n/a; 2011[A]: n/a. Number reviewed in less than or equal to 295 days[B]; Pre-MDUFMA: 2000: 58; 2001: 63; 2002: 39; MDUFMA: 2003: 43; 2004: 40; 2005: 41; 2006: 34; 2007: 29; MDUFA: 2008: 27; 2009: 35; 2010[A]: 44; 2011[A]: 22. Percentage reviewed in less than or equal to 295 days[B]; Pre-MDUFMA: 2000: 82.9; 2001: 85.1; 2002: 76.5; MDUFMA: 2003: 86.0; 2004: 88.9; 2005: 73.2; 2006: 61.8; 2007: 78.4; MDUFA: 2008: 79.4; 2009: 87.5; 2010[A]: 100.0; 2011[A]: 100.0. Tier 2 goal percentage[C]; Pre-MDUFMA: 2000: [Empty]; 2001: [Empty]; 2002: [Empty]; MDUFMA: 2003: [Empty]; 2004: [Empty]; 2005: [Empty]; 2006: [Empty]; 2007: [Empty]; MDUFA: 2008: 90; 2009: 90; 2010[A]: 90; 2011[A]: 90. Met Tier 2 goal[D]; Pre-MDUFMA: 2000: n/a; 2001: n/a; 2002: n/a; MDUFMA: 2003: n/a; 2004: n/a; 2005: n/a; 2006: n/a; 2007: n/a; MDUFA: 2008: No; 2009: No; 2010[A]: Yes[E]; 2011[A]: Yes[E]. Average number of review cycles per submission[F]; Pre-MDUFMA: 2000: 1.77; 2001: 2.05; 2002: 2.16; MDUFMA: 2003: 1.82; 2004: 1.98; 2005: 2.54; 2006: 2.40; 2007: 2.51; MDUFA: 2008: 2.32; 2009: 2.12[G]; 2010[A]: 2.04; 2011[A]: 1.58. Of first-cycle decisions, percentage that were approval; Pre-MDUFMA: 2000: 15.1; 2001: 17.3; 2002: 9.8; MDUFMA: 2003: 16.0; 2004: 8.9; 2005: 8.8; 2006: 21.8; 2007: 8.1; MDUFA: 2008: 5.9; 2009: 9.8; 2010[A]: 7.4; 2011[A]: 9.3. Among original PMAs with a final decision, percentage of final decisions that were: Approval[G]; Pre-MDUFMA: 2000: 61.6; 2001: 69.3; 2002: 76.5; MDUFMA: 2003: 74.0; 2004: 79.5; 2005: 71.4; 2006: 90.4; 2007: 70.6; MDUFA: 2008: 68.8; 2009: 56.3; 2010[A]: 82.8; 2011[A]: 93.3. Among original PMAs with a final decision, percentage of final decisions that were: Denial[G]; Pre-MDUFMA: 2000: 0.0; 2001: 0.0; 2002: 0.0; MDUFMA: 2003: 0.0; 2004: 0.0; 2005: 0.0; 2006: 0.0; 2007: 0.0; MDUFA: 2008: 0.0; 2009: 0.0; 2010[A]: 0.0; 2011[A]: 0.0. Among original PMAs with a final decision, percentage of final decisions that were: Withdrawal[G]; Pre-MDUFMA: 2000: 31.5; 2001: 28.0; 2002: 23.5; MDUFMA: 2003: 26.0; 2004: 20.5; 2005: 28.6; 2006: 9.6; 2007: 29.4; MDUFA: 2008: 31.3; 2009: 43.8; 2010[A]: 17.2; 2011[A]: 6.7. Average FDA review time (in days) for original PMAs that were not reviewed within 295 days[B]; Pre-MDUFMA: 2000: 383; 2001: 468; 2002: 451; MDUFMA: 2003: 333; 2004: 368; 2005: 342; 2006: 447; 2007: 422; MDUFA: 2008: 584; 2009: 571; 2010[A]: [Empty][H]; 2011[A]: [Empty][H]. Average time to final decision (in days) for original PMAs that were not reviewed within 295 days[G]; Pre-MDUFMA: 2000: 752; 2001: 1006; 2002: 688; MDUFMA: 2003: 591; 2004: 775; 2005: 727; 2006: 679; 2007: 645; MDUFA: 2008: 923; 2009: 823; 2010[A]: [Empty][H]; 2011[A]: [Empty][H]. Source: GAO analysis of FDA data. Notes: A review cohort includes all the medical device submissions relating to a particular performance goal that were submitted in a given fiscal year. For example, all original PMAs received by FDA from October 1, 2010, to September 30, 2011, make up the original PMA review cohort for FY 2011. Cohorts were considered complete if fewer than 10 percent of submissions were still under review at the time we received FDA's data. We treated PMA submissions as meeting the time frame for a given performance goal if they were reviewed within the goal time plus any extension to the goal time that may have been made. The only reason the goal time can be extended is if the sponsor submits a major amendment to the submission on its own initiative (i.e., unsolicited by FDA). [A] The FYs 2010 and 2011 original PMA cohorts were considered still incomplete. Specifically, for 18.5 percent of the FY 2010 original PMA cohort and 48.8 percent of the FY 2011 cohort, FDA had not yet made a decision that would permanently stop the review clock for purposes of determining whether FDA met its performance goals (i.e., an approval, approvable, not approvable, withdrawal, or denial) at the time we received FDA's data; this includes reviews by CBER through September 30, 2011, and reviews by CDRH through December 1, 2011. As a result, it was too soon to tell what the final results for these cohorts would be. It is possible that some of the reviews taking the most time were among those not completed when we received FDA's data. The percentage of original PMAs reviewed within 180 days for the FY 2010 and FY 2011 cohorts may increase or decrease as those reviews are completed; the number reviewed within 180 days and the number and percentage reviewed within 320 days and within 295 days may decrease as those reviews are completed. [B] Only original PMAs that had received a decision permanently stopping the review clock were used to determine the number and percentage of original PMAs reviewed within 180 days, within 320 days, and within 295 days. [C] Fiscal years for which there was no corresponding original PMA performance goal are denoted with [Empty]. [D] "n/a" denotes not applicable. In these years, there was no corresponding original PMA performance goal and therefore no determination of whether the goal was met. [E] These results may change as the remaining original PMA submissions for the FY 2010 and FY 2011 cohorts receive decisions that permanently stop the review clock for purposes of determining whether FDA met its performance goals. [F] Cycles that were currently in progress at the time we received FDA's data were included in this analysis. The average number of review cycles for the incomplete cohorts may increase as those reviews are completed but will not decrease. [G] This analysis includes only those original PMAs for which FDA or the sponsor had made a final decision; this includes reviews by CBER through September 30, 2011, and reviews by CDRH through December 1, 2011. For this analysis, the FYs 2009 through 2011 original PMA cohorts were considered still incomplete. Specifically, 22 percent of the FY 2009 original PMA cohort, 46.3 percent of the FY 2010 cohort, and 65.1 percent of the FY 2011 cohort had not yet received a final decision. As a result, it was too soon to tell what the final results for these cohorts would be. It is possible that some of the reviews taking the most time were among those not completed when we received FDA's data. The percentages of final decisions that were approval, denial, or withdrawal and the average time to final decision for original PMAs not meeting the 295-day time frame for the FYs 2009 through 2011 cohorts may increase or decrease as those reviews are completed. The average number of review cycles for the FYs 2009 through 2011 cohorts may increase as those reviews are completed but will not decrease. [H] For the FYs 2010 through 2011 cohorts, there were no original PMAs that had received a final decision that did not meet the 295-day time frame. [End of table] Table 6: FDA Premarket Approval (PMA) Review Performance for Expedited PMAs, FYs 2000-2011: Fiscal years: Total number of submissions; Pre-MDUFMA: 2000: 11; 2001: 10; 2002: 12; MDUFMA: 2003: 4; 2004: 17; 2005: 6; 2006: 3; 2007: 2; MDUFA: 2008: 4; 2009: 4; 2010[A]: 6; 2011[A]: 7. Number reviewed in less than or equal to 180 180 days[B]; Pre-MDUFMA: 2000: 2; 2001: 3; 2002: 3; MDUFMA: 2003: 2; 2004: 8; 2005: 1; 2006: 0; 2007: 0; MDUFA: 2008: 1; 2009: 2; 2010[A]: 2; 2011[A]: 1. Percentage reviewed in less than or equal to 180 days[B]; Pre-MDUFMA: 2000: 18.2; 2001: 30.0; 2002: 25.0; MDUFMA: 2003: 50.0; 2004: 47.1; 2005: 16.7; 2006: 0.0; 2007: 0.0; MDUFA: 2008: 25.0; 2009: 50.0; 2010[A]: 40.0; 2011[A]: 50.0. Tier 1 goal percentage[C]; Pre-MDUFMA: 2000: [Empty]; 2001: [Empty]; 2002: [Empty]; MDUFMA: 2003: [Empty]; 2004: [Empty]; 2005: [Empty]; 2006: [Empty]; 2007: [Empty]; MDUFA: 2008: 50; 2009: 50; 2010[A]: 50; 2011[A]: 50. Met Tier 1 goal[D]; Pre-MDUFMA: 2000: n/a; 2001: n/a; 2002: n/a; MDUFMA: 2003: n/a; 2004: n/a; 2005: n/a; 2006: n/a; 2007: n/a; MDUFA: 2008: No; 2009: Yes; 2010[A]: No[E]; 2011[A]: Yes[E]. Number reviewed in less than or equal to 300 days[B]; Pre-MDUFMA: 2000: 8; 2001: 6; 2002: 11; MDUFMA: 2003: 3; 2004: 14; 2005: 5; 2006: 2; 2007: 0; MDUFA: 2008: 2; 2009: 3; 2010[A]: 5; 2011[A]: 2. Percentage reviewed in less than or equal to 300 days[B]; Pre-MDUFMA: 2000: 72.7; 2001: 60.0; 2002: 91.7; MDUFMA: 2003: 75.0; MDUFMA: 2004: 82.4; 2005: 83.3; 2006: 66.7; 2007: 0.0; MDUFA: 2008: 50.0; 2009: 75.0; 2010[A]: 100.0; 2011[A]: 100.0. less than or equal to 300 days goal percentage[C]; Pre-MDUFMA: 2000: [Empty]; 2001: [Empty]; 2002: [Empty]; MDUFMA: 2003: [Empty]; 2004: [Empty]; 2005: 70; 2006: 80; 2007: 90; MDUFA: 2008: [Empty]; 2009: [Empty]; 2010[A]: [Empty]; 2011[A]: [Empty]. Met less than or equal to 300 days goal[D]; Pre-MDUFMA: 2000: n/a; 2001: n/a; 2002: n/a; MDUFMA: 2003: n/a; 2004: n/a; 2005: Yes; 2006: No; 2007: No; MDUFA: 2008: n/a; 2009: n/a; 2010[A]: n/a; 2011[A]: n/a. Number reviewed in less than or equal to 280 days[B]; Pre-MDUFMA: 2000: 7; 2001: 6; 2002: 10; MDUFMA: 2003: 3; 2004: 14; 2005: 4; 2006: 1; 2007: 0; MDUFA: 2008: 2; 2009: 3; 2010[A]: 5; 2011[A]: 2. Percentage reviewed in less than or equal to 280 days[B]; Pre-MDUFMA: 2000: 63.6; 2001: 60.0; 2002: 83.3; MDUFMA: 2003: 75.0; 2004: 82.4; 2005: 66.7; 2006: 33.3; 2007: 0.0; MDUFA: 2008: 50.0; 2009: 75.0; 2010[A]: 100.0; 2011[A]: 100.0. Tier 2 goal percentage[C]; Pre-MDUFMA: 2000: [Empty]; 2001: [Empty]; 2002: [Empty]; MDUFMA: 2003: [Empty]; 2004: [Empty]; 2005: [Empty]; 2006: [Empty]; 2007: [Empty]; MDUFA: 2008: 90; 2009: 90; 2010[A]: 90; 2011[A]: 90. Met Tier 2 goal[D]; Pre-MDUFMA: 2000: n/a; 2001: n/a; 2002: n/a; MDUFMA: 2003: n/a; 2004: n/a; 2005: n/a; 2006: n/a; 2007: n/a; MDUFA: 2008: No; 2009: No; 2010[A]: Yes[E]; 2011[A]: Yes[E]. Average number of review cycles per submission[F]; Pre-MDUFMA: 2000: 2.45; 2001: 2.60; 2002: 2.08; MDUFMA: 2003: 2.75; 2004: 2.29; 2005: 3.50; 2006: 4.00; 2007: 3.00; MDUFA: 2008: 2.75; 2009: 2.50; 2010[A]: 2.00; 2011[A]: 2.00. Of first-cycle decisions, percentage that were approval; Pre-MDUFMA: 2000: 0.0; 2001: 20.0; 2002: 16.7; MDUFMA: 2003: 0.0; 2004: 17.6; 2005: 16.7; 2006: 0.0; 2007: 0.0; MDUFA: 2008: 25.0; 2009: 0.0; 2010[A]: 16.7; 2011[A]: 14.3. Among expedited PMAs with a final decision, percentage of final decisions that were: Approval[ G]; 2000: 81.8; 2001: 90.0; 2002: 75.0; MDUFMA: 2003: 75.0; 2004: 81.3; 2005: 66.7; 2006: 100.0; 2007: 0.0; MDUFA: 2008: 25.0; 2009: 75.0; 2010[A]: 75.0; 2011[A]: 100.0. Among expedited PMAs with a final decision, percentage of final decisions that were: Denial[G]; Pre-MDUFMA: 2000: 0.0; 2001: 0.0; 2002: 0.0; MDUFMA: 2003: 0.0; 2004: 0.0; 2005: 0.0; 2006: 0.0; 2007: 0.0; MDUFA: 2008: 25.0; 2009: 0.0; 2010[A]: 0.0; 2011[A]: 0.0. Among expedited PMAs with a final decision, percentage of final decisions that were: Withdrawal[G]; Pre-MDUFMA: 2000: 18.2; 2001: 10.0; 2002: 25.0; MDUFMA: 2003: 25.0; 2004: 18.8; 2005: 33.3; 2006: 0.0; 2007: 100.0; MDUFA: 2008: 50.0; 2009: 25.0; 2010[A]: 25.0; MDUFA: Average FDA review time (in days) for expedited PMAs that were not reviewed within 280 days[B]; Pre-MDUFMA: 2000: 343; 2001: 425; 2002: 308; MDUFMA: 2003: 447; 2004: 483; 2005: 427; 2006: 511; 2007: 344; MDUFA: 2008: 489; 2009: 414; 2010[A]: [Empty] [H]; 2011[A]: [Empty][H]. Average time to final decision (in days) for expedited PMAs that were not reviewed within 280 days[G]; Pre-MDUFMA: 2000: 588; 2001: 520; 2002: 334; MDUFMA: 2003: 1125; 2004: 713; 2005: 939; 2006: 792; 2007: 1111; MDUFA: 2008: 1017; 2009: 483; 2010[A]: [Empty][H]; 2011[A]: [Empty][H]. Source: GAO analysis of FDA data. Notes: A review cohort includes all the medical device submissions relating to a particular performance goal that were submitted in a given fiscal year. For example, all expedited PMAs received by FDA from October 1, 2010, to September 30, 2011, make up the expedited PMA review cohort for FY 2011. Cohorts were considered complete if fewer than 10 percent of submissions were still under review at the time we received FDA's data. All cohorts except FY 2010 and FY 2011 were complete for the purposes of our analysis. As a result, it was too soon to tell what the final results for these cohorts would be. We treated PMA submissions as meeting the time frame for a given performance goal if they were reviewed within the goal time plus any extension to the goal time that may have been made. The only reason the goal time can be extended is if the sponsor submits a major amendment to the submission on its own initiative (i.e., unsolicited by FDA). [A] The FYs 2010 and 2011 expedited PMA cohorts were considered still incomplete. Specifically, 33 percent of the FY 2010 expedited PMA cohort and 71.4 percent of the FY 2011 cohort had not yet received a final decision; this includes reviews by CBER through September 30, 2011, and reviews by CDRH through December 1, 2011. Additionally, for 16.7 percent of the FY 2010 expedited PMA cohort and 71.4 percent of the FY 2011 cohort, FDA had not yet made a decision that would permanently stop the review clock for purposes of determining whether FDA met its performance goals (i.e., an approval, approvable, not approvable, withdrawal, or denial) at the time we received FDA's data. As a result, it was too soon to tell what the final results for these cohorts would be. It is possible that some of the reviews taking the most time were among those not completed when we received FDA's data. The percentage of expedited PMAs reviewed within 180 days for the FY 2010 and FY 2011 cohorts may increase or decrease as those reviews are completed; the number reviewed within 180 days and the number and percentage reviewed within 300 days and within 280 days may decrease as those reviews are completed. The percentages of final decisions that were approval, denial, or withdrawal and the average time to final decision for the FYs 2010 through 2011 cohorts may increase or decrease as those reviews are completed. The average number of review cycles for the FYs 2010 through 2011 cohorts may increase as those reviews are completed but will not decrease. [B] Only expedited PMAs that had received a decision permanently stopping the review clock were used to determine the number and percentage of expedited PMAs reviewed within 180, 300, and 280 days. [C] Fiscal years for which there was no corresponding expedited PMA performance goal are denoted with a [Empty]. [D] "n/a" denotes not applicable. In these years, there was no corresponding expedited PMA performance goal and therefore no determination of whether the goal was met. [E] These results may change as the remaining expedited PMA submissions for the FY 2010 and FY 2011 cohorts receive decisions that permanently stop the review clock for purposes of determining whether FDA met its performance goals. [F] Cycles that were currently in progress at the time we received FDA's data were included in this analysis. The average number of review cycles for the incomplete cohorts may increase as those reviews are completed but will not decrease. [G] Only expedited PMAs that had received a final decision were used to determine the percentages of final decisions that were approval, denial, or withdrawal, and the average time to final decision for expedited PMAs not reviewed within 280 days. [H] For the FYs 2010 through 2011 cohorts, there were no expedited PMAs that had received a final decision that did not meet the 280-day time frame. [End of table] [End of section] Appendix II: Number of Full-time Equivalent (FTE) FDA Staff Supporting Medical Device User Fee Activities, FYs 2003 through 2010: Center for Devices and Radiological Health (CDRH): FDA centers and offices: Office of the Center Director (OCD)[A]; Number of FTEs in each fiscal year: 2003: 13; 2004: 12; 2005: 15; 2006: 19; 2007: 34; 2008: 35; 2009: 52; 2010: 63. FDA centers and offices: Office of Management Operations (OSM/OMO); Number of FTEs in each fiscal year: 2003: 89; 2004: 62; 2005: 64; 2006: 62; 2007: 61; 2008: 50; 2009: 48; 2010: 52. FDA centers and offices: Office of Information Technology (OIT)[B]; Number of FTEs in each fiscal year: 2003: [Empty]; 2004: [Empty]; 2005: [Empty]; 2006: [Empty]; 2007: [Empty]; 2008: 16; 2009: 15; 2010: 17. FDA centers and offices: Office of Compliance (OC); Number of FTEs in each fiscal year: 2003: 40; 2004: 46; 2005: 54; 2006: 55; 2007: 56; 2008: 61; 2009: 60; 2010: 60. FDA centers and offices: Office of Device Evaluation (ODE); Number of FTEs in each fiscal year: 2003: 305; 2004: 301; 2005: 298; 2006: 287; 2007: 311; 2008: 326; 2009: 322; 2010: 329. FDA centers and offices: Office of Science and Engineering Laboratories (OST/OSEL); Number of FTEs in each fiscal year: 2003: 88; 2004: 94; 2005: 110; 2006: 91; 2007: 89; 2008: 86; 2009: 87; 2010: 94. FDA centers and offices: Office of Communication Education and Radiation Programs (OHIP/OCER); Number of FTEs in each fiscal year: 2003: 42; 2004: 42; 2005: 47; 2006: 35; 2007: 45; 2008: 43; 2009: 35; 2010: 43. FDA centers and offices: Office of Surveillance and Biometrics (OSB); Number of FTEs in each fiscal year: 2003: 86; 2004: 92; 2005: 104; 2006: 106; 2007: 98; 2008: 105; 2009: 117; 2010: 139. FDA centers and offices: Office of In Vitro Diagnostics (OIVD)[C]; Number of FTEs in each fiscal year: 2003: [Empty]; 2004: 49; 2005: 61; 2006: 67; 2007: 65; 2008: 71; 2009: 69; 2010: 105. FDA centers and offices: Committee Conference Management (CCM)[D]; Number of FTEs in each fiscal year: 2003: [Empty]; 2004: [Empty]; 2005: [Empty]; 2006: [Empty]; 2007: [Empty]; 2008: 1; 2009: 1; 2010: 1. FDA centers and offices: CDRH Total; Number of FTEs in each fiscal year: 2003: 662; 2004: 697; 2005: 753; 2006: 721; 2007: 760; 2008: 793; 2009: 805; 2010: 902. Center for Biologics Evaluation and Research (CBER): FDA centers and offices: Center Director's Office, Office of Management (OM), Office of Information Management (OIM), and Office of Communication, Outreach, and Development (OCOD); Number of FTEs in each fiscal year: 2003: 13; 2004: 15; 2005: 20; 2006: 23; 2007: 23; 2008: 23; 2009: 24; 2010: 28. FDA centers and offices: Office of Blood Research & Review; Number of FTEs in each fiscal year: 2003: 37; 2004: 43; 2005: 49; 2006: 70; 2007: 66; 2008: 66; 2009: 64; 2010: 64. FDA centers and offices: Office of Cellular, Tissue & Gene Therapies; Number of FTEs in each fiscal year: 2003: 2; 2004: 2; 2005: 2; 2006: 2; 2007: 2; 2008: 5; 2009: 6; 2010: 6. FDA centers and offices: Office of Vaccines Research & Review; Number of FTEs in each fiscal year: 2003: 1; 2004: 0; 2005: 0; 2006: 0; 2007: 1; 2008: 1; 2009: 1; 2010: 4. FDA centers and offices: Office of Therapeutics Research & Review; Number of FTEs in each fiscal year: 2003: 1; 2004: 0; 2005: 0; 2006: 0; 2007: 0; 2008: 0; 2009: 0; 2010: 0. FDA centers and offices: Office of Biostatistics & Epidemiology; Number of FTEs in each fiscal year: 2003: 1; 2004: 2; 2005: 2; 2006: 2; 2007: 2; 2008: 2; 2009: 4; 2010: 5. FDA centers and offices: Office of Compliance & Biologics Quality; Number of FTEs in each fiscal year: 2003: 5; 2004: 6; 2005: 9; 2006: 6; 2007: 8; 2008: 6; 2009: 4; 2010: 7. FDA centers and offices: CBER Total; Number of FTEs in each fiscal year: 2003: 59; 2004: 68; 2005: 81; 2006: 104; 2007: 101; 2008: 102; 2009: 103; 2010: 114. Office of Regulatory Affairs (ORA): FDA centers and offices: ORA Total; Number of FTEs in each fiscal year: 2003: 59; 2004: 60; 2005: 62; 2006: 64; 2007: 64; 2008: 62; 2009: 57; 2010: 69. Office of the Commissioner (OC): FDA centers and offices: OC Total; Number of FTEs in each fiscal year: 2003: 77; 2004: 70; 2005: 77; 2006: 78; 2007: 90; 2008: 74; 2009: 84; 2010: 86. Shared Service (SS)[E]: FDA centers and offices: SS Total; Number of FTEs in each fiscal year: 2003: [Empty]; 2004: 20; 2005: 60; 2006: 53; 2007: 57; 2008: 64; 2009: 60; 2010: 59. All Centers and Offices Total; Number of FTEs in each fiscal year: 2003: 857; 2004: 915; 2005: 1,034; 2006: 1,020; 2007: 1,071; 2008: 1,095; 2009: 1,109; 2010: 1,230. Source: GAO analysis of FDA data. Note: All FTE rounded to the nearest whole number. FTEs for each fiscal year may not add to the fiscal year total due to rounding. One FTE represents 40 hours of work per week conducted by a federal government employee over the course of 1 year. FTEs do not include contractors and therefore provide a partial measure of staffing resources. [A] OCD includes Medical Device Fellowship Program employees even though the Fellows were assigned to work throughout CDRH. [B] OIT was included in the OMO FTE total prior to FY 2008. [C] OIVD did not exist prior to FY 2004. Also, the Radiology Devices Branch was moved from ODE to OIVD between FY 2009 and FY 2010. [D] CCM was included in the OMO FTE total prior to FY 2008. [E] Shared Service FTE were not separated from the center FTE until FY 2004. [End of table] [End of section] Appendix III: Comments from the Department of Health and Human Services: Department Of Health And Human Services: Office Of The Secretary: Assistant Secretary for Legislation: Washington, DC 20201: February 22 2012: Marcia Crosse: Director, Health Care: U.S. Government Accountability Office: 441 G Street NW: Washington, DC 20548: Dear Ms. Crosse: Attached are comments on the U.S. Government Accountability Office's (GAO) report entitled, "Medical Devices: FDA Has Met Most Performance Goals but Device Reviews Are Taking Longer" (GAO-12-418). The Department appreciates the opportunity to review this draft section of the report prior to publication. Sincerely, Signed by: Jim R. Esquea: Assistant Secretary for Legislation: Attachment: [End of letter] General Comments Of The Department Of Health And Human Services (HHS) On The Government Accountability Office's (GAO) Draft Report Entitled, "Medical Devices: FDA Has Met Most Performance Goals But Device Reviews Are Taking Longer" (GAO-12-418): The Department appreciates the opportunity to review and comment on this draft report. As GAO notes, the Food and Drug Administration's (FDA) own annual reports to Congress[Footnote 1] regarding medical device review times reveal the same patterns that GAO has found in this report. FDA has continued to make progress during FY 2011 while implementing MDUFA II. Overall, FDA has already met or exceeded, or has the potential to meet or exceed, based on preliminary data of completed and pending reviews, 17 of 21 Tier 1 performance goals and 15 of 21 Tier 2 performance goals for the FY 2008 through FY 2011 performance goal cohorts. In addition, GAO noted, and FDA also has identified in it own reports, that the total time to a decision (FDA review time plus the time it takes a sponsor to provide requested information) has increased, yet FDA's performance in meeting its goals is by and large strong. Since the total time to a decision includes the time industry incurs in responding to FDA's concerns, FDA and industry bear shared responsibility for the time increase and will need to work together to improve performance on total time to decision. FDA recently conducted an analysis of premarket review times under the 510(k) program and identified quality issues in more than 50 percent of 510(k) applications. These quality issues require agency staff to prepare and issue additional information letters, resulting in additional review cycles. At the same time, FDA has been instituting management changes to improve its premarket review programs, which the agency expects will help address the increase in total time to decision. FDA has conducted a thorough assessment of the 510(k) review program and an assessment of how CDRH uses science in regulatory decision-making. FDA issued two reports in August 2010 that found that FDA needed to take several critical actions to improve the predictability, consistency, and transparency of both the 510(k) and premarket approval application review programs. The agency solicited public comment on the recommendations identified in the studies and received a range of perspectives from stakeholders, including medical device companies, industry representatives, venture capitalists, health care professional organizations, third-party payers, patient and consumer advocacy groups, foreign regulatory bodies, and others. After considering the public input, in January 2011, FDA announced 25 specific actions[Footnote 2] that the agency would take to improve the predictability, consistency, and transparency of its premarket review programs. Subsequently, FDA has taken or is taking actions to: * create a culture change toward greater transparency, interaction, collaboration, and the appropriate balancing of benefits and risk; * ensure predictable and consistent recommendations, decision-making, and application of the least-burdensome principle; and * implement efficient processes and use of resources. These actions exemplify FDA's commitment to increasing the timely availability of safe and effective new medical devices to patients and healthcare providers. Finally, FDA needs adequate, stable funding to manage a device program that can approve or clear safe and effective products for patients without delay, and only timely reauthorization of a medical device user fee program that both FDA and the medical device industry support can assure such funding. Footnotes: [1] [hyperlink, http://www.fda.gov/AboutFDA/ReportsManualsForms/Reports/UserFeeReports/P erformanceReports/MDUFMA/default.htm]. [2] [hyperlink, http://www.fda.gov/downloads/AboutFDA/CentersOffices/CDRH/CDRHReports/UC M239450.pdf]. [End of section] Appendix IV: GAO Contact and Staff Acknowledgments: GAO Contact: Marcia Crosse, (202) 512-7114 or crossem@gao.gov: Staff Acknowledgments: In addition to the contact named above, Robert Copeland, Assistant Director; Carolyn Fitzgerald; Cathleen Hamann; Karen Howard; Hannah Marston Minter; Lisa Motley; Aubrey Naffis; Michael Rose; and Rachel Schulman made key contributions to this report. [End of section] Footnotes: [1] Medical devices include instruments, apparatuses, machines, and implants that are intended for use to diagnose, cure, treat, or prevent disease, or to affect the structure or any function of the body. See 21 U.S.C. § 321(h). These devices range from simple tools such as bandages and surgical clamps to complicated devices such as pacemakers. [2] See Pub. L. No. 107-250, § 102(a), 116 Stat. 1588, 1589-1600 (2002) (codified as amended at 21 U.S.C. §§ 379i and 379j). [3] A user fee is a fee assessed for goods and services provided by the federal government. FDA collected one type of user fee-- application fees--under MDUFMA. [4] FDA collects three types of user fees under MDUFA: application fees, annual establishment registration fees, and annual fees for periodic reports regarding Class III devices. [5] For the remainder of this report, we use the term "user fees" to refer to user fees submitted with device applications such as premarket approvals (PMA), premarket notifications (510(k)), and various types of PMA supplements. [6] The PMA review process is more stringent than the 510(k) review process and is generally used for higher risk devices. As part of a PMA submission, the manufacturer is required to supply evidence providing reasonable assurance that the device is safe and effective. Under the 510(k) process, FDA determines whether the device is substantially equivalent to a legally marketed device. FDA also reviews several other types of medical device submissions that are outside the scope of our work. PMAs and 510(k)s make up the majority of device submissions received by FDA. For example, our analysis of PMAs and 510(k)s included 88.6 percent of all device submissions to FDA in FY 2010. [7] See Pub. L. No. 110-85, § 201(c), 121 Stat. 823, 842-43 (2007). The performance goals are identified in letters sent by the Secretary of Health and Human Services to the Chairman of the Senate Committee on Health, Education, Labor, and Pensions and the Chairman of the House Committee on Energy and Commerce and are published on FDA's website. Each fiscal year, FDA is required to submit a report on its progress in achieving those goals and future plans for meeting them. See 21 U.S.C. § 379j-1(a). [8] FDA does not have to approve a PMA or clear a 510(k) for it to be considered acted upon. There are a number of decisions in addition to an approval or clearance decision that can end FDA's review. [9] We defined a cohort to be complete if fewer than 10 percent of submissions from that cohort were still under review at the time we received FDA's data. [10] The data for FY 2011 are preliminary because the agency had only completed its review for 61 percent of 510(k) submissions and 51 percent of PMA submissions at the time we received FDA's data. As a result, it was too soon to tell what the final results for this cohort would be. For example, it is possible that some of the reviews taking the most time were among those not completed when we received FDA's data. [11] The first review cycle begins when a sponsor makes a submission to FDA and ends when FDA either makes a decision or contacts the sponsor in writing to request additional information. A new review cycle begins when the sponsor sends a response back to FDA. [12] When we refer to consumer advocacy groups, we are referring to groups that advocate on behalf of consumers and patients. [13] It was beyond the scope of our review to describe all issues raised by stakeholders. Such issues--including barriers to innovation in the medical device industry or the need for increased resources at FDA--have been extensively covered in other reports, forums, and hearings. [14] See 21 U.S.C. § 360c(a)(1); 21 C.F.R. pt. 860. [15] Some class III device types on the market before the enactment of the Medical Device Amendments of 1976 and those determined to be substantially equivalent to them do not currently require PMA approval for marketing. FDA has been taking steps to address these device types; once this process is completed, all class III devices will be required to go through the PMA review process. For additional information, see GAO, Medical Devices: FDA's Premarket Review and Postmarket Safety Efforts, [hyperlink, http://www.gao.gov/products/GAO-11-556T] (Washington, D.C.: Apr. 13, 2011). [16] A small percentage of devices enter the market by other means, such as through the humanitarian device exemption process that authorizes FDA to exempt certain medical devices from the premarket review requirement to demonstrate effectiveness in order to provide an incentive for the development of devices that treat or diagnose rare diseases or conditions. See 21 U.S.C. § 360j(m); 21 C.F.R. pt. 814, subpt. H. [17] A legally marketed device to which a new device may be compared for a determination regarding substantial equivalence is a device that was legally marketed prior to 1976, a device that has been reclassified from class III to class II or I, or a device that has been found to be substantially equivalent through the 510(k) process. See 21 C.F.R. § 807.92(a)(3). [18] See Department of Health and Human Services, Food and Drug Administration, Quarterly Update on Medical Device Performance Goals (Silver Spring, Md.: July 26, 2011). [19] A cohort is comprised of all the submissions of a certain type filed in the same fiscal year. For example, all 510(k)s received by FDA from October 1, 2010, to September 30, 2011, make up the 510(k) review cohort for FY 2011. [20] Tier 1 and Tier 2 designations refer to the length of time allotted (90 days and 150 days, respectively) for FDA to complete its review of 510(k) submissions. If FDA completed its review of a submission in 90 days or less, it met the time frame for both the Tier 1 and Tier 2 goals. If the review was completed in more than 90 days but not more than 150 days, only the time frame for the Tier 2 goal was met. If the review took longer than 150 days, the time frame for neither goal was met. FDA did not designate 510(k) performance goals as either Tier 1 or Tier 2 prior to FY 2008. We have aligned the performance goals in place prior to FY 2008 with the Tier 1 goals for FYs 2008-2011 based on sharing the same 90-day time frame. This placement illustrates the increase in the goal percentage over time. We defined a 510(k) cohort to be complete if fewer than 10 percent of submissions from that cohort were still under review at the time we received FDA's data, which cover reviews by CDRH through October 26, 2011, and reviews by CBER through December 23, 2011. Using this definition, FY 2011 was the only 510(k) cohort that was incomplete. [21] When calculating whether FDA met the performance goal for a 510(k) cohort, FDA and industry have agreed to include only those submissions receiving a substantially equivalent or not substantially equivalent decision. For our analysis, we included all 510(k)s that had received a final decision, regardless of the decision received, in order to provide a broader look at FDA's review performance. [22] In its July 2011 analysis of 510(k) submissions, FDA concluded that reviewers asked for additional information from sponsors--thus stopping the clock on FDA's review time while the total time to reach a final decision continued to elapse--mainly due to problems with the quality of the submission. See U.S. Department of Health and Human Services, Food and Drug Administration, Analysis of Premarket Review Times Under the 510(k) Program (Silver Spring, Md.: July 2011). [23] For example, for FY 2012 the standard PMA application fee is $220,050 while the 510(k) fee is $4,049. [24] In its annual performance reports, FDA refers to these two types as original PMAs and expedited original PMAs. [25] See 21 U.S.C. § 360e(d)(5). Unmet medical need is demonstrated by meeting one of the following criteria: the device represents a breakthrough technology that provides a clinically meaningful advantage over existing technology; no approved alternative treatment or means of diagnosis exists; the device offers significant, clinically meaningful advantages over existing approved alternative treatments; or the availability of the device is in the best interest of patients. [26] An approval order informs the applicant that the PMA has been approved. An approvable letter is sent to inform the applicant that there needs to be resolution of minor deficiencies or completion of an FDA inspection. A major deficiency letter informs the applicant that the PMA lacks significant information necessary for the agency to complete its review and requests the applicant amend the submission to provide the necessary information. A not approvable letter informs the applicant that the submission cannot be approved at that time because of significant deficiencies; describes the deficiencies; and, where practical, describes the measures required to make the submission approvable. Generally, before FDA issues a not approvable letter, it will first issue a major deficiency letter to provide the applicant with an opportunity to address its concerns. A denial order notifies the applicant that the PMA is not approved and informs the applicant of its deficiencies. [27] In September 2009, FDA convened an internal 510(k) working group to conduct a comprehensive assessment of the 510(k) process. This assessment resulted in the publication of a preliminary report in August 2010, which was intended to communicate preliminary findings and recommendations and actions FDA might take to address identified areas of concern. See U.S. Department of Health and Human Services, Food and Drug Administration, 510(k) Working Group: Preliminary Report and Recommendations (Silver Spring, Md.: August 2010). Also in September 2009, FDA convened an internal task force on the utilization of science in regulatory decision making. This task force was responsible for reviewing how FDA uses science in its regulatory decision making for device reviews and making recommendations on how FDA can quickly incorporate new science--including evolving information, novel technologies, and new scientific methods--into its decision making, while maintaining as much predictability as practical. The task force released a preliminary report with its findings and recommendations in August 2010. See U.S. Department of Health and Human Services, Food and Drug Administration, Task Force on the Utilization of Science in Regulatory Decision Making, Preliminary Report and Recommendations (Silver Spring, Md.: August 2010). [28] In January 2011, after reviewing public comments on the August 2010 reports, FDA issued a plan of action for implementing the recommendations in the reports. See U.S. Department of Health and Human Services, Food and Drug Administration, Plan of Action for Implementation of 510(k) and Science Recommendations (Silver Spring, Md.: January 2011). FDA began implementing these actions in March 2011 and the majority of the actions had been implemented or were underway at the time of our report. See U.S. Department of Health and Human Services, Food and Drug Administration, CDRH Plan of Action for 510(k) and Science (Silver Spring, Md.: October 2011). [29] When calculating whether FDA met the performance goal for a 510(k) cohort, FDA and industry have agreed to include only those submissions receiving a substantially equivalent or not substantially equivalent decision. For our analysis, we included all 510(k)s that had received a final decision, regardless of the decision received, in order to provide a broader look at FDA's review performance. [30] Approximately 39 percent of 510(k)s received in FY 2011 were still under review at the time we received FDA's data, which cover reviews by CDRH through October 26, 2011, and reviews by CBER through December 23, 2011. As a result, it was too soon to tell what the final results for this cohort would be. For example, the percentage of completed 510(k)s that met the 90-day performance goal time frame was 97.2 percent. However, the percentage of 510(k)s reviewed within 90 days for the FY 2011 cohort may increase or decrease as those reviews are completed. [31] Approximately 39 percent of 510(k)s received in FY 2011 were still under review at the time we received FDA's data, which cover reviews by CDRH through October 26, 2011, and reviews by CBER through December 23, 2011. As a result, it was too soon to tell what the final results for this cohort would be. For example, the percentage of completed 510(k)s that met the 150-day performance goal was 99.8 percent. However, the percentage of 510(k)s reviewed within 150 days for the FY 2011 cohort may increase or decrease as those reviews are completed. [32] Approximately 39 percent of 510(k)s received in FY 2011 were still under review at the time we received FDA's data, which cover reviews by CDRH through October 26, 2011, and reviews by CBER through December 23, 2011. As a result, it was too soon to tell what the final results for this cohort would be. It is possible that some of the reviews taking the most time were among those not completed when we received FDA's data. [33] Approximately 39 percent of 510(k)s received in FY 2011 were still under review at the time we received FDA's data, which cover reviews by CDRH through October 26, 2011, and reviews by CBER through December 23, 2011. As a result, it was too soon to tell what the final results for this cohort would be. Therefore, the percentage of 510(k)s in the FY 2011 cohort receiving a first-cycle substantially equivalent decision may increase or decrease as those reviews are completed. [34] Approximately 39 percent of 510(k)s received in FY 2011 were still under review at the time we received FDA's data, which cover reviews by CDRH through October 26, 2011, and reviews by CBER through December 23, 2011. As a result, it was too soon to tell what the final results for this cohort would be. Therefore, the percentage of 510(k)s in the FY 2011 cohort receiving a first-cycle AI letter may increase or decrease as those reviews are completed. [35] We treated PMA submissions as meeting the time frame for a given performance goal if they were reviewed within the goal time plus any extension to the goal time that may have been made. The only reason the goal time can be extended is if the sponsor submits a major amendment to the submission on its own initiative (i.e., not solicited by FDA). According to FDA, typical situations that might prompt a sponsor to submit an unsolicited major amendment include when the applicant obtains additional test data related to the safety or effectiveness of the device or obtains new clinical data from a previously unreported study. [36] PMA performance goals were not designated as Tier 1 or Tier 2 until FY 2008. We have aligned the performance goals in place prior to FY 2008 with the Tier 1 or Tier 2 goals for FYs 2008-2011 based on sharing the same or similar goal time frames. This placement illustrates the increase in the goal percentage over time. [37] For this analysis, the FY 2010 and 2011 original PMA cohorts were still incomplete. Specifically, for 18.5 percent of the FY 2010 original PMA cohort and 48.8 percent of the FY 2011 cohort, a decision that would permanently stop the review clock for purposes of determining whether FDA met its performance goals had not been made at the time we received FDA's data. As a result, it was too soon to tell what the final results for this cohort would be. It is possible that some of the reviews taking the most time were among those not completed when we received FDA's data. The percentage of original PMAs reviewed within 180 days for these two cohorts may increase or decrease as those reviews are completed. [38] For this analysis, the FYs 2010 and 2011 original PMA cohorts were still incomplete. Specifically, for 18.5 percent of the FY 2010 original PMA cohort and 48.8 percent of the FY 2011 cohort, a decision that would permanently stop the review clock for purposes of determining whether FDA met its performance goals had not been made at the time we received FDA's data. As a result, it was too soon to tell what the final results for this cohort would be. It is possible that some of the reviews taking the most time were among those not completed when we received FDA's data. The percentage of original PMAs reviewed within 320 and 295 days for these two cohorts may increase or decrease as those reviews are completed. [39] The FY 2009 original PMA cohort is complete for purposes of calculating FDA review time but incomplete for the calculation of time to final decision because some submissions in the cohort have received a decision ending a review cycle (e.g., approvable letter) and permanently stopping the review clock for purposes of determining whether FDA met its performance goals, but have not yet received a final decision such as approval or denial that would end the review process. [40] See U.S. Department of Health and Human Services, Food and Drug Administration, Draft Guidance for Industry and Food and Drug Administration Staff. The 510(k) Program: Evaluating Substantial Equivalence in Premarket Notifications [510(k)] (Silver Spring, Md.: Dec. 27, 2011). [41] See U.S. Department of Health and Human Services, Food and Drug Administration, CDRH Guidance Development (Silver Spring, Md.: Aug. 1, 2011). [42] See U.S. Department of Health and Human Services, Food and Drug Administration, Medical Device Premarket Programs: An Overview of FDA Actions (Silver Spring, Md.: Oct. 19, 2011). Interactive review--which was established following the 2007 reauthorization of the medical device user fee program--was created to formalize a process to encourage and facilitate communication between FDA and sponsors during reviews. [43] Additionally, FDA recently issued draft guidance regarding the factors that FDA considers when making benefit-risk determinations in order to increase the transparency of these determinations. See U.S. Department of Health and Human Services, Food and Drug Administration, Draft Guidance for Industry and Food and Drug Administration Staff. Factors to Consider when Making Benefit-Risk Determinations in Medical Device Premarket Review (Silver Spring, Md.: Aug. 15, 2011). According to FDA, the criteria in this guidance take a patient-centric approach by calling for the consideration of patients' tolerance for risk. [44] See U.S. Department of Health and Human Services, Food and Drug Administration, 510(k) Working Group: Preliminary Report and Recommendations (Silver Spring, Md.: August 2010) and U.S. Department of Health and Human Services, Food and Drug Administration, Task Force on the Utilization of Science in Regulatory Decision Making, Preliminary Report and Recommendations (Silver Spring, Md.: August 2010). FDA identified several causes of the issues noted in these assessments, including very high reviewer and manager turnover at CDRH; insufficient training for staff and industry; extremely high ratios of front-line supervisors to employees; insufficient oversight by managers; CDRH's rapidly growing workload, caused by the increasing complexity of devices and the number of submissions reviewed; unnecessary and/or inconsistent data requirements imposed on device sponsors; insufficient guidance for industry; and poor-quality submissions from industry. [45] See U.S. Department of Health and Human Services, Food and Drug Administration, CDRH Standard Operating Procedure for "Notice to Industry" Letters (Silver Spring, Md.: June 14, 2011). The SOP provides a streamlined, systematic process for communicating with industry via the guidance process and other means, as appropriate. Notice to Industry letters are short communications that describe, at a very high level, changes to scientific data requirements and FDA's reasons for those changes. Some of these letters may constitute guidance, while others will not. Because these letters are short and are overseen by upper management at the Center, they can be developed and released in roughly 3 weeks. FDA posts these letters on its website and also uses additional methods for distributing the letters to stakeholders. [46] See U.S. Department of Health and Human Services, Food and Drug Administration, SOP: Management of Review Staff Changes During the Review of a Premarket Submission (Silver Spring, Md.: Dec. 27, 2011). [47] See U.S. Department of Health and Human Services, Food and Drug Administration, Driving Biomedical Innovation: Initiatives to Improve Products for Patients (Silver Spring, Md.: October 2011). The reviewer certification program includes up to 18 months of training on specific core competencies through online training modules, instructor-led courses, and practical experience. [48] The experiential learning program will include visits to academic institutions, manufacturers, research organizations, and health care facilities to provide new reviewers with a broader view of the regulatory process for medical devices. [49] See U.S. Department of Health and Human Services, Food and Drug Administration, Analysis of Premarket Review Times Under the 510(k) Program (Silver Spring, Md.: July 2011). [50] Quality problems included (1) inadequate device description, (2) discrepancies throughout the submission, (3) problems with the indications for use, (4) failure to follow or otherwise address current guidance documents or recognized standards, (5) lack of performance data, and (6) lack of clinical data. [51] See U.S. Department of Health and Human Services, Food and Drug Administration, Medical Device User Fee Program Public Meeting (Hyattsville, Md.: Sept. 14, 2010). [52] See A Delicate Balance: FDA and the Reform of the Medical Device Approval Process, Hearing Before the Special Committee on Aging, United States Senate, 112TH Cong. (2011) (statement of Diana Zuckerman, President, National Research Center for Women and Families, Cancer Prevention and Treatment Fund). [53] See Institute of Medicine of the National Academies, Medical Devices and the Public's Health, The FDA 510(k) Clearance Process at 35 Years (Washington, D.C.: July 29, 2011). IOM concluded that the standard for assessing substantial equivalence to a predicate device for clearance under the 510(k) program generally does not require evidence of safety or effectiveness of a device. Therefore, IOM concluded that FDA cannot evaluate the safety and effectiveness of devices as long as the standard for clearance is substantial equivalence, as directed in statute. IOM also concluded that available information on postmarket performance of devices does not provide sufficient information about potential harm or lack of effectiveness to be a useful source of data on the safety and effectiveness of marketed devices. IOM does not believe, however, that there is a public-health crisis related to unsafe or ineffective medical devices. See also GAO, Medical Devices: FDA Should Take Steps to Ensure That High-Risk Device Types Are Approved through the Most Stringent Premarket Review Process, [hyperlink, http://www.gao.gov/products/GAO-09-190] (Washington, D.C.: Jan. 15, 2009). [54] See U.S. Department of Health and Human Services, Food and Drug Administration, Medical Device Reporting (MDR) Rate in 510(k) Cleared Devices Using Multiple Predicates (Silver Spring, Md.: Oct. 14, 2011). A sponsor may cite more than one predicate device in a submission for several reasons. For example, multiple devices, each with its own predicate, may be bundled together into one submission. A sponsor may also cite multiple predicates when a single device combines the functions of more than one device. [55] See 76 Fed. Reg. 50,230 (Aug. 12, 2011). [56] FDA held a public workshop on the adoption, implementation, and use of unique device identifiers in various health-related electronic data systems in September 2011. See 76 Fed. Reg. 43,691 (July 21, 2011). According to FDA's plan of action, FDA is also currently developing proposed regulations for the unique device identifier system. [End of section] GAO’s Mission: The Government Accountability Office, the audit, evaluation, and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO’s commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO’s website [hyperlink, http://www.gao.gov]. Each weekday afternoon, GAO posts on its website newly released reports, testimony, and correspondence. To have GAO e-mail you a list of newly posted products, go to [hyperlink, http://www.gao.gov] and select “E- mail Updates.” Order by Phone: The price of each GAO publication reflects GAO’s actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO’s website, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. Connect with GAO: Connect with GAO on facebook, flickr, twitter, and YouTube. Subscribe to our RSS Feeds or E mail Updates. Listen to our Podcasts. Visit GAO on the web at [hyperlink, http://www.gao.gov]. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Website: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]; E-mail: fraudnet@gao.gov; Automated answering system: (800) 424-5454 or (202) 512-7470. Congressional Relations: Katherine Siggerud, Managing Director, siggerudk@gao.gov, (202) 512-4400 U.S. Government Accountability Office, 441 G Street NW, Room 7125 Washington, DC 20548. Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov, (202) 512-4800 U.S. Government Accountability Office, 441 G Street NW, Room 7149 Washington, DC 20548.