This is the accessible text file for GAO report number GAO-06-416 
entitled 'Clinical Lab Quality: CMS and Survey Organization Oversight 
Should be Strengthened' which was released on June 27, 2006. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Report to Congressional Requesters: 

United States Government Accountability Office: 

GAO: 

June 2006: 

Clinical Lab Quality: 

CMS and Survey Organization Oversight Should Be Strengthened: 

Clinical Lab Quality: 

GAO-06-416: 

GAO Highlights: 

Highlights of GAO-06-416, a report to congressional requesters 

Why GAO Did This Study: 

The Clinical Laboratory Improvement Amendments of 1988 (CLIA) 
strengthened and extended quality requirements for labs that perform 
tests to diagnose or treat disease. About 36,000 labs that perform 
certain complex tests must be surveyed biennially by either a state or 
one of six private accrediting organizations. CMS oversees 
implementation of CLIA requirements and the activities of survey 
organizations. GAO was asked to examine (1) the quality of lab testing; 
(2) the effectiveness of surveys, complaint investigations, and 
enforcement actions in detecting and addressing lab problems; and (3) 
the adequacy of CMS’s CLIA oversight. 

What GAO Found: 

Because of limited comparable data from CMS and survey organizations, 
too little is known about the quality of lab testing. For example, a 
standardized assessment of lab quality across survey organizations is 
not possible because of different definitions of what constitutes a 
serious quality problem. One survey organization had no systematic way 
of identifying the problematic labs it inspects. However, GAO’s 
analysis of an indicator that measures a lab's ability to consistently 
produce accurate test results suggests that lab quality may not have 
improved at hospital labs in recent years. 

Based on an analysis of available data and interviews with CMS and 
survey organizations, real and potential lab quality problems are 
masked by survey, complaint, and enforcement weaknesses. Because most 
survey organizations announce the timing of biennial surveys, allowing 
labs to prepare for inspections, surveys may not provide a realistic 
picture of lab quality. Although two survey organizations that 
generally inspect hospital labs plan to begin unannounced surveys in 
2006, they may not be possible at physician office labs that have 
irregular hours. Survey organizations that typically inspect such labs, 
however, provide more advance notice about upcoming inspections than 
CMS allows states to provide. Several other factors suggest that 
surveys and complaints do not present a realistic picture of lab 
quality. Interviews with officials from a sample of states confirmed 
that some survey organizations do not cite all serious deficiencies, as 
evidenced by variability in the limited available lab survey data. 
Officials said that surveyors may be reluctant to cite deficiencies 
because they view their role as educational, not regulatory; moreover, 
CMS has instructed state surveyors not to cite some deficiencies for 
over 2 years after implementing new lab requirements. Finally, lab 
workers may file complaints infrequently because of concern about 
retaliation and a lack of understanding about how to file a complaint. 
CMS rarely imposes sanctions, even for labs with the same repeat 
deficiencies, a reflection of the educational focus of the CLIA 
program. 

CMS does not require labs to participate in a key quality assurance 
test as frequently as CLIA requires. Although funded by lab fees, CMS 
officials indicated that the program has not been allowed to hire 
sufficient staff to carry out the agency’s oversight responsibilities. 
Moreover, CMS’s principal oversight tool, intended to determine if all 
serious deficiencies were identified, lacks independence because many 
oversight reviews are conducted simultaneously with survey 
organizations. CMS’s presence may make surveyors more attentive to 
survey tasks than when they are not being observed. Compared to 
independent reviews, simultaneous reviews rarely identify missed 
deficiencies. Furthermore, CMS does not collect and analyze data on 
serious deficiencies identified by each survey organization and thus, 
is unable to assess whether lab quality is improving or declining. Nor 
does CMS effectively analyze other key data such as the use of 
sanctions. To improve oversight, CMS is establishing a nationwide 
complaints database. CMS is also instituting annual survey organization 
performance reviews. 

What GAO Recommends: 

GAO is making recommendations to the CMS Administrator to improve CLIA 
oversight including (1) standardizing the reporting of survey 
deficiencies to permit meaningful comparisons across survey 
organizations; (2) working with survey organizations to ensure that 
educating lab workers does not preclude appropriate regulation, such as 
identifying and reporting deficiencies that affect lab testing quality; 
and (3) allowing the CLIA program to fully use revenues generated by 
the program to hire sufficient staff to fulfill its statutory 
responsibilities. CMS concurred with 11 of GAO’s 13 recommendations and 
noted that the report provided insights into areas where it can 
improve, augment, and reinforce oversight. 

[Hyperlink; http://www.gao.gov/cgi-bin/getrpt?GAO-06-416]. 

To view the full product, including the scope and methodology, click on 
the link above. For more information, contact Leslie G. Aronovitz at 
(312) 220-7600 or aronovitzl@gao.gov. 

[End of Section] 

Contents: 

Letter: 

Results in Brief: 

Background: 

Insufficient Data Exist to Identify Extent of Serious Lab Quality 
Problems: 

Oversight Weaknesses Mask Quality Problems: 

CMS Oversight of CLIA Is Inadequate: 

Conclusions: 

Recommendations for Executive Action: 

Agency and Accrediting Organization Comments and Our Evaluation: 

Appendix I: Effects of Lab Errors on Patient Health: 

Appendix II: Labs Surveyed by State Survey Agencies and the Percentage 
with Condition-Level Deficiencies, by State in 2004: 

Appendix III: Number of Labs Subject to Surveys by State Survey 
Agencies in 2005 and Number of Labs with Sanctions, 1998 to 2004: 

Appendix IV: Comments from the Centers for Medicare & Medicaid 
Services: 

Appendix V: Comments from the College of American Pathologists: 

Appendix VI: Comments from COLA: 

Appendix VII: Comments from the Joint Commission on Accreditation of 
Healthcare Organizations: 

Appendix VIII: GAO Contact and Staff Acknowledgments: 

Tables: 

Table 1: Percentage of Inspection Requirements Classified as Serious, 
by Survey Organization: 

Table 2: Amount of Advance Notice Given to Labs about Upcoming 
Inspections, by Survey Organization: 

Table 3: Number of Complaints Received by CAP, 2002-2005: 

Table 4: Number of Labs Inspected with Principal Only, Principal and 
Alternative, and Alternative Only Sanctions Imposed, 1998-2004: 

Table 5: Analysis of Results of Simultaneous and Independent Validation 
Reviews of State Surveys, Fiscal Years 1999-2003: 

Table 6: Analysis of Results of Simultaneous and Independent Validation 
Reviews of Accrediting Organizations' Surveys, Fiscal Years 1999-2003: 

Table 7: Effects of Lab Errors on Patient Health: 

Figures: 

Figure 1: Types of Survey Organizations, Requirements Used to Survey 
Labs, and Percentage of Labs Surveyed by Each Organization, as of 
December 2005: 

Figure 2: Percentage of Labs with Proficiency Testing Failures from 
1999 through 2003, by Survey Organization: 

Abbreviations: 

AAAB: American Association of Blood Banks: 
AOA: American Osteopathic Association: 
ASHI: American Society of Histocompatibility and Immunogenetics:  
CAP: College of American Pathologists: 
CLIA: Clinical Laboratory Improvement Amendments of 1988: 
CMS: Centers for Medicare & Medicaid Services: 
CMSO: Center for Medicaid and State Operations: 
ER: emergency room: 
JCAHO: Joint Commission on Accreditation of Healthcare Organizations:  
OSCAR: On-Line Survey, Certification, and Reporting system: 

United States Government Accountability Office: 

Washington, DC 20548: 

June 16, 2006: 

The Honorable Charles E. Grassley: 
Chairman: 
Committee on Finance: 
United States Senate: 

The Honorable Mark Souder: 
Chairman: 
The Honorable Elijah E. Cummings: 
Ranking Minority Member: 
Subcommittee on Criminal Justice, Drug Policy and Human Resources:  
Committee on Government Reform: 
House of Representatives: 

Clinical lab tests are one of the most frequently billed Medicare 
procedures and, according to the American Clinical Laboratory 
Association, affect an estimated 70 percent of medical 
decisions.[Footnote 1] To improve oversight of clinical labs, Congress 
passed legislation in 1967;[Footnote 2] renewed concerns about quality, 
including errors in Pap smear tests used to diagnose cervical cancer, 
resulted in enactment of the Clinical Laboratory Improvement Amendments 
of 1988 (CLIA).[Footnote 3] In recent years, despite CLIA, lab quality 
problems in several states have raised questions about the adequacy of 
lab oversight. Lab oversight is critical because inaccurate or 
unreliable lab tests may lead to improper treatment, unnecessary mental 
and physical anguish for patients, and higher health care 
costs.[Footnote 4] 

The Centers for Medicare & Medicaid Services (CMS) is responsible for 
overseeing compliance with CLIA requirements. As of December 2005, 
there were approximately 193,000 labs nationwide, ranging from very 
small physician office labs that conduct fewer than 2,000 tests 
annually to hospital labs that conduct millions of tests each year. 
Most clinical labs regulated under CLIA must obtain a certificate from 
CMS but only about 19 percent--those that conduct moderate-to high- 
complexity tests--undergo biennial inspections, which are also referred 
to as surveys.[Footnote 5] The surveys assess lab compliance with 
mandated personnel and testing standards. In addition, surveyed labs 
must participate in proficiency testing, a program that requires them 
to test samples with unknown characteristics that are then graded by an 
external party. Labs with serious deficiencies may be sanctioned, e.g., 
required to cease testing. Labs have a choice of being surveyed by (1) 
their state survey agency, under contract with CMS; (2) their state 
CLIA-exempt program for labs in New York and Washington; or (3) one of 
six private accrediting organizations.[Footnote 6] State survey agency 
inspections use CLIA requirements that are intended to help ensure 
valid and reliable lab tests; the two state CLIA-exempt programs and 
six accrediting organizations survey labs using their own requirements 
that CMS has determined to be at least equivalent to CLIA's. Each 
survey organization is also responsible for investigating complaints 
about lab quality.[Footnote 7] Because of the critical importance of 
accurate lab test results and oversight, you asked us to conduct a 
nationwide assessment of (1) the quality of lab testing; (2) the 
effectiveness of surveys, complaint investigations, and enforcement 
actions in detecting problems and ensuring compliance; and (3) the 
adequacy of CMS oversight of the CLIA program. 

To determine what is known about the quality of lab testing, we 
analyzed data on serious deficiencies identified during surveys by 
state survey agencies using CMS's On-Line Survey, Certification, and 
Reporting system (OSCAR).[Footnote 8] The CLIA program inspection 
requirements are classified as either "standard-" or "condition-" 
level. Similarly, deficiencies are also characterized as standard-or 
condition-level, based on the requirement in which the deficiency 
occurs. Because condition-level requirements generally consist of one 
or more standard-level requirements, a deficiency at the condition- 
level denotes a serious or systematic problem. We requested comparable 
data on serious deficiencies from state CLIA-exempt programs and the 
three largest accrediting organizations--the College of American 
Pathologists (CAP), COLA, and the Joint Commission on Accreditation of 
Healthcare Organizations (JCAHO)--which together survey about 97 
percent of accredited labs.[Footnote 9] CAP, COLA, JCAHO, and exempt- 
state programs each maintain their own separate databases. We also 
analyzed proficiency testing data--another indicator of a lab's ability 
to produce accurate test results. CMS officials generally recognize 
OSCAR and proficiency testing data to be reliable. We discussed the 
OSCAR database with CMS officials and tailored our analysis to ensure 
the accuracy of our findings. Because exempt states and accrediting 
organizations survey labs using their own requirements, we worked with 
them to develop data comparable to OSCAR deficiency data. We discussed 
our analyses with CMS and each of the survey organizations to ensure 
that we had interpreted the data correctly. Based on discussions with 
officials from the three accrediting organizations, we determined that 
they take appropriate steps to ensure the reliability of their data. 
Because it was not practical to independently test the reliability of 
accrediting organization data, we present these data as reported by 
those organizations. 

To assess the effectiveness of lab surveys, complaint investigations, 
and enforcement mechanisms in detecting problems and securing 
compliance, we reviewed the processes used to ensure the quality of 
clinical lab testing and analyzed available data related to these 
issues. We also conducted structured interviews with officials from (1) 
CMS, (2) three CMS regional: 

offices,[Footnote 10] (3) 10 state survey agencies,[Footnote 11] (4) 
the New York and Washington CLIA-exempt programs, and (5) the three 
accrediting organizations. We judgmentally selected the 10 state survey 
agencies to include a mixture of states whose lab inspections 
identified a range of serious deficiencies from few to many. We also 
discussed the quality problems discovered at a Maryland hospital lab 
with a Maryland state survey agency official and interviewed 9 of the 
36 CAP surveyors who participated in surveys of this lab from 1999 
through 2003 to obtain a firsthand perspective on the CAP survey 
process.[Footnote 12] Based on our review and discussions with CMS and 
survey organization officials, we focused on several key issues, 
including the rationale for announced surveys, the ability of lab 
surveys to identify serious deficiencies, the balance struck between 
the regulatory and educational goals of lab surveys, the implications 
of CAP's use of volunteer surveyors from neighboring labs to conduct 
inspections, how survey organizations facilitate the filing of 
complaints, and the use of sanctions to encourage compliance. We 
analyzed data on the number of complaints received by each survey 
organization from 2002 through 2004 and discussed CAP's initiatives to 
encourage the filing of complaints. In addition, we determined the 
extent to which labs had the same serious problems on consecutive 
surveys and discussed with CMS the steps the agency had taken to deter 
an inconsistent pattern of compliance. 

To assess the effectiveness of CMS oversight of the CLIA program, we 
analyzed the laws and regulations that define CMS's role and authority. 
We also reviewed CMS's process for determining that the survey 
requirements and procedures of state CLIA-exempt programs and 
accrediting organizations are at least equivalent to those of CLIA. We 
analyzed the results of validation reviews that federal surveyors from 
CMS regional offices conducted for state survey agency lab inspections 
and that state survey agency staff conducted for accrediting 
organization inspections from 1999 through 2003. We also examined other 
mechanisms that CMS uses to hold survey organizations accountable for 
their performance: (1) the collection and analysis of data on surveys, 
complaints, and enforcement actions, including steps taken to address 
communication and coordination issues that became evident during a 
complaint investigation at a Maryland hospital lab; and (2) recently 
developed annual reviews that measure state survey agency compliance 
with CLIA program requirements, such as the timeliness of surveys. We 
conducted our work from January 2005 through May 2006 in accordance 
with generally accepted government auditing standards. 

Results in Brief: 

Insufficient data exist to identify the extent of serious quality 
problems at labs. When CMS implemented revised CLIA survey requirements 
in 2004, it modified historical state survey agency findings stored in 
its OSCAR database and, as a result, data prior to 2004 no longer 
reflect key survey requirements in effect at the time of those surveys. 
In addition, the lack of a straightforward method to link similar 
requirements across survey organizations makes it virtually impossible 
to assess lab quality in a standardized manner, such as identifying the 
proportion of labs with condition-level deficiencies, which indicate 
serious or systemic quality problems. Although CMS has stated that it 
believes lab quality has improved since the early 1990s, the results of 
proficiency testing--the one available data source that can be used to 
uniformly compare lab quality across survey organizations--suggest that 
lab quality may not have improved at hospital labs and that the 
improvement for physician office labs may be misleading because a 
significant number of such labs are no longer inspected. 

Weaknesses in surveys, complaint processes, and enforcement mask 
potential quality problems at labs. Lab survey findings may not 
accurately reflect the actual quality assurance process in place on a 
day-to-day basis because of several shortcomings. First, most survey 
organizations announce all surveys, allowing labs to prepare for their 
inspections. To address this problem, accrediting organizations that 
inspect hospital labs began conducting unannounced surveys in 2006. 
Second, the limited data available suggest that state survey agency 
inspections do not identify all serious deficiencies. Third, the 
balance struck between the CLIA program's educational and regulatory 
goals is sometimes inappropriately skewed toward education, which may 
also result in understatement of survey findings. For example, CMS 
instructed state survey agencies not to cite deficiencies for new lab 
quality control requirements for 2 years, in part because of a lack of 
lab "buy-in" for some of the new policies and procedures; CMS then 
extended this period and gave no specific end date. Regarding complaint 
processes, complaints are filed by a variety of sources, including lab 
workers. Few labs were the subject of a complaint each year from 2002 
through 2004--significantly less than one complaint per lab per year. 
Concern that labs can easily identify the lab workers who file 
complaints and lab workers' lack of familiarity with how to file a 
complaint may explain why so few workers report problems. Since one 
accrediting organization required each lab it inspects to display a 
poster explaining how to file a complaint, the number of complaints it 
received about lab quality has doubled. Finally, based on the large 
number of labs with proposed sanctions from 1998 through 2004 that were 
never imposed--even for labs with the same serious, condition-level 
deficiencies on consecutive surveys--it is unclear how effective CMS's 
enforcement process is at motivating labs to consistently comply with 
CLIA requirements. 

CMS's oversight of clinical lab quality is inadequate to ensure that 
labs are meeting CLIA requirements. The agency requires proficiency 
testing three times each year instead of on a quarterly basis, as 
required by CLIA. Nor is CMS meeting its own requirement to determine 
in a timely manner the continued equivalency of accrediting 
organization and exempt-state inspection requirements and processes. 
For example, New York's and COLA's reviews were about 4 years and 3 
years past due, respectively, as of December 2005. CMS attributed these 
delays to having too few staff. Moreover, CMS allows the implementation 
of changes to accrediting organization and exempt state inspection 
requirements between periodic equivalency determinations before it 
reviews the proposed changes. Validation reviews--one of CMS's most 
important oversight tools--do not provide an independent assessment of 
the extent to which surveys identify all serious deficiencies because 
many are performed simultaneously with such surveys. In addition, CMS's 
requirement for validating state survey agencies' inspections is vague, 
resulting in no validation reviews in some states. Finally, CMS does 
not effectively use data to monitor survey organization activities and 
processes, such as the proportion of labs with serious deficiencies, 
proficiency testing results, or trends in complaints. Realizing that 
its existing oversight activities need to be strengthened, CMS has 
begun instituting performance reviews to measure survey organization 
compliance with its requirements and is developing protocols to ensure 
improved communication among survey organizations concerning complaints 
about lab quality. 

We are recommending that the CMS Administrator take actions that will 
standardize survey findings across survey organizations, enable it to 
compare changes over time, and make meaningful comparisons among 
organizations; strengthen survey, complaint, and enforcement processes; 
and improve CMS oversight of the CLIA program. In commenting on a draft 
of this report, CMS endorsed our overall conclusion that quality 
assurance for the nation's clinical labs should be strengthened and 
said that it would take actions in response to 11 of our 13 
recommendations. CMS disagreed with our recommendations concerning the 
frequency of proficiency testing and the extent of simultaneous 
accrediting organization validation reviews. We believe that 
implementing these recommendations is necessary to improve oversight of 
labs and accrediting organizations. CMS also provided an alternative 
assessment of lab quality, disagreed with our conclusion about the 
educational phase-in periods for new CLIA requirements, and expressed 
concern about identifying and sanctioning labs with repeat condition- 
level deficiencies. CAP, COLA, and JCAHO also provided comments on a 
draft of this report. CAP indicated that it would identify additional 
measures it could take to strengthen its own oversight, and COLA found 
merit in our recommendations to improve CMS oversight. CAP, COLA, and 
JCAHO disagreed with some of our findings and recommendations to CMS. 
We incorporated technical comments from CMS and the three accrediting 
organizations, as appropriate. 

Background: 

A clinical lab is generally defined as a facility that examines 
specimens derived from humans for the purpose of disease diagnosis, 
prevention, and treatment, or health assessment of individuals. While 
hospital and interstate labs were previously subject to regulation, 
CLIA strengthened federal requirements and extended them to most other 
clinical labs, including physician office labs. For example, CLIA 
strengthened personnel requirements for lab workers, strengthened 
proficiency testing that evaluates the accuracy of lab testing between 
surveys, and created a range of sanctions to enforce 
compliance.[Footnote 13] 

Most clinical labs regulated under CLIA must obtain a certificate from 
CMS and pay fees every 2 years to cover the costs of administering the 
CLIA program, including surveys and other oversight 
activities.[Footnote 14] The fees vary based on the complexity and 
volume of testing performed. Lab tests are categorized as waived, 
moderate, or high complexity.[Footnote 15] Approximately 81 percent of 
all labs (about 157,000) are not subject to routine biennial surveys 
because they perform (1) "waived" tests, which are examinations and 
procedures that have an insignificant risk of erroneous results, 
including those approved for home use or determined to employ 
methodologies so simple or accurate that the likelihood of erroneous 
results is negligible;[Footnote 16] or (2) tests performed during the 
course of a patient visit with a microscope on specimens that are not 
easily transportable.[Footnote 17] 

Surveyed Labs: 

CLIA establishes more stringent requirements for the 19 percent (about 
36,000) of labs performing moderate-or high-complexity testing, 
including the requirement for a survey and participation in routine 
proficiency testing. Since the early 1990s, the number and proportion 
of labs subject to surveys have declined, while the number and 
proportion conducting waived tests have increased.[Footnote 18] Surveys 
examine lab compliance with CLIA program requirements in several areas 
including: personnel qualifications, proficiency testing, quality 
control, quality assurance, and recordkeeping. 

* Personnel: CLIA sets minimum qualifications for all persons 
performing or supervising moderate-or high-complexity lab tests and 
specifies responsibilities for each position. 

* Proficiency testing: Surveyed labs must participate in an approved 
external proficiency testing program, which evaluates the accuracy of 
laboratory testing. Under this requirement, labs purchase samples with 
unknown characteristics several times each year from an approved 
proficiency testing provider.[Footnote 19] The lab is required to test 
the samples with its routine patient testing, and the results are 
returned to the testing provider to be graded. A proficiency testing 
failure is defined as unsatisfactory performance on two consecutive or 
two out of three testing events. The results of proficiency testing for 
all inspected labs are transmitted to CMS and maintained in a database. 

* Quality control: Labs must have a process for routinely monitoring 
personnel, testing equipment, and the testing environment to ensure 
proper operation and accurate results. 

* Quality assurance: Labs must follow their plan to monitor the overall 
operation of the laboratory on an ongoing basis and must resolve 
identified problems that affect the quality of their testing. 

* Recordkeeping: Labs must maintain an audit trail of testing that 
documents specimen integrity and test performance for all phases of the 
test process from the test order to the test report. 

Survey Organizations: 

In general, labs have a choice of who conducts their surveys--state 
survey agencies using CLIA inspection requirements or other survey 
organizations that use requirements CMS has determined to be at least 
equivalent to CLIA's.[Footnote 20] CMS contracts with state survey 
agencies in most states to inspect labs against CLIA 
requirements.[Footnote 21] CLIA established an approval process to 
allow states and private accrediting organizations to use their own 
requirements to survey labs.[Footnote 22] As noted earlier, New York 
and Washington operate CLIA-exempt programs and CMS has approved six 
private, nonprofit accrediting organizations to survey labs--the 
American Association of Blood Banks (AABB), the American Osteopathic 
Association (AOA), the American Society of Histocompatibility and 
Immunogenetics (ASHI), CAP, COLA, and JCAHO. The requirements of both 
state CLIA-exempt programs and accrediting organizations must be 
reviewed by CMS at least every 6 years to ensure CLIA equivalency, but 
may be more stringent than those of CLIA. For example, when inspecting 
labs engaged in moderate-and high-complexity testing, New York and some 
accrediting organizations also look at the labs' procedures for 
conducting "waived" tests, which is not required under CLIA. Figure 1 
lists the three types of survey organizations and indicates whether 
they survey labs under CLIA requirements, or use their own CLIA- 
equivalent requirements. It also shows the percentage of labs 
performing moderate-to high-complexity testing surveyed by each type of 
organization. In general, state survey agencies, COLA, and Washington's 
CLIA-exempt program survey physician office labs, while New York's CLIA-
exempt program, CAP, and JCAHO survey hospital labs. 

Figure 1: Types of Survey Organizations, Requirements Used to Survey 
Labs, and Percentage of Labs Surveyed by Each Organization, as of 
December 2005: 

[See PDF for image]  

[A] Washington is not included as it has only a CLIA-exempt program. 

[B] New York uses CLIA-equivalent requirements to inspect larger 
hospital labs under the state's CLIA-exempt program and CLIA 
requirements to inspect smaller labs, including physician office labs. 
Only the labs in the CLIA-exempt program are counted here. 

[C] Some labs are counted more than once because labs may be accredited 
by more than one organization. While some labs in New York may be 
accredited, they are still subject to biennial surveys by the state 
survey agency or the state CLIA-exempt program, because New York does 
not authorize accreditation as the basis for lab licensure. 

Source: GAO.

[End of figure]

Surveys and Complaint Investigations: 

Survey organizations (1) conduct surveys and complaint investigations, 
and (2) monitor proficiency test results submitted by surveyed labs 
three times a year. Surveys are typically conducted by former or 
current lab workers, who assess lab compliance with CLIA or CLIA- 
equivalent requirements. Most lab inspections are announced, that is, 
the lab has advance notice of when the survey will occur. Generally, 
surveyors verify that lab personnel are appropriately qualified to 
conduct testing, evaluate proficiency test records, check equipment and 
calibration to ensure that appropriate quality control measures are in 
place, and determine whether the lab has a quality assurance plan and 
uses it to, among other things, appropriately identify and resolve 
problems affecting testing quality. Surveys also include an educational 
component to assist labs in understanding how to comply with CLIA 
requirements. The duration of a survey generally depends on the size-- 
in terms of the number of tests conducted--and complexity of a lab as 
well as the number of surveyors. Thus, a survey conducted at a small 
lab may only take a few hours to complete, while a survey at a large 
hospital lab may take a survey team a full week or more. 

In addition to inspections, survey organizations are responsible for 
determining the seriousness of and investigating all complaints. For 
those complaints that are determined to pose immediate jeopardy--an 
imminent and serious threat to patient health and a significant hazard 
to public health--CMS requires that the investigation be initiated 
within 2 working days. Complaints may be investigated on-site or 
through communications between the survey organization and the lab. 
Complaint investigations for all survey organizations are unannounced. 

Lab survey requirements are classified as either "standard-" or 
"condition-" level. Generally, condition-level requirements are made up 
of one or more related standard-level requirements. For example, the 
condition-level requirement on enrollment and testing of samples 
through a proficiency testing program has two related standard-level 
requirements: (1) enrollment, which includes requirements for the lab 
to provide the name of the program it has enrolled in to the Department 
of Health and Human Services and authorize the release of testing data 
to the department; and (2) testing, which specifies that the samples 
must be tested in the same manner as any specimen and prohibits 
referring the test samples to another lab for analysis. 

Deficiencies are also characterized as standard-or condition-level 
based on the requirement in which the deficiency occurs. Deficiencies 
in standard-level requirements, that is, standard-level deficiencies, 
denote problems that generally are not serious, while condition-level 
deficiencies are cited when the problems are serious or systemic in 
nature. A serious problem is defined as an inadequacy in a lab's 
quality of services that adversely affects, or has the potential to 
adversely affect, the accuracy and reliability of patient test results. 
When deficiencies are found during surveys or complaint investigations, 
labs are required to submit a plan of correction, detailing how and 
when they will address the deficiencies. Additionally, CMS can impose 
principal or alternative sanctions, or both.[Footnote 23] Principal 
sanctions include revocation of a CLIA certificate, cancellation of the 
right to receive Medicare payments, or limits on testing. Revocation of 
a CLIA certificate is equivalent to termination from the CLIA program. 
Alternative sanctions are less severe and include civil money penalties 
or on-site monitoring.[Footnote 24] For condition-level deficiencies 
that do not involve immediate jeopardy, labs have an opportunity to 
correct the deficiencies, which we refer to as a grace period, before 
the sanctions are imposed. If a lab is unable to correct a deficiency 
during this grace period, CMS determines whether to impose a sanction 
and the type of sanction. 

CMS Oversight: 

CMS, including its 10 regional offices, oversees state and accrediting 
organization survey activities.[Footnote 25] CMS reviews and approves 
initial and subsequent applications from exempt-state programs and 
accrediting organizations to ensure CLIA equivalency. Validation 
reviews are one of CMS's primary oversight tools. Federal surveyors in 
CMS regional offices are responsible for conducting validation reviews 
of state survey agency and exempt-state program inspections, but state 
survey agency staff conduct the validation reviews of accrediting 
organization inspections.[Footnote 26] An objective of these reviews is 
to determine if all condition-level deficiencies were 
identified.[Footnote 27] These reviews are conducted within 60 days of 
a state's or 90 days of an accrediting organization's survey of a lab. 
Starting in 1999, CMS required that at least one validation review be 
conducted simultaneously with an accrediting organization's survey, a 
step intended to encourage an exchange of ideas and approaches among 
surveyors. CMS also encourages the use of simultaneous reviews of state 
survey agency inspections. By law, the number of labs selected for 
validation reviews must be sufficient to allow a reasonable estimate of 
the performance of each accrediting organization being 
assessed.[Footnote 28] CMS requires fewer validation reviews of state 
survey agency lab surveys (1 percent) than for those of exempt-state 
programs or accrediting organizations (5 percent). 

Beginning in 2003, CMS regional offices began reviewing the activities 
of state survey agencies against a set of 13 performance standards. The 
standards cover areas such as the timeliness of lab inspections, 
surveyor personnel qualifications and training, CLIA data management, 
and the handling of complaints. CMS's goal is to evaluate each state 
survey agency's ability to carry out its CLIA responsibilities and to 
make improvements. CMS is also developing performance standards for 
other survey organizations that inspect labs using their own CLIA 
equivalent requirements. 

Insufficient Data Exist to Identify Extent of Serious Lab Quality 
Problems: 

The extent of serious quality problems at labs is unclear because CMS 
has incomplete data on condition-level deficiencies identified by state 
survey agencies prior to 2004. We also found that the lack of a 
straightforward linkage between CLIA requirements and the CLIA- 
equivalent requirements of some survey organizations makes it virtually 
impossible to assess lab quality in a standardized manner, such as 
identifying the proportion of labs with condition-level deficiencies. 
Such deficiencies indicate serious or systemic quality problems. 
Proficiency testing results--the one available data source that can be 
used to uniformly compare lab quality across survey organizations-- 
raise questions about whether lab quality has improved in recent years. 

Limited Data Are Available on the Quality of Labs Inspected by State 
Survey Agencies: 

CMS's OSCAR database contains limited data on the quality of labs 
inspected by state survey agencies and, as a result, it is not possible 
to analyze changes in the quality of lab testing over time. In January 
2004, CMS implemented revised CLIA survey requirements and modified the 
existing OSCAR data--state survey agency findings--to reflect the 
changes.[Footnote 29] The revisions affected approximately two-thirds 
of the CLIA condition-level requirements.[Footnote 30] As a result of 
the data modifications, the findings for surveys conducted prior to 
2004 no longer reflect all key condition-level requirements in effect 
at the time of those surveys.[Footnote 31] Based on the available 2004 
OSCAR data (which represent about one half of all labs surveyed by 
state survey agencies), we found that 6.3 percent of labs had condition-
level deficiencies (see app. II for data on all state survey agencies, 
including the District of Columbia).[Footnote 32] As will be discussed 
below, similar data are not available for labs surveyed by other survey 
organizations. 

Quality of Labs Inspected by Other Survey Organizations Is Very 
Difficult to Measure in a Standardized Manner: 

Differences between the inspection requirements that state survey 
agencies use to measure lab quality and those of exempt-state programs 
and accrediting organizations make it virtually impossible to measure 
lab quality in a standardized manner. Because exempt-state programs and 
accrediting organizations do not classify inspection requirements and 
related deficiencies as either standard-or condition-level, they cannot 
easily identify the number of CLIA condition-level deficiencies cited 
at the labs they survey or the proportion of surveyed labs with 
condition-level deficiencies.[Footnote 33] 

We asked exempt-state programs and accrediting organizations what 
percentage of their requirements, and any deficiencies cited for 
failure to meet those requirements, indicated serious problems that 
were equivalent to CLIA condition-level deficiencies. While only 8 
percent of CLIA requirements used by state survey agencies are 
classified as condition-level and therefore serious, the proportion of 
requirements that exempt-state programs and accrediting organizations 
classify as serious ranged from 20 percent up to 100 percent (see table 
1). 

Table 1: Percentage of Inspection Requirements Classified as Serious, 
by Survey Organization: 

Organization: State survey agencies; 
Percentage of requirements classified as serious: 8. 

Organization: New York CLIA-exempt program; 
Percentage of requirements classified as serious: [A]. 

Organization: Washington CLIA-exempt program; 
Percentage of requirements classified as serious: [A]. 

Organization: COLA; 
Percentage of requirements classified as serious: 20. 

Organization: CAP; 
Percentage of requirements classified as serious: 80. 

Organization: JCAHO; 
Percentage of requirements classified as serious: 100.  

Sources: GAO analysis of information provided by CMS, New York, 
Washington, CAP, COLA, and JCAHO. 

[A] This state's CLIA-exempt program does not distinguish between 
serious and nonserious requirements. 

[End of table]

CAP and COLA crosswalked their recent survey findings to CLIA condition-
level requirements.[Footnote 34] Although their analysis suggested that 
from about 56 to 68 percent of labs surveyed during 2004 had a 
deficiency in at least one condition-level requirement, they 
acknowledged that these proportions overstated the subset of labs with 
serious problems. JCAHO did not crosswalk its inspection requirements 
to those of CLIA because staff would have had to manually review each 
survey report to determine which deficiencies were equivalent to 
deficiencies in CLIA condition-level requirements. However, JCAHO did 
tell us that in 2004, about 90 percent of the labs it surveyed had a 
deficiency in at least one requirement and, as previously noted, JCAHO 
classifies all its requirements as serious. 

Despite the difficulty of identifying CLIA equivalent condition-level 
deficiencies, two of the three accrediting organizations we reviewed 
have systems to identify labs they survey that have serious quality 
problems. COLA estimated that about 9 percent of labs it surveyed in 
2004 were subject to closer scrutiny because of the seriousness of the 
problems identified. According to JCAHO, about 5 percent of the labs it 
surveyed in 2004 were not in compliance with a significant number of 
requirements. The third accrediting organization, CAP, has criteria for 
identifying labs that warrant greater scrutiny, but CAP officials told 
us that identifying such labs had to be accomplished on a case-by-case 
basis rather than through a database inquiry. As a result, CAP plans to 
spend in excess of $9 million during 2006 and 2007 to develop an 
integrated data system that pulls together multiple factors--survey 
results, complaints, proficiency testing, findings of other inspection 
bodies, and changes in lab directors--to enable it to readily identify 
problem labs. According to CAP officials, such labs will be targeted 
for greater monitoring, and CMS and other survey organizations will be 
notified about CAP's actions. 

Proficiency Testing Results Suggest that Quality Has Not Improved at 
Hospital Labs in Recent Years: 

Although CMS noted that proficiency testing trend data show a decrease 
in failures for labs as a whole, the data suggest that lab quality may 
not have improved at hospital labs for the period 1999 through 2003. 
Proficiency testing is an important oversight tool for survey 
organizations because it is an objective indicator of a lab's ability 
to consistently produce accurate test results and is conducted more 
frequently than surveys--three times a year versus once every 2 years. 
In the absence of comparable survey data, proficiency testing results 
provide a uniform way to assess the quality of lab testing across 
survey organizations. 

Our analysis of CMS proficiency testing data for 1999 through 2003 
suggests that there has been an increase in proficiency testing 
failures for labs inspected by CAP and JCAHO, which generally inspect 
hospital labs, and a decrease in such failures for labs surveyed by 
state survey agencies and COLA, which tend to inspect physician office 
labs (see fig. 2). CMS defines failures as unsatisfactory performance 
in two consecutive or two out of three proficiency testing events. For 
example, the percentage of labs with proficiency testing failures 
surveyed by CAP and JCAHO from 1999 through 2003 increased from 4.1 
percent to 6.8 percent and from 6.6 percent to 7.8 percent, 
respectively. It is unclear, however, whether the decrease in failures 
for physician office labs represents an actual improvement in lab 
quality or reflects the fact that some problematic labs are no longer 
surveyed. Specifically, many physician office labs now perform waived 
tests and therefore are no longer surveyed or participate in 
proficiency testing. Between 1998 and 2005, the percentage of labs 
subject to surveys and proficiency testing decreased from about 30 
percent to about 19 percent. 

Figure 2: Percentage of Labs with Proficiency Testing Failures from 
1999 through 2003, by Survey Organization: 

[See PDF for image] 

Note: Data include labs affiliated with each organization during each 
year. 

Source: GAO analysis of CMS proficiency testing data.

[End of figure] 

Oversight Weaknesses Mask Quality Problems: 

Weaknesses in surveys, complaint processes, and enforcement mask real 
and potential quality problems at labs. Survey weaknesses include: (1) 
inspections that most organizations announce ahead of the visit, which 
allows labs to prepare for their inspections and portray themselves in 
a manner that may not accurately reflect their day-to-day quality 
assurance processes; (2) variability in the proportion of labs with 
condition-level deficiencies in 2004, which suggests surveys are not 
conducted in a consistent manner; and (3) the goal of educating lab 
workers during surveys taking precedence over, or precluding, the 
identification and reporting of deficiencies. Furthermore, the 
significant increase in complaints since CAP took steps to help ensure 
that lab workers know how to file a compliant suggests that some 
quality problems at labs inspected by other survey organizations may 
not be reported. Finally, sanctions are not being used effectively as 
an enforcement tool to promote labs' compliance with CLIA requirements, 
as evidenced by the relatively few labs with repeat condition-level 
deficiencies on consecutive surveys from 1998 through 2004 that had 
sanctions imposed. 

Announced Surveys May Result in Unrealistic Picture of Lab Quality: 

Because labs can and do prepare for surveys, CMS regional office 
officials and most of the state survey agencies acknowledged that 
announced surveys may not always provide a realistic picture of lab 
quality.[Footnote 35] As shown in table 2, the amount of advance notice 
for surveys varies from as little as 2 weeks up to 12 weeks; until 
recently, only the New York CLIA-exempt program conducted unannounced 
surveys. Survey agency officials in two states told us that surveyors 
had inspected labs where records documenting the implementation of 
periodic quality control procedures were completed in the same 
handwriting using the same colored pen. This degree of uniformity 
raises a concern about whether the quality control occurred at all, or 
as frequently as the records suggested. A CAP surveyor told us that the 
pathologist at one lab had cleaned up, and signed off on, about 3- 
months worth of quality control records the night before the survey. In 
hearings on the questionable test results at a Maryland hospital lab, a 
worker testified that lab staff prepared frantically for their 
announced inspections. 

Table 2: Amount of Advance Notice Given to Labs about Upcoming 
Inspections, by Survey Organization: 

Survey organization: New York CLIA-exempt program; 
Amount of advance notice[A]: None. 

Survey organization: State survey agencies; 
Amount of advance notice[A]: Up to 2 weeks[B]. 

Survey organization: Washington CLIA-exempt program; 
Amount of advance notice[A]: 4 weeks. 

Survey organization: JCAHO; 
Amount of advance notice[A]: 4 weeks[C]. 

Survey organization: CAP; A
mount of advance notice[A]: 7 weeks[D]. 

Survey organization: COLA; 
Amount of advance notice[A]: 12 weeks[E].  

Sources: CMS, New York CLIA-exempt program, Washington CLIA-exempt 
program, CAP, COLA, and JCAHO. 

[A] These numbers reflect stated policy and may not represent actual 
practice. 

[B] Advance notice permitted by CMS guidance. 

[C] In January 2006, JCAHO stopped providing labs advance notice about 
upcoming inspections. 

[D] Average actual notice provided to CAP labs for 2004. In 2006, CAP 
began conducting unannounced inspections. 

[E] COLA confirms the survey date about 8 weeks in advance. 

[End of table]

In 2006, both CAP and JCAHO began conducting unannounced inspections at 
most of the hospital labs they survey.[Footnote 36] Both CAP and JCAHO 
officials told us that the unannounced surveys will occur as early as 6 
months prior to the anniversary of a lab's prior survey. CMS and survey 
organizations that inspect physician office labs provided several 
justifications for continuing to announce inspections at such labs, 
including (1) ensuring that the lab is open and that appropriate 
personnel are available to answer surveyor's questions, and (2) 
minimizing disruptions to patient care. These justifications appear to 
be reasonable because they reflect the operating tempo at physician 
office labs. It may not be appropriate, however, to provide such labs 
with 4 to 12 weeks advance notice, given that CMS currently limits the 
advance notice provided by state survey agencies to no more than 2 
weeks. 

Variability in Reported Survey Deficiencies Suggests that Labs Are Not 
Surveyed Consistently: 

Variability in OSCAR data for state survey agency inspections conducted 
in 2004 suggests that labs are not surveyed in a consistent manner, and 
interviews with CMS and state survey agency officials confirmed this 
hypothesis. As a result, available data likely understate the extent of 
serious quality problems at labs. In 2004, the percentage of labs that 
state survey agencies reported with condition-level deficiencies varied 
considerably by state, ranging from none in 6 states to about 25 
percent of labs in South Carolina. These data only included findings 
for about one-half of the labs surveyed by state survey agencies. Of 
the 33 states that survey more than 100 labs, 16 found condition-level 
deficiencies at fewer than 5 percent of labs, while 6 states identified 
such serious deficiencies in more than 10 percent of the labs they 
surveyed (see app. II).[Footnote 37] 

Based on interviews with CMS and 10 state survey agencies, it appears 
that at least some of this variability is due to differences in states' 
approaches to surveys as opposed to true differences in lab quality. 
For example, CMS told us that, because there is not a prescriptive 
checklist to guide the survey process, the reliance on state surveyor 
judgment will result in variations in the citing of deficiencies. To 
compensate for the unstructured nature of the state survey process, 
officials we interviewed from 2 state survey agencies told us that they 
created checklists to help ensure that surveyors looked at all of the 
critical elements during lab surveys.[Footnote 38] Furthermore, while 
some of the state survey agencies we spoke with told us that their 
surveyors always cite condition-level deficiencies that are identified 
during surveys, officials in other states said that there are 
circumstances under which condition-level deficiencies would not be 
cited. For example, according to officials from a state survey agency 
we interviewed, surveyors prefer not to cite condition-level 
deficiencies. Rather, surveyors in this state prefer to cite multiple 
standard-level deficiencies instead of a condition-level deficiency 
because it allows the imposition of state law sanctions, avoiding what 
was characterized as a less efficient federal sanctions 
process.[Footnote 39] Additionally, officials from 2 other state survey 
agencies explained that surveyors consider a lab's compliance history 
when determining what deficiencies to cite, while officials from a 
third state told us that surveyors will educate lab workers, 
particularly new lab workers, about the CLIA requirements rather than 
citing CLIA condition-level deficiencies.[Footnote 40] 

Balance Between Educational and Regulatory Roles by CMS and Survey 
Organizations Appears to Be Inappropriate: 

The goal of educating lab workers sometimes takes precedence over, or 
precludes, the identification and reporting of deficiencies that affect 
the quality of lab testing. As a result, data on the quality of lab 
testing and trends in quality over time may be misleading. Although 
CLIA neither requires nor precludes an educational role for surveyors, 
the preamble to CMS's implementing regulation noted that surveys are 
intended, in part, to provide an opportunity for on-site education 
regarding accepted laboratory procedures. In addition, CMS guidance and 
training encourage state surveyors to play an educational 
role.[Footnote 41] Many state survey agency officials we interviewed 
also told us that their surveyors play a major educational role. As 
noted earlier, surveyors from one state survey agency do not cite 
condition-level deficiencies when lab workers are new but prefer to 
educate the new staff. Because CMS revised its OSCAR database in 2004, 
it is not possible to identify states that have consistently not cited 
condition-level deficiencies, data that would help to quantify the 
extent to which an educational role is substituting for appropriate 
regulation of labs.[Footnote 42] 

An inappropriate balance between the educational and regulatory role is 
also evident in some accrediting organization practices. One of the CAP 
surveyors we interviewed with over 30 years of lab experience estimated 
that the majority of pathologists--individuals who generally serve as 
CAP survey team leaders--view surveys as educational, rather than as 
assessments of compliance with lab requirements. Another surveyor told 
us that CAP's survey process focuses heavily on education, and that 
some survey team leaders emphasize education more than others. For 
COLA, the process of educating labs begins even prior to a survey. For 
example, COLA encourages labs to submit a self-assessment for review 
prior to the scheduled survey so that the labs can identify COLA 
requirements with which they are not in compliance. About 20 percent of 
all labs surveyed during 2004 submitted a self-assessment (616 labs) 
and, compared to labs that did not submit a self-assessment, fewer 
deficiencies were identified at these labs during on-site 
surveys.[Footnote 43] 

CMS appears to be inappropriately stressing education over regulation 
in its implementation of (1) 2003 lab quality control requirements for 
the CLIA program and (2) proficiency testing for lab technicians who 
interpret Pap smears, a test for cervical cancer. When state surveyors 
began assessing compliance with new lab quality control requirements in 
January 2004, they were instructed to note deficiencies on a cover 
letter to labs rather than on the survey report itself for a period of 
2 years. Thus, such deficiencies are not recorded in the OSCAR 
database. In part because of a lack of lab "buy-in" for some of the new 
policies and procedures, CMS officials have extended the educational 
period for about another 2 years. CMS has taken a similar educational 
approach to Pap smear proficiency testing, which began in 2005. CMS 
will not cite deficiencies or impose sanctions against labs in which 
staff fail the new Pap smear proficiency testing in 2005 or 2006, as 
long as the labs and individuals involved complete such testing, 
including following the regulatory protocol for subsequent testing in 
the case of an initial failure. According to CMS, this educational 
focus allows labs and their staff to become familiar with the 
proficiency testing program and to prepare themselves for such testing, 
since there was about a 13-year time lag between the 1992 regulations 
that implemented CLIA and the 2005 implementation of Pap smear 
proficiency testing.[Footnote 44] This educational approach seems 
questionable given CMS's concern about some of the high initial 
proficiency test failure rates. 

Use of Volunteer Surveyors by CAP Raises Concerns: 

Although state survey agencies, exempt-state programs, COLA, and JCAHO 
employ dedicated staff surveyors, CAP relies primarily on volunteer 
teams consisting of lab workers from other CAP-inspected labs to 
conduct surveys.[Footnote 45] In contrast to the mandatory training and 
continuing education programs in place for the staff surveyors of other 
survey organizations, training for CAP's volunteer surveyors is 
currently optional. CAP plans to establish a mandatory training program 
beginning in mid-2006.[Footnote 46] As a condition of accreditation, 
labs inspected by CAP must survey another CAP-accredited lab of similar 
size and composition at least once every 2 years. According to data 
provided by CAP, two-thirds of volunteer surveyors who had recently 
participated in a survey had no formal training in the 3 to 5 years 
preceding the survey. Two CAP surveyors we interviewed told us that 
they had not completed any training because it was optional. Two other 
surveyors told us that they had never been notified about the existence 
of optional CAP training. While full-time surveyors employed by other 
survey organizations conduct from 30 to about 200 surveys per year, CAP 
volunteer surveyors have much less experience conducting surveys 
because they only survey about one lab each year. 

Three of the nine CAP surveyors we interviewed stated that they 
believed that mandatory training was important, but some surveyors 
wondered when lab workers would have time to complete the courses 
because of their demanding work schedules. According to CAP officials, 
however, the required training will take only 1 to 2 days and surveyors 
will have a choice of live seminars and workshops or e-learning 
completed at their own computers.[Footnote 47] For ongoing training 
requirements, CAP plans to give surveyors a choice of taking additional 
training or passing a competency evaluation. CAP will track compliance 
with its new training requirements to ensure that surveyors 
successfully complete training and demonstrate competency within 2 
years of participating in a survey. CAP's required training is less 
extensive than that required by other survey organizations. For 
example, state survey agency inspectors must complete 5 days of basic 
training and periodic advanced courses afterwards while COLA staff 
inspectors participate in a 5-week orientation program and an annual 20 
hours of continuing education. 

The use of volunteer inspectors by CAP also raises concerns about the 
appearance of a conflict of interest. These concerns arise because of 
the way CAP survey teams are structured. CAP's Commission on Laboratory 
Accreditation policy manual specifies that the inspection team leader 
is the individual responsible for the conduct of an ongoing site 
inspection, and must not be in a business, professional, or personal 
relationship that would preclude an objective inspection. Furthermore, 
the manual states that the inspection team leader is usually 
responsible for determining the size of, and assembling, the inspection 
team. However, until April 2006, CAP policy did not preclude competing 
labs from surveying one another or lab survey team members from 
soliciting business, such as referrals, from a lab at the conclusion of 
the survey.[Footnote 48] The policy was also silent about survey team 
members' business, professional, or personal relationships that could 
cloud their independence.[Footnote 49] Typically, inspection team 
leaders are pathologists who direct other labs in the community, and 
the inspection team is comprised of several employees from the team 
leader's lab. 

We believe that the use of volunteers, including those from nearby 
labs, and the personal and professional relationships that may exist 
among lab staff and survey team members, creates the appearance of a 
conflict of interest and could undermine the integrity of the survey 
process.[Footnote 50] Comments from some CAP surveyors we interviewed 
raise a concern about having survey team leaders who are also the day- 
to-day supervisors of team members. For example, lack of agreement 
about the seriousness of a deficiency could result in the team leader 
instructing the team to downgrade the deficiency to a recommendation, a 
less serious finding that does not appear in the inspection report. 
Team members who are subordinates to the team leader may feel that they 
have no other recourse than to follow the team leader's instructions. 
Recognizing that team members' objectivity may be compromised in this 
situation, CAP's revised conflict of interest policy instructs all 
parties to be cautious to retain objectivity in fact finding throughout 
the inspection process. 

In discussing these findings with CAP officials, they told us that they 
plan to institute a number of initiatives to help ensure survey 
objectivity, including (1) resurveying the same lab by an independent 
team to assess the consistency of inspections, (2) centralizing survey 
team assignments performed by CAP staff, (3) not announcing surveys, 
and (4) not notifying labs of the survey team composition prior to the 
survey. 

Lab Workers Who File Complaints About Quality Problems in Lab Testing 
Not Afforded Whistle-blower Protections: 

Some lab workers may not be filing complaints about quality problems at 
their labs because of anonymity concerns or because they may not be 
familiar with filing procedures. Complaints about labs can come from a 
variety of sources, including lab workers. Complaints are an important 
tool in detecting quality problems between lab surveys. For example, 
complaints about testing at a hospital lab were crucial because 
information had been concealed, complicating the detection of quality 
problems during the lab's surveys. As a result of a complaint, 
surveyors were able to substantiate inadequate calibration of testing 
equipment that could adversely affect patient care. 

Based on OSCAR data and data obtained from exempt-state programs and 
accrediting organizations for 2002 through 2004, few complaints were 
received about lab testing relative to the number of labs-- 
significantly less than one complaint per lab per year.[Footnote 51] 
The low volume of lab complaints may be related to complainants' 
concerns about anonymity and fear of retaliation for filing a 
complaint. It may be easy for a lab to determine the source of a 
complaint filed by a lab worker. For example, in some cases, either the 
nature of the complaint or the piece of testing equipment in question 
could narrow the list of possible complainants. Two CAP surveyors we 
interviewed commented that, in their opinion, it would be easy to 
determine the identity of a complainant. During congressional hearings 
in 2004, a Maryland hospital lab worker testified that she and her 
colleagues feared losing their jobs because of the complaints they 
filed. 

Because of the difficulty of protecting the anonymity of lab workers 
who file complaints, whistle-blower protections for such individuals 
are particularly important. Two of the three accrediting organizations 
we interviewed have whistle-blower protections--CAP and JCAHO.[Footnote 
52] For example, CAP implemented a comprehensive whistle-blower 
protection policy in July 2004 that includes revocation of 
accreditation or other appropriate action for any lab that directly or 
indirectly threatens, intimidates, or retaliates against a lab worker. 
While officials from New York and Washington's exempt-state programs 
told us that whistle-blower laws in their states provide some 
protection for lab workers who file complaints, officials in most of 
the other 10 states we interviewed told us that they did not have any 
whistle-blower protections or were unable to identify specific 
protections that applied to lab workers in their state. Currently, 
there are no federal whistle-blower protections specifically for 
workers in labs covered by CLIA. In 2005, legislation was introduced to 
provide whistle-blower protections to workers in labs covered by 
CLIA.[Footnote 53] 

We also found that lab workers may not know how to file a complaint. 
CAP experienced a significant increase in the number of complaints it 
received since October 2004, when it began requiring CAP-inspected labs 
to display posters on how to file complaints. Specifically, from 
October through December 2004, CAP received an average of 22 complaints 
per month, compared to an average of 11 complaints per month in the 9 
months preceding the poster requirement. As a result, the number of 
complaints about the quality of lab testing more than doubled in 2004 
and the number substantiated increased by more than 40 percent--even 
though the poster was only displayed for the last 3 months of 2004 (see 
table 3).[Footnote 54] In September 2005, COLA also began requiring 
labs to display a complaints poster similar to CAP's. It is too early, 
however, to determine the impact of COLA's new complaints poster on the 
number, type, and substantiation rate of complaints. Neither CMS nor 
JCAHO plans to require a similar complaints poster.[Footnote 55] 

Table 3: Number of Complaints Received by CAP, 2002-2005: 

Year: 2002; 
Received: 82; 
Substantiated: 39. 

Year: 2003; 
Received: 84; 
Substantiated: 40. 

Year: 2004; 
Received: 170; 
Substantiated: 70. 

Year: 2005; 
Received: 290[A]; 
Substantiated: 74 (preliminary)[B]. 

Source: CAP. 

[A] This number is as of November 30, 2005, and thus does not include 
complaints received in December. 

[B] As of November 30, 2005, CAP had substantiated 74 complaints; over 
100 complaints were still under active investigation. 

[End of table] 

Lab Sanctions Are Rarely Imposed: 

Few labs were sanctioned by CMS from 1998 through 2004--even those with 
the same condition-level deficiencies on consecutive surveys--because 
many proposed sanctions are never imposed. Our analysis of CMS 
enforcement data from 1998 through 2004 found that 501 labs were 
sanctioned, which equates to less than 3 percent of labs inspected by 
state survey agencies.[Footnote 56] The most common were principal 
sanctions, which may result in suspension or limitation of testing or 
termination from the CLIA program; few labs were subjected to 
alternative sanctions, such as directed plans of correction or civil 
monetary penalties (see table 4). Appendix III shows the number of labs 
surveyed by state survey agencies and the number of sanctioned labs 
from 1998 through 2004. 

Table 4: Number of Labs Inspected with Principal Only, Principal and 
Alternative, and Alternative Only Sanctions Imposed, 1998-2004: 

Sanction: Principal only; 
Description:  
* Revocation of CLIA certificate 9termination): 
* Cancellation of approval to receive Medicare payments: 
* Limits placed on testing: 
* Suspension of testing; 
Number of Labs: 269. 

Sanction: Principal and alternative;
Description: At least one principal sanction plus at least one 
alternative sanction; 
Number of labs: 170. 

Subtotal: Number of Labs: 439. 

Sanction: Alternative;
Description:
* Directed plans of correction: 
* Civil money penalties: 
* State on-site monitoring: 
* Partial or full suspension of Medicare payments; 
Number of labs: 62. 

Total: Number of Labs: 501. 

Source: GAO analysis of CMS lab registries. 

[End of Table]

Although few labs were sanctioned from 1998 through 2004, over 9,000 
labs had sanctions proposed during that same time period.[Footnote 57] 
Before sanctions go into effect, labs are given a grace period to 
correct condition-level deficiencies, unless the deficiencies involve 
immediate jeopardy, that is, an imminent threat to patient health and 
significant hazard to public health. Most labs correct the deficiencies 
within the grace period. CMS officials told us that it was appropriate 
to give labs an opportunity to correct such deficiencies within a 
prescribed time frame and thus avoid sanctions.[Footnote 58] However, a 
principal objective of the enforcement process--one reflected in CMS 
guidance--is to motivate labs to comply with CLIA requirements, thereby 
helping to ensure the provision of accurate and reliable test results. 
Based on the large number of labs with proposed sanctions that were 
never imposed, it is unclear how effective the enforcement process is 
at motivating labs to consistently comply with CLIA requirements. 

The number of labs with the same repeat condition-level deficiencies 
from one survey to the next also raises questions about the overall 
effectiveness of the CLIA enforcement process. From 1998 through 2004, 
274 labs surveyed by state survey agencies had the same condition-level 
deficiency cited on consecutive surveys and 24 of these labs had the 
same condition-level deficiency cited on more than two 
surveys.[Footnote 59] This analysis may understate the percentage of 
labs with repeat condition-level deficiencies because OSCAR data prior 
to 2004 no longer reflect about two-thirds of condition-level 
requirements and associated deficiencies at the time of those surveys. 
We found that only 30 of the 274 labs with repeat condition-level 
deficiencies had sanctions imposed--either principal, alternative, or 
both. According to the CLIA legislative history, congressional concern 
about labs with repeat deficiencies led to alternative sanctions to 
provide an enforcement option short of principal sanctions to encourage 
compliance. 

From 1998 through 2004, less than 1 percent of accredited labs (81) 
lost their accreditation; few of these labs were subsequently 
sanctioned by CMS and many still participate in the CLIA 
program.[Footnote 60] Our analysis of CMS reports on sanctioned labs 
found that only 9 of the 81 labs had either principal and/or 
alternative sanctions imposed and that 1 of the 9 still performs 
moderate-to high-complexity testing.[Footnote 61] Based on a review of 
its CLIA certificate database, CMS officials told us that about half of 
the 81 labs still perform moderate-to high-complexity testing but could 
not describe the actions taken by CMS regional offices in response to 
the loss of accreditation. We contacted state survey agencies or CMS 
regional office officials to determine why 3 labs that COLA concluded 
had cheated on proficiency testing by referring the samples to another 
lab to be tested had no sanctions imposed. The purpose of proficiency 
testing is to provide an objective, external evaluation of the accuracy 
of a lab's test results, which is negated when another lab analyzes the 
sample. By statute, the intentional referral of samples to another lab 
for proficiency testing is a serious deficiency that should result in 
automatic revocation of a lab's CLIA certificate for at least 1 
year.[Footnote 62] Based on our interviews, we found that the 3 labs 
were allowed to continue testing because they had initiated corrective 
actions; in effect, these labs were given an opportunity to correct a 
deficiency that appears to have required a loss of their CLIA 
certificate for at least 1 year. A fourth lab was ultimately sanctioned 
for proficiency testing cheating by CMS but was allowed to continue 
testing for almost 2 years after having its accreditation revoked. 

We also attempted to analyze data on other actions, short of revoking 
accreditation, used by accrediting organizations to encourage lab 
compliance and, in particular, how they respond to labs with serious 
repeat deficiencies. According to CMS, this information is dispersed 
across CMS regional offices. CAP officials told us that they could 
initiate four intermediate actions including probation (lab is closely 
watched to ensure correction of problems), accreditation with 
conditions (nonroutine inspection to be scheduled), suspension of a lab 
section, and cessation of a specific type of testing; suspension and 
probation were instituted in 2004. According to CAP, in 2005, 28 labs 
were on probation, 106 labs were accredited with conditions, 1 lab was 
suspended, and 7 labs were required to cease a specific type of test. 
In 2004, JCAHO awarded conditional accreditation to 3 percent of the 
labs it inspects because they were not in substantial compliance with 
its survey requirements, as evidenced by the number of requirements not 
met; JCAHO conducts an on-site follow-up survey at such labs. From 2002 
through 2004, COLA required about 30 labs per year to cease testing due 
to issues identified during surveys and about 217 labs per year to 
cease testing certain tests or specialties due to unsuccessful 
proficiency testing.[Footnote 63] 

CMS Oversight of CLIA Is Inadequate: 

CMS's oversight is not adequate to help ensure that labs meet CLIA 
requirements. While CLIA requires proficiency testing quarterly, CMS 
only requires such testing three times each year. In addition, the 
agency is not meeting its responsibility to determine that accrediting 
organization and exempt-state requirements and processes continue to be 
at least equivalent to CLIA's. CMS attributed the delay in making 
equivalency determinations to having too few staff. Further, ongoing 
CMS validation reviews do not provide an independent assessment of the 
extent to which surveys identify all condition-level deficiencies-- 
primarily due to their timing. Finally, CMS does not adequately use 
data, such as the results of surveys, to monitor survey organization 
activities and processes. Realizing that its existing oversight 
activities need to be strengthened, CMS has begun instituting 
performance reviews to measure survey organization compliance with its 
standards and is developing protocols to ensure improved communication 
among survey organizations concerning complaints about lab quality. 

CMS's Implementation of Proficiency Testing Is Inconsistent with CLIA: 

CMS's decision to require proficiency testing for almost all laboratory 
tests only three times a year is inconsistent with the statutory 
requirement. CLIA requires that proficiency testing be conducted "on a 
quarterly basis, except where the Secretary determines for technical 
and scientific reasons that a particular examination or procedure may 
be tested less frequently (but not less often than twice per 
year)."[Footnote 64] The committee report on the bill that forms the 
basis for much of CLIA indicated that "proficiency testing should be 
the central element in determining a laboratory's competence, since it 
purports to measure actual test outcomes rather than merely gauging the 
potential for accurate outcomes."[Footnote 65] 

In CMS's 1992 rule implementing CLIA, the agency provided a rationale 
for reducing the frequency of proficiency testing, but did not provide 
a technical and scientific basis for reducing the frequency for 
particular procedures or tests.[Footnote 66] According to CMS's 
justification, experts were divided on the appropriate frequency of 
proficiency testing generally. In addition, requiring fewer events of 
proficiency testing would give laboratories more time to analyze the 
causes of test failures before the next event of proficiency testing 
and also enhance proficiency testing's value as an educational tool. 
Because CLIA increased the number of labs that were required to undergo 
proficiency testing, CMS believed that the number of organizations that 
provided proficiency testing services would not have been able to meet 
the anticipated increase in demand for testing services. To help avoid 
anticipated delays in completing proficiency testing and reporting 
requirements, CMS reduced the frequency of testing events to three 
times per year.[Footnote 67] 

CMS's requirement for proficiency testing does not meet the conditions 
specified in the statute that must be satisfied in order to require 
testing less frequently than quarterly. The language of the statute, as 
well as relevant legislative history, indicate that a decision to 
reduce the frequency of testing should be in the nature of an exception 
made with regard to a particular test, not the norm for all tests, and 
must be based on "technical and scientific" considerations related to 
that particular test.[Footnote 68] The reasons that CMS gave for 
requiring only three events per year were not based on scientific and 
technical considerations relevant to particular tests. Instead, CMS's 
decision was based on concerns of an administrative and logistical 
nature that CMS wanted to alleviate by reducing the frequency of 
testing events. 

CMS Is Late in Ensuring CLIA Equivalency of Exempt States' and 
Accrediting Organizations' Inspection Requirements and Processes: 

We found that CMS has been late in determining that exempt states' and 
accrediting organizations' inspection requirements and processes are at 
least equivalent to CLIA's. CMS must verify their equivalency and, by 
regulation, CMS requires such survey organizations to seek reapproval 
at least once every 6 years, or more frequently if deemed necessary. 
CMS establishes the time frames for when the next reapproval should 
occur, which have ranged from about 15 months to about 6 years. 
However, CMS has not completed its equivalency reviews within these 
time frames and accrediting organizations and exempt state programs 
have continued to operate without proper approval. Equivalency reviews 
for CAP, COLA, JCAHO, and Washington due to be completed between 
November 1, 1997, and April 30, 2001, were an average of about 40 
months late. In August 1995, CMS determined that New York's next 
equivalency review should be completed by June 30, 2001, but was over 4 
years past due as of December 2005. Similarly, COLA's equivalency 
review was about 3 years past due. 

Because accrediting organizations and exempt-state programs may choose 
to make changes to their inspection requirements between periodic 
equivalency reviews (1) accrediting organizations are required to 
submit changes to their inspection requirements and policies 30 days 
prior to changing their standards[Footnote 69] and (2) exempt-state 
programs are required to provide notice when they change their 
licensure or inspection requirements. Although federal regulations 
require CMS to review equivalency when an accrediting organization or 
exempt-state program adopts new requirements, a CMS official told us 
that the agency is not required to review such changes before their 
implementation to ensure equivalency.[Footnote 70] As a result, such 
survey organizations may introduce changes that are inconsistent with 
CLIA requirements. For example, JCAHO made a significant change to its 
inspection requirements in January 2004 but did not receive CMS 
approval until 6 months later; CMS did not begin an in-depth review of 
JCAHO's revised requirements until early 2005--over a year after they 
were implemented by JCAHO.[Footnote 71] According to CMS, their review 
has identified several critical areas where JCAHO standards are less 
stringent than those of CLIA. JCAHO acknowledged the need to make some 
adjustments to its revised requirements. 

CMS officials attributed delays in making equivalency determinations 
and reviewing interim changes to having too few staff. The CLIA 
program, located in CMS's Center for Medicaid and State Operations 
(CMSO), currently has approximately 21 full-time-equivalent positions 
compared to a peak of 29 such positions several years ago. The 
reduction occurred over time through attrition. As required by statute, 
the CLIA program is funded by lab fees and since its inception the 
program's fees have exceeded expenses. As of September 30, 2005, the 
CLIA program had a carryover balance of about $70 million--far more 
than required to hire an additional six to seven staff members. 
However, CMS officials told us that because the CLIA program staff are 
part of CMSO, they are subject to the personnel limits established for 
CMSO, regardless of whether or not the program has sufficient funds to 
hire more staff. Although CMSO is at its authorized personnel 
allocation, the CLIA program could hire additional staff with approval 
from the Administrator. We were told that CMSO has not requested such 
approval. 

We also noted issues that raise a question about the thoroughness of 
CMS equivalency reviews because some survey organizations' procedures 
or policies appear to be less stringent than those required by CMS for 
the CLIA program. For example: 

* Accrediting organizations provide labs more advance notice about 
upcoming surveys than CMS allows state survey agencies to give to the 
labs they inspect. 

* JCAHO surveyors focus their review of lab testing on the 12 months 
prior to the survey. CMS requires that state surveyors review the 
entire 24 months of testing since the last survey.[Footnote 72] 

* While CMS requires initial and advanced surveyor training, CAP 
encourages but does not require its volunteer surveyors to participate 
in surveyor training.[Footnote 73] 

* As of August 2005, CAP's policy manual indicates that complaint 
investigations may be announced or unannounced. CMS guidance requires 
that complaint investigations be unannounced. 

Prior to 2005, CMS's equivalency determination reviews focused on the 
inspection requirements themselves, and not the procedures and policies 
used by accrediting organizations and exempt-state programs in carrying 
out oversight of labs; this focus on inspection requirements may 
explain the divergence from the policies and procedures CMS requires 
for state survey agencies. During 2006, CMS is simultaneously reviewing 
the equivalency of COLA and JCAHO inspection requirements and, for the 
first time, incorporating on-site observations of accrediting 
organization policies and systems into the review and approval process. 
For example, CMS is checking to ensure that accrediting organizations 
have adequate systems in place to track such things as (1) complaints, 
(2) correction of deficiencies, and (3) proficiency testing. 

Many CMS Validation Reviews Lack Independence and Reviews Skip Some 
State Survey Agencies: 

CMS validation reviews that are intended to evaluate lab surveys 
conducted by both states and accrediting organizations do not provide 
CMS with an independent assessment of the extent to which surveys 
identify all serious--that is, condition-level or condition-level 
equivalent--deficiencies. CMS requires its regional offices to conduct 
validation reviews of 1 percent of labs inspected by state survey 
agencies in a year. In contrast, validation reviews of 5 percent of 
labs inspected by accrediting organizations during a year are conducted 
by state survey agency personnel. CMS does not specifically require 
that validations occur in each state and some states do not have 
validation reviews each year. Furthermore, many validation reviews 
occur at the same time a survey organization conducts its inspection 
and, in our view, the collaboration among the two teams during these 
simultaneous surveys prevents an independent evaluation.[Footnote 74] 

Validation of State Survey Agency Lab Surveys: 

The requirement to validate 1 percent of labs surveyed by state survey 
agencies in a year--roughly 100 validation reviews each year--does not 
ensure sufficient oversight of state survey agencies. The validation 
review requirement, which is included in CMS regional office annual 
budget memorandums, does not specify how many validation reviews must 
be conducted in each state. While the 10 CMS regional offices generally 
validated 1 percent of the state survey agency inspections within their 
region, they often did not validate 1 percent of inspections within 
each state and, in fact, performed none in some states. From 1999 
through 2003, federal surveyors: 

* validated less than 1 percent of labs surveyed by state survey 
agencies in an average of about 25 percent of states, ranging from 7 
states in 2002 to 17 states in 2003; and: 

* did not conduct any validation reviews in an average of 16 percent of 
states per year, ranging from 3 states in 2002 to 12 states in 1999. 

In 11 states, no validation reviews were conducted in multiple years. 
For example, no validation reviews were conducted in Michigan and 
Washington, D.C. during 4 of 5 years from 1999 through 2003. Without 
validating at least some surveys in each state, CMS is unable to 
determine if the states are appropriately identifying deficiencies. 

Seventy-five percent of validations of state lab surveys were conducted 
simultaneously from fiscal years 1999 through 2003.[Footnote 75] 
According to CMS officials, the large proportion of simultaneous 
validation reviews provides an opportunity for federal surveyors to 
share information with state surveyors, monitor their conformance with 
CLIA inspection requirements, and identify training and technical 
assistance needs. However, we found that such reviews do not provide an 
accurate assessment of state surveyors' ability to identify condition- 
level deficiencies. Of the 13 validation reviews that identified missed 
condition-level deficiencies, only 1 was a simultaneous review (see 
table 5). Validations of state surveys typically utilize one federal 
surveyor for either independent or simultaneous validation 
reviews;[Footnote 76] therefore, increasing the proportion of 
independent validation reviews to strengthen CMS oversight likely would 
not require additional federal surveyors. Moreover, conducting 
independent validation reviews eliminates the extra effort required to 
coordinate schedules to ensure that the validation reviews occur at the 
same time as the state survey. 

Table 5: Analysis of Results of Simultaneous and Independent Validation 
Reviews of State Surveys, Fiscal Years 1999-2003: 

Fiscal year: 1999;
Number of validation reviews: 223; 
Conducted simultaneously: number: 141;
Conducted simultaneously: Number with condition-level deficiencies 
state surveyors missed: 0; 
Conducted Independently: Number: 82;
Conducted Independently: Number with condition-level deficiencies state 
surveyors missed: 2.

Fiscal year: 2000;
Number of validation reviews: 224; 
Conducted simultaneously: number: 135;
Conducted simultaneously: Number with condition-level deficiencies 
state surveyors missed: 0; 
Conducted Independently: Number: 89;
Conducted Independently: Number with condition-level deficiencies state 
surveyors missed: 5. 

Fiscal year: 2001;
Number of validation reviews: 218; 
Conducted simultaneously: number: 184;
Conducted simultaneously: Number with condition-level deficiencies 
state surveyors missed: 1; 
Conducted Independently: Number: 34;
Conducted Independently: Number with condition-level deficiencies state 
surveyors missed: 0. 

Fiscal year: 2002;
Number of validation reviews: 240; 
Conducted simultaneously: number: 199;
Conducted simultaneously: Number with condition-level deficiencies 
state surveyors missed: 0; 
Conducted Independently: Number: 41;
Conducted Independently: Number with condition-level deficiencies state 
surveyors missed: 4. 

Fiscal year: 2003;
Number of validation reviews: 195; 
Conducted simultaneously: number: 167;
Conducted simultaneously: Number with condition-level deficiencies 
state surveyors missed: 0; 
Conducted Independently: Number: 28;
Conducted Independently: Number with condition-level deficiencies state 
surveyors missed: 2. 

Total; 
Number of validation reviews: 1,100; 
Conducted simultaneously: number: 826;
Conducted simultaneously: Number with condition-level deficiencies 
state surveyors missed: 1; 
Conducted Independently: Number: 274;
Conducted Independently: Number with condition-level deficiencies state 
surveyors missed: 13. 

Source: CMS. 

[End of Table] 

Validation of Accrediting Organizations' Lab Surveys: 

According to CMS guidance, at least one validation review of an 
accrediting organization's survey of labs should be conducted 
simultaneously each year, but not all validation reviews should be 
simultaneous because a combination of simultaneous and independent 
reviews provides a balanced view of surveyor performance. CMS officials 
were unable to tell us exactly how many of the roughly 275 validation 
reviews conducted each year from fiscal year 1999 through fiscal year 
2003 were simultaneous.[Footnote 77] However, one of the three 
accrediting organizations we reviewed told us that a significant 
proportion of their validation reviews are conducted simultaneously. 
JCAHO estimated that 33 percent of its validation reviews were 
conducted simultaneously. COLA estimated that 9 percent of validation 
reviews conducted in 2004 and 2005 were simultaneous. Finally, CAP 
officials told us that, from 2002 through 2004, 11 percent of 
validation reviews of CAP-accredited labs were conducted 
simultaneously. 

Given the limitations of simultaneous reviews, conducting independent 
validation reviews are a more effective way of ensuring the equivalency 
of accrediting organization inspection requirements and processes 
between the equivalency determinations. CMS officials told us that the 
agency's intent in instituting simultaneous reviews was for state and 
accrediting organization surveyors to share best practices, to promote 
understanding of each other's programs, and to foster accrediting 
organization improvement. They indicated that they considered it a 
learning experience both if an accrediting organization surveyor added 
a deficiency noted by a state surveyor to a survey report and vice 
versa. However, most of the state survey agency officials we 
interviewed told us that simultaneous validation reviews do not provide 
a realistic evaluation of the adequacy of accrediting organizations' 
inspection process. In fact, CMS guidance encourages surveyors to 
discuss the survey findings prior to concurrent conferences with lab 
personnel to review their findings. 

From fiscal years 1999 through 2003, state survey agency surveyors 
found condition-level deficiencies missed by accrediting organization 
surveyors on 64 validation reviews, but only 6 of these validation 
reviews were simultaneous.[Footnote 78] In contrast, 58 (91 percent) of 
the validation reviews that identified serious deficiencies missed by 
accrediting organizations were independent validation reviews. (See 
table 6.) 

Table 6: Analysis of Results of Simultaneous and Independent Validation 
Reviews of Accrediting Organizations' Surveys, Fiscal Years 1999-2003: 

Fiscal year: 1999; 
Number of validation reviews: 227; Validation reviews that found 
condition-level deficiencies missed by accrediting organizations' 
surveyors: Total: 8; 
Validation reviews that found condition-level deficiencies missed by 
accrediting organizations' surveyors: Number conducted simultaneously: 
0; 
Validation reviews that found condition-level deficiencies missed by 
accrediting organizations' surveyors: Number conducted independently: 
8. 

Fiscal year: 2000; 
Number of validation reviews: 265; Validation reviews that found 
condition-level deficiencies missed by accrediting organizations' 
surveyors: Total: 17; 
Validation reviews that found condition-level deficiencies missed by 
accrediting organizations' surveyors: Number conducted simultaneously: 
3; 
Validation reviews that found condition-level deficiencies missed by 
accrediting organizations' surveyors: Number conducted independently: 
14. 

Fiscal year: 2001; 
Number of validation reviews: 214; Validation reviews that found 
condition-level deficiencies missed by accrediting organizations' 
surveyors: Total: 8; 
Validation reviews that found condition-level deficiencies missed by 
accrediting organizations' surveyors: Number conducted simultaneously: 
1; 
Validation reviews that found condition-level deficiencies missed by 
accrediting organizations' surveyors: Number conducted independently: 
7. 

Fiscal year: 2002; 
Number of validation reviews: 317; Validation reviews that found 
condition-level deficiencies missed by accrediting organizations' 
surveyors: Total: 14; 
Validation reviews that found condition-level deficiencies missed by 
accrediting organizations' surveyors: Number conducted simultaneously: 
1; 
Validation reviews that found condition-level deficiencies missed by 
accrediting organizations' surveyors: Number conducted independently: 
13. 

Fiscal year: 2003; 
Number of validation reviews: 348; Validation reviews that found 
condition-level deficiencies missed by accrediting organizations' 
surveyors: Total: 17; 
Validation reviews that found condition-level deficiencies missed by 
accrediting organizations' surveyors: Number conducted simultaneously: 
1; 
Validation reviews that found condition-level deficiencies missed by 
accrediting organizations' surveyors: Number conducted independently: 
16. 

Total; 
Number of validation reviews: 1,371; Validation reviews that found 
condition-level deficiencies missed by accrediting organizations' 
surveyors: Total: 64; 
Validation reviews that found condition-level deficiencies missed by 
accrediting organizations' surveyors: Number conducted simultaneously: 
6; 
Validation reviews that found condition-level deficiencies missed by 
accrediting organizations' surveyors: Number conducted independently: 
58.  

Source: CMS. 

Note: While our analysis covered validation reviews for all six 
accrediting organizations, CAP, COLA, and JCAHO account for the vast 
majority of such reviews. CMS officials were unable tell us the total 
number of validation reviews conducted simultaneously and independently 
during each fiscal year. 

[End of table]

CMS Use of Data for Oversight of CLIA Program Is Limited: 

CMS does not routinely collect and analyze data essential for effective 
oversight of the CLIA program but has initiatives to automate some 
available data to make them more accessible for analysis. Using data to 
analyze activities across survey organizations can be a powerful tool 
in improving CMS oversight of the CLIA program. Such analyses include 
identifying and addressing inconsistencies in how surveys are 
conducted. Although CMS tracks the most frequently cited deficiencies 
at labs in an effort to improve quality, it does not routinely track 
the proportion of labs, by state, in which state survey agencies 
identify condition-level deficiencies--those that denote serious or 
systemic problems. According to a CMS official, the agency has not 
evaluated variability in such deficiencies since 1999.[Footnote 79] As 
noted earlier in this report, variability in survey findings suggests 
inconsistencies in how surveys are conducted. CMS also does not require 
exempt-state programs and accrediting organizations to routinely submit 
data on serious deficiencies identified at the labs they inspect, 
unless the deficiencies pose immediate jeopardy to the public or an 
individual's health.[Footnote 80] As noted earlier, the lack of a 
common vocabulary on what constitutes a serious deficiency would make 
it virtually impossible for CMS to analyze such data. 

We also found that CMS does not effectively use available data to 
assess clinical lab quality in areas such as proficiency testing, 
sanctions, and complaints. For example, CMS's analysis of proficiency 
testing data for all labs showed improvements over time. As reported 
earlier, proficiency testing failures have increased for labs surveyed 
by CAP and JCAHO. Comprehensive analysis of the proficiency testing 
database is particularly valuable because it provides a uniform way to 
assess the quality of lab testing across survey organizations, which is 
not currently available for survey results. CMS is now in the process 
of automating the annual registry of sanctioned labs, which should help 
it identify important trends, such as the infrequent use of alternative 
sanctions. Automating the registry, however, will not address the lack 
of data on (1) steps taken by state survey agencies and regional 
offices when labs have their accreditation revoked or (2) interim 
steps, short of revocation of accreditation, that accrediting 
organizations take to help encourage lab compliance. CMS also lacks a 
complaints database, and therefore was unable to assess the impact of 
CAP's decision to require labs to prominently display a poster on how 
to file a complaint. CMS is developing, and plans to launch, a 
complaints database in March 2006.[Footnote 81] 

CMS Implementing Performance Reviews for Survey Organizations: 

CMS has implemented performance reviews for state survey agencies and 
is in the process of developing such reviews for accrediting 
organizations. First implemented in 2004, the annual CLIA state 
performance reviews evaluate each survey agency's ability to accomplish 
its lab oversight responsibilities.[Footnote 82] The reviews, conducted 
on-site by CMS regional office staff, measure performance in 13 areas, 
such as the timely conduct of surveys and the appropriate documentation 
of any deficiencies identified.[Footnote 83] According to CMS, the 
reviews are based on the performance-improvement model that 
characterizes much of the administration of the CLIA program.[Footnote 
84] Consequently, the primary role of regional offices in conducting 
the reviews is to provide education and support for state survey agency 
improvement. For the 2004 reviews, 38 states were required to submit 
corrective action plans to their respective CMS regional offices in at 
least 1 of the 13 areas examined. Three areas required the most 
corrective action plans: principles of documentation, proficiency 
testing desk reviews, and survey time frames.[Footnote 85] 

* Principles of documentation. CMS found that some state survey 
agencies lacked the supervisory personnel to conduct internal reviews 
intended to ensure the appropriate documentation of deficiencies. It 
also found that some state survey agencies did not follow the protocol 
instructions on how to quantify such reviews. 

* Proficiency testing desk reviews. Because of personnel shortages, 
some state survey agencies were unable to perform proficiency testing 
desk reviews between surveys, waiting instead until the next on-site 
survey to address unsuccessful proficiency testing. CMS plans to 
provide additional surveyor training on desk review requirements. 

* Survey time frames. CMS regional office staff were inconsistent in 
scoring whether state survey agencies met the established time frames 
for initial surveys. While some regions were lenient if a state missed 
the time frame by just 1 day or provided a reasonable explanation--such 
as staff turnover or illness--other regions were more stringent in 
scoring states against the standard. 

However, it is not yet clear to what extent the 2004 scores represent 
state survey agency shortcomings or a learning curve for the states in 
understanding the performance review protocols. 

In partnership with the accrediting organizations, CMS is developing 
performance standards comparable to, but different from, those 
implemented in 2004 for state survey agencies. For example, both the 
state survey agency review protocols and those proposed for accrediting 
organizations measure the timeliness of the surveys, but those proposed 
for the latter would also focus on several areas that are unique to 
accrediting organizations. The performance standards would include (1) 
timely and consistent information sharing and (2) alerting CMS about 
decisions to limit or remove accreditation in a timely manner. 
According to a CMS official, the agency plans to phase in the 
performance standards, starting with a standard on complaints. For 
example, if the CLIA complaints database is activated in March 2006, 
CMS could begin to monitor accrediting organization responsiveness to, 
and outcomes of, complaints. Because the database will contain national 
lab complaint data, CMS will be able to compare the volume and outcome 
of complaints across survey organizations. According to CMS, 
implementation of the accrediting organization performance standards 
will be a central--not regional--office responsibility. 

Conclusions: 

Clinical labs play a pivotal role in the nation's health care system by 
diagnosing many diseases, including potentially life-threatening 
diseases, so that individuals receive appropriate medical care. Given 
this important role, lab tests must be accurate and reliable. CMS and 
survey organization oversight is intended to ensure that labs produce 
reliable test results, a key objective of CLIA. Our work demonstrated 
that the oversight of clinical labs needs to be strengthened in several 
areas. 

Determining the quality of lab testing is difficult because it is 
virtually impossible to crosswalk inspection requirements across survey 
organizations. Without standardized survey findings across all survey 
organizations, CMS cannot tell whether the quality of lab testing has 
improved or worsened over time or whether deficiencies are being 
appropriately identified. 

Lab oversight has weaknesses that make it difficult to determine the 
quality of lab testing because they mask quality problems. To help 
surveys provide a realistic picture of day-to-day operations, CAP and 
JCAHO began unannounced surveys of the labs they survey--generally 
hospital labs--in 2006. While unannounced surveys at physician office 
labs may not be practical, Washington's exempt program and COLA 
currently give such labs more advance notice than the 2 weeks CMS 
prescribes for labs inspected by state survey agencies. Similarly, the 
greater weight that CMS and survey organizations sometimes place on 
their educational, as opposed to their regulatory role may lead to 
survey findings that do not accurately reflect lab quality. Educating 
labs to ensure high-quality testing should complement but not replace 
the enforcement of CLIA inspection requirements. The low number of lab 
complaints may be the result of a lack of information about how to file 
a complaint and lab workers' fear of retaliation. Because protecting 
the anonymity of lab workers who file complaints is difficult, whistle- 
blower protections for such individuals are particularly important. 
Finally, labs with the same serious deficiencies on consecutive surveys 
often escape sanctions, even though Congress authorized alternative 
sanctions to give CMS more flexibility to achieve lab compliance. 
Without the threat of real consequences, labs may not be sufficiently 
motivated to comply with CLIA inspection requirements. 

CMS's oversight is not adequate to enforce CLIA requirements. The 
agency is not requiring labs to participate in proficiency testing on a 
quarterly basis, as required by CLIA. Furthermore CMS is not conducting 
CLIA-equivalency determinations within the time frames it established 
for such reviews, nor has it always reviewed changes to exempt-state 
and accrediting organizations' inspection requirements before their 
implementation, even though it requires their submission to ensure 
continued CLIA equivalency of their requirements. Although the CLIA 
program has generated funds, CMS agencywide staffing limitations have 
prevented the program from hiring sufficient staff to complete 
equivalency reviews in a timely manner. Many validation reviews are 
conducted at the same time a survey organization conducts its survey, 
and such simultaneous reviews may not provide a true assessment of 
surveyor performance. Independent validation reviews of accrediting 
organization surveys are critical because CMS has not conducted 
equivalency reviews within the time frames it established. We also 
found that few validation reviews of state survey agency lab 
inspections are conducted each year and that none occurred in some 
states. Because state surveyors conduct validation reviews of 
accrediting organizations to ensure the continuing CLIA equivalency of 
their inspection requirements, conducting an appropriate number of 
validation reviews of state survey agency lab inspections is critical. 
CMS also has not yet taken the lead in ensuring the availability and 
use of data from survey organizations to help it monitor their 
performance--particularly the consistency with which surveys are 
conducted. CMS is creating a new complaint database, but its plan to 
automate the existing sanctions registry will not address the lack of 
data on enforcement actions taken by state survey agencies and regional 
offices when labs have their accreditation revoked. 

Recommendations for Executive Action: 

To enable CMS to track the nature and extent of lab quality problems 
across survey organizations, we recommend that the CMS Administrator 
take the following action: 

* Work with exempt-state programs and accrediting organizations to 
standardize their categorization and reporting of survey findings in a 
way that tracks to CLIA inspection requirements and allows for 
meaningful comparisons across organizations, such as the analysis of 
trends in the citation of condition-level deficiencies. 

To ensure consistency in the oversight of labs by survey organizations, 
we recommend that the CMS Administrator take the following four 
actions: 

* Ensure that the advance notice of upcoming surveys provided to 
physician office labs is consistent with CMS's policy for advance 
notice provided by state survey agencies. 

* Ensure that regulation of labs is the primary goal of survey 
organizations and that education to improve lab quality does not 
preclude the identification and reporting of deficiencies that affect 
lab testing quality. 

* Impose appropriate sanctions on labs with consecutive condition-level 
deficiencies in the same requirements. 

* Require all survey organizations to develop, and require labs to 
prominently display, posters instructing lab workers on how to file 
anonymous complaints. 

To improve oversight of labs and survey organizations, we recommend 
that the CMS Administrator take the following eight actions: 

* Consistent with CLIA, require quarterly proficiency testing, except 
when technical and scientific considerations suggest that less frequent 
testing is appropriate for particular examinations or procedures. 

* Ensure that evaluations of exempt-state and accrediting organization 
inspection requirements take place prior to expiration of the period 
for which they are approved in order to ensure the continued 
equivalency of their requirements with CLIA's. 

* Ensure that changes to the inspection requirements of exempt states 
and accrediting organizations be reviewed prior to implementation, as 
required by regulation, to ensure that individual changes do not affect 
the overall CLIA equivalency of each organization. 

* Allow the CLIA program to utilize revenues generated by the program 
to hire sufficient staff to fulfill its statutory responsibilities. 

* Ensure that federal surveyors validate a sufficient number of 
inspections conducted by each state survey agency to allow a reasonable 
estimate of their performance, including a minimum of one independent 
validation review for each state survey agency surveyor. 

* Require that almost all validation reviews of each accrediting 
organizations' surveys be an independent assessment of performance. 

* Collect and routinely review standardized survey findings and other 
available information for all survey organizations to help ensure that 
CLIA requirements are being enforced and to monitor the performance of 
each organization. 

* Establish an enforcement database to monitor actions taken by state 
survey agencies and regional offices on labs that lose their 
accreditation. 

Agency and Accrediting Organization Comments and Our Evaluation: 

We provided a draft of this report to CMS, and to CAP, COLA, and JCAHO-
-the three laboratory accrediting organizations included in our review. 
CMS strongly endorsed our overall conclusion that quality assurance for 
the nation's clinical labs should be strengthened and noted that the 
report provided insights into areas where it can improve, augment, and 
reinforce oversight of both labs and accrediting organizations to 
ensure quality testing. Overall, CMS concurred with 11 of our 13 
recommendations. Despite this endorsement, however, CMS (1) provided an 
alternative assessment of lab quality, (2) disagreed that the phase-in 
of certain CLIA requirements inappropriately stressed education as 
opposed to regulation, (3) expressed concern about how to identify and 
sanction labs with repeat condition level deficiencies, (4) disagreed 
with our recommendation regarding the frequency of proficiency testing, 
and (5) stated that it was already meeting our recommendation to 
conduct almost all validation reviews of each accrediting organization 
independently. We continue to believe that implementation of these 
recommendations is necessary for the effective oversight of labs. 
(CMS's comments are reproduced in app. IV.) CAP indicated that it took 
seriously our findings and recommendations and intended to determine if 
there were additional measures it could take to strengthen its own 
oversight. COLA said that our recommendations to improve CMS oversight 
of survey organizations had merit. Nonetheless, CAP, COLA, and JCAHO 
disagreed with some of our findings and recommendations to CMS. (CAP, 
COLA, and JCAHO's comments are reproduced in app. V, VI, and VII, 
respectively.) Our evaluation first responds to CMS's and related 
accrediting organizations' comments and then addresses additional 
comments by accrediting organizations. 

Assessment of Lab Quality: 

CMS and COLA commented that lab performance has improved since the 
enactment of CLIA. In particular, CMS pointed to the substantial 
decline--from about 80 percent to about 42 percent--in the percentage 
of labs nationwide with deficiencies between 1994 and 2004. It is 
important to note that CMS's data (1) do not distinguish between 
serious condition-level deficiencies and less serious standard-level 
deficiencies,[Footnote 86] (2) include the early start-up period when 
physician office labs were first regulated,[Footnote 87] and (3) 
exclude deficiency data on the substantial number of labs surveyed by 
accrediting organizations and state CLIA-exempt programs. Due to these 
shortcomings, we do not believe that CMS's data provide an accurate 
assessment of lab quality nationwide. 

Based on the limited data available on state survey agency inspections 
of labs since 1998 and the lack of any comparable data on accrediting 
organization and exempt-state program survey findings, we concluded 
that insufficient data existed to identify the extent of serious 
quality problems at labs. CMS did not retain backup files of pre-2004 
data on deficiencies identified by state survey agencies. Although CMS 
has determined that accrediting organization and exempt-state program 
lab requirements are at least equivalent to CLIA's, there is no 
agreement across survey organizations on how to distinguish serious 
from less serious deficiencies. While CMS concurred with our 
recommendation to standardize the categorization and reporting of 
survey findings in a way that tracks to CLIA and allows meaningful 
comparisons across survey organizations, it noted that a 
straightforward linkage of requirements is limited by CMS's authority 
under the statute--that is, survey organizations are permitted by 
statute to have different requirements--and that it will approach 
implementation of our recommendation cautiously. JCAHO said that it 
agreed with the need for a common, agreed upon, taxonomy that could be 
used by all survey organizations to track serious deficiencies, but 
commented that it thought CMS's implementation of our recommendation 
would require a revamping of JCAHO's accreditation system. That was not 
the intent of our recommendation and it is clear from CMS's comments 
that its implementation of our recommendation would not require an 
overhaul of accrediting organizations' systems. CAP acknowledged the 
complexity and inherent challenges in measuring the quality of lab 
testing, but noted that it is committed to working to develop better 
systems to detect labs with serious quality problems--those that impact 
patient care. 

The statutory authority that permits standards different from CMS's 
(provided they are at least as stringent) does not impair the ability 
to develop a crosswalk that allows for meaningful comparisons across 
survey organizations--such as an analysis of trends in the citation of 
condition-level deficiencies. In fact, CMS regulations already require 
accrediting organizations and exempt-state programs to submit a 
crosswalk--detailed comparisons of their individual accreditation or 
licensure approval requirements with comparable CLIA condition-level 
requirements--when they apply and reapply for approval from 
CMS.[Footnote 88] Such a comparison is possible because CMS already 
identifies instances when accrediting organizations have missed 
condition-level requirements during validation reviews. For example, 
CMS should require survey organizations to (1) indicate which of their 
requirements relate to each CLIA condition-level requirement, and (2) 
explain which deficiencies in their requirements, if cited, should be 
considered equivalent to CLIA condition-level deficiencies. 

CMS also pointed to the steady increase in successful proficiency 
testing across all labs as an indication of improvements in lab 
quality. Our analysis of proficiency testing results suggested that lab 
quality had not improved at hospital labs in recent years. CMS 
correctly noted that the overall proportion of labs with no test 
failures increased from about 88 percent in 1998 to about 93 percent in 
2003--that is, fewer labs failed proficiency testing. However, by 
focusing on overall proficiency testing results, CMS data mask trends 
in failure rates for subsets of labs such as hospital labs. For 
example, from 1999 through 2003, the percentage of CAP-surveyed labs 
with proficiency testing failures increased from 4.1 percent to 6.8 
percent; CAP generally inspects hospital labs. CMS also commented that 
the overall improvement cannot be dismissed as a result of some labs 
being granted waived status because the more dramatic improvements 
predated the recent increase in the number of waived labs. It further 
commented that removing waived labs from the data would not result in 
improved performance rates. First, the number of waived labs--those 
performing waived tests or provider-performed microscopy--increased by 
about 26,600 from 1993 though 1998 and then increased by another 
approximately 33,700 labs from 1998 through 2004. Second, CMS's comment 
suggested that it had conducted an analysis of the impact of removing 
waived labs from the proficiency testing data. However, it did not 
provide any data analysis when we subsequently asked to see the 
evidence behind its assertion. COLA also addressed this issue, and did 
not challenge our conclusion that the decrease in proficiency testing 
failures for physician office labs might not represent an actual 
improvement in lab quality, but instead could reflect the fact that 
some problematic labs are no longer surveyed. 

Educational Focus of CLIA: 

CMS agreed that it was important to maintain an appropriate balance 
between its regulatory and educational approaches to CLIA 
implementation. While CMS noted that objective review and feedback are 
the bedrock of education, it emphasized that the educational approach 
does not preclude surveyors from identifying lab deficiencies. CAP and 
COLA offered similar comments. However, we found evidence that the goal 
of educating lab workers sometimes takes precedence over, or precludes, 
the identification and reporting of deficiencies and recommended that 
CMS take steps to ensure that regulation remains the primary goal of 
surveys. To address this problem, CMS stated that it will provide 
additional state agency surveyor training, improve guidance, develop an 
action plan to promote greater consistency among surveyors, and 
institute periodic performance and consistency reviews. CMS's comments 
did not address evidence we presented that an educational emphasis may 
also prevent fulfillment of regulatory responsibilities by some 
accrediting organizations. 

CMS disagreed that the extended phase-in periods for new quality 
control requirements and proficiency testing for lab technicians who 
interpret Pap smears were inappropriate. COLA noted that federal 
requirements in many regulated industries are phased in to allow them 
time to understand and effectively implement the requirements. CMS 
reaffirmed that, in the case of significant new requirements and for 
the time period specified by CMS, the educational approach may result 
in identified deficiencies being communicated to laboratories without a 
concomitant citation, as is the case with quality control and Pap smear 
testing requirements. As discussed in the report, we believe that CMS's 
educational phase-in periods are excessive. We found that the phase-in 
period for new quality control requirements was extended from 2 years 
to about 4 years, in part because of the lack of lab "buy-in" for some 
of the new policies and procedures. Similarly, the phase-in period for 
Pap smear proficiency testing is 2 years, despite (1) CMS's concern 
about some of the high initial test failure rates, (2) the consequences 
of inaccurate test results on patients' diagnoses and treatment, and 
(3) the approximately 13-year time lag between the 1992 implementation 
of the CLIA regulations and the commencement of Pap smear proficiency 
testing. 

Sanctioning Labs with Serious, Repeat Deficiencies: 

In commenting on our recommendation to appropriately sanction labs with 
repeat condition-level deficiencies, CMS acknowledged the need to 
carefully monitor repeat deficiencies but expressed concern that 
focusing on the condition cited may not indicate a true repeat 
deficiency because the underlying failures could have been different in 
the two consecutive surveys for those labs.[Footnote 89] CMS's 
assertion is inconsistent with its own policy on serious, repeat 
deficiencies for other providers, such as nursing homes. In general, 
immediate sanctions must be imposed on nursing homes with consecutive 
serious deficiencies, regardless of whether the deficiencies are in the 
same care area. As we have previously reported, allowing providers to 
avoid sanctions by correcting serious deficiencies contributes to an up-
and-down pattern of compliance and undermines the deterrent effect of 
sanctions.[Footnote 90] According to the CLIA legislative history, 
congressional concern about labs with repeat deficiencies led to the 
introduction of alternative sanctions such as civil money penalties as 
a substitute for more severe principal sanctions, which include 
termination from the CLIA program. 

Proficiency Testing Frequency: 

CMS disagreed with our recommendation that it require quarterly 
proficiency testing except when technical and scientific considerations 
indicate that less frequent testing is justified for particular tests; 
CMS insisted that proficiency testing three times a year was 
"appropriate." CMS stated that CMS and the Centers for Disease Control 
and Prevention had together determined that the reduced frequency was 
based on technical and scientific grounds. We asked for a record of the 
agencies' deliberation supporting that decision. CMS supplied a brief, 
undated narrative, which it attributed to the Centers for Disease 
Control and Prevention. It was not clear to us that this narrative was 
contemporaneous with the decision to reduce the frequency of 
proficiency testing. Moreover, the narrative focused on the relative 
costs and benefits of proficiency testing at various intervals. There 
was no analysis of technical and scientific considerations with regard 
to particular tests that presented a basis for reducing the frequency. 

Based on CMS's response, we maintain that CMS's decision to require 
proficiency testing three times a year is not authorized by CLIA. CMS 
did not dispute that, according to CLIA, it must base the decision to 
reduce the frequency of proficiency testing on scientific and technical 
considerations relevant to particular tests, and that the decision to 
reduce the frequency is in the nature of an exception to the norm of 
quarterly testing. CMS also acknowledged that the "public explanation" 
contained in the preamble to the rule setting the proficiency testing 
requirement at three times a year referred only to general concerns 
about the perceived burden associated with quarterly testing. For 
example, CMS stated in its final rule that the prospect of reduced 
frequency would provide a "needed respite" to both laboratories and 
proficiency test providers. In sum, CMS has not adhered to the 
conditions set out in the statute for reducing the frequency of 
proficiency testing and has implemented a policy that is not supported 
by statutory authority. 

Equivalency of Accrediting Organization and Exempt State Programs: 

CMS acknowledged the need to complete timely equivalency reviews of 
accrediting organization and exempt-state requirements, which were an 
average of about 40 months late for the 3-1/2-year period we examined. 
Regarding interim changes made between periodic equivalency reviews, 
CMS agreed with our recommendation and stated it would review such 
interim changes for both accrediting organizations and exempt-state 
programs prior to implementation, as required by regulation.[Footnote 
91] 

Furthermore, CMS indicated that changes to accrediting organization 
requirements did not necessarily impact CLIA equivalency determinations 
because accrediting organizations may have more stringent requirements 
than CLIA's. While possibly true, CMS must review the changes to 
determine whether CLIA equivalency is affected. For example, in 2005, 
when it reviewed JCAHO's revised standards a year after they were 
implemented, CMS identified several critical areas where JCAHO's 
standards were less stringent than those of CLIA. 

CMS acknowledged that a significant increase in workload and the 
decline in CLIA program staff were factors which contributed to delays 
in making equivalency determinations and reviewing interim changes. 
Although CMS stated that it reserved the right to manage the work 
within available resources and its assessment of priorities, it also 
made a commitment to explore our recommendation to utilize revenues 
generated by the CLIA program to hire sufficient staff to fulfill its 
responsibilities. We believe that additional staff would not only 
improve the timeliness of equivalency reviews, but also their 
thoroughness. 

Accrediting Organization Validation Reviews: 

CMS stated that, consistent with our recommendation, 88 percent of 
accrediting organization validation reviews were conducted 
independently in calendar year 2005. However, our recommendation was to 
require that almost all validation reviews of each accrediting 
organizations' surveys be conducted independently. CMS's comments do 
not indicate the proportion of independent validation reviews conducted 
for each accrediting organization. Because CMS did not begin collecting 
data on the number of simultaneous accrediting organization validation 
reviews performed until August 2003, we relied on estimates from 
accrediting organizations.[Footnote 92] CMS did not challenge JCAHO's 
estimate that 33 percent of its validation reviews were simultaneous, 
compared to about 10 percent for CAP and COLA. We do not believe that 
performing an estimated 33 percent of JCAHO's validation reviews 
simultaneously is consistent with our recommendation. 

COLA commented that simultaneous validation reviews are useful in 
assuring consistency and in providing an understanding of processes 
across survey organizations. It also questioned the accuracy of most of 
the missed condition-level deficiencies identified by CMS during 
independent validation reviews. We did not assess the process CMS uses 
to identify such missed deficiencies but based on a discussion with CMS 
officials it appears that the process is thorough and time consuming. 
JCAHO commented that we had misinterpreted the results of simultaneous 
and independent validation reviews of accrediting organizations 
because, by JCAHO's estimate, the proportion of missed condition-level 
deficiencies is roughly equivalent for both types of surveys. We did 
not find the assumptions behind JCAHO's estimate convincing, given the 
lack of data on the actual number of simultaneous versus independent 
validation reviews conducted for each accrediting organization. 
Furthermore, most of the state survey agency officials we interviewed, 
whose inspectors conduct accrediting organization validation reviews, 
told us that simultaneous validation reviews do not provide a realistic 
evaluation of the adequacy of accrediting organizations' inspection 
processes. 

Additional Comments by Accrediting Organizations: 

Additional CAP comments. CAP commented that we underestimated the value 
of using lab professionals in the inspection process and that we 
provided no factual evidence that their use was less effective than 
other models. In contrast to CAP, other survey organizations employ 
dedicated staff surveyors who have mandatory and continuing education 
requirements. In addition, such dedicated surveyors conduct from 30 to 
about 200 surveys per year compared to CAP's lab professionals who 
volunteer to perform about 1 survey per year. CAP partially addressed 
our concern about the lack of mandatory training for its volunteer 
surveyors. It plans to begin requiring training for survey team leaders 
in July 2006 and for survey team members in 2007. However, CAP's 
proposed new mandatory training is much less extensive than that 
required by other survey organizations. 

Moreover, we reported that some CAP surveyors we interviewed raised a 
concern about having survey team leaders who are also the day-to-day 
supervisors of team members. For example, lack of agreement about the 
seriousness of a deficiency could result in the team leader instructing 
the team to downgrade the deficiency to a recommendation, a less 
serious finding that does not appear in the inspection report. Team 
members who are subordinates to the team leader may feel that they have 
no other recourse than to follow the team leader's instructions. CAP 
recently revised its conflict-of-interest policy which now instructs 
all parties to be cautious to retain objectivity in fact finding 
throughout the inspection process. We do not believe that this change 
in CAP's conflict-of-interest policy addresses the concerns raised by 
the CAP surveyors we interviewed. In its comments, CAP indicated that 
it would continue to closely monitor this issue to determine if further 
actions were necessary. 

Additional COLA comments. COLA disagreed with our assertion that 
announced surveys may result in an unrealistic picture of lab quality-
-a conclusion supported by CMS regional office staff and most state 
survey agency officials we interviewed. We acknowledged that 
unannounced surveys of the physician office labs typically surveyed by 
COLA and state survey agencies were not practical given the 
unpredictable operating hours of such labs and need to minimize 
disruptions to patient care. However, we recommended that the advanced 
notice be limited to the 2 weeks permitted by CMS for state survey 
agencies. COLA currently provides up to 12 weeks advanced notice. COLA 
contends that providing up to 6 months of advanced notice before a 
survey would only improve the lab's operation more quickly if the lab 
took that opportunity to review COLA's self assessment questions and 
correct any missing or incorrect processes or documentation. We believe 
that COLA's example underscores the importance of our recommendation; 
such actions should be an ongoing process at labs--not a reaction to an 
upcoming inspections. 

Additional JCAHO comments. JCAHO said that our recommendation that all 
survey organizations develop and require labs to prominently display 
posters that instruct lab workers on how to file anonymous complaints 
was too narrow and prescriptive and may inadvertently limit 
organizations from using other, more effective ways to educate lab 
workers on this topic. JCAHO did not explain how implementing our 
recommendation would limit other initiatives. In fact, CMS's comments 
identified a number of promising approaches that it believed could 
supplement posters. JCAHO also said that our analysis of the increase 
in CAP complaints after it required posters in the labs it inspects 
failed to recognize a broad national trend. JCAHO indicated that it 
also experienced a dramatic increase in lab complaints between 2004 and 
2005 without the use of posters.[Footnote 93] This increase may be 
related to JCAHO's July 2005 requirement for labs to educate staff on 
how to report concerns. CAP told us that during the 3 months after they 
required a poster to be displayed they observed an immediate increase 
in the number of complaints. Thus, CAP lab complaints increased by over 
100 percent in 2004 compared to 2003 and by another approximately 71 
percent in 2005. We continue to believe that CAP's experience suggests 
that complaint posters can be an important way to encourage lab workers 
to communicate their concerns. 

CMS, CAP, COLA, and JCAHO also provided technical comments which we 
incorporated as appropriate. 

As arranged with your offices, unless you publicly announce its 
contents earlier, we plan no further distribution of this report until 
30 days after its issue date. At that time, we will send copies to the 
Administrator of the Centers for Medicare & Medicaid Services and 
appropriate congressional committees. We will also make copies 
available to others upon request. In addition, the report will be 
available at no charge on the GAO Web site at http://www.gao.gov. 

If you or your staff have any question s about this report, please 
contact me at (312) 220-7600 or aronovitzl@gao.gov. Contact points for 
our Offices of Congressional Relations and Public Affairs may be found 
on the last page of this report. GAO staff who made major contributions 
to this report are listed in appendix VIII. 

Signed by: 

Leslie G. Aronovitz: 
Director, Health Care: 

[End of section] 

Appendix I: Effects of Lab Errors on Patient Health: 

This appendix contains examples of lab errors and their consequences, 
illustrating the importance of the quality of lab testing and the 
effects of lab errors on patient health. The examples in table 7 are 
summarized from case studies in the journal Laboratory Errors and 
Patient Safety. 

Table 7: Effects of Lab Errors on Patient Health: 

Description of lab error: Example 1: Delayed reporting of elevated lab 
value: 
* A 59-year-old woman with a history of a rapid and irregular heart 
beat and stroke is taking coumadin, a blood thinning agent. Her primary 
care physician has lab tests completed regularly to ensure that the 
coumadin dose is sufficient to maintain a lab value in the range of 2-
3. The results of one Friday's test was 5.7, a high value out of her 
target range, indicative that her blood is too thin and clots too 
slowly. This lab test value was documented but the patient's primary 
care physician was not notified about the elevated value. The following 
Monday morning a lead technologist noticed that the physician had not 
been called and immediately contacted a nurse at the physician's 
office. The nurse, alarmed that both the physician's office and the 
patient had not previously been notified, tried to contact the patient 
at home. The patient had gone to the emergency room (ER) and was 
admitted to the hospital with an even higher blood clotting value of 
7.2. With hospital treatment, her blood clotting value was reduced to 
2.3 and she was discharged 3 days later; 
* The physician ordering the test did not receive notice of the high 
blood clotting lab value for 3 days, even though the lab result was 
documented within an hour of collection. In addition, the lab failed to 
follow up on the situation within an appropriate time frame; 
Effects of error on patient health: 
* There was a delay in the diagnosis of a critically elevated blood 
clotting level; 
* The patient took coumadin inappropriately for 2 days;
* The patient experienced significant bleeding in her digestive tract 
related to her impaired blood clotting status; 
* The patient had to be hospitalized to stabilize her health condition; 
* The patient was exposed to blood products, putting her at risk for a 
transfusion reaction and exposure to infectious agents. 

Description of lab error: Example 2: Human recording and data entry 
error: 
* A 60-year-old man with a history of chronic liver disease went to the 
ER with a 36-hour history of chest pain. The ER physician ordered a 
cardiac blood test that resulted in the patient's being diagnosed with 
a heart attack. The patient was started on multiple medications and 
admitted to the cardiac intensive care unit at the hospital. After 10 
hours in intensive care, a second cardiac blood test was run and 
because of significant discrepancy in the results compared to the first 
test, a rapid investigation of the situation ensued. The investigation 
revealed that the results of the original ER test were recorded 
inaccurately. The patient was retested by cardiac specialty physicians, 
reevaluated, and thought to be stable. He was removed from heart 
monitors, taken off all cardiac medications, discharged from the 
hospital, and asked to return for a routine appointment in 2 days. The 
patient continued to have mild chest pain during the 2 days after he 
was discharged. During his return visit, the patient was found to have 
a stomach ulcer. The patient's chest pain was really referred pain 
caused by the ulcer. The patient was then treated correctly; 
* An investigation of the error revealed that it was a human recording 
and data entry error. The result from a different patient's blood test 
had been entered incorrectly into this patient's record because the two 
patients' blood tests had been run within a few minutes of each other; 
Effects of error on patient health: 
* The patient inappropriately received several unnecessary medications; 
* The patient was unnecessarily admitted to the cardiac intensive care 
unit, endured the painful placement of intravenous lines, and was 
attached to several heart monitors; 
* The patient suffered significant anxiety related to thinking that he 
had suffered a heart attack; 
* During the 2-day delay between hospital admission and follow-up 
appointment, the patient remained symptomatic for his actual condition-
-an ulcer; 
* If the patient's ulcer had been more severe and had started to bleed 
internally, the inappropriate administration of heart attack 
medications could have had harmful or even catastrophic effects on the 
patient's health. 

Description of lab error: Example 3: Inaccurate review of lab test 
results: 
* A woman who had been using birth control was referred to a 
dermatologist for severe acne. The dermatologist wanted to prescribe a 
drug known to cause birth defects so she ordered two pregnancy tests, 
one initially and the second 2 weeks later. The first test came back 
negative. When the dermatologist called for the results of the second 
test, a lab worker incorrectly told her that the pregnancy test was 
negative. The patient was given a prescription for the acne drug and 
advised to avoid pregnancy while taking this medication. Three days 
after the patient started taking the prescription, the dermatologist 
saw in the patient's record that the second pregnancy test for this 
patient was actually positive. Upon instruction, the patient stopped 
taking the medication and her number of prenatal visits was increased 
so she could be monitored for possible birth defects; 
* The investigation revealed that the laboratory employee, an 
experienced technician who was busy when the incident occurred, 
reported the result of the patient's first pregnancy test when asked 
for results from the second pregnancy test. Both the physician and the 
technician neglected to follow lab policies to include the date and 
time of the test when orally communicating results; 
Effects of error on patient health: 
* The patient inappropriately received a drug known to cause birth 
defects; 
* This necessitated increased monitoring of the patient's pregnancy; 
* Although the patient's pregnancy was ultimately unaffected, she 
experienced anxiety throughout. She continues to worry that her child 
is or will be negatively affected in the future because she took this 
drug early in her pregnancy. 

Description of lab error: Example 4: Inappropriate judgment concerning 
a microbiology lab test result 
* After returning from a trip to West Africa, a 30-year-old woman went 
to the ER after experiencing high fever, chills, and a headache. The 
woman was tested for malaria, a potentially deadly disease transmitted 
via mosquito bites. The test result was negative and the patient was 
sent home on ibuprofen. Four days later she returned to the ER 
suffering from continued high fever, listlessness, and a severe 
headache. She was tested again for malaria. This time the lab test 
result was positive for a moderate case of the disease; 
* The laboratory director reviewed the testing from the initial ER 
visit and found that the tests had been positive for malaria. The cause 
for not identifying malaria initially was inappropriate judgment. Since 
there was a low pretest probability of a positive result, technicians 
assumed that there would be a negative result; 
Effects of error on patient health: 
* The patient experienced fever and other significant symptoms during 
the 4-day delay in receiving appropriate care; 
* After the malaria diagnosis was made, the patient was admitted to the 
hospital and successfully treated. There was no permanent disability. 

Sources: All examples were summarized from case studies in the journal 
Laboratory Errors and Patient Safety. Example 1: Volume 2, Issue 
1(2005): 10; Example 2: Volume 1, Issue 4(2005): 6; Example 3: Volume 
1, Issue 3(2004): 6; Example 4: Volume 1, Issue 1(2004): 5. 

[End of table] 

[End of section] 

Appendix II: Labs Surveyed by State Survey Agencies and the Percentage 
with Condition-Level Deficiencies, by State in 2004: 

Table 8: : 

State[A]: Alabama; 
Number of labs surveyed: 236; 
Percentage of labs surveyed with reported condition-level deficiencies: 
3.4. 

State[A]: Alaska; 
Number of labs surveyed: 24; 
Percentage of labs surveyed with reported condition-level deficiencies: 
12.5. 

State[A]: Arizona; 
Number of labs surveyed: 122; 
Percentage of labs surveyed with reported condition-level deficiencies: 
4.1. 

State[A]: Arkansas; 
Number of labs surveyed: 197; 
Percentage of labs surveyed with reported condition-level deficiencies: 
9.1. 

State[A]: California; 
Number of labs surveyed: 666; 
Percentage of labs surveyed with reported condition-level deficiencies: 
3.9. 

State[A]: Colorado; 
Number of labs surveyed: 173; 
Percentage of labs surveyed with reported condition-level deficiencies: 
10.4. 

State[A]: Connecticut; 
Number of labs surveyed: 140; 
Percentage of labs surveyed with reported condition-level deficiencies: 
0.0. 

State[A]: Delaware; 
Number of labs surveyed: 18; 
Percentage of labs surveyed with reported condition-level deficiencies: 
0.0. 

State[A]: District of Columbia; 
Number of labs surveyed: 15; 
Percentage of labs surveyed with reported condition-level deficiencies: 
13.3. 

State[A]: Florida; 
Number of labs surveyed: 664; 
Percentage of labs surveyed with reported condition-level deficiencies: 
4.1. 

State[A]: Georgia; 
Number of labs surveyed: 369; 
Percentage of labs surveyed with reported condition-level deficiencies: 
4.3. 

State[A]: Hawaii; 
Number of labs surveyed: 57; 
Percentage of labs surveyed with reported condition-level deficiencies: 
5.3. 

State[A]: Idaho; 
Number of labs surveyed: 89; 
Percentage of labs surveyed with reported condition-level deficiencies: 
0.0. 

State[A]: Illinois; 
Number of labs surveyed: 292; 
Percentage of labs surveyed with reported condition-level deficiencies: 
7.5. 

State[A]: Indiana; 
Number of labs surveyed: 112; 
Percentage of labs surveyed with reported condition-level deficiencies: 
9.8. 

State[A]: Iowa; 
Number of labs surveyed: 141; 
Percentage of labs surveyed with reported condition-level deficiencies: 
2.1. 

State[A]: Kansas; 
Number of labs surveyed: 156; 
Percentage of labs surveyed with reported condition-level deficiencies: 
2.6. 

State[A]: Kentucky; 
Number of labs surveyed: 217; 
Percentage of labs surveyed with reported condition-level deficiencies: 
3.2. 

State[A]: Louisiana; 
Number of labs surveyed: 157; 
Percentage of labs surveyed with reported condition-level deficiencies: 
12.1. 

State[A]: Maine; 
Number of labs surveyed: 32; 
Percentage of labs surveyed with reported condition-level deficiencies: 
0.0. 

State[A]: Maryland; 
Number of labs surveyed: 180; 
Percentage of labs surveyed with reported condition-level deficiencies: 
3.3. 

State[A]: Massachusetts; 
Number of labs surveyed: 217; 
Percentage of labs surveyed with reported condition-level deficiencies: 
4.6. 

State[A]: Michigan; 
Number of labs surveyed: 192; 
Percentage of labs surveyed with reported condition-level deficiencies: 
2.6. 

State[A]: Minnesota; 
Number of labs surveyed: 155; 
Percentage of labs surveyed with reported condition-level deficiencies: 
3.2. 

State[A]: Mississippi; 
Number of labs surveyed: 202; 
Percentage of labs surveyed with reported condition-level deficiencies: 
7.9. 

State[A]: Missouri; 
Number of labs surveyed: 238; 
Percentage of labs surveyed with reported condition-level deficiencies: 
2.9. 

State[A]: Montana; 
Number of labs surveyed: 41; 
Percentage of labs surveyed with reported condition-level deficiencies: 
7.3. 

State[A]: Nebraska; 
Number of labs surveyed: 127; 
Percentage of labs surveyed with reported condition-level deficiencies: 
8.7. 

State[A]: Nevada; 
Number of labs surveyed: 98; 
Percentage of labs surveyed with reported condition-level deficiencies: 
4.1. 

State[A]: New Hampshire; 
Number of labs surveyed: 22; 
Percentage of labs surveyed with reported condition-level deficiencies: 
9.1. 

State[A]: New Jersey; 
Number of labs surveyed: 302; 
Percentage of labs surveyed with reported condition-level deficiencies: 
9.9. 

State[A]: New Mexico; 
Number of labs surveyed: 50; 
Percentage of labs surveyed with reported condition-level deficiencies: 
10.0. 

State[A]: New York[B]; 
Number of labs surveyed: 545; 
Percentage of labs surveyed with reported condition-level deficiencies: 
8.1. 

State[A]: North Carolina; 
Number of labs surveyed: 321; 
Percentage of labs surveyed with reported condition-level deficiencies: 
13.4. 

State[A]: North Dakota; 
Number of labs surveyed: 30; 
Percentage of labs surveyed with reported condition-level deficiencies: 
10.0. 

State[A]: Ohio; 
Number of labs surveyed: 221; 
Percentage of labs surveyed with reported condition-level deficiencies: 
5.9. 

State[A]: Oklahoma; 
Number of labs surveyed: 158; 
Percentage of labs surveyed with reported condition-level deficiencies: 
10.8. 

State[A]: Oregon; 
Number of labs surveyed: 147; 
Percentage of labs surveyed with reported condition-level deficiencies: 
4.1. 

State[A]: Pennsylvania; 
Number of labs surveyed: 354; 
Percentage of labs surveyed with reported condition-level deficiencies: 
1.4. 

State[A]: Rhode Island; 
Number of labs surveyed: 36; 
Percentage of labs surveyed with reported condition-level deficiencies: 
0.0. 

State[A]: South Carolina; 
Number of labs surveyed: 143; 
Percentage of labs surveyed with reported condition-level deficiencies: 
25.2. 

State[A]: South Dakota; 
Number of labs surveyed: 66; 
Percentage of labs surveyed with reported condition-level deficiencies: 
1.5. 

State[A]: Tennessee; 
Number of labs surveyed: 403; 
Percentage of labs surveyed with reported condition-level deficiencies: 
10.7. 

State[A]: Texas; 
Number of labs surveyed: 651; 
Percentage of labs surveyed with reported condition-level deficiencies: 
6.5. 

State[A]: Utah; 
Number of labs surveyed: 95; 
Percentage of labs surveyed with reported condition-level deficiencies: 
6.3. 

State[A]: Vermont; 
Number of labs surveyed: 33; 
Percentage of labs surveyed with reported condition-level deficiencies: 
0.0. 

State[A]: Virginia; 
Number of labs surveyed: 271; 
Percentage of labs surveyed with reported condition-level deficiencies: 
5.2. 

State[A]: West Virginia; 
Number of labs surveyed: 85; 
Percentage of labs surveyed with reported condition-level deficiencies: 
7.1. 

State[A]: Wisconsin; 
Number of labs surveyed: 226; 
Percentage of labs surveyed with reported condition-level deficiencies: 
9.7. 

State[A]: Wyoming; 
Number of labs surveyed: 23; 
Percentage of labs surveyed with reported condition-level deficiencies: 
8.7. 

State[A]: Nation; 
Number of labs surveyed: 9,509; 
Percentage of labs surveyed with reported condition-level deficiencies: 
6.3. 

Source: GAO analysis of OSCAR data as of May 15, 2006. 

[A] Washington is not included because it operates only a CLIA-exempt 
program. 

[B] Excludes labs surveyed under the state's CLIA-exempt program. 

Note: Includes surveys conducted from January 12, 2004, through 
December 31, 2004. We excluded surveys conducted from January 1 to 
January 11, 2004, because surveyors began using new CLIA inspection 
requirements on January 12. 

[End of Table]

[End of section] 

Appendix III: Number of Labs Subject to Surveys by State Survey 
Agencies in 2005 and Number of Labs with Sanctions, 1998 to 2004: 

Table 9:  

State[A]: Alabama; 
Number of labs (2005): 491; 
Number of labs with sanctions (1998-2004): 1. 

State[A]: Alaska; 
Number of labs (2005): 50; 
Number of labs with sanctions (1998-2004): 0. 

State[A]: Arizona; 
Number of labs (2005): 265; 
Number of labs with sanctions (1998-2004): 7. 

State[A]: Arkansas; 
Number of labs (2005): 398; 
Number of labs with sanctions (1998-2004): 14. 

State[A]: California; 
Number of labs (2005): 1,570; 
Number of labs with sanctions (1998-2004): 134. 

State[A]: Colorado; 
Number of labs (2005): 310; 
Number of labs with sanctions (1998-2004): 6. 

State[A]: Connecticut; 
Number of labs (2005): 246; 
Number of labs with sanctions (1998-2004): 0. 

State[A]: Delaware; 
Number of labs (2005): 46; 
Number of labs with sanctions (1998-2004): 0. 

State[A]: District of Columbia; 
Number of labs (2005): 35; 
Number of labs with sanctions (1998-2004): 6. 

State[A]: Florida; 
Number of labs (2005): 1,268; 
Number of labs with sanctions (1998-2004): 5. 

State[A]: Georgia; 
Number of labs (2005): 737; 
Number of labs with sanctions (1998-2004): 5. 

State[A]: Hawaii; 
Number of labs (2005): 81; 
Number of labs with sanctions (1998-2004): 4. 

State[A]: Idaho; 
Number of labs (2005): 203; 
Number of labs with sanctions (1998-2004): 0. 

State[A]: Illinois; 
Number of labs (2005): 517; 
Number of labs with sanctions (1998-2004): 15. 

State[A]: Indiana; 
Number of labs (2005): 274; 
Number of labs with sanctions (1998-2004): 1. 

State[A]: Iowa; 
Number of labs (2005): 281; 
Number of labs with sanctions (1998-2004): 1. 

State[A]: Kansas; 
Number of labs (2005): 279; 
Number of labs with sanctions (1998-2004): 0. 

State[A]: Kentucky; 
Number of labs (2005): 386; 
Number of labs with sanctions (1998-2004): 0. 

State[A]: Louisiana; 
Number of labs (2005): 396; 
Number of labs with sanctions (1998-2004): 7. 

State[A]: Maine; 
Number of labs (2005): 89; 
Number of labs with sanctions (1998-2004): 0. 

State[A]: Maryland; 
Number of labs (2005): 467; 
Number of labs with sanctions (1998-2004): 28. 

State[A]: Massachusetts; 
Number of labs (2005): 409; 
Number of labs with sanctions (1998-2004): 0. 

State[A]: Michigan; 
Number of labs (2005): 387; 
Number of labs with sanctions (1998-2004): 62. 

State[A]: Minnesota; 
Number of labs (2005): 274; 
Number of labs with sanctions (1998-2004): 1. 

State[A]: Mississippi; 
Number of labs (2005): 441; 
Number of labs with sanctions (1998-2004): 0. 

State[A]: Missouri; 
Number of labs (2005): 394; 
Number of labs with sanctions (1998-2004): 5. 

State[A]: Montana; 
Number of labs (2005): 93; 
Number of labs with sanctions (1998-2004): 4. 

State[A]: Nebraska; 
Number of labs (2005): 254; 
Number of labs with sanctions (1998-2004): 4. 

State[A]: Nevada; 
Number of labs (2005): 159; 
Number of labs with sanctions (1998-2004): 3. 

State[A]: New Hampshire; 
Number of labs (2005): 89; 
Number of labs with sanctions (1998-2004): 1. 

State[A]: New Jersey; 
Number of labs (2005): 533; 
Number of labs with sanctions (1998-2004): 16. 

State[A]: New Mexico; 
Number of labs (2005): 113; 
Number of labs with sanctions (1998-2004): 3. 

State[A]: New York; 
Number of labs (2005): 1,125; 
Number of labs with sanctions (1998-2004): 75. 

State[A]: North Carolina; 
Number of labs (2005): 676; 
Number of labs with sanctions (1998-2004): 1. 

State[A]: North Dakota; 
Number of labs (2005): 74; 
Number of labs with sanctions (1998-2004): 0. 

State[A]: Ohio; 
Number of labs (2005): 428; 
Number of labs with sanctions (1998-2004): 12. 

State[A]: Oklahoma;
Number of labs (2005): 292;
Number of labs with sanctions (1998-2004): 6. 

State[A]: Oregon; 
Number of labs (2005): 270; 
Number of labs with sanctions (1998-2004): 0. 

State[A]: Pennsylvania; 
Number of labs (2005): 749; 
Number of labs with sanctions (1998-2004): 7. 

State[A]: Rhode Island; 
Number of labs (2005): 75; 
Number of labs with sanctions (1998-2004): 1. 

State[A]: South Carolina; 
Number of labs (2005): 315; 
Number of labs with sanctions (1998-2004): 0. 

State[A]: South Dakota; 
Number of labs (2005): 116; 
Number of labs with sanctions (1998-2004): 2. 

State[A]: Tennessee; 
Number of labs (2005): 705; 
Number of labs with sanctions (1998-2004): 0. 

State[A]: Texas; 
Number of labs (2005): 1,854; 
Number of labs with sanctions (1998-2004): 26. 

State[A]: Utah; 
Number of labs (2005): 201; 
Number of labs with sanctions (1998-2004): 13. 

State[A]: Vermont; 
Number of labs (2005): 46; 
Number of labs with sanctions (1998-2004): 0. 

State[A]: Virginia; 
Number of labs (2005): 540; 
Number of labs with sanctions (1998-2004): 17. 

State[A]: West Virginia; 
Number of labs (2005): 143; 
Number of labs with sanctions (1998-2004): 6. 

State[A]: Wisconsin; 
Number of labs (2005): 480; 
Number of labs with sanctions (1998-2004): 1. 

State[A]: Wyoming; 
Number of labs (2005): 54; 
Number of labs with sanctions (1998-2004): 1. 

Total; 
Number of labs (2005): 19,678; 
Number of labs with sanctions (1998-2004): 501. 

Source: GAO analysis of CMS lab registries and CLIA database. 

[A] Washington is not included because it operates only a CLIA-exempt 
program. 

[End of table] 

[End of section] 

Appendix IV: Comments from the Centers for Medicare & Medicaid 
Services: 

DEPARTMENT OF HEALTH & HUMAN SERVICES: 
Centers for Medicare & Medicaid Services: 
200 Independence Avenue SW: 
Washington, DC 20201: 

DATE: MAY 17 2006: 

TO: Leslie G. Aronovitz General Accounting Office: 

FROM: Mark B. McClellan, M.D., Ph.D: 
Administrator: 

SUBJECT: General Accountability Office's (GAO) Draft Report "Clinical 
Lab Quality: CMS and Survey Organization Oversight Should Be 
Strengthened" (GAO-06-416): 

Thank you for the opportunity to review and comment on the subject GAO 
report. 

The Centers for Medicare & Medicaid Services (CMS) strongly endorses 
the overall GAO recommendation that quality assurance for the Nation's 
clinical laboratories be strengthened. To this end CMS has undertaken 
many initiatives in the past several years under auspices of the 
Clinical Laboratory Improvement Amendments of 1988 (CLIA) to strengthen 
standards and increase the quality of laboratory services. Such 
initiatives include the following: 

* Improved Quality Control Requirements: We improved quality control 
requirements for all non-waived testing under CLIA via regulations 
promulgated in 2003, followed by extensive training. 

* Cytology Proficiency Testing: We implemented statutorily-required 
cytology proficiency testing for individuals who examine Pap smears and 
are also revising regulations governing cytology proficiency testing. 
More than 12,000 cytotechnologists and pathologists underwent such 
testing in 2005 (the first year of national testing). 

* Complaint Tracking System: We designed and implemented an improved 
complaint tracking system and other informational infrastructure in 
2006 (S&C Administrative Information Memo 06-13 dated March 28, 2006). 
Further improvements are underway for implementation in 2007. 

* Performance Standards for States: We implemented a national system of 
performance metrics for performance of State survey agencies. We review 
each State's performance annually, and require plans for improvement 
wherever performance remains below the thresholds of acceptability. In 
2005, thirty-three States implemented plans for improvement in at least 
one of thirteen possible areas. 

* Performance Standards for Accrediting Organizations: We initiated a 
comprehensive effort in 2005 to develop performance standards for 
accrediting organizations. 

* Accrediting Organization Response & Improvement. We initiated a 
practice of convening all accrediting organizations together to review 
common issues and promote improvement in surveys. A new agreement for 
improved data sharing, communications, and performance has been one 
outcome of these regular meetings. In addition, a new rapid response 
alert agreement will enable faster, more efficient and more coordinated 
responses to situations with significant quality or public health 
implications. 

* Data and Analysis: We are currently examining methods by which data 
may be more effectively employed in the oversight of both State 
agencies and accrediting organizations. We are planning for overhaul of 
the CLIA databases and analytical systems used to manage the program. 

* Waived Lab Quality Project: We initiated a special project to improve 
performance in those laboratories that qualify for a certificate of 
waiver (i.e., are not required to be inspected provided they follow 
manufacturers' test instructions appropriately). Of those laboratories 
in the sample that received a CMS revisit (n=459), more than 70 percent 
improved their performance in adhering to manufacturers' instructions 
subsequent to the CMS reviews and initial visit[Footnote 94]. 

The above actions build on a track record of progressively improved 
performance by laboratories that coincides with CMS' implementation of 
the CLIA Labs with Deficiency Citations enacted by Congress in 1988. 

For example, the percentage of laboratories determined to have 
deficiencies declined by 42 percent between 1994 and 2004 (Figure 1). 
The decline was most pronounced in the early years of implementation, 
before leveling off in 2004 subsequent to CMS' strengthening of the 
quality control requirements[Footnote 95]. 

[See PDF for image] 

[End of figure] 

[See PDF for image] 

[End of figure] 

The CMS also made it a national goal to improve the results of 
proficiency testing of laboratories[Footnote 96]. Laboratories are 
required by statute to undergo proficiency testing for certain 
specialties and analytes that are specified in the regulations, if the 
laboratory's business includes those specialties (for example, tests 
for blood type, blood chemistry, blood cell counts, viruses, parasites, 
fungus, bacteria, and blood antibodies). The proficiency tests are 
administered at least three times per year (except for once-per-year 
testing in cytology proficiency). The GAO highlights the 1 93% 
importance of proficiency testing as a measure of performance. While 
the GAO points to a recent increase in failures among some labs between 
1999 and 2003, the overall performance of all labs has improved, 
especially when viewed over a longer time period. 

[See PDF for image] 

[End of figure] 

Figure 2 shows the performance of all labs (both accrediting 
organization and CMS certified). In 1996, 87.4 percent of all labs 
enrolled in proficiency testing had no test failures. The proficiency 
testing rates improved to 88.1 percent in 1998, to 91.9 percent in 
2000, and to 92.8 percent in 2003. These improvements cannot be 
dismissed as a result of some laboratories being granted "waived 
status," since (a) the most dramatic improvements pre-dated the recent 
increase in the number of waived labs, and (b) removing waived labs 
from the data would not result in improved performance rates (i.e., 
labs with a CLIA certificate of waiver did not have a higher failure 
rate under their previous certificate of compliance or certificate of 
accreditation, when they were subject to proficiency-testing). 

Balancing Enforcement and Educational Roles: 

The CMS believes that a careful survey process and identification of 
regulatory deficiencies, combined with our quality improvement and 
educational approach to CLIA implementation, has been a very effective 
strategy and has yielded substantial benefits to the public. The GAO 
report, however, expresses concern about whether CMS is adequately 
balancing regulatory and educational roles. In particular, the GAO 
states: 

CMS appears to be inappropriately stressing education over regulation 
in its implementation of (1) 2003 laboratory quality control 
requirements for the CLIA program and (2) proficiency testing for lab 
technicians who interpret Pap smears, a test for cervical cancer. 

We agree that a careful balance must be maintained. We also believe 
that our response to specific GAO recommendations (discussed below) 
demonstrates our commitment to maintaining an appropriate balance. 

We strongly disagree, however, with GAO's statement regarding the 2003 
quality control regulations and the cytology proficiency testing for 
Pap smears. In both instances we strengthened quality requirements and 
public protections. It was entirely fitting that we emphasize education 
of providers and full opportunity for laboratories to understand the 
new requirements, implement them appropriately, upgrade their systems 
and practices, and make necessary corrections without unnecessary 
sanctions. 

The CMS "educational approach" does not mean that surveyors refrain 
from identifying deficiencies on the part of laboratories; in fact, 
objective review and feedback is the bedrock of education. The 
educational approach does mean that we limit the exercise of sanctions 
in certain circumstances, particularly in cases of new requirements 
when the motivational power of sanctions is unnecessary (and may even 
be counterproductive). 

The CMS published the improved quality control requirements in final 
form in January, 2003 as the CLIA Quality System Rule. The rule became 
effective April 24, 2003. Prior to this rule, laboratories that 
performed moderate complexity testing were only held to certain limited 
quality control requirements. Laboratories performing high complexity 
testing were already subject to the more stringent quality control 
requirements prior to the 2003 rule. The 2003 rule applied the more 
stringent quality control requirements to laboratories performing 
moderate complexity testing. As a result, all laboratories performing 
non-waived testing (i.e., moderate and high complexity testing) have 
subsequently been subject to the same quality control requirements. 

Changes in the 2003 CMS rule required laboratories performing moderate 
complexity testing to upgrade their quality control. CMS adopted an 
educational period to allow those laboratories more time to understand 
and comply with the more stringent quality control requirements that 
applied to them for the first time, and have committed ourselves in the 
rule promulgation process to implement the change in a manner that 
would limit the immediate impact on laboratories. 

In addition, the educational period has been important to permit CMS to 
evaluate various technological changes and work with the Clinical and 
Laboratory Standards Institute (CLSI) to (a) develop two new documents 
(one for manufacturers and one for laboratories) that provide guidance 
for quality control; (b) evaluate new quality control systems that do 
not fully comport with the 2003 regulations but which may offer useful 
innovation; and (c) develop technical assistance for laboratories in 
risk management (one of their quality control responsibilities). 

The CMS is also consulting with experts in the field of laboratory 
science to enable us to make future refinements to the quality control 
interpretive guidelines and also provide laboratories with more 
effective, practical applications for quality control. 

Figure 3 shows the overall decline in the percentage of laboratories in 
which surveyors identified the top two quality control deficiencies. 
This trend suggests that our approach produces the desired results in 
the form of improved quality. 

[See PDF for image] 

[End of figure] 

With regard to Pap smear proficiency testing, several factors 
contributed to a delay in implementing this provision of the 1988 law 
on a national basis. More recently, CMS successfully worked with 
potential testing programs and with laboratories to begin such testing 
in 2005. More than 12,000 cytotechnologists and pathologists were 
tested in that first year of national testing. Such national testing 
represented a significant accomplishment to ensure that the full 
protections afforded by CLIA have been put into effect. 

CMS' educational approach refrains from punitive sanctions only if (a) 
each laboratory enrolls its staff in proficiency testing, (b) ensures 
the staff take the test, and (c) ensures that proper regulatory 
procedures are followed in the event that an individual fails the test 
(e.g., undertakes additional education and is re-tested). Under our 
enforcement discretion, we are not currently sanctioning laboratories 
if they are unable to ensure that 100 percent of staff achieves a 
passing score, provided the laboratory follows the requirements for re- 
testing. 

CMS believes our educational approach strikes an appropriate balance 
between (a) the impositions of sanctions for anything less than 
immediate 100 percent compliance by all laboratories, and (b) any 
lessening of expectations for proficiency and protection of the public 
that are not fully reviewed and supported by both evidence and good 
practice. 

For those laboratories that continue to incur repeat deficiencies, CMS 
will use a progressive enforcement approach. In that context CMS is 
appreciative of the GAO's external review, which provides insight into 
areas where we may improve, augment, and reinforce our oversight of 
both laboratories and accrediting organizations to ensure quality 
testing. 

Responses to Specific GAO Recommendations: 

Each GAO recommendation is quoted in italics below, followed by our 
comment and plan of action. We added numbering to the GAO 
recommendations for ease of reference. 

GAO Recommendation #1: Work with exempt state programs and accrediting 
organizations to standardize their categorization and reporting of 
survey findings in a way that tracks to CLIA inspection requirements 
and allows for meaningful comparisons across organizations, such as the 
analysis of trends in the citation of condition-level deficiencies. 

CMS Response: We endorse this concept but will be cautious as to its 
scope. In our experience, a straightforward linkage of accrediting 
organization requirements to CLIA condition-level requirements is 
limited by our authority under the statute, and still may not make it 
fully possible to assess labs in a standardized manner. 

First, the law permits each accrediting organization to have different 
requirements compared to CMS, so long as their requirements are at 
least equivalent to CMS requirements. 

Second, accrediting organization requirements may exceed CMS 
requirements (so their standard may not have a CMS equivalent). 

Third, standardization of requirements does not automatically provide a 
total picture of the adequacy of an accrediting organization's survey 
and will not reduce the need for CMS to analyze in-depth those 
accrediting organization surveys that are subject to validation 
review[Footnote 97]. 

Fourth, after multiple review cycles, CMS has verified that the 
accrediting organization's published standards are at least equivalent 
to, if not more stringent than the CLIA regulations. We believe the 
more important issue in accrediting organization oversight is the 
accrediting organization's enforcement of their standards. 
Demonstrating that an accrediting organization is enforcing its 
standards through comprehensive policies, procedures and internal 
monitoring processes is vital to the effectiveness of a program. An 
accrediting organization can have the highest standards, but if not 
enforced appropriately, these standards hold little value in ensuring 
laboratory quality. Toward that end, CMS has re-focused its approval 
and oversight of accrediting organizations to concentrate on outcomes. 
This re-focusing is not only a more efficient use of CMS resources, but 
also a more effective approach overall in overseeing accrediting 
organizations. 

To supplement the validations and other information about accrediting 
organizations, CMS, through the Partners for Laboratory Oversight 
process, has convened a workgroup of accrediting organizations and CMS 
representatives to develop data-driven performance indicators similar 
to the State Agency Performance Review (SAPR) program that CMS utilizes 
to monitor State agency performance of CLIA responsibilities and 
adherence to policies. The accrediting organization indicators would 
monitor routinely, for example, whether biennial surveys were conducted 
timely, and whether laboratories that failed proficiency testing or 
incurred serious deficiencies corrected their problems promptly or had 
sanctions imposed. The Partners for Laboratory Oversight effort engages 
an exceptional collection of expertise and experience in laboratory 
oversight. By organizing the "best of the best" in a collaborative 
endeavor involving all accrediting organizations, we hope that 
accrediting organizations will make further improvements as well as 
advance the state of the art for laboratory quality. 

CMS Action: 

1(a) Categorization of Findings: CMS will work with exempt state- 
programs and accrediting organizations to promote greater 
standardization of categorizing and reporting survey findings in a way 
that enables improved tracking to CLIA inspection requirements and 
allows for more meaningful comparisons across organizations, such as 
the analysis of trends in the citation of condition-level deficiencies. 

GAO Recommendation #2: Ensure that the advance notice of upcoming 
surveys provided to physician office labs is consistent with CMS' 
policy for advance notice provided by state survey agencies. 

CMS Response: We agree. CMS will require any accrediting organization 
using announced surveys to reduce its lead time to be consistent with 
CMS policy governing actions of State survey agencies. 

CMS Action: 

2(a) Advance Notice in Small Labs: CMS will ensure that the advance 
notice of upcoming surveys provided to physician office labs is 
consistent with CMS' policy for advance notice provided by State survey 
agencies. 

2(b) Consistency: CMS will work with accrediting organizations and 
State survey agencies to promote unannounced surveys in larger labs and 
achieve greater consistency among all oversight organizations. 

GAO Recommendation #3: Ensure that regulation of labs is the primary 
goal of survey organizations and that education to improve lab quality 
does not preclude the identification and reporting of deficiencies that 
affect lab testing quality. 

CMS Response: We agree that education to improve lab quality should 
never preclude the identification of deficiencies that affect lab 
testing quality, and that regulation of labs is the primary goal of 
survey organizations. In the case of significant new requirements, and 
only within certain areas for the time period specified by CMS, the 
educational approach may include the possibility of identified 
deficiencies being communicated to laboratories without a concomitant 
citation. Currently, such allowance primarily applies to two 
situations: 

* Quality control requirements that were new in the 2003 regulation for 
labs conducting moderate complexity testing; 

* Cytology proficiency testing that was newly implemented on a national 
basis in 2005. 

For the reasons explained previously, we do not anticipate a change in 
this policy. 

CMS Action: 

3(a) Consistency Action Plan: CMS will ensure that a CMS Consistency 
Workgroup comprised of Regional Office and Central Office CLIA staff 
formulates an action plan to increase consistency. 

3(b) Guidance: CMS will develop protocols or refinements to surveyor 
guidance to ensure an appropriate balance between the enforcement and 
educational functions of the survey process. 

3(c) Training: CMS will provide additional training for surveyors and 
management on the differences between the "educational approach" and 
the "outcome oriented survey process", including concentrated training 
on which survey findings require citation without any variation. 

3(d) Performance & Consistency Review: CMS will ensure Central and 
Regional Office data review of key identified data sets, on a periodic 
basis, to determine if observed variations are truly significant and to 
identify any significant trends. This increased communication between 
Central Office, Regional Offices, & State agencies as they work to 
explain and understand the variations will lead to decreased 
variability and enhanced consistency over time. 

GAO Recommendation #4: Impose appropriate sanctions on labs with 
consecutive condition-level deficiencies in the same requirements. 

CMS Response: This recommendation is already CMS policy; the issue is 
our approach to implementation of the policy. CMS' policy of 
progressive enforcement involves the imposition of sanctions for 
laboratories failing to correct deficiencies that impact on the quality 
of laboratory testing, increasing in severity in the event of 
continuing failures. By looking only at the category of failure (the 
"conditions"), however, it is not possible to determine whether a 
laboratory has consecutively failed in the same requirement. 

For example, the laboratory could fail in proficiency testing in one 
year due to neonatal testing, and fail in proficiency testing in a 
completely different division of the laboratory the next year (e.g., 
virology). In regard to laboratories with consecutive condition-level 
deficiencies, the data presented by GAO would not permit us to assess 
whether there is a serious problem because the underlying failures 
could have been different in the two consecutive surveys for those 
laboratories that the GAO included in its report. Nonetheless, we agree 
that the issue is important and that labs that consistently fail to 
assure quality must be subject to consistently stronger remedial 
action. 

CMS Action: 

4(a) Monitoring & Data Analysis: CMS will carefully monitor citations 
of repeat deficiencies as part of the overall redesign of the CMS 
information system (converting from the Online Survey and Certification 
Reporting System (OSCAR) database to the ASPEN information system). 

4(b) Follow-up System: CMS will review the data with State survey 
agencies and accrediting organizations for the purpose of ensuring that 
the laboratories with true repeat deficiencies have accelerated and 
progressive enforcement actions imposed, if the deficiencies are not 
corrected expeditiously and effectively[Footnote 98]. 

GAO Recommendation #5: Require all survey organizations to develop, and 
require labs to prominently display, posters instructing laboratory 
workers on how to file anonymous complaints[Footnote 99].  

CMS Response: Complaints from clients or laboratory workers can be an 
extremely important vehicle for identifying problems. For that reason, 
CMS follows up on all complaints. Information about filing complaints 
has already been included in the updated Surveyor and Laboratory 
Interpretive Guideline document and most States already have a Hotline 
for the receipt of complaints. 

In March 2006, CMS also implemented a new, more sophisticated data 
system to receive and track complaints. The system will significantly 
facilitate State agency documentation and follow-up of complaints to 
their conclusion. 

CMS Action: 

5(a) Filing Complaints: CMS will take action to promote greater 
awareness of the opportunity and methods to file a complaint with CMS, 
State survey agencies, and accrediting organizations regarding the 
quality of laboratory services. Such actions may include: 

* Providing a complaint filing "fact sheet" and model complaint poster 
on our website; 

* Issuing a CLIA brochure regarding complaint filing; 

* Encouraging State agencies and partners to publicize the complaint 
process through their websites and publications; and: 

* Working with the laboratory industry to use publications to highlight 
the importance of complaint filing by laboratory workers to promote 
laboratory excellence. 

* Consideration of requirements for all laboratories to display posters 
instructing laboratory workers on how to file anonymous complaints. 

5(b) Complaint Information Sharing: CMS will work with accrediting 
organizations and States to increase the sharing of information 
regarding complaints and complaint investigations. 

5(c) Complaint Tracking and Response: CMS will seek to augment its 
complaint tracking system to build in the capability for accrediting 
organizations to transmit their complaint data to that system, thereby 
enabling a national complaint information database (or repository) for 
the first time. Along with CMS' monitoring its own follow-ups of 
complaints, such a system would assist the accrediting organizations to 
follow up timely on complaints they receive. 

GAO Recommendation #6: Consistent with CLIA, require quarterly 
proficiency testing, except when technical and scientific 
considerations suggest that less frequent testing is appropriate for 
particular examinations or procedures. 

CMS Response: CMS already made this determination. While the public 
explanation emphasized limiting the burden on laboratories, CMS, in 
conjunction with the Centers for Disease Control and Prevention, 
concluded on both technical and scientific grounds that proficiency 
testing three times per year was appropriate. 

GAO Recommendation #7: Ensure that evaluations of exempt State and 
accrediting organization inspection requirements take place prior to 
expiration of the period for which they are approved in order to ensure 
the continued equivalency of their requirements with CLIA. 

CMS Response: We recognize the need to complete timely reviews. 
However, we reserve the right to manage the work within available 
resources and assessment of priorities. Initially, we deemed 
accreditation organizations and exempt States for periods of less than 
6 years. This allowed us to perform multiple assessments to evaluate 
their programs and assure their standards were consistently equivalent 
to those of CLIA. Over the years we have found that the accrediting 
organizations have been consistent in regard to equivalency of 
standards. To ensure continued equivalency or more stringent 
requirements than those of CLIA, we are refocusing our approval process 
and oversight on evaluating how exempt States and accrediting 
organizations are enforcing their standards and assessing patient 
testing outcomes through the validation survey process. We are managing 
the risk appropriately. For accrediting organizations, we are 
developing performance measures through our partners, and are using the 
validation process to monitor the outcomes of their survey processes, 
as well as CLIA compliance. The CMS-convened Partners' for Laboratory 
Oversight group has already raised the bar by collaborating to 
facilitate increased effectiveness, knowledge and consistency for all 
participating entities, with the aim of improving their application and 
assessment of compliance for CLIA purposes. 

CMS Action: 

7(a) Timely Review of Accrediting Organization Standards: CMS will 
ensure, within the limits of available resources and priorities, that 
evaluation of exempt State and accrediting organization inspection 
requirements takes place prior to expiration of the period for which 
they are approved in order to ensure the continued equivalency of their 
requirements with CLIA. 

GAO Recommendation #8: Ensure that changes to the inspection 
requirements of exempt states and accrediting organizations are 
reviewed prior to implementation, as required by regulation, to ensure 
that individual changes do not affect the overall CLIA equivalency of 
each organization. 

CMS Response: It is correct that the accreditation organization must 
submit changes to CMS 30 days prior to their implementation [42 CFR 
493.557(a)(13)]. However, the regulatory language does not specify a 
time period for the review of this information by CMS. Additionally, 
State exemption has no similar requirement. Since accreditation 
organizations' requirements may be more stringent than CLIA, changes to 
requirements do not necessarily impact CLIA equivalency determinations. 

The approval of accrediting organizations is only one portion of CMS' 
oversight responsibilities. While we appreciate the value of timely 
review, we reserve the right to manage the work within available 
resources and assessment of priorities. Due to the potential for 
concerns about accrediting organization performance (versus equivalency 
of standards), CMS increased the percentage of validation surveys 
performed per year from an initial 1 % to the current level of 2.5 %. 
CMS also receives anecdotal information regarding accrediting 
organization performance from State agencies and specific concerns 
through the complaint process. 

CMS Action: 

8(a) Timely Review of Accrediting Organization Changes: CMS will, 
within the limits of available resources and priorities, ensure that 
changes to the inspection requirements of exempt states and accrediting 
organizations are reviewed prior to implementation, as required by 
regulation, to ensure that individual changes do not affect the overall 
CLIA equivalency of each organization. 

GAO Recommendation #9: Allow the CLIA program to utilize revenues 
generated by the program to hire sufficient staff to fulfill its 
statutory responsibilities. 

CMS Response: CMS faced a decline in CLIA program staff as our workload 
has increased significantly. We therefore will explore this GAO 
recommendation. 

CMS Action: 

9(a) CLIA Staffing: CMS will consider adjustments to CLIA staffing in 
CMS Central and Regional Offices, and will establish CLIA staffing 
levels consistent with workload and available CLIA revenues. 

GAO Recommendation #10: Ensure that Federal surveyors validate a 
sufficient number of inspections conducted by each State survey agency 
to allow a reasonable estimate of their performance, including a 
minimum of one independent validation review for each State survey 
agency surveyor. 

CMS Response: In its recommendation to perform a sufficient number of 
surveys "to allow a reasonable estimate of their performance," GAO 
quotes from the language in section 353(e)(2)(D) of the Public Health 
Service Act (which contains the CLIA statute), which pertains only to 
the evaluation of approved laboratory accreditation organizations, not 
the State agency. There is no statutory requirement regarding the 
number of surveys to be performed in each State to assess surveyor 
competency. Nevertheless, we agree that oversight of State agency and 
surveyor competency is important and that Federal surveyors should 
conduct a sufficient number of Federal Monitoring Surveys to allow for 
a reasonable estimate of State agency performance. In CY 2004, CMS 
instituted the CLIA State Agency Performance Review, a more 
comprehensive State agency oversight mechanism. The CLIA State Agency 
Performance Review includes indicators that measure the mechanisms for 
improvement in response to findings of our Federal Monitoring Surveys 
concerning individual surveyor competency assessments. 

Types of Federal Monitoring Surveys include: 

Comparative. The Regional Office surveyor(s) survey the laboratory 
after the State agency surveyor(s). This type of survey is also called 
Look-Behind and can be considered as independent survey of the 
laboratory. The GAO would consider this to be an "independent" 
validation survey. 

Observational. The Regional Office surveyor accompanies the State 
agency surveyor(s) during the laboratory survey and interacts as 
necessary to provide guidance to the State agency surveyor(s) at 
appropriate times. 

Participatory. The Regional Office surveyor and State agency 
surveyor(s) identify deficiencies during the laboratory survey. 

The Federal Monitoring Survey is a powerful educational tool for 
surveyor training. Observational and participatory Federal Monitoring 
Surveys are balanced by the comparative survey. We estimate the 
comparative surveys accounted for about 15 percent of all CLIA 
oversight surveys during the period GAO studied. 

We agree that the comparative survey or "independent validation review" 
offers a truer assessment of surveyor competency than the observational 
or participatory Federal Monitoring Survey, and for that reason 
continue to have the comparative survey as a tool available to Federal 
surveyors for their oversight responsibilities. We are convinced that 
Federal surveyors exercise appropriate judgement as to when to select 
or not select the comparative survey to fulfill their responsibilities 
for surveyor competency assessment. One must also consider that 
comparative Federal Monitoring surveys can be disruptive to 
laboratories as they require two separate surveys conducted during 
different time frames to separately determine laboratory compliance for 
CLIA. 

CMS Action: 

10(a) Validating State Agency Performance: CMS will increase their 
efforts to ensure that the Federal Monitoring Surveys are performed 
annually in each State in numbers sufficient to allow a reasonable 
estimate of State agency performance, including increasing the number 
of "independent" reviews. 

10(b) Independent Validation Review: CMS will ensure that at least one 
comparative Federal Monitoring Survey is performed for each surveyor 
every year. 

10(c) Strengthen Training: CMS will strengthen its training focus and 
application of the outcome-oriented approach to surveying for 
laboratory compliance with 42 CFR §493 by incorporating additional 
specific examples and case studies of deficiencies that demonstrate non-
compliance in current and future training of laboratory surveyors. 

GAO Recommendation #11: Require that almost all validation reviews of 
each accrediting organizations 'surveys be an independent assessment of 
performance. 

CMS Response: We reviewed the statistics provided by GAO regarding the 
numbers of validation surveys performed simultaneously with the 
laboratory accreditation organizations, as well as our statistics 
regarding validation surveys. The numbers given for CAP (11 percent), 
COLA (9 percent) and JCAHO (33 percent) equate to the numbers of 
simultaneous validation surveys per year for each organization that are 
shown here in Figure 4. 

Forty-seven is consistent with the number historically recounted by the 
staff in the CMS regional offices that authorize the validation surveys-
an average of about 1 simultaneous validation survey per State per 
year. It is also consistent with statistics in the CLIA data system for 
calendar year 2005 (45 simultaneous validation surveys). 

FIGURE 4: SIMULTANEOUS VALIDATION SURVEYS(Annual Nationwide Data). 

[See PDF for Image] 

[End of table] 

The number of validation surveys performed nationwide has increased in 
recent years to almost 400 validation surveys to ensure that CMS is 
adequately overseeing accrediting organization performance. At the 
present level of 1 simultaneous validation survey per State, 
simultaneous validation surveys constitute about 12 percent of the 
total number of validation surveys performed. Conversely, about 88 
percent of the total validation surveys are performed independently, 
which equates to the recommendation that almost all validation surveys 
be an independent assessment of performance. We believe 12-15 percent 
is a reasonable proportion to reserve for the opportunities afforded by 
simultaneous validation surveys, such as: 

-promoting understanding of each other's programs; 
- sharing of best practices; and: 
-fostering improvements in accreditation organizations' survey 
processes. 

CMS Action: 

11 (a) Ensure Validation Surveys: CMS will continue to monitor and 
ensure that the vast preponderance of validation surveys for 
accrediting organizations takes the form of independent assessments. 

GAO Recommendation #12: Collect and routinely review standardized 
survey findings and other available information for all survey 
organizations to help ensure that CLIA requirements are being enforced 
and to monitor the performance of each organization. 

CMS Response: We strongly endorse the value of collecting and reviewing 
survey findings and other available information to monitor, sustain, 
and improve performance. For this reason we instituted standardized 
mechanisms for State survey agency performance through the State Agency 
Performance Review (SAPR) protocols. Those protocols utilize standard 
indicators of performance and data. More recently we initiated 
development of a similar system for application to the performance of 
accrediting organizations. We believe that the accrediting organization 
Performance Measures under development will effectively enhance current 
methods to fulfill our oversight responsibilities for accrediting 
organizations. 

With regard to the "standardized" aspect of this recommendation, we 
will put emphasis on improving our methods of standardizing 
interpretations of survey outcomes, even though the standards of each 
accrediting organization may be different[Footnote 100]. 

CMS Action: 

12(a) Collection & Review of Accrediting Organization Survey Findings: 
CMS will explore methods to expand its collection, review, and analysis 
of survey findings and the follow-up actions of accrediting 
organizations in order to monitor, sustain and improve performance of 
accrediting organizations. 

GAO Recommendation #13: Establish an enforcement database to monitor 
actions taken by state survey agencies and regional offices on labs 
that lose their accreditation. 

CMS Response: We agree that laboratories losing accreditation due to 
CLIA quality issues require close attention to ensure they are not 
erroneously deemed CLIA compliant. Our development efforts for 
enforcement management, and planned future system enhancements, will 
assist us in tracking and monitoring such cases. We will also be 
working closely with our state agencies, regional offices and 
accreditation organizations to review present procedures to ensure that 
actions taken are appropriate and timely. 

CMS Action: 

13(a) CMS Enforcement Database. Complete the development of the CMS 
CLIA enforcement database to track and monitor labs that necessitate 
any potential federal enforcement actions. 

[End of section] 

Appendix V: Comments from the College of American Pathologists: 

College of American Pathologists: 
325 Waukegan Road, Northfield, Illinois 60093-2750: 
800-323-4040: 

Advancing Excellence: 

Direct Response To:
DIVISION OF GOVERNMENT AND PROFESSIONAL AFFAIRS 1350 I Street, NW, 
Suite 590 Washington, DC 20005-3305: 
202-354-7100 Fax: 202-354-7155 800- 392-9994: 
[Hyperlink, http://www.cap.org]: 

May 16, 2006: 

Leslie G. Aronovitz: 
Director, Health Care: 
Government Accountability Office: 
441 G Street, NW: 
Washington, DC 20548: 

Dear Ms. Aronovitz: 

As an organization dedicated to improving laboratory medicine and 
patient care, the College of American Pathologists (CAP), takes 
seriously the findings and recommendations of the Government 
Accountability Office (GAO) report titled Clinical Lab Quality: CMS and 
Survey Organization Oversight Should Be Strengthened. We were pleased 
to work with the GAO on this report and appreciate the opportunity to 
provide comments. 

In general, we believe that the Clinical Laboratory Improvement 
Amendments (CLIA) provide for adequate federal oversight for ensuring 
accurate laboratory testing and promoting ongoing quality improvement. 
In 2004, the College initiated its own evaluation of its Laboratory 
Accreditation Program as a result of the events at Maryland General 
Hospital. The College's comprehensive review, which considered 
independent, external recommendations, resulted in the approval of a 
series of initiatives designed to strengthen the program, many of which 
are acknowledged by the GAO in this report. The CAP, however, is 
continuously looking for ways to improve and will be closely examining 
the findings of the GAO with respect to the CAP Laboratory 
Accreditation Program (LAP) to determine if there are additional 
measures that the College should take to strengthen its program. The 
CAP also will continue to work closely with the Centers for Medicare 
and Medicaid Services and other stakeholders to develop consensus on 
best practices in assessing laboratory quality. 

The GAO was charged with examining the quality of laboratory testing 
and was unable to make a determination about this issue. The CAP 
believes there are inherent challenges to measuring the quality of 
laboratory testing due to the complexity of the issue, and the CAP is 
working to develop better systems for detecting laboratories with 
quality issues that potentially impact upon patient care. As noted in 
the report, the CAP is investing $9 million dollars over the next two 
years in new information systems and processes to strengthen our 
ability to monitor a laboratory for sustained compliance throughout its 
two-year accreditation cycle.  

CAP also will be developing a sophisticated computer program that will 
integrate quality factors, such as proficiency testing results and 
trend analysis, inspection findings and complaints, that contribute to 
a knowledge management system which will be utilized to support more 
effective accreditation decision-making that relies upon a 
comprehensive, multi-dimensional assessment of laboratory performance. 

Ultimately, the most important outcome for patients is consistently 
accurate laboratory results. As noted by GAO, the only comparative data 
available at this time to evaluate the quality of laboratory results in 
a systematic way is proficiency testing data. The CAP believes that the 
proficiency testing data cited in the report, for the most part, 
demonstrates that laboratories accredited by the CAP perform better on 
proficiency testing than those that are not. We believe that is a 
relevant measure of the quality of testing performed by laboratories 
accredited by the CAP. 

However, quality improvement is never a static exercise. As is noted in 
the report, the College's laboratory accreditation program continues to 
undergo change. Many of our recent initiatives directly address issues 
of concern raised by the GAO. For example: 

* Unannounced Inspections: The CRP's move to unannounced inspections 
directly addresses the GAO's concerns related to the accreditation 
system's ability to emphasize continuous regulatory compliance and adds 
credibility to the accreditation survey's conclusions as to the 
laboratory's ability to provide quality patient care. 

* Complaint Posters and Whistleblower Policies: The CRP's "anonymous 
complaint" poster, which, as of October 2004 must be displayed in any 
CAP accredited laboratory, and the CAP whistleblower protection policy 
both address some of the GAO's concerns about identifying emerging 
laboratory quality problems. 

* Mandatory Inspector Training: The CRP's new mandatory inspector 
training addresses the GAO's concerns about using active and current 
laboratory professional to conduct CAP surveys. This training will 
supplement their professional experience with specific guidance on 
inspection techniques. 

While we believe the GAO provides valuable insights and information to 
consider, there are also portions of the report where we have a 
different perspective. 

The GAO expresses concerns about our program's use of practicing 
laboratory professionals as inspectors. The CAP believes that the GAO 
underestimates the value of utilizing laboratory professionals in the 
inspection process. CAP accredited laboratories voluntarily choose CAP 
accreditation, which includes requirements that are more stringent than 
CLIA. We believe that this dedication to enhanced quality by laboratory 
professionals demonstrates a commitment to undertake more than is 
required by the federal government to assure quality laboratory 
testing. 

Further, the report provides no factual evidence that the CAP's use of 
laboratory professionals is less effective than other inspection 
models. In fact, the available evidence suggests that the CAP system is 
comparable to other models. 

The GAO also raises concerns that the use of volunteer inspectors 
creates the appearance of conflicts of interest, but also notes steps 
the College has taken to address this issue. It is important to note 
that the College has always had policies and procedures to protect 
against conflicts of interest interfering with the objectivity of the 
inspection process. In addition, a component of our recent initiative 
to strengthen the LAP was to tighten and make more comprehensive our 
existing conflict of interest policies and procedures. The CAP expects 
that our enhanced policies and procedures will more effectively guard 
against potential conflicts impairing the inspection process, and the 
CAP will continue to closely monitor this issue to determine if further 
actions are necessary. 

The GAO also raises concerns regarding the educational versus the 
regulatory aspects of CLIA. As the GAO correctly notes, CLIA neither 
requires nor precludes an educational role for surveyors. The College 
believes that these dual objectives are not mutually exclusive and that 
education is an inherent and important outcome to the inspection 
process of identifying and correcting deficiencies. 

Conclusion: 

The CAP accreditation program is dedicated to a single mission: raising 
the quality of laboratory testing to improve patient care. As with the 
laboratories we accredit, we are committed to the continuous 
improvement of our program and therefore, take seriously the analysis 
provided in this report. We believe our actions demonstrate this 
commitment. The CAP will continue to work closely with Centers for 
Medicare and Medicaid Services and other stakeholders on further 
improving laboratory quality and the laboratory inspection process. 

Sincerely, 

Signed By: 

Thomas M. Sodeman, MD, FCAP President: 

[End of section] 

Appendix VI Comments from COLA: 

COLA Working Together for Excellence in Healthcare: 

CHAIR: 
Isabel V. Hoverman, MD: 
Austin, Texas: 
American College of Physicians (ACP): 

Douglas A. Beigel Chief Executive Officer: 

SECRETARY: 
Mary E. Frank, MD: 
Rohnert Park, California: 
American Academy of Family Physicians (AAFP): 

TREASURER: 
Herman 1. Abromowitz, MD: 
Dayton, Ohio: 
American Medical Association (AMA): 

BOARD OF DIRECTORS: 
AAFP: 
Richard Wherry, MD: 
Dahlonega, Georgia: 

Daniel J. Van Durme, MD: 
Tampa, Florida: 

AMA: 
Veronica C. Santilli, MD: 
Brooklyn, New York: 

J. Edward Hill, MD: 
Tupelo, Mississippi: 

ACP: 
Donna E. Sweet, MD: 
Wichita, Kansas: 

W. James Stackhouse, MD, FACP: 
Goldsboro, North Carolina: 

AOA: 
William M. Silverman, DO, FACOFP: 
Maitland, Florida: 

CHIEF EXECUTIVE OFFICER: 
Douglas A. Beigel: 

May 16, 2006: 

Mr. Walter Ochinko: 
441 G St. NW, Room 5022: 
Washington, D.C. 20548: 

Dear Mr. Ochinko: 

COLA appreciates the opportunity to review and comment on your draft 
report "CLINICAL LAB QUALITY: CMS and Survey Organization Oversight 
Should be Strengthened (GAO-06-416)". As you know, during the course of 
your study, we responded to numerous written and verbal inquiries from 
the GAO and performed several in-depth data analyses. As COLA assisted 
you in this effort, we welcomed your critical, but focused examination 
of CLIA oversight and we used the process to discover opportunities to 
improve our accreditation program. Your report reinforces and validates 
for COLA many of the guiding principles that were used to design the 
most widely used laboratory accreditation program in the U.S. You 
specifically mentioned surveyor training and consistency of assessments 
as key factors to a strengthened laboratory oversight system. COLA is 
proud of its significant and extensive surveyor training program and 
our high level of inter-rater reliability. 

However, COLA has a number of concerns regarding several assumptions 
and conclusions the GAO has drawn in the report. Specifically, we 
disagree with your suggestion that education and enforcement are 
mutually exclusive. We feel that enforcement can successfully be 
coupled with education so that laboratories can learn tools they need 
for compliance. Furthermore, your findings draw conclusions regarding 
notice for onsite inspections and overall laboratory preparedness that 
run contrary to a quality improvement philosophy. While you argue that 
data on laboratory improvement are misleading, we are confident that 
laboratory quality has improved since the promulgation of CLIA 
regulations in 1992. Finally, your recommendations for mechanisms to 
improve CMS oversight of survey organizations have merit, however, some 
are rooted in overstated problems and incorrect conclusions. We have 
addressed these substantive areas in our comments below. We have 
appended some technical comments under separate cover. 

Substantive Comments: 

Education should not be confused with lack of accountability: 

Education is a critical and key component to the reasonable and 
appropriate implementation and enforcement of laboratory performance 
requirements. COLA feels that such an educational approach is essential 
to the desired outcome of real improvement in laboratory performance 
and to prevent the continuation of deficiencies across inspection 
cycles. Many federal requirements in many regulated industries are 
"phased-in" in order to allow regulated entities the time to understand 
and effectively implement the requirements. There is little benefit to 
the laboratory and no benefit to public health and safety for the 
establishment of expectations that laboratories cannot meet. Education 
is essential to the improvement process so as to empower laboratories 
to meet or exceed the minimum expectations. COLA takes its enforcement 
responsibilities very seriously and we are proud of our consistent 
track record in the appropriate enforcement of CLIA. We are also proud 
of our unwavering commitment to laboratory quality improvement. You are 
absolutely correct that COLA begins educating laboratories upon 
enrollment in our accreditation program. This approach allows us to 
have meaningful improvement-based interactions with our accredited 
laboratories over the course of their two year accreditation period. 
The onsite inspection is but one aspect of the program. As we described 
to you during the course of your examination, we continually monitor 
laboratory performance on Proficiency Testing and we regularly 
communicate with and educate laboratories-before and after an onsite 
survey. 

COLA's comprehensive surveyor training program in conjunction with 
individual surveyor exposure to hundreds of laboratories yearly, 
provides the COLA surveyors with a unique opportunity to share their 
knowledge and experience during a laboratory's onsite survey. It has 
always been the goal of COLA's Accreditation program to bring 
laboratories, particularly the smaller Physician Office Laboratory 
(POL) with its less experienced staff, into compliance with the law by 
a combination of approaches which identifies the deficiencies present 
and shares with the lab the correct way to assure quality patient 
testing. COLA then follows up on the identified deficiencies and 
requires an evidence-based response from the laboratory before their 
accreditation is approved or continued. COLA cites ALL problems in the 
lab. 

While COLA laboratory inspections are highly educational we enforce 
100% of our [CMS-approved] accreditation requirements]. 

Laboratory Preparation: 

We disagree with your assertion that allowing a laboratory to prepare 
for a survey masks the discovery of laboratory problems. We know of no 
research that would support such a conclusion. 

While much of a laboratory's evidence of compliance is documentary, 
there is little of this evidence that can be fabricated in a short 
period of time. Personnel qualifications, root cause analyses of °out 
of limit'° Quality Control (QC), failed Proficiency Testing (PT), the 
release of patient results when QC is out of limits, incorrect 
frequency of QC, the use of expired reagents, proper specimen 
identification, proper report elements - are all virtually impossible 
to create retrospectively after a survey is scheduled. If contact from 
a laboratory's accrediting organization about their upcoming survey 
occurs with three or even six months notice, it would only improve the 
lab's operation more quickly if the lab took that opportunity to review 
their self-assessment questions and corrected any missing or incorrect 
processes or documentation. 

Clearly, laboratories that "fix" or complete records immediately prior 
to an announced onsite inspection (an example used in your report) have 
critical management and laboratory operations issues. Our surveyors are 
trained to spot these problems as well as others that may arise when a 
laboratory attempts to °fix° documents or data just before an onsite 
survey. 

As you know, documentation is only one part of the onsite assessment. 
The qualitative, interactive assessment of the laboratory, coupled with 
the ongoing participation in proficiency testing, provides COLA with a 
more accurate picture of the overall quality of the laboratory. 

It is important to note that logistically, any reduction in final 
notice of scheduled survey to laboratories will likely raise the cost 
to inspect laboratories by the national survey organizations. The 
increased costs will ultimately be borne by the laboratories themselves 
and will be further increased by any necessary rescheduling as a result 
of scheduled inspections that could not be performed because of a 
variety of reasons-including staff vacations, laboratory operating 
hours etc. Moreover, the root cause of problems like those at Maryland 
General Hospital were not discovered by unannounced or short notice 
inspections-they were brought to light by whistleblowers. Lab personnel 
who could be potential whistleblowers, may not be available during 
unannounced and short notice surveys. 

We completely agree with your conclusion that unannounced inspections 
in the smaller lab environment are impractical and have a negative 
impact on patient care. We have discussed the same with CMS, JCAHO and 
others. 

COLA exists to help improve laboratory medicine and patient care-the 
primary tenets of which are lasting improvement mechanisms and quality 
systems generated by committed, informed, and prepared laboratory 
directors and staff. CLIA's intent is to improve the quality of 
laboratory testing. As new technologies emerge and laboratory testing 
evolves it is more important than ever that the industry understand the 
principles of quality laboratory testing and apply them correctly. 

Laboratory Quality has Indeed Improved: 

We were disappointed that you seemingly discounted the improved 
proficiency testing performance by COLA accredited laboratories and 
further intimated that overall laboratory quality has not improved. The 
percentage of COLA laboratories that fail PT has decreased. COLA is 
vigilant in the continual monitoring of proficiency testing performance 
by our laboratories. We educate laboratories on how to remedy 
proficiency testing problems and ultimately on how to ensure that all 
tests are performed in a controlled and analytically sound fashion. 
COLA is proud of the fact that our program is having a positive impact 
on laboratories and on patient care. 

Statistically, your assertion that the explanation for improved PT 
performance by COLA labs is because poorly performing laboratories have 
withdrawn from oversight (by performing only waived testing) is 
remarkable on the following counts: 

1) It should be encouraging that poorly performing laboratories that 
are beyond help are °weeded out" and no longer perform clinical 
laboratory testing other than waived procedures. As you know, COLA 
takes PT performance very seriously and requires laboratories to cease 
testing problem analytes or specialties on the basis of systemic and 
continual PT failures. 

2) COLA now accredits more laboratories than it has in the past 10 
years. We are delighted to see our rolls of accredited laboratories 
filled by conscientious, quality-minded laboratories. 

Data that COLA provided the GAO (but not used in the draft report) 
shows that, in general, condition-level deficiencies declined in 
laboratories that have been surveyed over multiple years. We view this 
as evidence that the quality of laboratories subject to continual and 
regular oversight has improved. 

We also note that since the advent of CLIA, there have been a number of 
contributing factors which affect both the smaller and more 
sophisticated laboratory and their performance on Proficiency Testing. 
For example, testing technologies and reagent stability have improved 
tremendously, especially in consistency, reliability, and ease of use. 
Proficiency testing results should have been expected to improve. The 
fact that they haven't in larger labs may be related to the 
difficulties in attracting and retaining a well-trained workforce in 
laboratory medicine. 

CMS Improvements: 

In general, we believe that your recommendations for how CMS can 
improve the oversight of survey organizations have merit. For example, 
we believe that CMS should hire sufficient staff. However, we believe 
that you have fundamentally misunderstood and/or overstated the 
shortcomings you have identified in the following: 

1) The validation process of survey organizations (by CMS) can 
certainly be improved. However, we feel that simultaneous validations 
are often of benefit and helpful to assuring some consistency and 
predictability between surveying groups. Simultaneous validations 
should not be assumed to be improper or ineffective. In fact, CMS and 
survey organizations have greatly improved their understanding of each 
other's processes because of simultaneous validations. This type of 
cooperation is critical to an effective public/private partnership as 
CLIA oversight has become. 

2) Furthermore, the report asserts that simultaneous validations are 
not independent and do not identify as many condition-level 
deficiencies". In our experience, most of the discrepant findings in 
non-simultaneous validation reports are, in fact, incorrect because 
they are based on false assumptions. For example, CMS' validation 
process (by virtue of its structure) fails to recognize that COLA 
performs continual monitoring and enforcement of PT performance is not 
confined to the onsite survey alone. In all, despite the potential 
improvements in the validation process, no accrediting organizations 
have come close to the 20% discrepancy threshold defined in CLIA: 

3) We agree that the deeming approval and renewal process (by CMS of 
survey organizations), should be timely, consistent, standardized and 
thorough, however, we caution that the GAO, Congress, and CMS respect 
that the approved accrediting organizations are all unique in approach 
and methodology. All survey organizations meet the CLIA required 
validation thresholds; all thoroughly investigate complaints, and all 
take immediate action when risk of harm situations is evident. In 
short, we oversee labs differently, and laboratory quality has 
improved. We do not agree that standardized surveys would be inherently 
more accurate or appropriate for the wide rage of laboratories and 
laboratory environments currently overseen by CMS and the accrediting 
organizations[Footnote 101].  

Critical Factors: 

As we noted above, we are very pleased with the many accomplishments of 
the CLIA program and COLA's accreditation program in particular in 
improving laboratory quality over the years. Generally, quality of 
laboratory testing can be measured in two ways: 1) by evaluating 
quality of laboratories and providing resources to assist them to 
correct deficient practices; and 2) by preventing laboratories that do 
not meet quality standards from continuing to provide clinical 
laboratory testing. We are appropriately achieving this outcome. 

When CLIA was enacted, there were many problems that have since been 
rectified. COLA, as the first CLIA approved accrediting organization 
was designed to promote quality improvement and excellent patient care 
through an interactive approach through effective enforcement, 
oversight, and education. When the CLIA regulations were promulgated, 
many laboratories were unfamiliar with the concepts of "quality 
assurance", "quality control", and "proficiency testing". We committed 
ourselves to a program of comprehensive surveyor training, coupled with 
consistent, efficient survey methodologies to instill a culture of 
quality in our accredited laboratories. COLA's surveyors, all of whom 
are employed by COLA, are cross trained in multiple laboratory 
disciplines, quality systems, and more importantly, trained in 
communications, conflict management, investigation, and root cause 
analysis techniques. 

We applaud the GAO for noting the critical role played by trained, 
professional surveyors. 

Thank you for the opportunity to comment on this important report. 

Sincerely, 

SignedBy: 

Douglas A. Beigel: 
Chief Executive Officer: 
COLA: 

[End of section] 

Appendix VII: Comments from the Joint Commission on Accreditation of 
Healthcare Organizations: 

Joint Commission on Accreditation of Healthcare Organizations: 
Setting the Standard for Quality in Health Care: 

May 15, 2006: 

Mr. David Walker: 
Comptroller General: 
Government Accountability Office: 
441 G Street, N.W. 
Washington, DC: 

Dear Mr. Walker: 

We would like to thank the Government Accountability Office for the 
opportunity to review the draft report CLINICAL LAB QUALITY. CMS and 
Survey Organization Oversight Should be Strengthened (GAO-06-416). 
Soliciting the views of entities that are the subject of your review 
helps to ensure accuracy and provides context for your assessment. The 
Joint Commission congratulates the GAO on its efforts made in 
conducting this study of the quality of testing in our nation's 
clinical laboratories, the effectiveness of survey organization 
oversight of laboratory performance, and CMS' oversight of the CLIA 
program. 

Ensuring that our nation's laboratories are providing high quality and 
safe services is one of the Joint Commission's highest priorities. Most 
clinical diagnoses are based on the results of lab tests; yet, the 
level of quality in our nation's laboratories often goes unnoticed 
because these health care providers rarely have direct patient contact. 
Recognizing the critical and essential importance of laboratory 
services, the Joint Commission has designated the laboratory as an 
essential service and therefore factors the laboratory's accreditation 
status into the hospitals accreditation decision. This policy 
underscores the patient care implications of laboratory quality; and 
the need for hospital leadership to pay particular attention to 
laboratory processes and outcomes. 

We would next like to comment on the GAO recommendation that CMS 
standardize the categorization and reporting of survey findings, while 
having the theoretical potential to simplify administrative oversight 
of the program, has several serious shortcomings. First, compliance 
with this GAO recommendation would require revamping of our entire 
accreditation system. GAO fails to recognize that the Joint Commission- 
like its colleague accrediting bodies-uses a different and more 
sophisticated approach to assess laboratory performance. This 
recommendation essentially assumes that CLIA requirements and 
categorizations are the "gold standard;" however our experience has 
shown that the Joint Commission's systems approach is much more 
effective in identifying the causes of performance problems and helping 
labs correct the underlying factors that contribute to these problems. 
The Joint Commission's approach also drives sustained improvement in 
lab performance. Second, the concept underlying federal reliance on 
private sector accreditation is that it provides a level of flexibility 
not available in a regulatory environment. When Congress established 
the accreditation option, with the caveat that the relevant standards 
"meet or exceed" federal regulations, it recognized that other 
approaches can be more innovative and more effective at ensuring 
quality and patient safety. As an accrediting organization, the Joint 
Commission is not a contractor to CMS, in contrast to the CMS 
relationship with state survey agencies. 

Notwithstanding the foregoing, the Joint Commission believes that CMS 
could and should play a role in developing a common, agreed-upon 
taxonomy that could be used by all laboratory survey organizations to 
track serious deficiencies. As the draft GAO reports notes state survey 
agency determinations that condition-level requirements are out of 
compliance are highly subjective and, by their nature, inconsistent. If 
all survey organizations were to agree on criteria as to what 
constitutes a serious deficiency, this would create the desired 
comparability without requiring accrediting organizations to change the 
ways in which they categorize and report findings. We believe that the 
GAO should replace its current recommendation to standardize the 
categorization and reporting of survey findings with a new one that 
directs CMS to take the lead in coordinating a joint effort to develop 
a common definition of what constitutes a serious deficiency. 

The Joint Commission further disagrees with the GAO recommendation that 
CMS uniformly impose more sanctions on labs with repeat condition-level 
deficiencies. A number of variables may contribute to circumstances in 
which a lab is found to have consecutive condition-level deficiencies 
for the same requirement, including a lack of sufficient resources or a 
lack of understanding of the tools needed to fix the problem. Further, 
the same condition may be found to be out of compliance as a result of 
the contribution of different standards to the overall deficiency. 
Determining when to employ a punitive versus an educational (or 
collaborative) approach to promoting compliance can be a difficult 
judgment. Quality experts maintain that an educational approach is the 
best way to evaluate weaknesses and achieve and sustain improved 
performance over time. The following three types of behaviors have been 
identified as contributing to poor performance[Footnote 102]: 

* Human error. Inadvertent action; inadvertently doing other than what 
should have been done; slip, lapse, or mistake. 

* At-risk behavior. Behavior that increases risk, where risk is not 
recognized, or is mistakenly believed to be justified. 

* Reckless behavior. Behavioral choice to consciously disregard a 
substantial and unjustifiable risk. 

The most appropriate way to manage reckless behavior is through 
remedial (or disciplinary) action, such as sanctions; however, we 
contend that most labs with consecutive condition-level deficiencies 
are actually exhibiting at-risk behavior. The best way to manage at- 
risk behavior is to remove the negative incentives, create incentives 
for healthy behaviors, and increase situational awareness. In the 
patient safety literature overwhelmingly supports the conclusion that 
punishment encourages organizations to cover up problems[Footnote 103, 
104]. Thus, Joint Commission believes that GAO's call for CMS to impose 
more sanctions on laboratories with repeat condition-level deficiencies 
is likely to be counterproductive. 

Next, to ensure consistency of laboratory oversight by survey 
organizations, the GAO recommends that all survey organizations develop 
and require labs to prominently display posters that instruct lab 
workers on how to file anonymous complaints. The Joint Commission 
believes that this recommendation is too narrow and prescriptive, and 
may inadvertently limit organizations from using other more effective 
ways to educate lab workers on how to file a complaint. Furthermore, 
GAO's commentary on the increase in complaints received by CAP after it 
required the display of posters neglects to recognize a broad national 
trend. During the same period, the Joint Commission also experienced a 
dramatic increase in lab-related complaints. The number of complaints 
the Joint Commission received in 2004 was 44 and this number rose to 69 
in 2005, an increase of 64 percent, without any requirement for 
displaying posters. 

We further believe that GAO has misinterpreted its validation survey 
data. It concludes that "independent" surveys-more commonly referred to 
as look-behind surveys-are more effective than simultaneous surveys in 
identifying condition-level deficiencies that were missed by 
accrediting organizations. However, the data presented in the draft 
report do not support this assertion. The Joint Commission estimates 
that 3 percent of the simultaneous validation surveys resulted in 
findings of condition-level deficiencies missed by accrediting 
organization surveyors, compared to the identification of such findings 
in 5 percent of the "independent" validation. Thus, the proportion of 
condition-level findings is roughly equivalent in both types of 
surveys. 

Finally, while the GAO's lengthy and detailed review addresses many 
issues associated with laboratory quality, it fails to address a long 
acknowledged shortcoming of CLIA requirements-the qualifications and 
supply of lab personnel. The Joint Commission believes that the 
personnel standards currently required by CLIA are insufficient to 
adequately protect patients and the public health. For example, CLIA 
requires only an Associate degree and minimal lab training to perform 
tests of high complexity and has no personnel requirements for waived 
tests. Today, the problems underlying failures in laboratory 
performance that are most commonly cited by experts in the field are 
the growing shortage of laboratory technologists and the inadequacy of 
their training. These shortcomings become especially glaring in the 
face of the expanding array and increasing complexity of laboratory 
test in hospitals. By not addressing this serious regulatory 
shortcoming in the scope of its review, GAO has missed an important 
opportunity to leverage potential improvements in laboratory 
performance and protect the public interest. 

The Joint Commission is submitting technical comments on the GAO draft 
report as an attachment to this letter. If you have any questions 
concerning these comments, please contact Trisha Kurtz of my staff at 
202.783.6655. 

Sincerely, 

Signed By: 

Dennis O'Leary: 
President and CEO: 

Enclosure: 

[End of section] 

Appendix VIII: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Leslie G. Aronovitz (312) 220-7600 or aronovitzl@gao.gov: 

Acknowledgments: 

In addition to the contact named above, Walter Ochinko, Assistant 
Director; Lucia P. Fort; Dan Lee; Kevin Milne; Dean Mohs; Elizabeth T. 
Morrison; Michelle Rosenberg; and Elizabeth Scherer made key 
contributions to this report. 

[End of Section] 

FOOTNOTES 

[1] Medicare is a federal health care program serving the elderly and 
disabled individuals. 

[2] Clinical Laboratories Improvement Act of 1967, Pub. L. No. 90-174, 
§5, 81 Stat. 533, 536. 

[3] Pub. L. No. 100-578, 102 Stat. 2903. 

[4] Appendix I provides examples of the effect of lab errors on patient 
health. 

[5] Labs obtain a CLIA certificate that corresponds to the complexity 
of the testing they conduct. Generally, each lab has one certificate 
but a large hospital with multiple labs may have a corresponding number 
of certificates. By regulation, labs that are within a hospital campus 
and under common direction are allowed to file either a single 
application for a certificate or multiple applications for multiple 
certificates. 

[6] CMS contracts with state survey agencies in the District of 
Columbia and 49 states (including New York but not Washington) to 
survey labs under CLIA requirements. Labs in Washington are surveyed 
either by the state's CLIA-exempt program or by an accrediting 
organization. Labs in New York are surveyed either by the state survey 
agency or New York's CLIA-exempt program. New York does not authorize 
accreditation as a basis for lab licensure. 

[7] Throughout this report, we use the term "survey organizations" when 
referring collectively to state survey agencies, the two state CLIA- 
exempt programs, and accrediting organizations. 

[8] In addition to the results of state survey agency lab inspections, 
CMS's OSCAR database stores other information on labs registered under 
the CLIA program including: (1) labs' CLIA certificate history, such as 
switching from being inspected by accrediting organizations to being 
inspected by state survey agencies; (2) the results of complaint 
investigations; (3) labs' billing history; and (4) proficiency testing 
enrollment and performance data. 

[9] COLA was formerly known as the Commission on Office Laboratory 
Accreditation. 

[10] New York, Philadelphia, and Seattle. 

[11] California, Colorado, Connecticut, Idaho, Louisiana, Michigan, 
Nebraska, North Carolina, Pennsylvania, and South Carolina. 

[12] A 2004 complaint investigation conducted by the Maryland state 
survey agency found that personnel at this lab falsified records to 
conceal problems with HIV and hepatitis testing equipment and that the 
lab provided hundreds of patients with potentially erroneous test 
results. These problems were not detected during the 2003 CAP 
inspection or a prior complaint investigation conducted by the state 
survey agency in 2002. 

[13] Implementation of CLIA was phased in over a number of years. CLIA 
does not apply to forensic laboratories, research laboratories that do 
not report patient-specific results, drug testing laboratories 
certified by the Substance Abuse and Mental Health Services 
Administration, and Veterans Administration laboratories. 

[14] Labs surveyed by either the New York or Washington CLIA-exempt 
programs do not obtain a CLIA certificate and do not pay fees to CMS. 
According to CMS, labs are billed a year in advance of the surveys to 
provide adequate time for them to pay their fees and for states to 
perform surveys prior to the expiration of labs' CLIA certificates. 

[15] In January 2000, the Food and Drug Administration assumed 
responsibility for categorizing tests conducted by labs from the 
Centers for Disease Control and Prevention. 

[16] Pregnancy and blood sugar screenings are examples of such tests. 
Labs conducting waived tests are only required to follow manufacturers' 
instructions and to limit testing to Food and Drug Administration- 
approved or cleared methods. 

[17] Known as provider-performed microscopy procedures, such tests must 
be performed by a physician or other qualified provider as defined in 
CLIA regulations. Labs conducting such tests are required to have 
written procedures for the tests they perform and must also satisfy 
applicable proficiency testing requirements and have a system to ensure 
the competency of testing personnel. These labs and those performing 
waived tests are subject to complaint investigations. See 42 C.F.R. § 
493.1773(f)(2005). 

[18] From 1998 through December 2005, the proportion subject to surveys 
has declined from about 30 percent to about 19 percent, while the 
proportion of labs that are not surveyed because they perform waived 
testing has increased from 70 percent to 81 percent. 

[19] Proficiency testing providers are private companies or state lab 
departments and are approved by CMS annually. 

[20] Some labs, including Indian Health Service labs, are surveyed by 
federal surveyors located in CMS's regional offices. 

[21] CMS contracts with state survey agencies in the District of 
Columbia and 49 states (including New York but not Washington) to 
survey labs under CLIA requirements. 

[22] Prior to CLIA, CMS was not required to routinely determine the 
equivalency of accrediting organization and state CLIA-exempt program 
requirements. 

[23] State survey agencies may propose the imposition of sanctions for 
noncompliance but only CMS can impose sanctions. Accrediting 
organizations and exempt-state programs may revoke accreditation or 
remove a lab's state license, respectively, for noncompliance with 
their CLIA-equivalent requirements. While CMS regional office staff 
determine whether loss of accreditation should also result in 
revocation of a lab's CLIA certificate, loss of state licensure is 
tantamount to CLIA certificate revocation. 

[24] Because of congressional concern that available remedies were too 
limited and could dissuade CMS from enforcement, CLIA gave CMS 
additional tools, called alternative sanctions, to help motivate labs 
to comply with quality requirements. 

[25] The Centers for Disease Control and Prevention is also responsible 
for carrying out certain CLIA-related tasks, including (1) developing 
and evaluating technical standards for lab testing components; (2) 
working with CMS and the Food and Drug Administration to determine the 
regulatory impact of lab technical standards; (3) conducting lab 
research and analysis; and (4) facilitating the CLIA Advisory 
Committee, which makes recommendations to improve the CLIA program. 

[26] Unlike validation reviews of accrediting organization surveys, CMS 
refers to the validation of state surveys as Federal Monitoring 
Surveys. Because of their similar objective, we refer to all such 
surveys as validation reviews in this report. We refer to validation 
reviews that occur at the same time as the lab survey as simultaneous. 
Conversely, validation reviews that occur after the lab survey are 
referred to as independent validations. 

[27] According to CMS, the criterion for identifying a missed 
deficiency is the reasonableness of concluding that a condition-level 
deficiency was present at the time a survey organization conducted its 
survey but the survey organization's findings did not note the 
deficiency. 

[28] See 42 U.S.C. § 263a(e)(2)(D)(2000). By regulation, a similar 
requirement applies to validation reviews of labs under exempt-state 
programs. 42 C.F.R. § 493.563(b)(2)(2005). 

[29] CMS published the regulations for the new requirements in January 
2003 and surveyors began using the new requirements on January 12, 
2004. 

[30] For example, some condition-level requirements were reorganized 
and some were consolidated. 

[31] When we asked for access to backup files, CMS told us that it did 
not have backup files of the original pre-2004 survey data. 

[32] We excluded survey results for the period January 1 through 
January 11, 2004, because CMS modified OSCAR data for findings prior to 
January 12, 2004, to reflect revised CLIA requirements. 

[33] Although CMS reviews the requirements of exempt-state programs and 
accrediting organizations to ensure that they are at least equivalent 
to CLIA's, there is not necessarily a one-to-one match with CLIA 
requirements. Thus, one CLIA condition-level requirement may equal 
several accrediting organization requirements or vice versa. For 
example, CMS's condition-level requirement for successful lab 
participation in approved proficiency testing corresponds to at least 
19 CAP, 3 COLA, and 4 JCAHO requirements. 

[34] This effort took CAP and COLA about 4 months to complete. 

[35] CMS does require unannounced surveys for: (1) complaint 
investigations, (2) follow-up surveys conducted to verify correction of 
deficiencies, and (3) nonroutine surveys conducted when there is reason 
to believe a lab is operating in a manner that constitutes a risk to 
human health. In contrast, all nursing home surveys are required to be 
unannounced to help ensure that homes do not cover up problems that may 
exist when surveyors are not present. See GAO, California Nursing 
Homes: Care Problems Persist Despite Federal and State Oversight, GAO/ 
HEHS-98-202 (Washington, D.C.: July 27, 1998). 

[36] JCAHO implemented its unannounced surveys in January 2006 and CAP 
began phasing in unannounced inspections in the spring of 2006. CAP and 
JCAHO will continue to provide Department of Defense and prison labs 
that they survey with advance notice to enable surveyors to obtain the 
security clearances required to enter such facilities. 

[37] Our analysis excluded state survey agencies that inspect fewer 
labs because even a small change in the number of labs with condition- 
level deficiencies can produce a large percentage point change. 

[38] While checklists may be useful, some state survey agencies told us 
that use of the checklists may result in insufficient probing and 
observation. CAP officials told us that they plan to move beyond their 
current emphasis on requiring documentation of lab processes with 
probing techniques that require direct interaction with lab staff, 
observation of testing, and new survey tools to guide inspectors in 
assessing compliance with requirements. COLA's survey process includes 
a list of questions that surveyors must answer by asking probing 
questions of, and interacting with, lab staff. JCAHO's process, 
introduced in 2004, uses computer-based algorithms when making 
compliance determinations. 

[39] When state survey agencies cite condition-level deficiencies, the 
CMS regional office for that state determines what, if any, sanctions 
should be imposed. 

[40] One example cited by a state survey agency involved a complaint 
investigation of a transfusion-related fatality, the result of a lab 
worker mixing up patient samples. Because the lab had already 
instituted extensive corrective actions by the time the surveyor 
arrived, the survey agency cited a standard-level deficiency for 
documentation errors rather than a condition-level deficiency. We 
discussed this case with CMS officials who told us that because the 
problem had been addressed, there was essentially no condition-level 
deficiency to cite. 

[41] While combining the roles of educator and regulator may not be 
unique, it is different from the exclusively regulatory role state 
surveyors under contract with CMS play for other provider groups, such 
as nursing homes and home health agencies. 

[42] Using OSCAR trend data, we were able to identify state survey 
agencies that educated instead of regulated home health agencies. See 
Medicare Home Health Agencies: Weaknesses in Federal and State 
Oversight Mask Potential Quality Issues, GAO-02-382 (Washington, D.C.: 
July 19, 2002). 

[43] This percentage includes both labs preparing for their initial 
survey and those with prior surveys. According to COLA officials, newer 
labs are most likely to perform the self-assessment; over time, they 
believe that the vast majority of COLA-inspected labs have completed 
the self-assessment. 

[44] Because of lab testing errors that led to women's deaths from 
cervical cancer, Congress required a specific type of proficiency 
testing for individuals who interpret the results of Pap smear tests, 
which requires examining glass slides under a microscope. Although CLIA 
was enacted in 1988, CMS told us that cost, the inability to find a 
national testing provider, and other technical issues delayed 
establishing a Pap smear proficiency testing program until 2005. 

[45] As of November 2005, CAP also employed 11 full-time surveyors. 
Historically, CAP staff surveyors were responsible for inspections of 
smaller labs that conduct less complex tests. Increasingly, staff 
surveyors will (1) accompany survey teams assigned to labs with a large 
number of deficiencies on their prior survey, and (2) assist teams in 
conducting either a lab's initial survey or the team leader's initial 
survey. Staff surveyors will also conduct more nonroutine surveys, such 
as investigating a complaint or following up on performance concerns 
raised during surveys. 

[46] Currently, CAP volunteer surveyors are encouraged to participate 
in surveyor training at least once every 3 years. In July 2006, CAP 
plans to begin requiring survey team leaders to complete mandatory 
training. Mandatory training for survey team members is targeted to 
begin in 2007. 

[47] In contrast, CAP staff surveyors complete a 6-month training 
program before they are allowed to conduct surveys independently. 

[48] In explaining this policy, CAP notes that it believes team leaders 
and inspectors will conduct the inspection of a competing lab 
professionally and in an objective manner. 

[49] In April 2006, CAP issued a revised conflict of interest policy 
that addresses these concerns. 

[50] According to CAP, 57 percent of surveys in 2004 were conducted by 
surveyors who worked in nearby labs. For example, surveyors who 
inspected a Maryland hospital lab from 1999 through 2003 worked in labs 
that were from 5 to 42 miles away. The remaining 43 percent were 
conducted by surveyors who did not work in nearby labs and who 
therefore required air travel to carry out the survey. 

[51] The modifications to OSCAR did not affect data on the number of 
complaints. The complaint information in OSCAR excludes complaints that 
do not require an on-site survey. 

[52] COLA does not have a formal whistle-blower policy. COLA officials 
told us that they promptly investigate all complaints, many of them 
from former lab employees, and keep the identity of the complainants 
anonymous. 

[53] H.R. 686, 109th Cong. (2005). 

[54] CAP plans to hire an additional staff person to investigate 
complaints. 

[55] Effective July 2005, JCAHO required labs to educate staff on how 
to report concerns about lab quality to the Joint Commission, but does 
not specify use of a poster to do so. 

[56] According to CMS, sanctions generally result from deficiencies 
identified during an inspection by a state survey agency. 

[57] Since CMS data lists only the number of labs with proposed 
sanctions by year, this number may double-count labs that had proposed 
sanctions in multiple years. 

[58] Labs surveyed by states have 23 days to correct immediate jeopardy 
deficiencies, 90 days to correct all other condition-level 
deficiencies, and up to 12 months to correct standard-level 
deficiencies. CAP and COLA give labs 30 days to correct all 
deficiencies, and effective July 1, 2005, JCAHO reduced the time labs 
have to correct deficiencies from 90 to 45 days. Labs may also appeal 
proposed sanctions, and depending on the outcome of such appeals, 
sanctions may be dismissed. 

[59] Thirty-three states and the District of Columbia had at least one 
lab with the same repeat condition-level deficiency. 

[60] Twenty CAP-, 51 COLA-, and 10 JCAHO-inspected labs had their 
accreditation revoked. After notice of revocation of accreditation, a 
lab retains its CLIA certificate and may continue to test specimens for 
45 days while a state survey agency inspects the lab and makes a 
recommendation concerning the lab's continued participation in the CLIA 
program to the responsible CMS regional office, unless CMS takes action 
sooner. Potential outcomes include (1) termination from the CLIA 
program; (2) determination that the lab meets CLIA requirements because 
they are less stringent than those of the accrediting organization, 
resulting in the lab switching to state survey agency oversight; (3) 
the lab's return to compliance and reapplication to be surveyed by an 
accrediting organization; or (4) cessation of moderate-to high- 
complexity testing and assumption of a CLIA certificate that only 
allows less complex, waived testing. 

[61] We created a database for the sanctions data contained in CMS's 
annual lab registry reports to Congress from 1998 through 2004 and were 
able to identify labs that both lost accreditation and had a sanction 
imposed. 

[62] Pub. L. No. 100-578, § 2, 102 Stat. at 2911, codified at 42 U.S.C. 
§ 263a(i)(4)(2000). 

[63] In addition to suspension of or limits on testing, COLA uses 
directed plans of corrections, such as requiring a lab to hire a 
consultant or participate in specific training. 

[64] Pub. L. No. 100-578, § 2, 102 Stat. 2903, 2907-08, 42 U.S.C. § 
263a(f)(3)(2000). 

[65] H.R. Rep. 100-899 at 28 (1988) reprinted in 1988 U.S.C.C.A.N. 
3828, 3849. 

[66] See 57 Fed. Reg. 7002, 7128-29 (1992). Prior to 1992, CMS required 
proficiency testing quarterly. See 55 Fed. Reg. 9538 (1990). 

[67] According to a CMS official, the adoption of less frequent 
proficiency testing was accompanied by an increase in the number of 
specimens subject to proficiency testing from two every 3 months to 
five every 4 months. 

[68] The committee report provided examples of technical and scientific 
considerations justifying an exception to the quarterly testing 
requirement. Those examples also stress the significance of excepting 
tests from the quarterly testing requirement on the basis of 
circumstances presented by the individual test. See H.R. Rep. 100-899 
at 29 (1988), 1988 U.S.C.C.A.N. at 3850. That quarterly testing is 
intended to be the norm is further evidenced by the committee report's 
recommendation that the number of quarters that a laboratory had failed 
to pass proficiency testing determine the severity of sanctions 
imposed. See id. at 30, 1988 U.S.C.C.A.N. at 3851. 

[69] For example, CAP typically reorganized, consolidated, or changed 
its requirements two times a year. As a result of these changes, about 
1,000 requirements were removed and about 1,200 requirements were added 
over 5 years. COLA has changed its requirements five times since it was 
first approved by CMS in 1993. JCAHO officials stated that they make 
changes to their requirements each year. 

[70] See 42 C.F.R. § 493.573(a)(3)(2005). 

[71] On January 1, 2004, JCAHO launched a new accreditation process, 
which included a substantial consolidation of lab requirements. 
Additionally, JCAHO began using new methodologies, including a new 
software program that analyzes data to help focus on-site surveys on 
priority areas and a tracer methodology to track patients and specimens 
through the continuum of lab services. 

[72] According to JCAHO, the organization plans to change its survey 
process to include a review of a random sample of the other records and 
documents over the entire 24-month period. 

[73] As noted earlier, CAP plans to begin requiring mandatory surveyor 
training in mid-2006. 

[74] Simultaneous surveys resemble "observational" federal oversight 
surveys conducted at nursing homes. We previously reported that such 
surveys, in which federal surveyors accompany and observe state 
surveyors during the routine inspection of a nursing home, are not a 
realistic assessment of state surveyor performance because CMS's 
presence may make state surveyors more attentive to their survey tasks 
than when they are not being observed, a phenomenon known as the 
Hawthorne effect. See GAO, Nursing Home Care: Enhanced Oversight of 
State Programs Would Better Ensure Quality, GAO/HEHS-00-6 (Washington, 
D.C.: Nov. 4, 1999). 

[75] These validation reviews include both exempt-state and state 
survey agency lab surveys. 

[76] State survey agencies employ 96 full-time-equivalent staff to 
survey labs. 

[77] CMS did not begin tracking this information until August 2003. 

[78] Both survey organizations submit their findings to CMS, which then 
compares the findings to determine whether accrediting organization 
surveyors missed any condition-level deficiencies. Examples of 
condition-level deficiencies missed by accrediting organizations 
include: (1) lab did not correctly calculate the results of tests used 
to monitor patients using a blood-thinning medication, which could 
result in serious medical complications such as internal bleeding; (2) 
lab did not follow manufacturer's instructions for calibrating checks 
of certain test equipment; and (3) lab director failed to provide 
overall management and direction of lab, such as ensuring timely 
enrollment in a proficiency testing program and corrective actions 
following a proficiency testing failure. 

[79] Our nursing home work found a similar absence of analysis by CMS 
regarding trends in serious problems identified by state surveyors. See 
GAO, Nursing Home Quality: Prevalence of Serious Problems, While 
Declining, Reinforces Importance of Enhanced Oversight, GAO-03-561 
(Washington, D.C.: July 15, 2003). 

[80] Officials from a state survey agency told us that, while they do 
not want to receive routine survey reports from accrediting 
organizations, they do want to receive information when accrediting 
organizations identify problem labs--before a lab loses its 
accreditation and becomes the responsibility of the state survey 
agency. When an accrediting organization revokes a lab's accreditation, 
the state survey agency becomes responsible for determining whether to 
recommend revocation of the lab's CLIA certificate to the appropriate 
CMS regional office. 

[81] In response to communication problems highlighted by the complaint 
investigations at a Maryland hospital lab, CMS has been meeting 
regularly with officials from exempt-state programs, accrediting 
organizations, state survey agencies, and its regional offices to 
discuss and improve coordination and data sharing--particularly the 
handling of complaints. The Maryland state survey agency identified 
deficiencies during a 2004 complaint investigation but did not inform 
CAP, the organization responsible for surveys of the Maryland hospital 
lab. CAP officials told us that they first learned about the 2004 
complaint investigation that revealed problems with HIV and hepatitis 
testing equipment from newspapers. 

[82] According to a CMS official, the Seattle regional office was in 
the forefront of developing a set of performance standards for 
Washington's CLIA-exempt program. The standards, though not identical 
to those implemented for state survey agencies, have been in place for 
several years. As of November 2004, the New York regional office is 
developing similar performance standards for the New York CLIA-exempt 
program. 

[83] The 13 areas are personnel qualifications, financial management, 
completion of workload targets, survey selection and scheduling, 
outcome-oriented survey process, acceptable plan of correction, 
complaints, ongoing training activities, data management, survey time 
frames, proficiency testing desk review, principles of documentation, 
and enforcement. 

[84] In explaining the purpose of the performance reviews to state 
survey agencies, CMS noted that they were designed to serve as an 
additional opportunity to further the agency's educational and 
supportive efforts of state survey agencies. The goal is to promote 
optimal performance by identifying areas needing improvement and 
corrective action. Survey agencies are expected to have systems in 
place for monitoring and evaluating the efficiency of their corrective 
actions. 

[85] During desk reviews, state survey agency staff track the 
proficiency testing results of labs using data reports from proficiency 
testing providers and request that labs initiate corrective actions 
when the results are below certain thresholds with some frequency. 

[86] We asked CMS officials to provide the data points for fig. 1 in 
the agency's comments on our draft report. The data provided by CMS 
show that the percentage of labs with condition-level deficiencies 
remained relatively constant over time--fluctuating between 6 and 8 
percent from 1996 through 2005. However, the pre-2004 and post-2004 
data are not comparable because CMS revised CLIA survey requirements in 
January 2004. Additionally, the data likely understate the actual 
number of condition-level deficiencies. As noted earlier, state survey 
agencies do not consistently cite all condition-level deficiencies 
identified during inspections and CMS has instructed states not to 
include deficiencies related to the new 2003 lab quality control 
requirements in a lab's survey report. 

[87] Generally, our analysis focused on the period 1998 through 2004, 
recognizing that the early years of CLIA implementation were probably 
atypical because the law expanded oversight to previously unregulated 
physician office labs and many labs shifted from more complex testing 
that required routine inspections to less complex testing that did not. 

[88] See 42 C.F.R. § 493.553(a)(1)(2005). 

[89] JCAHO expressed a similar concern. 

[90] See GAO, Nursing Homes: Additional Steps Needed to Strengthen 
Enforcement of Federal Quality Standards, GAO/HEHS-99-46 (Washington, 
D.C.: Mar. 18, 1999). 

[91] CMS's comments explained that its regulations require 30-day 
advanced notice by accrediting organizations but not by exempt-state 
programs. 

[92] Our analysis of accrediting organization validation reviews 
covered fiscal year 1999 through fiscal year 2003. We excluded fiscal 
year 2004 because CMS had not yet completed its analysis of condition- 
level deficiencies missed by accrediting organizations. 

[93] JCAHO miscalculated the increase in complaints from 2004 to 2005. 
According to the data provided by JCAHO, the increase was 57 percent, 
not 64 percent. JCAHO's 2005 complaint data were not available when we 
initially collected data on complaints received and substantiated by 
survey organizations. 

[94] Of the 2 % sample of waived laboratories (2422) for FY 2005, 
preliminary data indicate that 751 laboratories did not adhere to 
manufacturers' instructions. Of the 751 laboratories, 459 received a 
revisit. Of 459, initial data indicate that more than 70% showed 
improvement with manufacturers' instructions after an initial 
determination of nonconformance on the initial visit. 

[95] Improvements in the CMS information system have prevented the 
verification of data prior to 2004. However, the trend line (especially 
the substantial decline in deficiencies after 1994-1995 subsequent to 
CMS implementation of CLIA regulations) is consistent with a variety of 
other measures. 

[96] The initiative was stimulated by the Government Performance and 
Results Act (GPRA), enacted by Congress to promote performance 
improvement in areas of national concern. The GPRA goal included both 
(a) increasing the percentage of labs that fully enroll their staff in 
proficiency testing, and (b) improving the proficiency of the 
laboratories, as measured by the results of proficiency tests. 

[97] For example, we might equate an accrediting organization's 
requirement for proficiency testing enrollment with CMS' CLIA condition-
level requirement for proficiency testing enrollment. Beneath the 
surface, however, we must be aware that proficiency testing enrollment 
applies to the laboratory's enrollment in proficiency testing for a 
great many potential analytes. If an accrediting organization-to-CLIA 
linkage is based only on lack of enrollment in testing, regardless of 
how many analytes were omitted, the assessment of quality would be 
woefully incomplete. Such an incomplete picture of quality would 
represent an inadequate assessment of quality since it would not 
capture all the serious deficiencies that have occurred. In our CLIA 
validation review of accreditation organizations, a CMS team manually 
reviews and compares the entire narrative findings of the CLIA 
validation inspections to those of the accrediting organization 
inspections. The entire narratives are compared and not limited to 
whether or not the accrediting organization found a deficiency in 
proficiency testing enrollment. Otherwise, the picture would be 
incomplete and our review would be inadequate. We need to know more 
about the analytes involved. If the CLIA validation inspection found 
that the laboratory failed to enroll in proficiency testing for 2 
analytes, e.g., prothrombin time and glucose, but the accrediting 
organization inspection found that the laboratory failed to enroll in 
proficiency testing for only 1 of those 2 analytes, prothrombin time, 
the review identifies an inadequacy on the part of the accrediting 
organization -the accrediting organization inspection has failed to 
identify a serious flaw in the laboratory's practices that can 
negatively impact the quality of the laboratory's testing and the 
outcome can be death. In a worst case scenario, the laboratory's lack 
of enrollment in proficiency testing for the analyte glucose can result 
in inaccurate and unreliable testing results, which could affect the 
health status of a diabetic patient. The flawed testing results could 
directly result in patient fatality from diabetic shock. If the 
laboratory performs thousands of tests each year under those 
circumstances, thousands of patients are at risk. 

[98] CMS provides laboratories with an opportunity to correct its 
problems prior to the imposition of sanctions. If the problem 
represents a threat to patient health and safety, then the time frame 
for correction is either very short or the laboratory is required to 
cease testing. Most laboratories find the threat of sanctions to be an 
enormous incentive and quickly correct their problems. The desired 
outcome in CLIA is regulatory compliance, high quality, and prompt and 
effective remedy of problems. For CMS certified laboratories, 396 
laboratories received a notice of a proposed sanction and of those, 93 
failed to take prompt corrective action and had sanctions imposed in 
2005. The 2005 Laboratory Registry contains 236 laboratories listed for 
all oversight entities as having sanctions imposed. The number cases in 
which sanctions were threatened is approximately four times the 
sanction level, indicating that in the vast preponderance of cases the 
laboratories responded quickly to the potential for sanctions. CMS also 
assessed $4.4 million in civil monetary penalties. 

[99] CMS data consistently indicate approximately 200 complaints 
alleged per year. This relatively low number may alternatively suggest 
either that quality is good, or that clients and workers do not know 
the avenues by which to lodge a complaint. 

[100] The CLIA regulations do not require that an accreditation 
organization's or exempt State's standards be the same as CLIA. Rather, 
the accreditation organization and exempt State's requirements, taken 
as a whole, must be equivalent to or more stringent than those of CLIA. 
The majority of the deemed organizations and exempt States' 
requirements are at a level that elevates the quality of testing and 
the standard of practice. CLIA, on the other hand, represents minimum 
requirements, and is sometimes less rigorous than the routine standard 
of clinical laboratory practice. 

Because their standards can be more stringent than CLIA, the 
accrediting organizations and exempt States can hold the labs to higher 
quality requirements. For example, CAP requires proficiency testing for 
all analytes, not just those that are specified at Subpart I, and the 
JCAHO has quality standards for waived tests. Standardization would 
make our reviews easier, but would weaken the accreditation 
organization standards that are more stringent than CLIA, restrain 
marketplace-enriching standard development, and change their unique 
corporate identity and organizational autonomy. 

[101] No evidence has been presented to show that the oversight as 
described in the regulation has not been sufficient. CMS should provide 
for an effective process to review accrediting organizations at time of 
approval and renewal. If process and procedure review were adequate, 
there would not be any question of equivalency. Approving laboratory 
accrediting organizations should be likened to accrediting colleges-it 
is done on a periodic basis and every college does not provide the 
exact same path to a Bachelor of Science degree. Every laboratory 
accrediting organization does not need to provide the same path to 
accreditation as long as the agreed to minimums is met. 

[102] See: The Just Culture Community, located at www.justculture.org. 

[103] Leape, L., Bates DW, Cullen DJ, et al. Systems analysis of 
adverse drug events. JAMA 1995; 274:35-43.
 
[104] Reason JT. Understanding adverse events: The human factor. In: 
Vincent C, ed. Clinical Risk Management: Enhancing Patient Safety. 2" 
ED. LONDON: BMJ: 2001:9-30. 

GAO's Mission: 

The Government Accountability Office, the investigative arm of 
Congress, exists to support Congress in meeting its constitutional 
responsibilities and to help improve the performance and accountability 
of the federal government for the American people. GAO examines the use 
of public funds; evaluates federal programs and policies; and provides 
analyses, recommendations, and other assistance to help Congress make 
informed oversight, policy, and funding decisions. GAO's commitment to 
good government is reflected in its core values of accountability, 
integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains 
abstracts and full-text files of current reports and testimony and an 
expanding archive of older products. The Web site features a search 
engine to help you locate documents using key words and phrases. You 
can print these documents in their entirety, including charts and other 
graphics. 

Each day, GAO issues a list of newly released reports, testimony, and 
correspondence. GAO posts this list, known as "Today's Reports," on its 
Web site daily. The list contains links to the full-text document 
files. To have GAO e-mail this list to you every afternoon, go to 
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order 
GAO Products" heading. 

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to: 

U.S. Government Accountability Office 

441 G Street NW, Room LM 

Washington, D.C. 20548: 

To order by Phone: 

Voice: (202) 512-6000: 

TDD: (202) 512-2537: 

Fax: (202) 512-6061: 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: www.gao.gov/fraudnet/fraudnet.htm 

E-mail: fraudnet@gao.gov 

Automated answering system: (800) 424-5454 or (202) 512-7470: 

Public Affairs: 

Jeff Nelligan, managing director, 

NelliganJ@gao.gov 

(202) 512-4800 

U.S. Government Accountability Office, 

441 G Street NW, Room 7149 

Washington, D.C. 20548: