This is the accessible text file for GAO report number GAO-02-279 
entitled 'Nursing Homes: Federal Efforts to Monitor Resident 
Assessment Data Should Complement State Activities' which was released 
on February 15, 2002. 

This text file was formatted by the U.S. General Accounting Office 
(GAO) to be accessible to users with visual impairments, as part of a 
longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the 
printed version. The portable document format (PDF) file is an exact 
electronic replica of the printed version. We welcome your feedback. 
Please E-mail your comments regarding the contents or accessibility 
features of this document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

United States General Accounting Office: 
GAO: 

Report to Congressional Requesters: 

February 2002: 

Nursing Homes: 

Federal Efforts to Monitor Resident Assessment Data Should Complement
State Activities: 

GA0-02-279: 

Contents: 

Letter: 

Results in Brief: 

Background: 

Only Eleven States Conduct Separate On-Site or Off-Site Reviews of MDS 
Accuracy: 

States Attempt to Improve MDS Data Accuracy through On-Site Reviews, 
Training, and Other Remedies: 

CMS' MDS Review Program Could Better Leverage Existing State and 
Federal Accuracy Activities: 

Conclusions: 

Recommendations for Executive Action: 

Agency and State Comments and Our Evaluation: 

Appendix I: Summary of State On-Site MDS Reviews As of January 2001: 

Appendix II: Comments from the Centers for Medicare and Medicaid 
Services: 

Tables: 

Table 1: States with and without MDS Review Programs as of January 
2001: 

Table 2: MDS Assessments with Errors in Five States with On-Site MDS 
Review Programs: 

Table 3: Implementation Schedule for CMS' MDS Accuracy Review Program: 

Figures: 

Figure 1: MDS Elements Identified By Nine States As Having High 
Potential for MDS Errors: 

Abbreviations: 

ADL: activities of daily living: 

CMS: Centers for Medicare and Medicaid Services: 

DAVE: data assessment and verification: 

HCFA: Health Care Financing Administration: 

HHS: Health and Human Services: 

MDS: minimum data set: 

OIG: Office of Inspector General: 

OASIS: Outcome and Assessment Information Set: 

PPS: prospective payment system: 

SNF: skilled nursing facilities: 

[End of section] 

United States General Accounting Office: 
Washington, DC 20548: 

February 15, 2002: 

The Honorable Charles E. Grassley: 
Ranking Minority Member: 
Committee on Finance: 
United States Senate: 

The Honorable Larry Craig: 
Ranking Minority Member: 
Special Committee on Aging: 
United States Senate: 

Nursing homes play an important role in the health care system of the 
United States. More than 40 percent of elderly Americans will use a 
nursing home at some time in their lives. Such facilities provide 
skilled nursing, therapy, or supportive care to older individuals who 
do not need the intensive medical care provided by hospitals, but for 
whom receiving care at home is not feasible. Under the Medicare and 
Medicaid programs, nursing homes were expected to receive $58 billion 
in 2001, with a federal share of approximately $38 billion. Nursing 
homes that participate in these programs are required to periodically 
assess the care needs of residents in order to develop an appropriate 
plan of care. Such resident assessment data are known as the minimum 
data set (MDS).[Footnote 1] The federal government contracts with 
states to periodically inspect or survey nursing homes, and state 
surveyors use MDS data to help assess the quality of resident care. 
[Footnote 2] Medicare and some state Medicaid programs also use MDS 
data to adjust nursing home payments to account for variation in 
resident care needs. 

Thus, the accuracy of MDS data has implications for the identification 
of quality problems and the level of nursing home payments. 

MDS accuracy is one of many areas that state surveyors are expected to 
examine during periodic nursing home surveys. Federal guidance for 
state surveyors regarding the accuracy of MDS assessments focuses on 
whether appropriate personnel completed or coordinated the assessments 
and whether there are any indications that the assessments were 
falsified This guidance also instructs surveyors to conduct a check of 
specific MDS items to ensure that the resident's condition is 
appropriately characterized. Concerns exist, however, that state 
surveyors already have too many tasks and that, as a result, the 
survey process may not adequately address MDS accuracy. In addition, 
our prior work on nursing home quality issues has identified 
weaknesses in the survey process that raise questions about the 
thoroughness and consistency of state surveys.[Footnote 3] 

In response to your request, we assessed (1) how states monitor the 
accuracy of MDS data compiled by nursing homes through review programs 
separate from their standard nursing home survey process, (2) how 
states attempt to improve the data's accuracy where there are
indications of problems, and (3) how the federal government ensures 
the accuracy of MDS data. We surveyed the 50 states and the District 
of Columbia to determine whether states had a separate MDS review 
program—distinct from any MDS oversight that might occur during the 
periodic nursing home surveys performed by all states. We then 
conducted structured interviews with officials in 10 of the 11 states 
that indicated they had separate MDS review programs.[Footnote 4] We 
also interviewed staff from the Centers for Medicare and Medicaid 
Services (CMS), an agency within the Department of Health and Human 
Services (HHS) that manages the Medicare and Medicaid programs, who 
were responsible for developing the agency's MDS review program. 
[Footnote 5] In addition, we reviewed regulations, literature, and 
other documents relating to MDS data. We performed our work from 
December 2000 through January 2002 in accordance with generally 
accepted government auditing standards. 

Results in Brief: 

Eleven states have established separate MDS review programs, apart 
from their standard nursing home survey process, to monitor the 
accuracy of resident assessment data compiled by nursing homes. An 
additional seven states reported that they plan to do so. According to 
officials in the 10 states with MDS accuracy review programs in 
operation as of January 2001, these programs were established 
primarily because of the important role played by MDS data in setting 
Medicaid payments and identifying quality of care problems. While 
routine nursing home surveys provide an opportunity to examine the 
accuracy of MDS data, officials in some of the 10 states with separate 
MDS review programs told us that surveyors do not have sufficient time 
to focus on the data's accuracy because of other survey tasks. To 
assess the accuracy of the MDS data, 9 of the 10 states conduct 
periodic on-site reviews in all or a significant portion of their 
nursing homes. These reviews include checking a sample of a home's MDS 
assessments and determining whether the basis for the assessments is 
adequately documented in residents' medical records. In addition, 
these reviews often include interviews of nursing home personnel 
familiar with residents and observations of the residents themselves. 
Such corroborating evidence provides reviewers increased assurance 
that an MDS assessment accurately reflects the resident's condition. 
States with on-site review programs reported that the discrepancies 
they identified between MDS assessments and the supporting 
documentation, also called "MDS errors," typically resulted from 
differences in clinical interpretation or mistakes, such as a 
misunderstanding of MDS definitions. Two of the 10 states were able to 
tell us the amount of the recoupments they obtained from nursing homes 
due to Medicaid overpayments based on inaccurate MDS assessments. For 
example, West Virginia received $1 million from one nursing home 
relating to MDS errors associated with physical therapy services. 

States with separate MDS review programs identified a variety of 
approaches to improving MDS accuracy. State officials highlighted the 
on-site review process itself and provider education activities as 
their primary approaches. On-site reviews heighten facility staff 
awareness of the importance of MDS data and can lead to the correction 
of practices that contribute to MDS errors. Some officials said that 
on-site reviews provide a valuable opportunity for informal training 
and coaching staff about completing and documenting MDS assessments, 
which is important given the types of MDS errors found and the high 
staff turnover in nursing homes. Identifying areas of confusion by 
nursing home staff during on-site MDS reviews is also useful in 
guiding the focus of formal training sessions conducted outside of the 
nursing home. State officials reported that they also have one or more 
remedies at their disposal to help improve accuracy, such as requiring 
nursing homes to prepare a corrective action plan or imposing 
financial penalties on nursing homes when serious or extensive errors 
in MDS data are found. Indiana, for example, requires facilities to 
submit a corrective action plan detailing how the facility will 
address errors identified during an on-site review. In addition, Maine 
has collected approximately $390,000 in financial penalties since late 
1995 from facilities with MDS errors. Finally, officials from five 
states told us that their MDS review efforts have resulted in a 
notable decrease in MDS errors across all facilities. For example, the 
average percentage of assessments with MDS errors that resulted in a 
payment change since initiation of their separate review programs has 
decreased from about 85 percent to 10 percent of assessments in South 
Dakota and from 75 percent to 30 percent of assessments in Indiana. 

Following the 1998 implementation of Medicare's MDS-based payment 
system, the Health Care Financing Administration (HCFA) began building 
the foundation for its own separate review program—distinct from state 
efforts—intended to ensure the accuracy of MDS data for all nursing 
home residents. In the course of developing and testing various 
accuracy review approaches, an agency contractor found widespread MDS 
errors that resulted in a change in the Medicare payment level for two-
thirds of the resident assessments sampled. Its on-site visits proved 
to be a very effective method of assessing accuracy. As a result, the 
contractor recommended that any MDS reviews involve on-site visits, at 
least for the first few years of any national review program, along 
with certain off-site analysis to help target homes and areas for 
review. In September 2001, CMS awarded a new contract to establish a 
national MDS accuracy review program. As currently planned, CMS' MDS 
review activities are projected to involve roughly 1 percent of the 
estimated 14.7 million MDS assessments expected to be completed in 
2001, with on-site reviews in fewer than 200 of the nation's 17,000 
nursing homes each year. In contrast, states that conduct separate MDS 
reviews typically examine from 10 to 40 percent of assessments 
completed in all or a significant portion of their nursing homes. The 
CMS contractor is required to coordinate its activities with ongoing 
state and federal efforts. For example, to avoid unnecessary overlap, 
the contractor is instructed to coordinate with states regarding the 
selection of facilities and the timing of visits. However, the 
contractor is not specifically tasked with assessing the adequacy of 
each state's MDS accuracy activities. While CMS' approach may yield 
some broad sense of the accuracy of MDS assessments on an aggregate 
level, it appears to be insufficient to provide confidence about the 
accuracy of MDS assessments in the vast bulk of nursing homes 
nationwide. 

Given the substantial level of effort and resources already invested 
at the state and federal levels to oversee nursing home quality of 
care, including periodic inspections at each home nationwide, we 
believe that CMS should reorient its MDS accuracy program so that it 
complements and leverages existing state review activities and its own 
established nursing home oversight efforts. Therefore, we are making 
recommendations to the administrator of CMS that include determining 
the adequacy of each state's efforts to ensure MDS accuracy and 
providing additional guidance and technical assistance to individual 
states as needed; routinely monitoring state review activities and 
progress as part of CMS' own ongoing federal oversight of nursing home 
quality; and ensuring that states and nursing homes have sufficient 
documentation to support the full MDS assessment. 

In commenting on a draft of this report, CMS agreed with the 
importance of assessing and monitoring the adequacy of state MDS 
accuracy efforts. CMS recognized that the MDS impacts reimbursement 
and care planning and that it is essential that the assessment data 
reflect the resident's health status so that the resident may receive 
the appropriate quality care and that providers are appropriately 
reimbursed. While CMS' comments suggested that its current efforts may 
be sufficient to assess and improve state performance, we do not 
believe they will result in the systematic assessment and monitoring 
of each state's MDS accuracy that we recommended. CMS did not agree 
with our recommendation on the need for sufficient documentation to 
support the full MDS assessment, expressing concern about potential 
duplicative effort and unnecessary burden for nursing homes. In our 
view, documentation need not be duplicative, but demonstrative that 
the higher-level summary judgment about a resident's condition and 
needs entered on the MDS can be independently validated. Given the 
importance of MDS data in adjusting nursing home payments and guiding 
resident care, ensuring their integrity is critical to achieving their 
intended purposes. 

Background: 

The nation's 17,000 nursing homes play an essential role in our health 
care system, providing services to 1.6 million elderly and disabled 
persons who are temporarily or permanently unable to care for 
themselves but who do not require the level of care furnished in an 
acute care hospital. Depending on the identified needs of each 
resident, as determined through MDS assessments, nursing homes provide 
a variety of services, including nursing and custodial care, physical, 
occupational, and speech therapy, and medical social services. 
[Footnote 6] The majority of nursing home residents have their care 
paid for by Medicaid, a joint federal-state program for certain low-
income individuals. Almost all nursing homes serve Medicaid residents, 
while more than 14,000 nursing homes are also Medicare-certified. 
Medicare, the federal health care program for elderly and disabled 
Americans, pays for post-hospital nursing home stays if a beneficiary 
needs skilled nursing or rehabilitative services.[Footnote 7] Medicare-
covered skilled nursing home days account for approximately 9 percent 
of total nursing home days. Medicare beneficiaries tend to have 
shorter nursing home stays and receive more rehabilitation services 
than individuals covered by Medicaid. 

MDS Used to Assess Nursing Home Residents: 

Since 1991, nursing homes have been required to develop a plan of care 
for each resident based on the periodic collection of MDS data. The 
MDS contains individual assessment items covering 17 areas, such as 
mood and behavior, physical functioning, and skin conditions. MDS 
assessments of each resident are conducted in the first 14 days after 
admission and are used to develop a care plan.[Footnote 8] A range of 
professionals, including nurses, attending physicians, social workers, 
activities professionals, and occupational, speech, and physical 
therapists, complete designated parts of the MDS.[Footnote 9] 
Assessing a resident's condition in certain areas requires 
observation, often over a period of days. For example, nursing staff 
must assess the degree of resident assistance needed during the 
previous 7 days-—none, supervised, limited, extensive, or total 
dependence—-to carry out the activities of daily living (ADL), such as 
using a toilet, eating, or dressing. To obtain this information, staff 
completing the MDS assessments are required to communicate with direct 
care staff, such as nursing assistants or activities aides, who have 
worked with the resident over different time periods. These staff have 
first-hand knowledge of the resident and will often be the primary and 
most reliable source of information regarding resident performance of 
different activities. While a registered nurse is required to verify 
that the MDS assessment is complete, each professional staff member 
who contributed to the assessment must sign and attest to the accuracy 
of his or her portion of the assessment. 

MDS Used in Quality Oversight and as Basis for Payments: 

MDS data are also submitted by nursing homes to states and CMS for use 
in the nursing home survey process and to serve as the basis for 
adjusting payments. CMS contracts with states to periodically survey 
nursing homes to review the quality of care and assure that the 
services delivered meet the residents' assessed needs. In fiscal year 
2001, the federal government spent about $278 million on the nursing 
home survey process.[Footnote 10] Effective July 1999, the agency 
instructed states to begin using quality indicators derived from MDS 
data to review the care provided to a nursing home's residents before 
state surveyors actually visit the home to conduct a survey.[Footnote 
11] Quality indicators are essentially numeric warning signs of 
potential care problems, such as greater-than-expected instances of 
weight loss, dehydration, or pressure sores among a nursing home's 
residents. They are used to rank a facility in 24 areas compared with 
other nursing homes in a state. In addition, by using the quality 
indicators before the on-site visit to select a preliminary sample of 
residents to review, surveyors should be better prepared to identify 
potential care problems. 

In addition to quality oversight, some state Medicaid programs and 
Medicare use MDS data to adjust nursing home payments to reflect the 
expected resource needs of their residents. Such payment systems are 
commonly known as "case-mix" reimbursement systems. Because not all 
residents require the same amount of care, the rate paid for each 
resident is adjusted using a classification system that groups 
residents based on their expected costs of care. Facilities use MDS 
data to assign residents to case-mix categories or groups that are 
defined according to clinical condition, functional status, and 
expected use of services. In Medicare, these case-mix groups are known 
as resource utilization groups. Each case-mix group represents 
beneficiaries who have similar nursing and therapy needs. As of 
January 2001, 18 states had introduced such payment systems for their 
Medicaid programs.[Footnote 12] As directed by the Congress, HCFA in 
1998 implemented a prospective payment system (PPS) for skilled 
nursing facilities (SNF)-—nursing homes that are certified to serve 
Medicare beneficiaries. The SNF PPS also uses MDS data to adjust 
nursing home payments. 

States and CMS use the term "accuracy reviews" to describe efforts 
that help ensure MDS assessments accurately reflect residents' 
conditions. Review activities can be performed on-site-—that is, at 
the nursing home-—or off-site. On-site reviews generally consist of 
documentation reviews to determine whether the resident's medical 
record supports the MDS assessment completed by the facility.[Footnote 
13] If the MDS assessment is recent, the review may also include 
direct observation of the resident and interviews with nursing home 
staff who have recently evaluated or treated the resident. 

While documentation reviews may also be conducted outside of the 
nursing home, other off-site reviews of MDS data include examining 
trends across facilities.[Footnote 14] For example, off-site review 
activities could involve the examination of monthly reports showing 
the distribution of residents' case-mix categories across different 
facilities in a state. Similarly, off-site reviews could also involve 
an examination of particular MDS elements, such as the distribution of 
ADLs within and across nursing homes to identify aberrant or 
inconsistent patterns that may indicate the need for further 
investigation. Off-site and on-site reviews may also be combined as a 
way of leveraging limited resources to conduct MDS accuracy activities. 

Only Eleven States Conduct Separate On-Site or Off-Site Reviews of MDS 
Accuracy: 

Eleven states conduct separate MDS accuracy reviews apart from their 
standard nursing home survey process. Ten of these states' reviews 
were in operation as of January 2001. An additional 7 states reported 
that they intend to initiate similar accuracy reviews.[Footnote 15] 
All 18 of these states either currently use an MDS-based Medicaid 
payment system or plan to implement such a system. The remaining 33 
states have no plans to implement separate MDS review programs and 
currently rely on their periodic nursing home surveys for MDS 
oversight.[Footnote 16] In all but one of the states with separate MDS 
review programs operating as of January 2001, accuracy reviews entail 
periodic on-site visits to nursing homes. The reviews focus on whether 
a sample of MDS assessments completed by the facility is supported by 
residents' medical records. If the MDS assessments reviewed are recent 
enough that residents are still in the facility and their health 
status has not changed, the on-site review may also be supplemented 
with interviews of nursing home staff familiar with the residents, as 
well as observations of the residents themselves, to validate the 
record review. About half of these states also conduct off-site data 
analyses in which reviewers look for significant changes or outliers, 
such as facilities with unexplained large shifts in the distribution 
of residents across case-mix categories over a short period. Officials 
primarily attributed the errors found during their on-site reviews to 
differences in clinical interpretation and mistakes, such as a 
misunderstanding of MDS definitions. A few of these states have been 
able to show some recoupments of Medicaid payments since the 
implementation of their on-site review programs. 

Most States Do Not Have Separate MDS Review Programs: 

Of the 50 states and the District of Columbia, only 11 conduct 
accuracy reviews of MDS data that are separate from the state's 
nursing home survey process.[Footnote 17] (See table 1.) These 11 
states provide care to approximately 22 percent of the nation's 
nursing home residents and all but one have an MDS-based payment 
system (Virginia began conducting MDS accuracy reviews in April 2001 
in anticipation of adopting such a payment system in 2002). Seven 
additional states plan to initiate separate MDS reviews-—three 
currently have an MDS-based payment system and four are planning to 
implement such a payment system. Officials in the 10 states with 
separate, longer standing MDS review programs said that the primary 
reason for implementing reviews was to ensure the accuracy of the MDS 
data used in their payment systems. Several of these states also 
indicated that the use of MDS data in generating quality indicators 
was another important consideration. Vermont officials, in particular, 
emphasized the link to quality of care, noting that the state had 
created its own MDS-based quality indicators prior to HCFA's 
requirement to use quality indicators in nursing home surveys. A state 
official told us it was critical that the MDS data be accurate because 
Vermont was making this information available to the public as well as 
using it internally as a normal part of the nursing home survey 
process. 

Table 1: States with and without MDS Review Programs as of January 
2001: 

States with separate MDS review programs: 

Type of payment system: MDS-based payment system; 
States: Indiana, Iowa, Maine, Mississippi, Ohio, Pennsylvania, South
Dakota, Vermont, Washington, West Virginia; 
State totals: 10. 

Type of payment system: Planning to adopt MDS-based payment system; 
States: Virginia (reviews began April 2001); 
State totals: 1. 

States planning separate MDS review programs: 

Type of payment system: MDS-based payment system; 
States: Idaho, Kentucky, New Hampshire; 
State totals: 3. 

Type of payment system: Planning to adopt MDS-based payment system; 
States: Georgia, Minnesota,[A] New Jersey, Utah; 
State totals: 4. 

Type of payment system: Subtotal; 
State totals: 18. 

States with no plans to establish separate MDS review programs: 

Type of payment system: MDS-based payment system; 
States: Colorado,[B] Florida, Kansas, Nebraska, North Dakota; 
State totals: 5. 

Type of payment system: No MDS-based payment system; 
States: Alaska, Alabama, Arkansas, Arizona, California, Connecticut, 
District of Columbia, Delaware, Hawaii, Illinois,[A] Louisiana, 
Massachusetts,[A] Maryland,[B] Michigan, Missouri, Montana,[A] North 
Carolina, New Mexico, Nevada, New York,[A] Oklahoma, Oregon, Rhode 
Island, South Carolina, Tennessee, Texas,[A] Wisconsin, Wyoming; 
State totals: 28. 

Type of payment system: Subtotal; 
State totals: 33. 

Type of payment system: Total; 
State totals: 51. 

Note: States' decisions regarding whether to adopt an MDS-based 
payment system and MDS review program may have changed since the time 
of our data collection (January 2001). For example, a Kentucky 
official told us that it implemented a separate MDS review program in 
October 2001, and Montana has shifted to an MDS-based payment system. 

[A] Although these states do not conduct a separate review of MDS 
data, they do conduct separate reviews of data that are linked to 
their state's Medicaid payment system. For example, Texas has a non-
MDS-based case-mix payment system called the Texas Index for Level of 
Effort that is based on a recipient's condition, ADLs, and the level 
of staff intervention. 

[B] Colorado and Maryland officials volunteered that they had 
conducted onetime reviews of MDS data, but are not planning to 
regularly continue these reviews. Colorado's state survey agency 
conducted an MDS review of 90 nursing homes (40 percent of homes) in 
the summer of 2000 and Maryland officials participated in a HCFA-
funded project to conduct on-site reviews from May through July 2000 
at 5 percent of its nursing homes. 

Source: GAO survey of 50 states and the District of Columbia. 

[End of table] 

To varying degrees, three major factors influenced the decision of 33 
states not to establish separate MDS review programs. First, the
Page 12	GAO-02-279 Nursing Home Resident Assessment Data
majority--28 states—-do not have MDS-based Medicaid payment systems. 
Second, some states cited the cost of conducting separate reviews. 
Kansas, for example, reported a lack of funding and staff resources as 
the reason for halting a brief period of on-site visits in 1996 as a 
follow-up to nursing home surveys. Arkansas as well reported 
insufficient staff for conducting a separate review of MDS data. 
[Footnote 18] Finally, officials in about one-third of the states 
without separate MDS reviews volunteered that they had some assurance 
of the accuracy of MDS data either because of training programs for 
persons responsible for completing MDS assessments or because of the 
nursing home survey process.[Footnote 19] For example, Missouri 
operates a state funded quality improvement project in which nurses 
with MDS training visit facilities to assist staff with the MDS 
process and use of quality indicator reports. North Carolina also 
reported that its quarterly training sessions provide MDS training to 
approximately 800 providers a year. Regarding standard surveys, 
Connecticut and Maryland reported that their nursing home survey teams 
reviewed MDS assessments to determine if they were completed correctly 
and if the assessment data matched surveyor observations of the 
resident. In Connecticut, surveyors may also review a sample of 
facility MDS assessments for possible errors whenever they identify 
aberrant or questionable data on the quality indicator reports. 

Officials in the 10 states with separate, longer standing MDS review 
programs generally said that the survey process itself does not detect 
MDS accuracy issues as effectively as separate MDS review programs. 
[Footnote 20] Some noted that nursing home surveyors do not have time 
to thoroughly review MDS accuracy and often review a smaller sample 
size than MDS reviewers. The surveyors' primary focus, they indicated, 
was on quality of care and resident outcomes—not accuracy of MDS data. 
For example, surveyors would look at whether the resident needed 
therapy and whether it was provided. In contrast, the MDS reviewer 
would calculate the total number of occupational, speech, and physical 
therapy minutes to ensure that the resident was placed in the 
appropriate case-mix category. Officials in Iowa similarly noted that 
surveyors do not usually cite MDS accuracy as a specific concern 
unless there are egregious MDS errors, again, because the focus of the 
survey process is on quality of care. 

States with Separate MDS Review Programs Emphasize On-Site Oversight, 
but Also Conduct Off-Site Monitoring: 

Nine of the 10 states with separate, longer standing MDS accuracy 
review programs use on-site reviews to test the accuracy of MDS data, 
generally visiting all or a significant portion of facilities in the 
state at least annually, if not more frequently. (See app. I for a 
summary of state on-site review programs.) Due to a lack of staff, one 
state-—West Virginia-—limits its MDS reviews to off-site analysis of 
facility-specific monthly data. Most of these states have been 
operating their MDS review programs for 7 years or longer and 
developed them within a year of implementing an MDS-based payment 
system. Three of the nine states arrive at the facility
unannounced while the other six provide advanced notice ranging from 
48 hours to 2 weeks. 

The sample of facility MDS assessments reviewed by each state varies 
considerably. Assessment sample sizes generally range from 10 to 40 
percent of a nursing home's total residents but some states select a 
specific number of residents, not a percentage, and a few specifically 
target residents in particular case-mix categories. For example, 
Indiana selects a sample of 40 percent-—or no less than 25 residents—-
across all major case-mix categories, while Ohio's sample can be based 
on a particular case-mix category, such as residents classified as 
"clinically complex.[Footnote 21] Iowa officials told us that its 
reviewers select at least 25 percent of a facility's residents, with a 
minimum of 5 residents, while Pennsylvania chooses 15 residents from 
each facility, regardless of case-mix category or facility size. Some 
states expand the resident sample when differences between the MDS 
assessment and supporting documentation reach a certain threshold. 
[Footnote 22] For example, if the on-site review for the initial 
sample in Iowa finds that 25 percent or more of the MDS assessments 
have errors, a supplemental random sample is selected for review. 
While a few states limit their sample to Medicaid residents only, most 
select assessments to review from the entire nursing home's population. 

On-site reviews generally involve a comparison of the documentation in 
the resident's medical record to the MDS assessment prepared by the 
facility.[Footnote 23] Generally, the on-site process also allows 
reviewers to interview nursing home staff and to directly observe 
residents, permitting a better understanding of the documentation in a 
resident's medical record and clarifying any discrepancies that may 
exist. Staff interviews and resident observations can enhance the 
reviewer's understanding of the resident's condition and allow a more 
thorough MDS review than one relying primarily on documentation. 
However, as the interval between the facility's MDS assessment and the 
on-site review increases, staff interviews and resident observations 
become less reliable and more difficult to conduct.[Footnote 24] For 
example, staff knowledge of a particular patient may fade over time, 
the patient's health status may change, or the patient may be 
discharged from the facility. Pennsylvania officials, who reported 
reviewing assessments that were 6 to 12 months old, told us that the
state's MDS reviews tended to identify whether the nursing home had 
adequate documentation. Reviewing such old assessments tends to focus 
the review process on the adequacy of the documentation rather than on 
whether the MDS assessment was accurate.[Footnote 25] Four of the nine 
states review assessments between 30 and 90 days old, a process that 
likely increases the value of interviews and observation. The 
combination of interviews and observations can be valuable, but 
limiting reviews to only recent MDS assessments and providing homes 
advance notice may undermine the effectiveness of on-site reviews. 
[Footnote 26] Under such circumstances, facilities have an opportunity 
to focus on the accuracy of their recent assessments, particularly if 
the nursing home knows when their reviews will occur, instead of 
adopting facility-wide practices that increase the accuracy of all MDS 
assessments. 

Based on their on-site reviews, officials in the nine states 
identified seven areas as having a high potential for MDS errors, with 
two areas most often identified as being among the highest potential 
for error: (1) mood and behavior and (2) nursing rehabilitation and 
restorative care.[Footnote 27] (See figure 1.) Assessments of resident 
mood and behavior are used to calculate quality indicators and, along 
with nursing rehabilitation and restorative care, are often important 
in determining nursing home payments.[Footnote 28] CMS indicated that 
several of the MDS elements cited in figure 1 were also identified by 
a CMS contractor as areas of concern. Officials in most states with 
separate on-site review programs told us that errors discovered during 
their on-site reviews often resulted from differences in clinical 
interpretation or mistakes, such as a misunderstanding of MDS 
definitions by those responsible for completing MDS assessments. 
Officials in only four of the nine states were able to tell us whether 
the errors identified in their MDS reviews on average resulted in a 
case-mix category that was too high or too low. Two of these states 
reported roughly equal numbers of MDS errors that inappropriately 
placed a resident in either a higher or lower case-mix category; a 
third indicated that errors more often resulted in higher payments; 
and a fourth found that errors typically resulted in payments that 
were too low. None of the nine states track whether quality indicator 
data were affected by MDS errors. 

Figure 1: MDS Elements Identified By Nine States As Having High 
Potential for MDS Errors: 

[Refer to PDF for figure: vertical bar graph] 

Mood and behavior: 
Identified as 1 of top 3 high potential areas: 4 states; 
Identified as high potential area: 5 states. 

Nursing rehabilitation and restorative care: 
Identified as 1 of top 3 high potential areas: 3 states; 
Identified as high potential area: 5 states. 

ADLs: 
Identified as 1 of top 3 high potential areas: 5 states; 
Identified as high potential area: 2 states. 

Therapy[A]: 
Identified as 1 of top 3 high potential areas: 5 states; 
Identified as high potential area: 2 states. 

Physician visits or orders[B]: 
Identified as 1 of top 3 high potential areas: 3 states; 
Identified as high potential area: 2 states. 

Toileting plans[C]: 
Identified as 1 of top 3 high potential areas: 3 states; 
Identified as high potential area: 1 state. 

Skin conditions: 
Identified as 1 of top 3 high potential areas: 2 states; 
Identified as high potential area: 0 states. . 

Note: We asked states to identify areas of the MDS assessment that 
have a high potential for MDS errors. State responses were included in 
this figure if two or more states identified an area as "high 
potential." 

[A] Staff record the number of days and total minutes of therapy, such 
as physical or occupational therapy, received by a resident in the 
last 7 days. 

[B] Staff record the number of days during the last 14-day period in 
which a physician has examined the resident or changed the care 
directions for the resident. The latter is known as physician orders. 

[C] Staff members record scheduled times each day that they perform 
any of the following tasks: (1) take the resident to the bathroom, (2) 
give the resident a urinal, or (3) remind the resident to go to the 
bathroom. 

Source: Interviews with officials from nine states with separate on-
site review programs in operation as of January 2001. 

[End of figure] 

Two of the 10 states with MDS review programs were able to tell us the 
amount of Medicaid recoupments resulting from inaccurate MDS 
assessments. From state fiscal years 1994 through 1997, South Dakota 
officials reported that the state had recouped about $360,000 as a 
result of recalculating nursing home payments after MDS reviews. West 
Virginia received $1 million in 1999 related to MDS errors for 
physical therapy discovered during a 1995 on-site review at a nursing 
home. Officials in five additional states told us that they 
recalculate nursing home payments when MDS errors are found, but could 
not provide the amount recovered.[Footnote 29] 

Of the 10 states with longer standing MDS review programs, four use 
off-site analyses to supplement their on-site reviews, while one state 
relies on off-site analyses exclusively. Both Maine and Washington 
examine MDS data off-site to monitor changes by facility in the mix of 
residents across case-mix categories. Such changes may help identify 
aberrant or inconsistent patterns that may indicate the need for 
further investigation. Ohio, a state with approximately 1,000 
facilities—-more than any other state that conducts MDS reviews—-
analyzes data off-site to identify facilities with increased Medicaid 
payments and changes in case-mix categories to select the 
approximately 20 percent of facilities visited each year.[Footnote 30] 
West Virginia has eliminated its on-site reviews and now focuses 
solely on analyzing monthly reports for its 141 facilities—for 
example, significant changes in case-mix categories or ADLs across 
consecutive MDS assessments. In addition to informally sharing results 
of off-site reviews with the state nursing home surveyors, West 
Virginia is trying to formalize a process in which off-site reviews 
could trigger additional on-site or off-site documentation reviews. 

States Attempt to Improve MDS Data Accuracy through On-Site Reviews, 
Training, and Other Remedies: 

Officials in the nine states with on-site review programs consistently 
cited three features of their review programs that strengthened the 
ability of nursing home staff to complete accurate MDS assessments and 
thus decrease errors: (1) the actual presence of reviewers, (2) 
provider education, and (3) remedies that include corrective action 
plans and financial penalties. On-site reviews, for example, 
underscore the state's interest in MDS accuracy and provide an 
opportunity to train and coach those who are responsible for 
completing MDS assessments. Similarly, the errors discovered during on-
site reviews guide the development of more formal training sessions 
that are offered by the state outside of the nursing home. Requiring 
nursing homes to prepare corrective action plans and imposing 
financial penalties signal the importance of MDS accuracy to 
facilities and are tools to improve the accuracy of the MDS data. As a 
result of these efforts, some states have been able to show a notable 
decrease in their overall error rates. 

Most of the nine states view on-site visits and training as 
interrelated elements that form the foundation of their MDS review 
programs. State officials said that nursing homes pay more attention 
to properly documenting and completing the MDS assessments because 
reviewers visit the facilities regularly. On-site visits also allow 
reviewers to discuss MDS documentation issues or requirements with 
staff, providing an opportunity for informal MDS training. For 
example, Indiana officials told us that 2 to 3 hours of education are 
a routine part of each facility's MDS review. Noting the high staff 
turnover rates in nursing homes, many states reported that frequent 
training for the staff responsible for completing MDS assessments is 
critical.[Footnote 31] Officials in seven of the nine states with on-
site reviews told us that high staff turnover was one of the top three 
factors contributing to MDS errors in their states. In addition, many 
of the reasons cited for MDS errors—such as a misunderstanding of MDS 
definitions and other mistakes—reinforce the need for training. 
[Footnote 32] 

States with on-site reviews use the process to guide provider 
education activities—both on-site and off-site. For example, during 
Pennsylvania's annual MDS reviews of all nursing homes, state 
reviewers determine the types of training needed. According to state 
officials, the state uses the results of these reviews to shape and 
provide facility-specific training, if it is needed, within a month of 
the review and subsequently conducts a follow-up visit to see if the 
facility is improving in these areas. They indicated that all 685 
homes visited during 2000, the first year of this approach, were 
provided with some type of training. To improve MDS accuracy, several 
states also provide voluntary training opportunities outside of the 
nursing home. Maine, Iowa, Indiana, and South Dakota, for example, 
provide MDS training regularly throughout the state, rotating the 
location of the training by region so that it is accessible to staff 
from all facilities. 

While states generally emphasized on-site reviews and training as the 
primary ways to improve the accuracy of the MDS data, some reported 
that they have also instituted certain remedies, such as corrective 
action plans and financial penalties. Indiana and Pennsylvania, for 
example, require facilities to submit a corrective action plan 
detailing how the facility will address errors identified during an on-
site review. Two states—Maine and Indiana—impose financial penalties. 
[Footnote 33] Maine has instituted financial penalties for recurring 
serious errors, collecting approximately $390,000 since late 1995. 
Maine also requires facilities with any MDS errors that result in a 
case-mix category change to complete and submit a corrected MDS 
assessment for the resident.[Footnote 34] While Indiana imposes 
financial penalties, it does not view them as the primary tool for 
improving MDS accuracy.[Footnote 35] Rather, officials attributed a 
decrease in MDS errors to the education of providers and the on-site 
presence of reviewers. Other remedies cited by states include 
conducting more frequent on-site MDS reviews and referring suspected 
cases of fraud to their state's Medicaid Fraud Control Unit. 

Five of the nine states that conduct on-site MDS reviews told us that 
their efforts have resulted in a notable decrease in MDS errors across 
all facilities since the implementation of their review programs. (See 
table 2.) South Dakota officials, for example, reported that the 
percentage of assessments with MDS errors across facilities had 
decreased from approximately 85 percent to 10 percent since the 
implementation of the state's MDS review program in 1993. Similarly, 
Indiana reported a decrease in the statewide average error rate from 
75 percent to 30 percent of assessments in 1 year's time. Four states 
could not provide these data. In calculating these decreases, three of 
the five states—Indiana, Maine, and South Dakota—define MDS errors as 
an unsupported MDS assessment that caused the case-mix category to be 
inaccurate.[Footnote 36] Iowa's definition, however, includes MDS 
elements that are not supported by medical record documentation, 
observation, or interviews, regardless of whether the MDS error 
changed the case-mix category. Similarly, while Pennsylvania does not 
limit errors to those that changed the case-mix category, the state 
defines errors as a subset of MDS elements that are not supported by 
the medical record." 

Table 2: MDS Assessments with Errors in Five States with On-Site MDS 
Review Programs (in percent): 
			
State: Indiana; 
Initial MDS error rate: 75%; 
Subsequent MDS error rate: 30%; 
Time of initial and subsequent error rate: 1999, 2000. 

State: Iowa; 
Initial MDS error rate: 32%; 
Subsequent MDS error rate: 22%; 
Time of initial and subsequent error rate: July, December 2000. 

State: Maine[A]; 
Initial MDS error rate: 21%; 
Subsequent MDS error rate: 10%; 
Time of initial and subsequent error rate: 1995, 2000. 

State: Pennsylvania; 
Initial MDS error rate: 20%; 
Subsequent MDS error rate: 15%; 
Time of initial and subsequent error rate: 2000, 2001. 

State: South Dakota; 
Initial MDS error rate: 85%; 
Subsequent MDS error rate: 10%; 
Time of initial and subsequent error rate: 1993, 1998. 

[A] Errors that result in changes for a subset of case-mix categories 
were used to calculate these error rates. 

Source: Data provided by Indiana, Iowa, Maine, Pennsylvania, and South 
Dakota. 

[End of table] 

CMS' MDS Review Program Could Better Leverage Existing State and 
Federal Accuracy Activities: 

Following implementation of Medicare's MDS-based payment system in 
1998, HCFA began building the foundation for its own separate review 
program-—distinct from state efforts-—to help ensure the accuracy of 
MDS data. In the course of developing and testing accuracy review 
approaches, its contractor found widespread MDS errors that resulted 
in a change in Medicare payment categories for 67 percent of the 
resident assessments sampled. In September 2001, CMS awarded a new 
contract to implement a nationwide MDS review program over a 2- to 3-
year period.[Footnote 38] Despite the benefits of on-site reviews, as 
demonstrated by states with separate review programs, the current plan 
involves conducting on-site reviews in fewer than 200 of the nation's 
17,000 nursing homes each year. In addition, the contractor's combined 
on-site and off-site reviews to evaluate MDS accuracy will involve 
only about 1 percent of the approximately 14.7 million MDS assessments 
expected to be prepared in 2001. In contrast, states that conduct 
separate on-site MDS reviews typically visit all or a significant 
portion of their nursing homes and generally examine from 10 to 40 
percent of assessments. While CMS' approach may yield some broad sense 
of the accuracy of MDS assessments on an aggregate level, it may be 
insufficient to help ensure the accuracy of MDS assessments in most of 
the nation's nursing homes. At present, it does not appear that CMS 
plans to leverage the considerable resources already devoted to state 
nursing home surveys and states' separate MDS review programs that 
together entail a routine on-site presence in all nursing homes 
nationwide. Nor does it plan to more systematically evaluate the 
performance of state survey agencies regarding MDS accuracy through 
its own federal comparative surveys. Finally, CMS is not requiring 
nursing homes to provide documentation for the full MDS assessment, 
which could undermine the efficacy of its MDS reviews. 

Testing of MDS Accuracy Approaches Identified Widespread Accuracy 
Problems: 

In September 1998, HCFA contracted with Abt Associates to develop and 
test various on-site and off-site approaches for verifying and 
improving the accuracy of MDS data. Two of the approaches resembled 
state on-site MDS reviews and the off-site documentation reviews 
performed by CMS contractors that review Medicare claims.[Footnote 39] 
Another approach used off-site data analysis to target facilities for 
on-site review.[Footnote 40] To determine the effectiveness of the 
approaches tested in identifying MDS inaccuracies, Abt compared the 
errors found under each approach to those found in its "reference 
standard"—-independent assessments performed by MDS-trained nurses 
hired by Abt for approximately 600 residents in 30 facilities in three 
states.[Footnote 41] Abt found errors in every facility, with little 
variation in the percentage of assessments with errors across 
facilities. On average, the errors found affected case-mix categories 
in 67 percent of the sampled Medicare assessments. Abt concluded that 
the errors did not result in systematic overpayments or underpayments 
to facilities even though there were more errors that placed residents 
in too high as opposed to too low a case-mix category. Abt did not 
determine, however, the extent to which errors affected quality 
indicators. 

Due to the prevalence of errors, Abt recommended a review program that 
included periodically visiting all facilities during the program's 
first several years. Recognizing the expense of visiting every 
facility, however, Abt also recommended eventually transitioning to 
the use of off-site mechanisms to target facilities and specific 
assessments for on-site review. Abt also made recommendations to 
address the underlying causes of MDS errors: simplifying the MDS 
assessment tool, clarifying certain MDS definitions (particularly for 
ADLs), and improving MDS training for facilities.[Footnote 42] 

The Federal MDS Review Program Is Too Limited to Evaluate State-Level 
Accuracy Assurance Efforts: 

Building on the work of Abt Associates, in the summer of 2000, the 
agency began formulating its own distinct nationwide review program to 
address long-term MDS monitoring needs. The agency developed a request 
for proposal for MDS data assessment and verification activities and 
sought proposals from its 12 program safeguard contractors.[Footnote 
43] On September 28, 2001, CMS awarded a 3-year contract for 
approximately $26 million to Computer Sciences Corporation. The 
contract calls for the initiation of on-site and off-site reviews by 
late spring 2002, but the full scope of MDS review activities will not 
be underway until the second year of the contract.[Footnote 44] (See 
table 3.) 

Table 3: Implementation Schedule for CMS' MDS Accuracy Review Program: 

Phase: Developmental; 
Time period: October 2001 through May 2002; 
Review activities: 
* Test a combination of the most promising components from Abt's 
earlier assessment of various on-site and off-site approaches. 
* Recommend the appropriate balance between on-site and off-site 
reviews. 
* Identify and develop new approaches for monitoring MDS accuracy. 
* Begin to identify communication and collaboration strategies for 
federal and state accuracy reviews, such as coordinating with states. 

Phase: Initial implementation; 
Time period: April 2002 through September 2002; 
Review activities: 
* Begin conducting on-site and off-site accuracy reviews. 
* Continue to evaluate the efficacy of the accuracy review approaches 
being implemented and identify areas of risk. 
* Conduct ongoing data surveillance, such as monitoring and 
identifying trends in payments based on MDS data.[A] 

Phase: Full implementation; 
Time period: October 2002 through September 2003; 
Review activities: 
* Perform ongoing data analysis and the full scope of data assessment 
and verification activities.[B] 
* Implement training and education activities to ensure that those 
responsible for MDS data understand and accurately complete MDS 
assessments. This approach is expected to include a method for 
communicating how the contractor will continually refine and improve 
accuracy review processes. 

Note: The contract covers 1 year with two additional 1-year options. 
Currently, full implementation would occur in the second year of the 
contract. The third year of the contract may also include on-site 
enforcement surveys and special studies concerning the accuracy of 
reported Medicare and Medicaid data. 

[A] For example, one of the contractor's tasks is to analyze MDS data 
reported by nursing homes that serve Medicare beneficiaries to 
determine whether differences in case-mix categories relate to changes 
in the patient's health status or changes in how providers are 
reporting MDS data. 

[B] For example, while continuing on-site and off-site MDS reviews, 
the contractor will also be required to calculate error rates for paid 
claims for Medicare-covered services. 

Source: DAVE contract statement of work for CMS' review program for 
MDS accuracy. 

[End of table] 

Despite this broad approach, the contractor is not specifically tasked 
with assessing the adequacy of each state's MDS reviews. Instead, it 
is required to develop a strategy for coordinating its review 
activities with other state and federal oversight, such as the 
selection of facilities and the timing of visits, to avoid unnecessary 
overlap with routine nursing home surveys or states' separate MDS 
review programs. This approach does not appear to build on the 
benefits of on-site visits that are already occurring as part of state 
review activities. Rather, the contract specifies independent federal 
on-site and off-site reviews of roughly 1 percent of the approximately
14.7 million MDS assessments expected to be prepared in 2001--80,000 
during the first contract year and 130,000 per year thereafter. 
[Footnote 45] The contractor, however, tentatively recommended that 
the majority of reviews, about 90 percent, be conducted off-site. 
According to CMS, these off-site reviews could include a range of 
activities, such as the off-site targeting approaches developed by Abt 
or medical record reviews similar to those conducted by CMS 
contractors for purposes of reviewing Medicare claims. In addition, 
the contractor is expected to conduct a range of off-site data 
analyses that could include a large number of MDS assessments. The 
remaining 10 percent of MDS assessments-—representing fewer than 200 
of the nation's 17,000 nursing homes-—would be reviewed on-site each 
year. This limited on-site presence is inconsistent with Abt's earlier 
recommendation regarding the benefits of on-site reviews in detecting 
accuracy problems, and with the view of almost all of the states with 
separate MDS review programs that an on-site presence at a significant 
number of their nursing homes is central to their review efforts. 

While CMS' approach may yield some broad sense of the accuracy of MDS 
assessments on an aggregate level, it appears to be insufficient to 
provide confidence about the accuracy of MDS assessments in the vast 
bulk of nursing homes nationwide. Given the substantial resources 
invested in on-site nursing home visits associated with standard 
surveys or states' separate MDS review programs, CMS' MDS review 
program could view states' routine presence as the cornerstone of its 
program and instead focus its efforts on ensuring the adequacy of 
state reviews. CMS could build on its established federal monitoring 
survey process for nursing home oversight. The agency is required by 
statute to annually resurvey at least 5 percent of all nursing homes 
that participate in Medicare and Medicaid. One of the ways CMS 
accomplishes this requirement is by conducting nursing home 
comparative surveys to independently assess the states' performance in 
their nursing home survey process. During a comparative survey, a 
federal team independently surveys a nursing home recently inspected 
by a state in order to compare and contrast the results. These federal 
comparative surveys have been found to be most effective when 
completed in close proximity to the state survey and involve the same 
sample of nursing home residents to the maximum extent possible. Abt 
also attempted to review recently completed MDS assessments. 

Finally, a potential issue that could undermine the efficacy of the 
federal MDS accuracy reviews involves the level of documentation 
required to support an MDS assessment. CMS requires specific 
documentation for some MDS elements, but officials said that the MDS 
itself—which can simply consist of checking off boxes or selecting 
multiple choice answers on the assessment form—generally constitutes 
support for the assessment without any additional documentation. CMS 
officials consider the MDS assessment form to have equal weight with 
the other components of the medical record, such as physician notes 
and documentation of services provided. As a result, CMS asserts that 
the assessment must be consistent with, but need not duplicate, the 
medical record. In contrast, most of the nine states with separate on-
site review programs require that support for each MDS element that 
they review be independently documented in the medical record. State 
officials told us that certain MDS elements, such as ADLs, are 
important to thoroughly document because they require observation of 
many activities by different nursing home staff over several days. As 
a result, some of these states require the use of separate flow charts 
or tables to better document ADLs. Similarly, some states require 
documentation for short-term memory loss rather than accepting a 
nursing home's assertion that a resident has this condition. CMS' 
training manual describes several appropriate tests for identifying 
memory loss, such as having a resident describe a recent event. In one 
of its December 2000 reports, the HHS OIG recommended that nursing 
homes be required to establish an "audit trail" to support certain MDS 
elements. HCFA disagreed, noting that it does not expect all 
information in the MDS to be duplicated elsewhere in the medical 
record. However, given the uses of MDS data, especially in adjusting 
nursing home payments and producing quality indicators, documenting 
the basis for the MDS assessments in the medical record is critical to 
assessing their accuracy. 

Conclusions: 

In complying with federal nursing home participation and quality 
requirements, about 17,000 nursing homes were expected to produce 
almost 15 million MDS assessments during 2001 on behalf of their 
residents. This substantial investment of nursing home staff time 
contributes to multiple functions, including establishing patient care 
plans, assisting with quality oversight, and setting nursing home 
payments that account for variation in resident care needs. While some 
states, particularly those with MDS-based Medicaid payment systems, 
stated that ensuring MDS accuracy requires establishing a separate MDS 
review program, many others rely on standard nursing home surveys to 
assess the data's accuracy. Flexibility in designing accuracy review 
programs that fit specific state needs, however, should not preclude 
achieving the important goal of ensuring accountability across state 
programs. It is CMS' responsibility to consistently ensure that states 
are fulfilling statutory requirements to accurately assess and provide 
for the care needs of nursing home residents. 

The level of federal financial support for state MDS accuracy 
activities is already substantial. The federal government pays up to 
75 percent of the cost of separate state MDS review activities and in 
fiscal year 2001 contributed $278 million toward the cost of the state 
nursing home survey process, which is intended in part to review MDS 
accuracy. Instead of establishing a distinct but limited federal 
review program, reorienting the thrust of its review program in order 
to complement ongoing state MDS accuracy efforts could prove to be a 
more efficient and effective means to achieve CMS' stated goals. Such 
a shift in focus should include (1) taking full advantage of the 
periodic on-site visits already conducted at every nursing home 
nationwide through the routine state survey process, (2) ensuring that 
the federal MDS review process is designed and sufficient to 
consistently assess the performance of all states' reviews for MDS 
accuracy, and (3) providing additional guidance, training, and other 
technical guidance to states as needed to facilitate their efforts. 
With its established federal monitoring system for nursing home 
surveys—especially the comparative survey process—that helps assess 
state performance in conducting the nursing home survey process, CMS 
has a ready mechanism in place that it can use to systematically 
assess state performance for this important task. Finally, to help 
improve the effectiveness of MDS review activities, CMS should take 
steps to ensure that each MDS assessment is adequately supported in 
the medical record. 

Recommendations for Executive Action: 

With the goal of complementing and leveraging the considerable federal 
and state resources already devoted to nursing home surveys and to 
separate MDS accuracy review programs, we recommend that the 
administrator of CMS: 

* review the adequacy of current state efforts to ensure the accuracy 
of MDS data, and provide, where necessary, additional guidance, 
training, and technical assistance; 

* monitor the adequacy of state MDS accuracy activities on an ongoing 
basis, such as through the use of the established federal comparative 
survey process; and; 

* provide guidance to state agencies and nursing homes that sufficient 
evidentiary documentation to support the full MDS assessment be 
included in residents' medical records. 

Agency and State Comments and Our Evaluation: 

We provided a draft of this report to CMS and the 10 states with 
separate MDS accuracy programs for their review and comment. (See app. 
II for CMS' comments.) CMS agreed with the importance of assessing and 
monitoring the adequacy of state MDS accuracy efforts. CMS also 
recognized that the MDS affects reimbursement and care planning and 
that it is essential that the assessment data reflect the resident's 
health status so that the resident may receive the appropriate quality 
care and that providers are appropriately reimbursed. However, CMS' 
comments did not indicate that it planned to implement our 
recommendations and reorient its MDS review program.[Footnote 46] 
Rather, CMS' comments suggested that its current efforts provide 
adequate oversight of state activities and complement state efforts. 

While CMS stated that it currently evaluates, assesses, and monitors 
the accuracy of the MDS through the nursing home survey process, it 
also acknowledged the wide variation in the adequacy of current state 
accuracy review efforts. Our work in the 10 states with separate MDS 
review programs raised serious questions about the thoroughness and 
adequacy of the nursing home survey process for reviewing MDS 
accuracy. Officials in many of these states said that the survey 
process itself does not detect MDS accuracy issues as effectively as 
separate MDS review programs. Surveyors, we were told, do not have 
time to thoroughly review MDS accuracy and their focus is on quality 
of care and resident outcomes, not accuracy of MDS data. 

In response to our recommendations on assessing and monitoring the 
adequacy of each state's MDS reviews, CMS commented that it would 
consider adding a new standard to the state performance expectations 
that the agency initiated in October 2000. CMS indicated that the 
state agency performance review program would result in a more 
comprehensive assessment of state activities related to MDS accuracy 
than could be obtained through the comparative survey process. CMS 
also outlined planned analytic activities—such as a review of existing 
state and private sector MDS review methodologies and instruments, 
ongoing communications with states to share the knowledge gained, and 
comprehensive analyses of MDS data to identify systemic accuracy 
problems within states as well as across states—that it believes will 
help to evaluate state performance. 

We agree that some of CMS' proposed analytic activities could provide 
useful feedback to states on problem areas at the provider, state, 
region, and national levels. Similarly, the addition of MDS accuracy 
activities to its state performance standards for nursing home 
surveys, which CMS is considering, has merit. While CMS plans to 
consider adding a new standard to its state agency performance review 
program, the agency has a mechanism in place—the comparative survey 
process—that it could readily use to systematically assess state 
performance. However, CMS apparently does not intend to do so. Based 
on our discussions with agency officials, it does not appear that CMS' 
approach will yield a consistent evaluation of each state's 
performance. We continue to believe that assessment and routine 
monitoring of each state's efforts should be the cornerstone of CMS' 
review program. As we previously noted, the agency's proposed on-site 
and off-site reviews of MDS assessments are too limited to 
systematically assess MDS accuracy in each state and would consume 
resources that could be devoted to complementing and overseeing 
ongoing state activities. A comprehensive review of the adequacy of 
state MDS accuracy activities, particularly in those states without a 
separate review program, is essential to establish a baseline and to 
allow CMS to more efficiently target additional guidance, training, or 
technical assistance that it acknowledged is necessary. 

CMS did not agree with our recommendation that it should provide 
guidance to states regarding adequate documentation in the medical 
record for each MDS assessment. CMS stated that requiring 
documentation of all MDS items places an unnecessary burden on 
facilities. Skilled reviewers, it stated, should be able to assess the 
accuracy of completed MDS assessments through a combination of medical 
record review, observation, and interviews. CMS further stated that 
requiring duplicative documentation might result in documentation that 
is manufactured and of questionable accuracy. Of course, the potential 
for manufactured data could also be an issue with the MDS, when 
supporting documentation is absent or limited. Without adequate 
documentation, it is unclear whether the nursing home staff 
sufficiently observed the resident to determine his or her care needs 
or merely checked off a box on the assessment form. We continue to 
believe, as do most of the states with separate MDS review programs, 
that requiring documentation for the full MDS assessment is necessary 
to ensure the accuracy of MDS data. In our view, however, this 
documentation need not be duplicative of that which is already in the 
medical record but rather demonstrative of the basis for the higher-
level summary judgments about a resident's condition. Some states have 
already developed tools to accomplish this and in commenting on a 
draft of this report, two states said that CMS should establish 
documentation requirements for responses on the MDS. In addition, the 
discrepancies cited by the HHS OIG in its studies stemmed from 
inconsistencies between MDS assessments and documentation in 
residents' medical records. The OIG acknowledged that the results of 
its analyses were limited by the information available in the medical 
record—for example, when a facility MDS assessment was based on 
resident observation, the facility may not have documented these 
observations in the medical record. The importance of adequate 
documentation is further reinforced by the fact that using interviews 
and observation to validate MDS assessments may often not be possible, 
particularly for residents who have been discharged from the nursing 
home before an MDS accuracy review. Given the importance of MDS data 
in adjusting nursing home payments and guiding resident care, 
documenting the basis for the MDS assessment—in a way that can be 
independently validated—is critical to achieving its intended purposes. 

CMS provided additional clarifying information that we incorporated as 
appropriate. In addition, the states that commented on the draft 
report generally concurred with our findings and provided technical 
comments that we incorporated as appropriate. 

As agreed with your offices, unless you publicly announce the contents 
of this report earlier, we will not distribute it until 30 days after 
its date. At that time, we will send copies to the administrator of 
CMS; appropriate congressional committees; and other interested 
parties. We will also make copies available to others upon request. 

If you or your staff have any questions, please call me at (202) 512-
7114 or Walter Ochinko at (202) 512-7157. Major contributors to this 
report include Carol Carter, Laura Sutton Elsberg, Leslie Gordon, and 
Sandra Gove. 

Signed by: 

Kathryn G. Allen: 
Director, Health Care-—Medicaid and Private Insurance Issues: 

[End of section] 

Appendix I: Summary of State On-Site MDS Reviews As of January 2001: 

State[A]: Iowa; 
Number of nursing homes[B]: 465; 
Year state began: MDS-based payment system/MDS reviews: 2000 
(payment); 2000 (reviews); 
Review combined with nursing home surveys? No; 
Survey findings used in planning MDS reviews? No; 
Reviews done on-site, off-site, or both? Both; 
Frequency of on-site reviews (all facilities unless otherwise 
noted)[C]: Annually; 
Number of MDS assessments reviewed at each facility: At least 25 
percent with a minimum of 5 residents; 
Average time lapse between facility MDS and state review: 90 days; 
Reported importance of interviews/observations versus medical record 
review[D]: Interviews and observations are less important than medical
record review; 
State definition of "error": MDS element not supported by record,
observation, or interview; 
Facility error rate calculated: Yes; 
Examples of remedies and efforts to recoup Medicaid payments[E]: Make 
referrals to state survey agency; conduct additional reviews; provide 
on-site education; 
Accuracy and other trends: During the first 2 quarters of reviews, 
error rate decreased from 32 percent to 22 percent; 
Other features of on-site reviews: State provides voluntary training 
sessions on completing and submitting MDS assessments. State officials 
noted that provider education is a strong focus of their MDS review 
program. 

State[A]: Indiana; 
Number of nursing homes[B]: 562; 
Year state began: MDS-based payment system/MDS reviews: 1998 
(payment); 1998 (reviews); 
Review combined with nursing home surveys? No; 
Survey findings used in planning MDS reviews? No; 
Reviews done on-site, off-site, or both? On-site only; 
Frequency of on-site reviews (all facilities unless otherwise 
noted)[C]: At least every 15 months; 
Number of MDS assessments reviewed at each facility: 40 percent—-or no 
less than 25 residents; 
Average time lapse between facility MDS and state review: State 
reviews most recent MDS assessment; 
Reported importance of interviews/observations versus medical record 
review[D]: Interviews and observations are equally important as 
medical record review; 
State definition of "error": Assessment caused resident to be placed 
in the wrong case-mix category[F]; 
Facility error rate calculated: Yes; 
Examples of remedies and efforts to recoup Medicaid payments[E]: 
Impose financial penalties by reducing the administrative component of a
facility’s Medicaid payment; facility must submit plan and is subject 
to revisit; recalculate case-mix category and Medicaid rates; 
Accuracy and other trends: State officials link decreases in MDS error 
rates to the presence of on-site reviewers and the education of 
providers; 
Other features of on-site reviews: State publishes annual guidelines 
for providers on documentation needed to support MDS data. 

State[A]: Maine; 
Number of nursing homes[B]: 126; 
Year state began: MDS-based payment system/MDS reviews: 1993 
(payment); 1994 (reviews); 
Review combined with nursing home surveys? No; 
Survey findings used in planning MDS reviews? No[G]; 
Reviews done on-site, off-site, or both? Both; 
Frequency of on-site reviews (all facilities unless otherwise 
noted)[C]: Quarterly; 
Number of MDS assessments reviewed at each facility: Minimum of 10 
assessments per facility; 
Average time lapse between facility MDS and state review: 76 days; 
Reported importance of interviews/observations versus medical record 
review[D]: Interviews and observations are equally important as 
medical record review; 
State definition of "error": MDS element not supported by record,
observation, or interview[H]; 
Facility error rate calculated: Yes[H]; 
Examples of remedies and efforts to recoup Medicaid payments[E]: 
Conduct more frequent reviews; impose financial penalties;[H] request 
MDS reassessment from facility; 
Accuracy and other trends: While problems continue in some MDS 
elements, others show improvement, such as ADLs; 
Other features of on-site reviews: Reviewers bring portable computers 
to facilities and, using state-designed software, review MDS data. 

State[A]: Mississippi; 
Number of nursing homes[B]: 191; 
Year state began: MDS-based payment system/MDS reviews: 1988 
(payment); 1992 (reviews); 
Review combined with nursing home surveys? No; 
Survey findings used in planning MDS reviews? No; 
Reviews done on-site, off-site, or both? On-site only; 
Frequency of on-site reviews (all facilities unless otherwise 
noted)[C]: Annually; 
Number of MDS assessments reviewed at each facility: At least 20 
percent of residents in facility; 
Average time lapse between facility MDS and state review: 45 days; 
Reported importance of interviews/observations versus medical record 
review[D]: Interviews and observations are equally important as 
medical record review; 
State definition of "error": Assessment caused resident to be placed 
in the wrong case-mix category; 
Facility error rate calculated: No; 
Examples of remedies and efforts to recoup Medicaid payments[E]: 
Revisit facilities where problems have been identified; recalculate 
case-mix category and Medicaid rates; 
Accuracy and other trends: Facilities with poor MDS reviews tend to
receive many survey deficiencies; 
Other features of on-site reviews: State published guidelines for 
providers on documentation needed to support MDS data. 

State[A]: Ohio; 
Number of nursing homes[B]: 1,009; 
Year state began: MDS-based payment system/MDS reviews: 1993 
(payment); 1994 (reviews); 
Review combined with nursing home surveys? No; 
Survey findings used in planning MDS reviews? Not usually[I]; 
Reviews done on-site, off-site, or both? Both; 
Frequency of on-site reviews (all facilities unless otherwise 
noted)[C]: Annually[J]; 
Number of MDS assessments reviewed at each facility: Ranging from all 
to 50 residents, based on facility size; 
Average time lapse between facility MDS and state review: State 
reviews most recent MDS assessment for the reporting quarter; 
Reported importance of interviews/observations versus medical record 
review[D]: Interviews and observations are less important than medical
record review; 
State definition of "error": Assessment caused resident to be placed 
in the wrong case-mix category; 
Facility error rate calculated: Yes; 
Examples of remedies and efforts to recoup Medicaid payments[E]: 
Revisit facilities where problems have been identified; recalculate 
case-mix category and Medicaid rates; 
Accuracy and other trends: When recalculating the case-mix, the adjusted
payments decreased about 99 percent of the time; 
Other features of on-site reviews: State has done the following to 
address MDS errors: training; Web site; MDS newsletter; and providing
results of MDS reviews. 

State[A]: Pennsylvania; 
Number of nursing homes[B]: 774; 
Year state began: MDS-based payment system/MDS reviews: 1996 (payment); 
1994 (reviews); 
Review combined with nursing home surveys? No; 
Survey findings used in planning MDS reviews? No; 
Reviews done on-site, off-site, or both? On-site only; 
Frequency of on-site reviews (all facilities unless otherwise 
noted)[C]: Annually; 
Number of MDS assessments reviewed at each facility: 15 randomly 
selected residents from assessments actually used in the rate-setting
process; 
Average time lapse between facility MDS and state review: 6-12 months; 
Reported importance of interviews/observations versus medical record 
review[D]: Interviews and observations are less important than medical
record review; 
State definition of "error": Positive MDS element not supported by 
record[K]; 
Facility error rate calculated: Yes; 
Examples of remedies and efforts to recoup Medicaid payments[E]: 
Conduct more frequent reviews; provide training within 1 month; require
corrective action plan; 
Accuracy and other trends: State officials expect that their new MDS 
review process will ultimately lead to a decrease in error rates; 
Other features of on-site reviews: By restructuring the MDS review 
process, facilities are reviewed more frequently, issues are identified
more quickly and training is provided almost immediately to nursing 
facility staff. 

State[A]: South Dakota; 
Number of nursing homes[B]: 113; 
Year state began: MDS-based payment system/MDS reviews: 1993 (payment); 
1993 (reviews); 
Review combined with nursing home surveys? No; 
Survey findings used in planning MDS reviews? Not usually[I]; 
Reviews done on-site, off-site, or both? On-site only; 
Frequency of on-site reviews (all facilities unless otherwise 
noted)[C]: Every 15 months; 
Number of MDS assessments reviewed at each facility: At least 25 
percent of residents in facility; 
Average time lapse between facility MDS and state review: 14-30 days; 
Reported importance of interviews/observations versus medical record 
review[D]: Interviews and observations are equally important as 
medical record review; 
State definition of "error": Assessment caused resident to be placed 
in the wrong case-mix category; 
Facility error rate calculated: Yes; 
Examples of remedies and efforts to recoup Medicaid payments[E]: 
Revisit facilities where problems have been identified; recalculate
case-mix category and Medicaid rates; 
Accuracy and other trends: Since the state has been reviewing MDS 
data, the error rate has decreased from about 85 percent to 10 percent; 
Other features of on-site reviews: On-site reviews also include 
independent assessments and inter-rater reliability checks. 

State[A]: Vermont; 
Number of nursing homes[B]: 43; 
Year state began: MDS-based payment system/MDS reviews: 1992 (payment); 
1992 (reviews); 
Review combined with nursing home surveys? No, but same staff conduct
reviews and surveys; 
Survey findings used in planning MDS reviews? Not usually[I]; 
Reviews done on-site, off-site, or both? On-site only; 
Frequency of on-site reviews (all facilities unless otherwise 
noted)[C]: At least annually; 
Number of MDS assessments reviewed at each facility: 10 percent
predetermined and/or random sample of all residents in all units; 
Average time lapse between facility MDS and state review: MDS never
older than 90 days; 
Reported importance of interviews/observations versus medical record 
review[D]: Interviews and observations are equally important as 
medical record review; 
State definition of "error": MDS element not supported by record,
observation, or interview (effective 10/1/01); 
Facility error rate calculated: Yes (effective 10/1/01); 
Examples of remedies and efforts to recoup Medicaid payments[E]: 
Impose financial penalties (none imposed to date); revisit facilities
where problems have been identified; recalculate case-mix category and 
Medicaid rates; 
Accuracy and other trends: State officials told us that Vermont 
facilities do not have serious MDS accuracy issues; 
Other features of on-site reviews: Vermont tried to combine MDS 
reviews with nursing home surveys, but found that it detracted from the
survey process. 

State[A]: Washington; 
Number of nursing homes[B]: 271; 
Year state began: MDS-based payment system/MDS reviews: 1998 (payment); 
1998 (reviews); 
Review combined with nursing home surveys? No, but staff participate in
surveys about 6 times per year; 
Survey findings used in planning MDS reviews? Yes; 
Reviews done on-site, off-site, or both? Both; 
Frequency of on-site reviews (all facilities unless otherwise 
noted)[C]: Annually (Staff also conduct quarterly quality review 
audits); 
Number of MDS assessments reviewed at each facility: Approximately
20 percent, depending on facility size; 
Average time lapse between facility MDS and state review: 45-60 days; 
Reported importance of interviews/observations versus medical record 
review[D]: Interviews and observations are equally important as 
medical record review; 
State definition of "error": Assessment caused resident to be placed 
in the wrong case-mix category; 
Facility error rate calculated: Yes; 
Examples of remedies and efforts to recoup Medicaid payments[E]: 
Impose financial penalties (none imposed to date); revisit facilities
where problems have been identified; recalculate case-mix category and 
Medicaid rates; 
Accuracy and other trends: The types of MDS errors that commonly 
reoccur relate to misapplication of MDS definitions, and may in large
part be due to facility staff turnover. In commenting on a draft of 
this report, officials told us that these errors are consistent with 
those found in other states with MDS-based payment systems; 
Other features of on-site reviews: State plans to publish the results 
of MDS accuracy reviews on a Web page to prevent simple but recurring 
errors. 

[A] Virginia is not included because of the newness of its MDS review 
program (began operating in April 2001). We have included the nine 
other states with longer standing on-site review programs. 

[B] Source: CMS Nursing Home Compare Web site, [hyperlink, 
http://www.medicare.gov/nhcompare/Search], printed 6/8/01. 

[C] This column reflects the frequency of initial reviews for each 
facility. Some states conduct follow-up reviews more frequently for 
facilities where problems have been identified. 

[D] We asked states to select from the following categories: more 
important, equally important, and less important. 

[E] In addition, all nine states reported that they refer cases of 
suspected fraud to their state's Medicaid Fraud Control Unit. 

[F] Indiana officials added the following language to characterize MDS 
errors: An error occurs when the audit findings are different from the 
facility's transmitted MDS data and those differences result in a 
different case-mix category. 

[G] Survey findings may be used to plan MDS reviews, although this has 
not occurred yet. 

[H] Financial penalties and facility error rates, however, are only 
based on errors that result in changes for a subset of case-mix 
categories. 

[I] Survey findings are occasionally used in planning MDS reviews. 

[J] Staff use risk analysis to select approximately 200 facilities per 
year for on-site reviews. 

[K] Pennsylvania reviews only those MDS elements that have a positive 
response. For example, if a facility responded "no" or left an MDS 
element blank, that item would not be reviewed for accuracy, even if 
it could affect the case-mix category for that particular resident. 

[End of table] 

[End of section] 

Appendix II: Comments from the Centers for Medicare and Medicaid 
Services: 

Department Of Health & Human Services: 
Centers for Medicare & Medicaid Services: 
Administrator: 
Washington, DC 20201: 

Date: January 30 2002: 

To: Kathryn G. Allen: 
Associate Director: 
Health Care—-Medicaid and Private Insurance Issues: 

From: [Signed by] Thomas A. Scully: 
Administrator: 
	
Subject: General Accounting Office Draft Report, Nursing Homes: 
Federal Efforts to Monitor Resident Assessment Data Should Complement 
State Activities (GAO-02-279): 

Thank you for the opportunity to review and comment on the above-
referenced report regarding Federal and state efforts to monitor 
resident assessment data and to ensure the accuracy of the minimum 
data set (MDS). The Centers for Medicare & Medicaid Services (CMS) 
recognizes the coding of items on the MDS impacts reimbursement and 
care planning. It is essential that the assessment data reflect the 
resident's health status, so that the resident may receive the 
appropriate quality care and that providers are reimbursed 
appropriately. 

When automation requirements were implemented in 1998, CMS devoted 
significant resources to the development of an accuracy improvement 
program. We instructed a contractor to develop MDS accuracy review 
protocols. Then we funded a program safeguard contractor, known as the 
Data Assessment and Verification (DAVE) contractor, to audit and 
verify MDS data. 

The CMS also developed and implemented a major MDS system enhancement 
that provided new mechanisms to correct inaccurate information 
residing in the MDS database. Accuracy was significantly improved with 
the addition of this system (e.g., approximately 66 percent reduction 
in the proportion of records in the database containing invalid data 
values). 

We currently evaluate, assess, and monitor the accuracy of the MDS 
through the nursing home survey process. According to Task 5C of 
Appendix P Survey Procedures for Long-Term Care, "after observing and 
talking with the resident, the surveyor conducts a comprehensive 
review which includes the following: a check of specific items on the 
MDS for accurate coding for the resident's condition. The specific 
items to be checked will be based on the Quality Indicators (QIs) 
identified for the resident on the Resident Level Summary. 

At least 2 of the QIs identified for the resident must be matched 
against the QI definitions and against evidence other than the MDS to 
verify that the resident's condition is accurately recorded in the 
MDS. Keep in mind that you are verifying that the resident's condition 
was accurately assessed at the time the MDS was completed." 

We appreciate the effort that went into this report and the 
opportunity to review and comment on the issues it raises. Our 
comments on the GAO recommendations follow. 

GAO Recommendation: 

With the goal of complementing and leveraging the considerable 
Federal, State, and nursing home resources already devoted to nursing 
home surveys and to separate MDS accuracy review programs, we 
recommend that the Administrator of CMS: 

* Review the adequacy of current state efforts to ensure the accuracy 
of MDS data, and provide, where necessary, additional guidance, 
training or technical assistance. 

CMS Response: 

We agree that assessing the adequacy of state efforts to ensure the 
accuracy of MDS data is an important oversight function. Development 
of analytic tools to monitor and compare State activities is included 
in the DAVE scope of work. The CMS considered alternatives to MDS and 
Outcome and Assessment Information Set (OASIS) accuracy verification 
before deciding on the more centralized focus of the DAVE contract. 
The CMS thinks that this national approach to accuracy is better 
positioned to impact accuracy across all states, recognizing that 
current state efforts varies widely in adequacy and reflects different 
special interests within states. 

The DAVE contract includes many tasks that will evaluate state 
performance related to MDS accuracy and the provision of training and 
technical assistance. During the early phases of the DAVE contract, 
the contractor will review existing data dependent tools
and instruments (e.g., state agency and private sector entities) used 
to monitor MDS data accuracy. This assessment will include how 
methodologies used in existing systems can be blended into their 
review efforts. Further, CMS and the DAVE contractor will have ongoing 
communications with the state agency communities to discuss activities
necessary to support data assessment and verification efforts and to 
share the knowledge gained. 

The DAVE reports can be developed to identify systemic accuracy 
problems within states (i.e., facilities with consistently high 
numbers of residents classifying into clinically
complex solely due to the number of physician orders, high numbers of 
patients receiving ultra high therapy, etc), as well as across states. 
Establishing national baseline thresholds for MDS and OASIS data and 
applicable associated claims will furnish the national,
state, and provider level evidence we need to address areas of concern 
for CMS: program integrity; beneficiary health and safety; and quality 
improvement. While some states and fiscal intermediaries (FIs) are 
already doing a limited level of this analysis, they lack the
data and staff resources to do an ongoing comprehensive analysis. The 
DAVE contractor can give the state agencies and FIs analytic files. 
The states and FIs can provide feedback on major problem areas at 
provider, state, region, and national levels. The DAVE on-site reviews 
can then complement the state efforts. 

The CMS understands the need for continual state and provider MDS 
training to improve the accuracy of MDS assessments. The CMS funded an 
accuracy protocol development contract. In August 1999, we analyzed 
the findings of this contract and published two sets of questions and 
answers, released in March 2001 and July 2001 on CMS's Web site. The 
questions and answers address the areas of concern identified in 
"Figure 1" of the GAO report. (Note the reference standard data are 
from facilities located in states with existing auditing systems, 
(Pennsylvania, Washington, Ohio). This information was also used as 
the basis for the development of a special MDS 2.0 training session 
for state MDS coordinators that was provided during the July 2001 MDS 
conference. The same information will be used to guide future 
revisions of the MDS instrument. 

GAO Recommendation: 

* Monitor the adequacy of state MDS accuracy activities on an ongoing 
basis, such as through the use of the established federal comparative 
survey process. 

CMS Response: 

We agree that state agency training and oversight functions are 
crucial to ensuring accuracy of MDS data. We believe that, under the 
DAVE contract, we will be able to significantly upgrade our 
capabilities to monitor state agency activity. The DAVE scope of work 
includes reports that CMS can use to evaluate MDS accuracy on state, 
regional, and national levels. These reports will provide CMS with the 
baseline data needed to analyze provider data by state and region. 
Once baselines are established, we can use statistical analysis to 
highlight aberrant coding patterns that impact quality and 
reimbursement. We plan to communicate this information to state 
agencies so they can better focus their training efforts. At the same 
time, we will be able to use the DAVE reports to review the 
effectiveness of state training and oversight activities. 

The CMS monitors state survey agencies' performance in several areas 
to ensure that standards are met and to identify any necessary 
corrective action. Each year we identify the performance areas to be 
included in the evaluation. Since CMS funds states in order to provide 
training and technical assistance to nursing homes, we will consider 
this area for inclusion in future state survey agency performance 
review protocols. We are confident that the state agency performance 
review program will result in a more comprehensive assessment of state 
activities related to MDS accuracy (through training and technical 
assistance) than could be obtained through the comparative survey 
process. 

GAO Recommendation: 
* Provide guidance to state agencies and nursing homes that sufficient 
evidentiary documentation to support the full MDS assessment be 
included in residents' medical records. 

CMS Response: 

We do not agree that duplicative documentation of MDS items is 
necessary or desirable. The MDS, as a clinical assessment, is an 
integral part of the resident's record. The CMS's position is that 
additional documentation for all MDS items creates unnecessary
burden for facilities. There are however, by exception, just a few MDS 
items for which a second source of documentation is required. 

The MDS, in conjunction with other clinical documentation, provides a 
full view of the beneficiary's clinical course in a given time period. 
Validation of MDS responses generally requires a review of information 
from medical records and other sources. In evaluating assessments, the 
reviewer must be able to exercise clinical judgment in determining the 
plausibility of MDS responses in light of other information in the 
resident's medical record. If we were to require duplicative 
documentation to support comparative reviews, that documentation may 
be manufactured for the sole purpose of satisfying a comparative 
audit, and may be of questionable accuracy. Skilled reviewers/auditors 
are able to assess the accuracy of completed MDS through a combination 
of reviews, the comprehensive record, observations, and interviews. 

[End of section] 

Footnotes: 

[1] The Omnibus Budget Reconciliation Act of 1987 required the 
Secretary of Health and Human Services to specify a minimum data set 
of core elements to use in conducting comprehensive assessments of 
patient conditions and care needs. See 42 U.S.C. § 1395i-3; 42 U.S.C. 
§ 1396r. By mid-1991, the requirement to assess and plan for resident 
care had been implemented in all nursing homes that serve Medicare and 
Medicaid beneficiaries. MDS data are collected for all residents in 
these facilities, including Medicare, Medicaid, and private pay 
patients. 

[2] The federal government has responsibility for establishing 
requirements that nursing homes must meet to participate in publicly 
funded programs. Every nursing home that receives Medicare or Medicaid 
funding must undergo a standard survey conducted on average every 12 
months and no less than once every 15 months. Under its contracts with 
states, the federal government funds 100 percent of costs associated 
with certifying that nursing homes meet Medicare requirements and 75 
percent of the costs associated with Medicaid standards. 

[3] See Nursing Homes: Sustained Efforts Are Essential to Realize 
Potential of the Quality Initiatives [hyperlink, 
http://www.gao.gov/products/GAO/HEHS-00-197], Sept. 28, 2000. 

[4] These 10 states are Iowa, Indiana, Maine, Mississippi, Ohio, 
Pennsylvania, South Dakota, Vermont, Washington, and West Virginia. 
Due to the newness of Virginia's MDS review program (implemented in 
April 2001), we focused on the experience of the 10 states with longer 
standing programs. In addition, about one-third of the states without 
separate MDS review programs volunteered additional information 
regarding the ways in which the accuracy of MDS data may be addressed 
through the nursing home survey process or training programs offered 
by the state. 

[5] On June 14, 2001, the Secretary of BHS changed the name of the 
Health Care Financing Administration (HCFA) to CMS. In this report, we 
will continue to refer to HCFA where our findings apply to the 
organizational structure and operations associated with that name. 

[6] For patients with an advanced illness, medical social services 
generally help the patient and family cope with the logistics of daily 
life, including financial and legal planning and mobilizing community 
resources that may be available to the patient. Such services may also 
include counseling the patient and family to address emotions and 
other issues related to the advanced illness. 

[7] To qualify, a Medicare beneficiary must require daily skilled 
nursing or rehabilitative therapy services, generally within 30 days 
of a hospital stay of at least 3 days in length, and must be admitted 
to the nursing home for a condition related to the hospitalization. 

[8] MDS assessments are conducted for all nursing home residents 
within 14 days of admission and at quarterly and yearly intervals 
unless there is a significant change in condition. Accommodating their 
shorter nursing home stays, Medicare beneficiaries in a Medicare-
covered stay are assessed on or before the 5th, 14th, and 30th day of 
their stays and every 30 days thereafter. 

[9] In a recent study, the BHS Office of Inspector General (OIG) 
reported that almost all of the facilities in its study had a position 
of MDS coordinator. Eighty-one percent were registered nurses, and the 
remainder were either licensed practical nurses, licensed vocational 
nurses, or social workers. See BHS OIG, Nursing Home Resident 
Assessment: Quality of Care, OEI-02-99-00040 (Washington, D.C.: BHS, 
Dec. 2000). 

[10] To assess state survey agency performance in fulfilling 
contractual obligations, CMS is required by statute to conduct federal 
oversight surveys in at least 5 percent of the nursing homes in each 
state within 2 months of the state's completion of its survey. CMS 
fulfills this requirement by conducting a combination of (1) 
comparative surveys, in which a federal team independently surveys a 
nursing home recently inspected by a state in order to compare and 
contrast the results, and (2) observational surveys where federal 
surveyors accompany a state survey team to a nursing home to watch the 
conduct of the survey, provide immediate feedback, and later rate the 
team's performance. Comparative surveys offer a more accurate picture 
of the adequacy of state survey activities than do observational 
surveys, which primarily are used to help identify training needs. 
HCFA surveyors found deficiencies that were more serious than those 
identified by state surveyors in about 70 percent of the 157 
comparative surveys they conducted between October 1998 and May 2000. 
See [hyperlink, http://www.gao.gov/products/GAO/HEHS-00-197], Sept. 
28, 2000. 

[11] Quality indicators were developed in a HCFA-funded project at the 
University of Wisconsin. See Center for Health Systems Research and 
Analysis, Facility Guide for the Nursing Home Quality Indicators 
(University of Wisconsin-Madison: Sept. 1999). 

[12] We refer to these states as having "MDS-based payment systems." 

[13] Each nursing home resident has a medical record where information 
about the resident is documented. In addition to the current plan of 
care, examples of medical record documentation include: (1) recent 
physician notes, (2) results of recent tests, and (3) documentation of 
services provided. Nursing home staff use this documentation to 
complete each MDS assessment. Maintaining an adequate level of 
documentation in the medical record improves the ability of staff to 
complete the MDS accurately, particularly for areas that require 
observation over a period of days. Some states assert that determining 
the degree of assistance that a resident requires with ADLs, such as 
bathing, dressing and toileting, requires repeated observation over 
several days, thus increasing the need for documentation. 

[14] CMS' current review of SNF PPS claims is an example of an off-
site documentation review. CMS contracts with fiscal intermediaries to 
process Medicare claims and to conduct reviews that use medical 
records requested from nursing homes to ensure that claims for 
Medicare payments are adequately supported. For fiscal years 2000 and 
2001, such contracts required fiscal intermediaries to review 0.5 
percent and 1 to 3 percent, respectively, of total SNF PPS claims. 

[15] In January 2002, we learned that one of these states—Kentucky—had 
implemented its MDS review program in October 2001. Our analysis, 
however, is based on the 10 programs in operation as of January 2001. 

[16] The District of Columbia is included as one of the 33 states that 
has no plans to implement a separate MDS review program. In this 
report, we generally refer to the District of Columbia as a state. 

[17] Since separate MDS accuracy reviews are associated with states' 
Medicaid programs, the costs can be considered administrative 
expenses. In general, the federal government pays 75 percent of the 
cost for review activities performed by skilled professional medical 
personnel, such as registered nurses, and 50 percent for other 
personnel costs. States are responsible for the remaining costs. 

[18] A few of the 10 states that carry out separate MDS reviews have 
structured their programs to reduce the costs of on-site reviews. For 
example, Ohio uses off-site data analysis to target a subset of 
facilities for further on-site review. However, West Virginia, which 
conducted on-site reviews until 1998, cited a lack of staff as the 
major reason for switching to an off-site-only review approach. 

[19] These 13 states include: Connecticut, Florida, Kansas, Maryland, 
Michigan, Missouri, Montana, North Carolina, Nevada, Oregon, South 
Carolina, Tennessee, and Wisconsin. Because states volunteered this 
information, there may be other states that conduct similar activities 
that provide some assurance of the accuracy of MDS data. 

[20] Two of the 10 states with MDS accuracy programs closely 
coordinate their reviews with state nursing home surveys—Vermont and 
Washington. In Vermont, 12 registered nurses separately conduct both 
the MDS accuracy reviews and nursing home surveys. Vermont officials 
told us that they had previously tried combining these processes but 
decided to separate them because of the heavy workload. In Washington, 
the nurses who conduct nursing home surveys and MDS reviews are 
located in the same department, and therefore coordinate closely by 
sharing reports and other information. The quality assurance nurses 
who conduct the MDS reviews are surveyor trained and participate in 
nursing home surveys about six times per year. Even so, Washington 
officials cited the importance of having a separate MDS review process 
aside from the nursing home surveys. 

[21] Generally, patients classified as clinically complex may have 
conditions such as burns, pneumonia, internal bleeding, or dehydration. 

[22] States with on-site reviews generally define MDS errors as an 
unsupported MDS assessment, or they use a stricter standard of an 
unsupported MDS assessment that results in a change in the resident's 
case-mix category. None of the states identify whether an MDS error 
results in a quality indicator change. 

[23] To strengthen the on-site review process, a few states—Iowa, 
South Dakota, and Vermont—conduct interrater reliability checks and 
one of these states, South Dakota, also conducts independent 
assessments. During an interrater reliability check, two reviewers 
examine the same MDS assessment and medical record separately and 
compare their findings to determine if they are correctly and 
consistently identifying MDS errors. For independent assessments, 
reviewers complete a separate MDS assessment using all of the 
available information at the facility and then compare it to the 
original assessment completed by the facility. In two recent reports, 
the BHS OIG also conducted independent assessments based on medical 
record documentation for 640 residents. See BHS OIG, OEI-02-99-00040, 
Dec. 2000 and Nursing Home Resident Assessment: Resource Utilization 
Groups, OEI-02-99-00041 (Washington, D.C.: BHS, Dec. 2000). 

[24] The nine states with on-site reviews had different criteria 
regarding when the assessment was too old to use interviews and 
observations as corroborating evidence. For example, one state 
reported that interviews and observations become less useful for an 
MDS assessment completed 14 days prior to the state review, while 
another state cited 180 days. 

[25] Similarly, the BHS OIG acknowledged that its documentation review 
of MDS assessments up to 11 months old did not permit a specific 
determination of why differences occurred, only whether the MDS was 
consistent with the rest of the medical record. See BHS OIG, OEI-02-99-
00041 and OEI-02-99-00040, Dec. 2000. 

[26] We have earlier reported that the timing of some nursing home 
surveys makes them predictable, allowing facilities to mask certain 
deficiencies if they chose to do so. See [hyperlink, 
http://www.gao.gov/products/GAO/HEHS-00-197], p. 11. 

[27] Nursing rehabilitation and restorative care are interventions 
that assist or promote the resident's ability to attain his or her 
maximum functional potential. Some examples include passive or active 
range of motion movements, amputation care, and splint or brace 
assistance. 

[28] For example, 2 of the 24 quality indicators are based on behavior 
areas assessed in the MDS, such as residents being verbally abusive, 
physically abusive, or showing symptoms of depression. 

[29] At the time of our interviews, three states did not recalculate 
Medicaid payments as a result of errors found during MDS reviews—
Maine, Pennsylvania, and Iowa. 

[30] Although Virginia had not begun its reviews at the time of our 
data collection, state officials told us that they planned to use off-
site data analysis to target approximately 20 facilities--7 percent—-
per month for on-site review. 

[31] We recently testified on the problem of nurse and nurse aide 
retention in a range of health care settings, including nursing homes. 
See Nursing Workforce: Recruitment and Retention of Nurses and Nurse 
Aides Is a Growing Concern (GAO-01-750T, May 17, 2001). In addition, 
the BHS OIG recently reported that about 60 percent of MDS 
coordinators had worked 1 year or less in that role at their current 
nursing home and over 65 percent had no prior experience as an MDS 
coordinator. See BHS OIG, OEI-02-99-00040, Dec. 2000. 

[32] HCFA provided guidance in March and July 2001 to facilities 
regarding the completion of MDS assessments. HCFA last published 
similar guidance in August 1996. A few state officials noted the long 
lapse in the publication between the two guides and told us that 
clearer and more timely guidance on MDS definitions was needed. 
However, CMS' Long Term Care Facility Resident Assessment Instrument 
User's Manual, which provides guidance on completing MDS assessments, 
has not been updated since 1995. 

[33] Vermont and Washington also told us that financial penalties are 
an available remedy, but had not imposed them as of early 2001. 

[34] In Maine, facilities are instructed to follow CMS' correction 
policy guidelines for MDS errors that do not result in a case-mix 
category change. In commenting on a draft of this report, CMS noted 
the development and implementation of its policy, which provided a new 
mechanism for facilities to correct inaccurate information in the MDS 
database. This new policy has significantly decreased the ability of 
facilities to submit certain types of inaccurate MDS data, such as 
entering a "5" for a particular MDS element, when the only available 
choices are "1-4." Under this policy, CMS has seen a reduction of 
approximately 66 percent in the proportion of records in the database 
containing invalid data values. 

[35] Indiana imposes financial penalties if more than 35 percent of a 
facility's MDS assessments have errors. State officials told us that 
very few facilities—roughly 3 to 4 each quarter—have errors that are 
significant enough to trigger financial penalties. 

[36] In Maine, only a subset of these case-mix category changes is 
used to calculate an error rate. 

[37] Pennsylvania reviews only those MDS elements that have a positive 
response. For example, if a facility responded "no" or left an MDS 
element blank, that item would not be reviewed for accuracy, even if 
it could affect the case-mix category for that particular resident. 

[38] CMS refers to the contractor responsible for this program as the 
data assessment and verification (DAVE) contractor. 

[39] Similar to the separate MDS reviews conducted by the states, Abt 
reviewed a subset of MDS items at a sample of nursing homes that met 
certain criteria, e.g., they were important in determining case-mix 
categories or calculating quality indicators or were suspected of 
being underreported. Abt reviewers used information from medical 
records as well as interviews and observations with staff and 
residents to determine whether the selected items on the MDS 
assessments were accurate. 

[40] One off-site approach tested relied on analyzing certain MDS 
"trigger" items, such as pneumonia, that are likely to be in error 
when found in a certain pattern on two consecutive MDS assessments for 
the same resident. Off-site data analysis under this approach could be 
used to identify facilities for on-site review that have a high 
proportion of residents shown as having pneumonia—one potential 
trigger item—across two or more MDS assessments. 

[41] The nurses conducted assessments over several days and shifts 
using all available documentation—medical record reviews, interviews, 
and observations—to replicate as closely as possible the observation 
period the facility used to make its assessments of those same 
residents. Because Abt found too few assessments meeting its original 
criteria—completed by the facility up to 14 days prior to the visits—
it augmented its sample with assessments that were up to 35 days old. 

[42] Similar to Abt, the HHS OIG concluded that differences found 
between MDS assessments and the supporting documentation indicated 
confusion or difficulties with the MDS assessment instrument and the 
need for enhanced training. The BHS OIG found differences in 76 
percent of the Medicare assessments reviewed. ADLs and the number of 
minutes recorded for therapy, specifically occupational and physical 
therapy, provided the greatest source of differences. 

[43] Program safeguard contractors were authorized by the Health 
Insurance Portability and Accountability Act of 1996, which allowed 
HCFA to contract with specialized entities to identify program 
integrity concerns. See 42 U.S.C. § 1395ddd. In May 1999, HCFA 
selected a pool of 12 contractors that can bid on proposed contracts 
covering these types of activities. See Medicare: Opportunities and 
Challenges in Contracting for Program Safeguards (GAO-01-616, May 18, 
2001). 

[44] Although the contractor will first focus on conducting MDS 
accuracy activities, the contractor is also required to establish a 
review program for the Outcome and Assessment Information Set (OASIS), 
the data used as the basis for home health payments and quality 
measures. 

[45] The reviews would encompass assessments from all payer sources. 
According to CMS, the number of assessments to be reviewed is a target 
that is subject to change. 

[46] CMS refers to the contractor responsible for this program as the 
DAVE contractor. 

[End of section] 

GAO’s Mission: 

The General Accounting Office, the investigative arm of Congress, 
exists to support Congress in meeting its constitutional 
responsibilities and to help improve the performance and 
accountability of the federal government for the American people. GAO 
examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO’s commitment to good government is reflected in its 
core values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through the Internet. GAO’s Web site [hyperlink, 
http://www.gao.gov] contains abstracts and full text files of current 
reports and testimony and an expanding archive of older products. The 
Web site features a search engine to help you locate documents using 
key words and phrases. You can print these documents in their 
entirety, including charts and other graphics. 

Each day, GAO issues a list of newly released reports, testimony, and 
correspondence. GAO posts this list, known as “Today’s Reports,” on 
its Web site daily. The list contains links to the full-text document 
files. To have GAO e-mail this list to you every afternoon, go to 
[hyperlink, http://www.gao.gov] and select “Subscribe to daily E-mail 
alert for newly released products” under the GAO Reports heading. 

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are 
$2 each. A check or money order should be made out to the 
Superintendent of Documents. GAO also accepts VISA and Mastercard. 
Orders for 100 or more copies mailed to a single address are 
discounted 25 percent. Orders should be sent to: 

U.S. General Accounting Office: 441 G Street NW, Room LM: 
Washington, D.C. 20548: 

To order by Phone: 
Voice: (202) 512-6000: 
TDD: (202) 512-2537: 
Fax: (202) 512-6061: 

To Report Fraud, Waste, and Abuse in Federal Programs Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Public Affairs: 

Jeff Nelligan, managing director, 
NelliganJ@gao.gov: 
(202) 512-4800: 
U.S. General Accounting Office: 
441 G Street NW, Room 7149:
Washington, D.C. 20548: