This is the accessible text file for GAO report number GAO-05-40 
entitled 'Aviation Safety: FAA Needs to Strengthen the Management of 
Its Designee Programs' which was released on November 16, 2004.

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov.

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately.

Report to the Ranking Democratic Member, Subcommittee on Aviation, 
Committee on Transportation and Infrastructure, House of 
Representatives:

October 2004:

AVIATION SAFETY:

FAA Needs to Strengthen the Management of Its Designee Programs:

[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-05-40]:

GAO Highlights:

Highlights of GAO-05-40, a report to Ranking Democratic Member, 
Subcommittee on Aviation, House Committee on Transportation and 
Infrastructure: 

Why GAO Did This Study:

The safety of the flying public and the reliability of the nation’s 
aircraft depend, in part, on the Federal Aviation Administration’s 
(FAA) regulation and certification of the aviation industry. FAA 
delegates the vast majority of its safety certification activities to 
about 13,600 private persons and organizations, known as “designees,” 
which are currently grouped into 18 different programs. Among other 
tasks, designees perform physical examinations to ensure that pilots 
are medically fit to fly and examine the airworthiness of aircraft. 

GAO reviewed (1) the strengths of FAA’s designee programs, (2) the 
weaknesses of those programs and factors contributing to those 
weaknesses, and (3) potential improvements to the programs.

What GAO Found:

The key strength of FAA’s designee programs is their ability to 
leverage agency resources. Allowing technically qualified individuals 
and organizations to perform 90 percent of certification activities 
enables FAA to better concentrate its limited staff resources on the 
most safety-critical functions, such as certifying new and complex
aircraft designs. For the aviation industry, designee programs enable 
individuals and companies to obtain required FAA certifications—such as 
approvals of aircraft designs—in a timely manner, thus reducing delays 
and costs to industry that might result from scheduling direct reviews 
by FAA. For example, officials from Boeing told us that using designees 
has added significantly to the company’s ability to improve daily 
operations by decreasing certification time.

Inconsistent FAA oversight and application of program policies are key 
weaknesses of the designee programs. FAA headquarters has evaluated 
only 6 of the 18 designee programs over the last 7 years. FAA conducted 
the evaluations on an ad hoc basis and lacks requirements or criteria 
for periodically evaluating these programs. FAA uses these evaluations 
to determine whether designee programs are complying with agency 
policies. In addition, FAA field offices do not always oversee designee 
activities according to agency policy. For example, a recent FAA study 
found that inspectors were not reviewing designated pilot examiners’ 
work on an annual basis as policy requires. Potential reasons for 
inconsistent oversight include (1) incomplete databases that FAA uses 
to manage its oversight of designees, (2) workload demands for FAA 
staff that limit the time spent on designee oversight, and (3) the lack 
of adequate training for FAA staff who oversee designees. While we did 
not find a direct link between inconsistent oversight of these programs 
and specific safety problems, the lack of consistent oversight limits 
FAA’s assurance that designees perform their work according to federal 
standards.

Opportunities exist for FAA to improve (1) program oversight to ensure 
consistent compliance with existing policies by FAA staff and (2) the 
completeness of databases used in designee oversight. For example, FAA 
could evaluate more of its field offices and designees—efforts modeled 
partly on the assessments conducted by some FAA regional offices—to 
ascertain the extent to which policies are being followed. 

Aircraft Undergoing Certification at Organizational Designee Facility: 

[See PDF for image]

[End of figure]

What GAO Recommends:

GAO recommends that FAA: (1) establish a program to evaluate all 
designee programs, giving priority to those programs that have not been 
evaluated, (2) develop mechanisms to improve compliance with existing 
designee oversight policies, and (3) upgrade its databases to provide 
complete and consistent information on all designee programs and the 
extent to which oversight is occurring.

FAA officials generally agreed with our recommendations, but expressed 
concerns about our use of an expert panel to identify weaknesses in the 
programs.

www.gao.gov/cgi-bin/getrpt?GAO-05-40.

To view the full product, including the scope and methodology, click on 
the link above. For more information, contact JayEtta Z. Hecker, (202) 
512-2834, heckerj@gao.gov.

[End of section]

Contents:

Letter:

Results in Brief:

Background:

Designee Programs Leverage FAA Resources and Provide Industry with 
Timely Certification Reviews:

FAA's Lack of Consistent Oversight of Designee Programs Is Affected by 
Incomplete Data, Workload Demands, and Lack of Training:

FAA Has Potential Opportunities to Improve Designee Programs:

Conclusions:

Recommendations for Executive Action:

Agency Comments:

Appendixes:

Appendix I: Objectives, Scope, and Methodology:

Appendix II: Experts Participating on GAO's Panel:

Appendix III: Roles and Responsibilities of Designees:

Appendix IV: Survey Instrument and Results:

Appendix V: GAO Contacts and Staff Acknowledgments:

GAO Contacts:

Staff Acknowledgments:

Bibliography:

Tables:

Table 1: Comparison of Designee Programs Administered by Three FAA 
Offices:

Table 2: Experts' Ranking of Top Strengths of the Designee Programs:

Table 3: Experts' Ranking of Top 5 Oversight Weaknesses:

Table 4: Experts' Ranking of Top Ways to Improve FAA's Designee 
Programs:

Table 5: Organizations Interviewed by GAO During Site Visits:

Table 6: The Number of Panelists Participating in Each Phase and 
Response Rate:

Table 7: Experts' Responses to GAO's Survey:

Figures:

Figure 1: FAA Offices That Manage the Different Designee Programs and 
Numbers of Designees (as of May 2004):

Figure 2: Designees Support FAA Throughout the United States:

Abbreviations: 

AME: Aviation Medical Examiner:

DAS: Designated Alteration Station:

DER: Designated Engineering Representative:

DOT: Department of Transportation:

DPE: Designated Pilot Examiner:

FAA: Federal Aviation Administration:

GAO: Government Accountability Office:

NVIS: National Vital Information Subsystem:

ODA: organization designation authorization:

PTRS: Program Tracking and Reporting Subsystem:

Letter October 8, 2004:

The Honorable Peter A. DeFazio: 
Ranking Democratic Member: 
Subcommittee on Aviation: 
Committee on Transportation and Infrastructure: 
House of Representatives:

Dear Mr. DeFazio:

The safety of the flying public and reliability of the nation's 
aircraft depends, in part, on the Federal Aviation Administration's 
(FAA) regulation and certification of the aviation industry. Although 
FAA staff perform many activities crucial to maintaining the safety of 
air transportation, since the 1920s, FAA has depended on 
congressionally authorized designee programs to help the agency ensure 
that the aviation industry meets certain safety standards. FAA's 
designee programs authorize about 13,400 private individuals and about 
180 organizations nationwide, known as "designees," to act as 
representatives of the agency to conduct many safety certification 
activities, such as administering flight tests to pilots, inspecting 
repair work by maintenance facilities, conducting medical examinations 
of pilots, and approving designs for aircraft parts. These designees 
are currently grouped into 18 different programs and are overseen by 
three FAA offices--Flight Standards Service, Aerospace Medicine, and 
Aircraft Certification Service--all of which are under the Office of 
the Associate Administrator for Regulation and Certification. Given the 
vastness of the U.S. aviation industry, designees enable FAA to carry 
out thousands of certification functions each year. FAA staff[Footnote 
1] are responsible for overseeing the work of individual designees and 
ensuring that organizational designees (also referred to as 
"delegations")--companies such as repair stations that have been 
delegated the authority to perform inspections of aircraft that have 
undergone major repairs--have systems in place, including staff and 
procedures, to perform the delegated functions. Organizational 
designees are responsible for overseeing their employees who perform 
the delegated functions. Based, in part, on congressional direction, 
FAA plans to change its designee programs within the next several years 
so that the agency can rely more on organizational rather than 
individual designees.

In response to your request, this report addresses the following 
questions: (1) What are the strengths of FAA's designee programs? (2) 
What are the weaknesses of the programs and the factors that contribute 
to those weaknesses? and (3) What can be done to address the identified 
weaknesses or otherwise improve the programs?

To address these questions, we obtained and analyzed information from a 
variety of sources. We identified 62 aviation experts with knowledge 
and expertise in FAA's designee programs, who participated on a Web-
based panel that provided the group's views on the strengths and 
weaknesses of the designee programs and ways to improve the programs. 
An initial list of experts was identified through referrals by FAA 
officials, the National Air Traffic Controllers Association, the 
Professional Airway System Specialists, and the Aerospace Repair 
Station Association and through citations in the literature on 
aviation. We then asked these initially identified experts for 
additional experts. We continued this process until we had about 10 to 
20 experts in each of four categories: (1) designees, (2) FAA 
inspectors and engineers, (3) independent experts and university 
academics, and (4) private sector and aviation industry associations. 
We obtained the experts' views by employing an iterative and controlled 
feedback process for obtaining individual views and then allowing each 
participant to respond to the entire panels' comments.

In addition, we obtained and analyzed information from FAA databases 
that maintain records on designees for fiscal years 1998 through 2003. 
We assessed the reliability of the databases and found the data 
sufficiently reliable for the types of analyses that we conducted for 
this report--including nationwide analyses of the number of designees 
by program, the geographical location of designees, and the number of 
designees per FAA staff responsible for designee oversight. However, we 
found that specific data needed for oversight were not found in some 
databases, as we discuss later in this report. We also conducted semi-
structured interviews with FAA officials, representatives of FAA 
inspectors and engineers who oversee designees, and designees in 
Seattle, Atlanta, Los Angeles, and Oklahoma City to obtain information 
on FAA's oversight of designees. We also interviewed officials from 
Transport Canada (the Canadian civil aviation authority) to obtain 
descriptive information on their designee programs. In addition, we 
reviewed past studies of FAA's use of designees by us, the Department 
of Transportation's (DOT) Office of Inspector General, and others. We 
conducted our work from April 2003 through October 2004 in accordance 
with generally accepted government auditing standards. Additional 
information on our methodology and the experts who participated on our 
panel are found in appendixes I and II.

Results in Brief:

Designees perform more than 90 percent of FAA's certification 
activities, thus greatly leveraging the agency's resources. By 
permitting nearly 13,400 technically qualified individuals and about 
180 organizations to perform thousands of certification tasks each 
year, FAA officials believe that the designee programs allow the agency 
to concentrate on what it considers to be the most critical safety 
areas, a view shared by our panel of experts. For example, while 
designees conduct routine certification functions, such as approvals of 
aircraft technologies that the agency and designees have had previous 
experience with, FAA focuses on new and complex aircraft designs or 
design changes. In addition, the use of designees expands FAA's access 
to technical expertise within the aviation community. For the aviation 
industry, the designee programs enable individuals and organizations to 
obtain required FAA certifications--such as approvals of the design, 
production, and airworthiness of aircraft--in a timely manner, thus 
reducing delays and costs to the industry that might result from 
scheduling direct reviews by FAA staff. For example, officials from an 
aircraft manufacturer told us that the use of designees has added 
significantly to the company's ability to enhance and improve daily 
operations by decreasing certification delivery time and increasing the 
flexibility and utilization of company resources. In addition, 
designees are convenient to the aviation industry due to their wide 
dispersal throughout the United States.

FAA's inconsistent monitoring of its designee programs and oversight of 
its designees are key weaknesses of the programs. For example, while 
FAA has evaluated 6 of its 18 designee programs over the last 7 years 
and has plans to evaluate 2 more, it has no plans to evaluate the 
remaining programs because of limited resources, according to a program 
official. FAA conducted these evaluations on an ad hoc basis usually at 
the request of FAA headquarters directors or regional office managers. 
The agency does not have requirements or criteria for periodically 
evaluating these programs. FAA uses these evaluations to determine 
whether designee programs are being carried out in compliance with 
agency policies. However, FAA has not implemented some recommendations 
from these evaluations. For example, a 2000 evaluation of designated 
alteration stations recommended that FAA establish a process to 
periodically assess the effectiveness and applicability of existing 
oversight policies concerning designated alteration stations and 
consider feedback from FAA field offices and designees as part of that 
process. The agency has not implemented this recommendation. In 
addition, we found that FAA field offices do not consistently implement 
agency policies on monitoring, selecting, and terminating designees. 
For example, inspectors in one region were not reviewing designated 
pilot examiners work on an annual basis and conducting oversight as 
required by agency policy. The primary goal of FAA's standards and 
policies, and its oversight of designees, is the safety of U.S. 
aviation. While we did not find systematic safety problems associated 
with FAA's oversight of designees, the agency's inconsistent oversight 
limits its assurance that the designees' work is performed uniformly in 
accordance with those standards and policies. Finally, we identified 
several factors that may have hindered FAA's ability to systematically 
monitor the designee programs and consistently apply designee oversight 
policies. First, FAA's oversight is hampered, in part, by the limited 
usefulness of some agency databases that are designed to capture 
information on designees. While all the databases have descriptive 
information on designees, such as their types of designations and 
status (i.e., active and/or terminated), the databases lack complete 
and consistent information on designees' performance and do not provide 
a comprehensive picture of whether FAA staff are carrying out their 
responsibilities to oversee designees. Second, the workload demands on 
FAA staff may limit the time they spend on designee oversight. Finally, 
FAA does not require refresher training for all staff who oversee 
designees, thereby increasing the risk that some staff do not retain 
the information, skills, and competencies required to perform their 
oversight responsibilities.

Opportunities exist for FAA to address these weaknesses by improving 
(1) oversight of the designee programs to ensure consistent compliance 
with existing policies by FAA inspectors, engineers, and flight 
surgeons and (2) the accuracy and comprehensiveness of computerized 
information on designees so that the databases can be more useful tools 
for designee oversight. Those opportunities were identified by experts 
on our panel and our review of practices within FAA and procedures 
adopted by other countries in administering their programs. For 
example, FAA could more consistently conduct internal evaluations of 
its field offices and designee programs--evaluations modeled in part on 
the assessments performed by some regional and program offices--to 
ascertain the extent to which its policies and procedures are being 
followed. FAA's internal review of designated pilot examiners in one 
regional office could provide a model for evaluations that could be 
performed by other FAA regions and for other designee programs. The 
review, which was based on (1) a comprehensive statistical analysis of 
designee activity in the region, (2) a survey of pilots who were tested 
by those designees, and (3) audits of designee files and surveillance 
reports by FAA inspectors, provided a reasonable method to assess 
program outcomes, identify the root causes of the lack of compliance 
with agency policy, and develop corrective action plans to address the 
root causes. Accurate, comprehensive data on FAA oversight and designee 
activities are integral to monitoring and evaluating the programs. The 
database used by FAA's Office of Aerospace Medicine to monitor the 
activities and performance of aviation medical examiners provides 
information and uses that could serve as a model for the other offices-
-Flight Standards Service and Aircraft Certification Service--that lack 
comprehensive databases on designee activities. Although this database 
was designed to simplify the processing of airmen medical certification 
information, Aerospace Medicine uses it to extract information on the 
status of aviation medical examiners and monitor their activity levels. 
Careful consideration of such opportunities are important both because 
of the central importance that the designee programs hold for FAA as 
well as the agency's plans to expand the use of organizational 
designees, which will further transform FAA's role to that of 
monitoring the performance of organizations rather than overseeing the 
individuals who perform the certification activities. Transport Canada, 
which expanded its use of organizational designees in the late 1980s, 
identified the establishment of standardized oversight practices and 
frequent audits of Canadian designees as important components of its 
programs.

To improve management control of the designee programs, and thus 
increase assurance that designees meet FAA's performance standards, we 
recommend that the Secretary of Transportation direct the FAA 
Administrator to establish a program to evaluate all designee programs, 
giving priority to those programs that have not been evaluated, and 
develop mechanisms to more consistently monitor and improve compliance 
with existing designee oversight policies, including identifying and 
sharing best practices among FAA programs and field offices. We also 
recommend that FAA strengthen the effectiveness of its designee 
databases by improving the consistency and completeness of information 
on designees activities and performance and FAA oversight. FAA 
officials generally agreed with these recommendations. However, the 
agency expressed concerns about our methodology for obtaining expert 
opinions of the designee programs. Further information is provided in 
the "Agency Comments" section of this report.

Background:

FAA has relied on designee programs since the 1920s to help the agency 
meet its responsibility for ensuring that the aviation industry meets 
FAA's safety standards.[Footnote 2] The programs authorize private 
persons and organizations, known as individual and organizational 
designees, respectively, to act on behalf of the agency to perform many 
activities to ensure the safety of air transportation. Of the nearly 
13,600 designees nationwide, approximately 13,400 are individual 
designees and about 180 are organizational designees, as of May 2004. 
These designees are grouped into 18 different programs and are overseen 
by three FAA offices--Flight Standards Service, Aerospace Medicine, and 
Aircraft Certification Service--all of which are under the Office of 
the Associate Administrator for Regulation and Certification. Figure 1 
shows the 18 different designee programs, the number of designees, and 
the FAA offices that manage them.

Figure 1: FAA Offices That Manage the Different Designee Programs and 
Numbers of Designees (as of May 2004):

[See PDF for image] 

[End of figure] 

Designees perform a large percentage of certification activities on 
behalf of FAA, such as determining whether aircraft designs, 
manufacturing, and maintenance meet specific safety standards and 
certifying the competency of persons that operate aircraft. FAA policy 
calls for the agency to delegate activities by evaluating the risk 
involved with such delegation; assessing whether the aviation industry 
has the experience to perform designated tasks; and delegating 
activities with defined standards, processes, and oversight procedures. 
FAA policy also states that some tasks are not delegated. For example, 
FAA does not permit designees to make rules, conduct surveillance or 
enforcement activities against aircraft manufacturers and airlines, or 
issue and modify aircraft type and production certificates.

Individual and organizational designees' roles and responsibilities 
vary according to program. For example, individual designees, such as 
engineering designees, evaluate whether aircraft designs meet FAA 
safety standards, designated mechanic examiners administer practical 
tests to mechanic applicants, designated pilot examiners administer 
practical tests to pilot applicants, and aviation medical examiners 
certify that pilots are medically fit to operate aircraft. Most 
individual designees can charge service fees to applicants. Most 
organizational designees perform similar activities as individual 
designees, but the organization holds the designation rather than the 
employees who work for them.[Footnote 3] The organization is 
responsible for managing, overseeing, and training its employees who 
perform the delegated functions. Organizational designees must develop 
procedures manuals that describe how the organizations will comply with 
FAA requirements and describe their internal evaluation processes, 
including internal auditing procedures. An example of an organizational 
designee is a designated alteration station, which is a company that 
can issue supplemental type certificates, which are required for 
aircraft that have been modified from their original design. Further 
information on the roles and responsibilities of the various types of 
designees are presented in appendix III.

FAA policy calls for selecting and appointing designees based on 
several factors, including designees' experience and qualifications, 
FAA field or program offices' ability to oversee designees, and the 
need for particular types of designees. Although the selection and 
appointment policies and procedures differ somewhat for different 
designee types, these policies generally call for specific and thorough 
technical reviews of the designee applicants' qualifications, including 
verifying the applicants' work experience, testing the applicants' 
knowledge and skills, and examining on-the-job performance. According 
to FAA policy, FAA officials or flight surgeons evaluate the 
applicants' experience and qualifications and determine whether to 
appoint or deny the applicant's request for designation.

FAA's field and program offices are responsible for supervising, 
monitoring, and tracking designees' activities to ensure that designees 
are performing their authorized functions in accordance with the 
appropriate regulations, policies, and procedures. FAA policy states 
that its inspectors, engineers, and flight surgeons should ensure the 
integrity of the designee programs by evaluating designee performance, 
interacting with designees on a regular basis, and evaluating technical 
data prepared by designees. For instance, FAA inspectors are expected 
to oversee designated pilot examiners by verifying their attendance at 
required training seminars and meetings, ensuring that they have 
developed and implemented a plan of action for the practical tests they 
conduct on pilot applicants, observing annually at least one practical 
test administered to a pilot applicant, and verifying that the designee 
has sufficient work activity to justify continuance of the designation. 
By comparison, FAA inspectors and engineers are expected to oversee 
organizational designees by ensuring that the organizations' procedures 
manuals comply with FAA policies on approving the design, production, 
and airworthiness of aircraft and assessing the technical capabilities 
of the organization. In addition, FAA officials are expected to provide 
guidance and oversight of organizational designees by participating in 
many aspects of major approvals. For instance, FAA officials provide 
guidance and oversight for projects involving new aircraft design 
concepts and technology.

Most designees' appointments are effective for 1 year, with the 
exception of individual and organizational designated airworthiness 
representatives, who are appointed for up to 5 years and all other 
types of organizational designees, whose appointments do not expire. 
FAA can terminate designees for various reasons, including insufficient 
work activity, unacceptable performance, lapse of qualifications, and 
lack of FAA need or ability to manage them. Designees can generally 
appeal FAA's decision to terminate them, except when the decision to 
terminate has been based on FAA's lack of resources to manage them. 
Table 1 compares aspects of designee oversight, including how designees 
are selected and terminated, among the three FAA program offices with 
designee responsibilities.

Table 1: Comparison of Designee Programs Administered by Three FAA 
Offices:

Program areas: Designee selection; 
Office of Aircraft Certification Service: Local FAA panel reviews 
designee applicants' qualifications and makes appointment; 
Office of Flight Standards Service: National selection board (National 
Examiner Board) reviews designee applicants' qualifications and creates 
a list of qualified candidates; Field office managers make appointment 
from the list of qualified candidates; 
Office of Aerospace Medicine: FAA regional flight surgeons review the 
qualifications of designee applicants and make appointments.

Program areas: Designee oversight; 
Office of Aircraft Certification Service: FAA inspectors or engineers 
are required to annually witness the performance of designees; FAA is 
required to conduct a technical evaluation and an Aircraft 
Certification Systems Evaluation Program[A] evaluation of delegated 
organizations every 2 years; Organizational designees are required to 
perform and document self-evaluation activities; 
Office of Flight Standards Service: FAA inspectors are required to 
conduct annual surveillance of most designees; Organizational designees 
are required to perform and document self-evaluation activities; 
Office of Aerospace Medicine: FAA regional flight surgeons are not 
required to conduct site visits of designees, but are required to 
assess designee performance in order to renew authorizations.

Program areas: Database used to monitor designees; 
Office of Aircraft Certification Service: Designee Information Network; 
Office of Flight Standards Service: Program Tracking and Reporting 
Subsystem and National Vital Information Subsystem; 
Office of Aerospace Medicine: Airmen Medical Certification Information 
Subsystem.

Program areas: Training for designees and FAA staff who oversee 
designees; 
Office of Aircraft Certification Service: Designees are required to 
attend initial indoctrination and refresher training every 2 years; FAA 
staff are required to attend initial training in areas of 
specialization and take the Delegation Management Course. Refresher 
training is not required for staff; Organizational designees are 
responsible for training authorized representatives who perform 
delegated functions.[B]; 
Office of Flight Standards Service: Designees are required to attend 
initial indoctrination and refresher training every 2 years; FAA staff 
are required to attend initial training in areas of specialization. A 
specific training course on designee oversight has not been developed. 
Refresher training is not required for staff; Organizational designees 
are responsible for training authorized representatives who perform 
delegated functions.[B]; 
Office of Aerospace Medicine: Designees and FAA staff are required to 
attend initial indoctrination and refresher training every 3 years.

Program areas: Termination of designees; 
Office of Aircraft Certification Service: Field office managers 
terminate designees; 
Office of Flight Standards Service: Field office managers terminate 
designees; 
Office of Aerospace Medicine: Regional flight surgeons terminate 
designees. 

Source: GAO analysis of FAA information.

[A] Aircraft Certification Systems Evaluation Program evaluations were 
designed to determine if FAA-delegated facilities are complying with 
the requirements of applicable federal regulations and the procedures 
established to meet those requirements.

[B] Training covers such areas as functions delegated to the 
authorization, the organization's processes and procedures, and FAA 
policy and guidance material.

[End of table]

FAA has proposed expanding the number of organizational designees and 
reducing the number of individual designees by creating an organization 
designation authorization (ODA) program. The ODA program would allow 
FAA to expand and standardize the approval functions of organizational 
designees and expand eligibility for organizational designees, 
including organizations not eligible under current FAA rules. 
Organizational designees under the current programs would be phased out 
during the first 3 years of implementing the new program, and the 
organizational designees would be expected to reapply for an ODA. FAA 
issued a Notice of Proposed Rulemaking for the ODA program in January 
2004. While FAA has received many comments in opposition to the 
proposed program including several that raise concerns that the 
proposed program would provide less specific and less technical 
oversight by FAA and would, over time, reduce the safety of the flying 
public, FAA has also received comments that the proposed program would 
improve the effectiveness of the agency's oversight of designees.

In addition, FAA has been mandated to develop and implement a certified 
design organization program, which would affect some designees 
currently responsible for approving the design and production of 
aircraft, and aircraft parts and equipment.[Footnote 4] Under this 
program, certain organizational designees that design and produce 
aircraft parts and equipment would no longer be designees, rather they 
would conduct their approval functions under a newly created FAA 
certificate. As a certificate holder, the certified design 
organizations would be subject to more formal processes when FAA grants 
or revokes the certificate. FAA would develop those processes as part 
of its requirement to develop a plan to implement a certified design 
organization program by 2007.

Designee Programs Leverage FAA Resources and Provide Industry with 
Timely Certification Reviews:

Designees perform more than 90 percent of FAA's certification 
activities, thus greatly leveraging the agency's resources and enabling 
staff to concentrate on other areas of aviation safety, according to 
our panel of experts, FAA and industry officials, and FAA staff who 
oversee designees. The approximately 13,600 designees augment FAA's 
workforce of about 4,100 inspection staff who are responsible for 
ensuring industry's adherence to FAA regulations. According to FAA 
officials, designees are crucial to the certification process by 
conducting routine activities, thereby allowing the agency to target 
its direct involvement to the most critical certification functions. 
For example, designated airworthiness representatives and designated 
manufacturing inspection representatives routinely support company 
efforts to perform design enhancements by conducting design conformity 
inspections in accordance with established procedures, while FAA's 
Aircraft Certification Service focuses on new and complex aircraft 
designs or design changes. This information is consistent with the 
strengths of the FAA's designee programs identified by our expert 
panel. Table 2 shows the top five strengths identified by our expert 
panel. There was considerable agreement among the experts on these 
strengths. All were identified as a "great" or "very great" strength of 
the designee programs by most of the panelists. No more than 2 of the 
62 participating experts felt that these strengths had "no" importance 
toward accomplishing FAA's safety responsibilities. (See app. IV for 
additional strengths identified by our expert panel.)

Table 2: Experts' Ranking of Top Strengths of the Designee Programs:

Ranking: 1; 
Strength: Use of designees expands available FAA resources.

Ranking: 2; 
Strength: Use of designees allows for more timely approvals than by not 
using designees.

Ranking: 3; 
Strength: Use of designees expands available technical expertise and 
specialization.

Ranking: 4; 
Strength: Designees provide greater scheduling flexibility and access 
to the public.

Ranking: 5; 
Strength: Use of designees enables FAA staff to concentrate on other 
areas of aviation safety.

Source: GAO analysis of expert panel information.

Note: Rankings based on responses from 62 experts and the frequency of 
responses indicating a "great" or "very great" strength.

[End of table]

According to all of the private industry experts on our panel and many 
of the other panelists, the use of designees allows the aviation 
industry and others to obtain more timely approvals and issuance of 
aircraft certifications than would be possible if FAA staff were solely 
responsible for those tasks. The designee programs provide more timely 
service to the aviation industry, while assuring the airworthiness of 
aeronautical products by utilizing aviation industry expertise to 
perform many certification activities under the oversight of FAA, 
according to agency officials. In addition, the designee programs 
provide the industry with greater scheduling flexibility and access to 
aviation safety-related services, such as access to aircraft and pilot 
certification services. For example, Boeing officials told us that the 
use of designees has added significantly to the company's ability to 
enhance and improve daily operations by providing consistent 
certification processes, decreasing certification delivery time, and 
increasing the flexibility and utilization of Boeing resources, which 
could reduce costs. Many experts on our panel also concurred that the 
designee programs are convenient to the aviation industry, as aviation 
organizations are able to control their production deadlines and not 
depend on FAA's schedule for certification and approval. Figure 2 shows 
the geographic distribution of designees and their wide dispersal 
throughout the United States.

Figure 2: Designees Support FAA Throughout the United States:

[See PDF for image] 

[End of figure] 

Additionally, the use of designees expands FAA's access to technical 
expertise within the aviation community, as many designees are industry 
experts. Forty-six of the 62 experts on our panel thought this was a 
"great" or "very great" strength of the designee programs, including 
all of the experts from the aviation industry. For example, designated 
engineering representatives review thousands of calculations, tests, 
and data involved in aircraft designs, on behalf of the agency to 
ensure compliance with FAA regulations. Other designees, such as 
designated manufacturing inspection representatives and designated 
airworthiness representatives, are technical experts in the "production 
conformity"[Footnote 5] or inspection of certain aircraft products or 
parts and issue certificates or approvals for engines, propellers, and 
other aircraft parts. Still other designees are aviation medical 
examiners--physicians who have been delegated the authority to perform 
physical examinations to determine if applicants are qualified to 
receive airman medical certificates[Footnote 6] and student pilot 
certificates.

FAA's Lack of Consistent Oversight of Designee Programs Is Affected by 
Incomplete Data, Workload Demands, and Lack of Training:

Our work shows that inconsistent oversight is a key weakness of the 
designee programs. Oversight occurs at two levels: at FAA headquarters, 
which is responsible for monitoring the practices of its field offices, 
and at FAA field offices that are directly overseeing designees. First, 
while FAA has evaluated 6 of its 18 designee programs since 1997 and 
plans to evaluate 2 more programs, it has no plans to evaluate the 
remaining programs because of limited resources. Moreover, the agency 
has not implemented some key recommendations from these evaluations. 
Second, FAA field offices do not always oversee designee activities 
according to FAA policy, nor do the field offices apply consistent 
criteria for selecting and terminating designees. The primary goal of 
FAA's standards and policies, and its oversight of designees, is the 
safety of U.S. aviation. While we did not find systematic safety 
problems associated with FAA's oversight of designees, the agency's 
inconsistent oversight limits its assurance that the designees' work is 
performed uniformly in accordance with those standards and policies. 
FAA's ability to systematically evaluate the designee programs and 
consistently apply its designee oversight policies may be impeded by 
three conditions: (1) incomplete data on FAA's oversight of designee 
activities, (2) workload demands placed on FAA staff who oversee 
designees, and (3) the lack of adequate training for FAA staff who 
perform oversight duties.

FAA Provides Inconsistent Monitoring of Its Field Offices:

To monitor the effectiveness of its designee programs and determine 
whether field offices are following FAA policy in their oversight of 
designees, FAA has evaluated only 6 of its 18 designee programs over 
the last 7 years. These evaluations encompass about 35 percent of FAA's 
designees. Moreover, these evaluations vary in quality and 
comprehensiveness. While FAA has plans to evaluate two additional 
designee programs over the next several years, it does not plan to 
evaluate the other 10 designee programs because of limited resources, 
according to a program official. FAA conducts evaluations of its 
designee programs on an ad hoc basis, usually at the request of FAA 
headquarters directors or regional office managers and uses these 
evaluations to determine whether the programs are being implemented in 
accordance with agency policies. The agency does not have requirements 
or criteria for periodically evaluating these programs and identifying 
the root causes for field offices and staff not consistently following 
FAA policies. According to FAA officials, the agency is developing 
quality management standards that will be used to evaluate field 
offices, including their oversight of designee programs. Both Flight 
Standards and Aircraft Certification Services plan to obtain approval 
of their quality management standards in 2006, but have no timeframe 
for conducting additional evaluations. While Aerospace Medicine has not 
evaluated its designee program, it uses regular management meetings 
with all the regional flight surgeons to monitor field oversight 
activities.

For the 11 designee programs within Flight Standards Service, the 
office has evaluated the designated pilot examiner program in some 
field offices and has plans to evaluate oversight practices for aircrew 
program designees in 2005 and designated mechanic examiners by 2006. 
However, the office has no current plans to review the oversight 
practices for the additional eight types of designees because of 
limited resources, according to a program official. In 2000, FAA's 
Flight Standards Service created a Quality Assurance Team to undertake 
standardized evaluations of its field offices to determine how they are 
conducting business, identify deficient areas, and make improvements as 
needed.[Footnote 7] As of July 2004, the Quality Assurance Team had 
evaluated the oversight of designated pilot examiners at 60 out of 104 
Flight Standards field offices to determine whether each office is 
following FAA policies and standards. The team plans to assess the 
designated pilot examiner oversight practices of the remaining field 
offices in 2005. Among the completed evaluations, Flight Standards has 
identified program weaknesses, such as computerized data records that 
lack information on required surveillance of designees. The evaluation 
process calls for reporting any identified deficiencies to the 
appropriate offices and regions for corrective action. However, the 
evaluations by the Quality Assurance Team do not identify the root 
causes or reasons for field offices and staff not consistently 
following FAA policies and standards. According to program officials, 
root causes of the problems are not identified because that is not the 
purpose of the audits.

In addition, in 2000, Flight Standards' Southwest Region reviewed the 
designated pilot examiner program in its nine field offices. While the 
review did not find any pilots who had been inappropriately 
certificated, it did find that inspectors were not reviewing pilot 
examiners' work on an annual basis and conducting oversight as required 
by FAA policy.[Footnote 8] The review by the Southwest Region was more 
comprehensive than the reviews undertaken by the Quality Assurance 
Team. Both the region and the Quality Assurance Team audited data on 
designees that were maintained in office files and in a computerized 
database for compliance with agency policy. However, unlike the Quality 
Assurance Team, the Southwest Region also gathered and analyzed 
information on designee activity and surveyed newly certificated pilots 
and conducted a 2-day conference with designated pilot examiners from 
the region. This more rigorous evaluation allowed the region to assess 
the outcomes of this designee program, identify root causes of the lack 
of compliance with agency policy, and develop corrective action plans, 
including increased training for inspectors, to address the root 
causes. Flight Standards has not applied this more comprehensive 
evaluation to its other eight regions or other designee programs to see 
if similar problems exist and to take any needed corrective action.

By comparison, from 1997 through 2000, FAA's Aircraft Certification 
Service assessed five[Footnote 9] of its six designee programs and took 
action to identify and correct the root causes of some identified 
weaknesses.[Footnote 10] For example, in 2000,[Footnote 11] the office 
assessed one designee program--designated alteration stations--in the 
aftermath of the fatal crash of Swissair Flight 111 in 1998, which 
killed 229 passengers and crewmembers. The Transportation Safety Board 
of Canada, which investigated the crash, suspected that an 
entertainment system, the installation of which had been approved by an 
FAA designee, may have been one factor contributing to a deadly 
electrical fire on board the aircraft.[Footnote 12] The Board concluded 
that FAA's designee program did not ensure that the designated 
alteration station employed personnel with sufficient aircraft-
specific knowledge to appropriately assess the integration of the 
entertainment system's power supply with aircraft power. In response to 
the Canadian report, in 1999, FAA investigated its oversight of the 
designated alteration station involved in the crash and concluded that 
FAA's oversight of the designee that installed the entertainment 
systems was in accordance with FAA policy.[Footnote 13] However, the 
report went on to note that aspects of FAA's policy for overseeing 
designated alteration stations lacked clarity and needed revision. To 
address this problem, the report recommended a nationwide study of 
FAA's oversight of designated alteration stations. This subsequent 
study, conducted in 2000, found general oversight weaknesses, including 
the lack of a national standard policy on management and oversight of 
designated alteration stations and a general lack of FAA supervision of 
these designees. To address the root cause of the problems identified, 
the 2000 study recommended revisions to FAA's order concerning 
oversight of designated alteration stations, which were made and issued 
in August 2002. The 2000 review further recommended that the office 
establish a process to periodically assess the effectiveness and 
applicability of existing policies concerning designated alteration 
stations and consider feedback from FAA field offices and designees. 
The Aircraft Certification Service has not implemented this 
recommendation to directly assess the policies in place, but continues 
to rely on informal feedback from FAA field offices and industry.

In addition, FAA has not fully implemented its 2002 policy to conduct 
technical evaluations of 49 organizational designees, located primarily 
in the Aircraft Certification Service.[Footnote 14] Technical 
evaluations allow the agency to determine whether the products and data 
produced by the organizations are technically acceptable and comply 
with FAA policies. According to FAA officials, the agency had conducted 
10 technical evaluations as of June 2004. FAA is allowing 
organizational designees time to perform approvals under their new 
procedures before performing the technical evaluations, according to 
the agency. In the meantime, according to FAA officials, these 
organizational designees are being evaluated under the current Aircraft 
Certification Systems Evaluation Program, which require an evaluation 
every 2 years.

Field Offices Provide Inconsistent Oversight of Designees:

Concerns about the consistency and adequacy of designee oversight that 
FAA field offices provide have been raised in previous 
reports,[Footnote 15] including FAA's evaluations of various designee 
programs, which we discussed earlier in this report; by individuals we 
interviewed during site visits; and by our expert panel. Table 3 shows 
the top five oversight weaknesses identified by our experts. The top-
ranked weakness--inconsistent oversight by FAA offices--was identified 
as a "great" or "very great" weakness by 36 of the 62 experts. No more 
than 6 of the 62 experts felt that these top five factors posed "no 
weakness" and between 5 and 13 other experts--believed these factors 
presented "little" weakness. (See app. IV for additional weaknesses 
identified by our expert panel.)

Table 3: Experts' Ranking of Top 5 Oversight Weaknesses:

Ranking: 1; 
Weakness: FAA offices level of oversight and interpretation of rules 
are inconsistent.

Ranking: 2; 
Weakness: Inactive, unqualified, or poor performing designees are not 
identified and removed expeditiously.

Ranking: 3; 
Weakness: It is difficult to terminate poor performing designees.

Ranking: 4; 
Weakness: Inadequate surveillance and oversight of designees.

Ranking: 5; 
Weakness: FAA has not made oversight of designees a high enough 
priority.

Source: GAO analysis of expert panel information.

Note: Rankings based on responses from 62 experts and the frequency of 
responses indicating a "great" or "very great" weakness.

[End of table]

Designees and industry officials that we spoke with indicated that 
FAA's level of oversight and interpretation of rules are inconsistent 
among regions and among offices within a region. For example, several 
designees whom we spoke to provided the example of one Aircraft 
Certification field office that was stricter in its application of FAA 
standards than other offices--i.e., the stricter office would not 
approve submittals for supplemental type certificates that would be 
approved by other FAA offices. As a result, applicants tend to "shop 
around" to find those offices that will provide expedited approvals, 
according to these designees. Another designee and an aviation parts 
manufacturer told us that FAA field offices required different 
paperwork and interpreted FAA rules differently for the same work. For 
example, a manufacturer of fortified cockpit doors found that field 
offices in Los Angeles and Seattle interpreted regulations differently 
and required different paperwork to process the same type of approval. 
Designated mechanic examiners that we spoke with provided similar 
examples of inconsistencies among field offices. They cited instances 
in which one field office would reject applications that another field 
office would approve. Further, an industry representative that we spoke 
with provided examples of inconsistencies among FAA offices concerning 
whether approval in the form of a supplemental type certificate is 
needed--with some offices requiring a supplemental type certificate and 
other offices considering the same type of manufacturing or maintenance 
work minor and requiring no approval. A designated engineering 
representative noted that different FAA staff required different levels 
of detail in the standard FAA form that engineering designees submit to 
show their completed work. Another industry representative noted the 
lack of standardized requirements for data submittals from certain 
types of designees, such as designated engineering representatives. A 
standardized checklist would help various FAA field offices to 
consistently interpret regulations, according to the industry 
representative. According to FAA officials, in certain cases, there are 
reasons for inconsistent application of rules. For example, in the case 
of cockpit doors, the projects typically varied across offices 
depending on data submitted by previous applicants and the capability 
of the applicant. In order to reduce unnecessary administrative burdens 
on applicants, FAA's policy specifies that once an applicant had 
demonstrated that a design change met FAA requirements, subsequent 
applicants for a similar alteration may not be required to conduct all 
the same tests required of the previous applicant, according to FAA 
officials. Agency officials further stated that checklists are created 
for each project and that standardized checklists cannot be used 
because each project is unique. This was disputed by FAA staff that we 
spoke with, some of whom had created standardized checklists to use for 
all the designees that they oversaw.

We also found that, in some cases, the ability of FAA field offices to 
oversee designees is affected by designees working outside of their 
normal locality and the amount of written details about that work that 
is provided to FAA. FAA policy allows designees to work outside of 
their assigned geographic area but, in certain circumstances, requires 
designees to notify the local FAA office.[Footnote 16] This situation 
can occur, for example, when specialized engineering expertise is 
needed by an aviation parts manufacturer and the closest designee with 
that expertise is located in a remote FAA region; in which case, the 
company may request the services of a designee from outside the region. 
We spoke with one designated engineering representative based in 
Atlanta who regularly worked outside his geographic area. In 2001, 7 of 
12 projects that he approved as a designee were outside the Atlanta 
area; in 2002, 20 of 33 projects were outside the area, and in 2003, 4 
of 28 projects were outside the area. When he works out of his 
geographic area, he normally contacts the field office where he is 
conducting his work only after the work is completed and submits the 
required paperwork to his FAA office in Atlanta upon completion of a 
project. He and other designated engineering representatives told us 
that they are likely to include minimal details in the forms submitted 
to FAA because that information can be requested under the Freedom of 
Information Act. An FAA engineer also told us that designated 
engineering representatives may be reluctant to include details on how 
they certify aviation products. Since FAA inspectors have little 
opportunity to witness the work being performed by designees that work 
outside their area, inspectors rely heavily on paperwork reviews. When 
the paperwork provides insufficient details about the designees' 
activities, FAA staff spend additional time requesting the needed 
information from designees, according to an FAA engineer.

In addition, Flight Standards Service staff told us that more direction 
and clarity was needed concerning the amount of surveillance that 
inspectors should be conducting over designees. Policy guidance 
describes how inspectors are to conduct surveillance of designees, and 
the service develops a national workplan each year that determines the 
number of inspections of designees that inspectors and engineers will 
perform. Several FAA field office managers that we spoke with believed 
that the oversight called for in the national workplan does not allow 
them to target oversight to those designees that need more or less 
surveillance. In addition, according to several FAA inspectors that we 
spoke with, it is difficult during their site visits of designees to 
identify those who are improperly certifying applicants or conducting 
inappropriate activities, such as approving parts beyond their 
authorization. The inspectors told us that they usually find out about 
improper designee activities by noticing mistakes on the forms 
submitted by designees and receiving complaints from designees' 
clients. A designee that we spoke with further explained that, because 
FAA visits are arranged in advance, designees have time to make sure 
things are done correctly during the visit. Flight surgeons, by 
comparison, are not required to conduct site visits of designees. Due 
to limited number of staff and resources available to conduct site 
visits, flight surgeons primarily conduct those visits only after 
problems have been identified by others, such as complaints by clients.

We also found that field offices did not consistently follow 
established policy for selecting designees. While we did not find 
evidence that unqualified designees were selected, this situation may 
result in not selecting the best qualified candidates. Nineteen of the 
62 experts on our panel believed that FAA does not consistently follow 
its own designee selection criteria[Footnote 17]--which are based on 
designee candidates experience and qualifications, FAA field offices' 
ability to oversee designees, and the need for particular types of 
designees[Footnote 18]--but rather appoints designees based on personal 
associations. Moreover, 9 of the 17 FAA inspectors and engineers on our 
panel rated the practice of awarding delegation status based on 
personal associations with FAA management as a "great" or "very great" 
weakness of the designee programs. FAA policy requires multiple parties 
to review applicant's qualifications and reach consensus on appointment 
decisions, but we found that field offices sometimes add their own 
criteria. For example, Flight Standards Service has established a 
National Examiners Board to review all designee applications and 
prepare a list of qualified candidates from which field office managers 
must select designees. The board was established to provide an 
objective, standardized process and to move away from the previous ad 
hoc practices of appointing designees that were often based on 
selecting personal acquaintances. However, we found that this process 
does not always work as intended. For example, in a Flight Standards 
field office that we visited, an applicant for designed airworthiness 
representative is required to have a letter of recommendation from the 
manager of the field office. According to an inspector at that field 
office, this practice has resulted in screening out otherwise qualified 
individuals. According to FAA officials, personal associations is an 
important factor in selecting and appointing designees. They consider 
personal knowledge and experience with the applicant an important 
consideration in the selection process, without which it is difficult 
to know whether applicants have the necessary qualifications and 
abilities.

In addition, FAA's internal evaluations confirm our work that FAA 
offices provide inconsistent oversight and interpretation of rules 
concerning designees, which limits the assurance that the agency has 
that the designees are performing certification work properly. For 
example, as mentioned previously in this report, in 2000, an FAA 
evaluation of designee pilot examiner oversight in one region found 
that inspectors were not conducting oversight as required by agency 
policy.[Footnote 19] That review further found that up to 30 percent of 
the designated pilot examiners in the region were not conducting 
complete practical tests of pilot certificate applicants and not 
consistently holding pilot applicants to the standards of the practical 
test.[Footnote 20] In addition, an FAA-industry study found that 
project approvals by certain designated engineering representatives, 
which do not require FAA review, combined with the lack of designee and 
FAA technical expertise in certain specialized areas, have resulted in 
designs that were deficient or not in compliance with FAA regulations.
[Footnote 21]

We also found that FAA offices do not always identify and remove 
inactive or poor performing designees expeditiously, which may be due 
to reluctance on the part of managers, engineers, and inspectors to 
take disciplinary action. FAA policy calls for providing counseling, 
remedial training, or limiting or terminating designees' authority for 
insufficient work activity and poor performance. For example, since 
1998, Aircraft Certification Service has terminated approximately 770 
designees for such reasons as insufficient activity, lapse in 
qualifications, or lack of care.[Footnote 22] However, a 2002 study 
conducted jointly by FAA and industry found that it was the perception 
of some FAA field staff who oversee designees that terminating 
designees is difficult because of fear of litigation. According to the 
report, this perception had resulted in little, if any, disciplinary 
action being taken against designees when it may be warranted.[Footnote 
23]

Our interviews with FAA field office managers and staff confirmed that 
they are reluctant to take disciplinary action against designees. For 
example, managers in the Seattle and Oklahoma City field offices and 
inspectors and engineers in the Atlanta and Los Angeles field offices 
told us that rather than take disciplinary action against poor 
performing designees, they wait and terminate the designee during the 
renewal process, as long as designees have not committed any criminal 
acts. According to these officials, FAA field offices prefer to not 
renew poor performing designees rather than terminate them because FAA 
management wants to avoid legal appeals that designees can make if the 
agency decides to terminate them for poor performances. According to 
FAA field inspectors that we spoke with, it is difficult for them to 
terminate poor performing designees--such as those who continue to omit 
information in their documented work despite training and counseling--
because the process is lengthy and time-consuming. According to one FAA 
engineer, when she tried to remove a designated engineering 
representative for making incorrect approvals, she was required by FAA 
policy to first notify the designee of FAA's intent to terminate the 
designation, and then to document the specific reasons for the 
recommended removal. The process took 2 to 3 years, according to the 
inspector. After designees are removed, they are allowed up to two 
appeals, which can further lengthen the removal process. FAA officials 
acknowledged that misunderstandings of the removal process among 
inspector staff will continue without the development of specific 
guidance and training on the designee termination process. Our analysis 
of data from the Aircraft Certification Service found that the office 
terminated 15 designees because of "lack of care or judgment" and 
terminated 121 by not renewing their designations over the last 5 
years.

In addition, FAA field and program office managers have some discretion 
over terminating poor performing and inactive designees, but because 
FAA's criteria for terminating designees is not specifically defined, 
each field and program office determines when poor performance or lack 
of activity constitutes grounds for termination. According to a manager 
in FAA's Civil Aerospace Medical Institute, one region may terminate an 
aviation medical examiner who is consistently more than 30 days late in 
transmitting medical certification data, while another region may 
terminate an aviation medical examiner who is consistently more than 60 
days late. An FAA engineer told us that designees in the Aircraft 
Certification Service are seldom terminated because of low activity 
level. Of the approximately 770 designees for Aircraft Certification 
Service that were terminated since 1998, according to information we 
analyzed in FAA's Designee Information Network database, about 230 (30 
percent) were terminated for inactivity. In addition, a manager for a 
Flight Standards field office told us that the criteria the office uses 
for terminating poor performing designees include whether the 
termination will result in a loss of income for individual designees. 
This criterion is not included in FAA policy nor considered by other 
field or program officials with whom we spoke.

Consistent application of oversight policies is important to ensure 
that designees follow FAA policies and that they remain free from 
pressures from employers or clients that may lead them to bypass those 
policies. For example, in 1999, FAA found that designated mechanics' 
examiners in the Orlando, Florida, area had not adhered to FAA's 
standards and had fraudulently indicated that hundreds of mechanic 
applicants had passed the certification examination. This resulted in 
FAA retesting many of the mechanics. In addition, some designated 
engineering representatives are salaried employees of the manufacturers 
whose products they are approving on behalf of FAA. In one case, a 
designee told us that another designated engineering representative was 
an executive officer of the company whose products he was approving, 
creating an apparent conflict of interest. The designee also told us 
that designees are under pressure by their employers to certify 
products. He stated that designated manufacturing inspection 
representatives are sometimes pressured by their employers to approve 
aviation products for export, under the threat of being fired. 
According to FAA officials, agency policy discourages the appointment 
of designated engineering representatives who are executives within a 
company where the primary job duties are schedule-driven and devoted to 
the output of the company's whole saleable products. Other designees, 
such as designated pilot examiners, are employed by flight schools and 
test pilot applicants for those schools. Since those designees depend 
upon the flight school for employment and referral of applicants, there 
could be an incentive for the designated pilot examiner to compromise 
the integrity of pilot tests. Such situations present the potential 
risk that designees may be pressured by employers to bypass FAA 
requirements in order to meet schedules or attract additional students. 
FAA officials acknowledge that an inherent conflict of interest exists 
in the designee programs, but did not view it as a weakness because 
designees can be held liable for deficiencies in their work. However, 
concerns were expressed to us by several FAA field managers and 
inspectors, that smaller organizations, such as repair shops, may be 
willing to risk liability and bypass agency requirements.

Poor Data, FAA Staff Workload, and Insufficient Training for FAA Staff 
May Contribute to Oversight Weaknesses:

FAA's oversight of its field offices and designees is hampered by the 
lack of comprehensive information in some of the agency's databases 
that are used to capture information on designees,[Footnote 24] the 
workload demands facing FAA staff who oversee designees, and 
insufficient training for FAA staff on designee oversight.

Designee Databases:

The databases for the offices of Flight Standards Service and Aircraft 
Certification Service were not designed to capture information 
concerning oversight performed by the managing offices and do not 
provide a comprehensive picture of FAA engineers' and inspectors' 
oversight activities or the activity levels of designees. For example, 
FAA policies require FAA inspectors in the Aircraft Certification 
Service who oversee manufacturing designees to update the designee 
management database, Designee Information Network, every time they 
oversee or monitor a designee's performance. However, no data field is 
provided to capture information on these oversight visits. A field for 
comments is available for FAA staff to indicate when a designee 
performance evaluation was conducted, but our review of the data files 
for 1998 through 2003, found that this information was not consistently 
noted. Moreover, FAA policy does not require its engineers to document 
their oversight of engineering designees in the database. Thus, FAA 
cannot readily ascertain how often staff in the Aircraft Certification 
Service monitored and evaluated designees, other than the minimum 
levels required to renew the designees' authority.[Footnote 25] 
According to officials in that office, information on how often staff 
review designee performance is recorded in designees' paper case files, 
which are maintained at the field or program offices. In addition, the 
Designee Information Network does not contain information on the number 
and type of approvals that the individual designees are conducting. As 
a result, FAA lacks a single, comprehensive data source that could be 
used to facilitate designee oversight by providing FAA a means to 
prioritize oversight activities and engineer workload. According to FAA 
officials, the fact that all oversight information is not captured in a 
single database does not directly affect the agency's ability to 
effectively oversee designees.

Two other databases--the Program Tracking and Reporting Subsystem 
(PTRS) and National Vital Information Subsystem (NVIS)--used by Flight 
Standards Service inspectors to monitor designees also do not 
completely track designees' activity level. According to FAA officials, 
PTRS was designed to track activities by FAA inspectors, such as noting 
when FAA inspectors conducted surveillances of designees, while NVIS 
was developed to track basic profile information on designees, such as 
their names, addresses, types of certification, designated 
authorizations, and status. PTRS can be used to track the activity 
levels of designees; however, it requires the FAA inspector to input 
the data each time they receive a certification package from a 
designee, but past reviews have found problems with incomplete 
information in the database. For example, in 2003, the Quality 
Assurance Team mentioned earlier found that required information on 
designee oversight was missing from the two databases or incorrect. The 
team noted records that would indicate the type of surveillance 
conducted (such as an observation of a complete or partial test) were 
missing from PTRS and records in NVIS that lacked renewal dates and 
contained inaccurate information on designee training and 
authorizations. By comparison, Aerospace Medicine has one database--
Airmen Medical Certification Information Subsystem--to track 
information on aviation medical examiners, including information on the 
number of medical certificates issued by each medical examiner and 
demographic, training, and oversight information for each designee. Our 
review of that database found reasonably complete information; we did 
not check the accuracy of the information.

FAA Staff Workload:

FAA's oversight of the designee programs may also be weak, in part, 
because of the workload demands facing agency staff who oversee 
designees. In addition, the amount of time that FAA staff spend on 
other aviation safety activities, such as monitoring air carrier 
operations, affects the amount of time spent on designee oversight. FAA 
policy recognizes that each designee oversight scenario is unique and 
allows variations in determining the extent of oversight needed to meet 
minimum annual requirements. FAA policy also states that the ability to 
provide adequate oversight depends on balancing the level of FAA 
staffing to the agency's workload and the number of designees. FAA 
policy, however, does not specify an acceptable workload for meeting 
this criterion. For example, each managing office must periodically 
verify adequate FAA staffing numbers based on the type and amount of 
the work performed by staff who oversee designees. FAA policy provides 
no further guidance for determining adequate numbers for proper 
oversight. FAA officials stated that the level of specificity in the 
guidance is adequate for determining staff workload with designees and 
that it would be difficult to determine an exact staffing ratio because 
of factors such as the size of facilities, the experience of designees, 
and the complexity of projects. However, the lack of clear policy 
guidance and staffing standards results in wide variation in the ratio 
of designees to FAA staff among offices and programs and makes it 
difficult for the agency to measure and account for its staff 
resources. For example, our review of FAA data showed that, on average, 
the ratio of designees to FAA staff is about 6 to 1 in the Aircraft 
Certification Service, about 5 to 1 in Flight Standards, and about 440 
to 1 in Aerospace Medicine. The ratios for individual FAA staff ranged 
from 1 designee to 1 FAA staff in several Aircraft Certification 
offices to about 870 designees to 1 FAA staff in Aerospace Medicine. 
Information we gathered from site visits at three of FAA's nine regions 
also showed a wide range of workload ratios. For example, information 
we gathered at Flight Standard's Northwest Mountain Region showed 
ratios among field offices ranging from 1 designee to 1 inspector to 
100 designees to 1 inspector. Variations in the ratios of designees to 
FAA staff are due to the type of designee and the complexity of their 
work, according to FAA officials. However, several engineers in the 
Aircraft Certification Service with whom we spoke expressed concerns 
that a designee to staff ratio higher than 10 to 1 limits the time they 
have to adequately monitor the work performed by designees. One 
Aircraft Certification engineer told us that while he was currently 
responsible for overseeing 10 designated engineering representatives, 
in the past, he had been responsible for between 30 and 60 designees, 
which was too many to adequately oversee. Flight Standards Service 
officials acknowledged that staffing standards need to be established. 
The National Academy of Sciences is currently evaluating the staffing 
standards for the office of Regulation and Certification, which 
encompasses Flight Standards, Aircraft Certification, and Aerospace 
Medicine, and expects to complete the study in 2005.

Past reports by us and others pointed out that escalating workloads 
and/or high turnover rates for FAA staff continue to diminish FAA's 
ability to oversee designees. For example, over 10 years ago, we 
reported that, in response to a dramatically escalating workload, FAA 
had delegated aircraft certification duties to designees without 
defining a clear role for its staff to ensure that they were 
effectively involved in the certification process.[Footnote 26] Since 
then, FAA has issued comprehensive policies governing the selection, 
appointment, and oversight of individual and organizational designees. 
We also pointed out high turnover rates (107 percent over the previous 
10 years) for FAA engineers who oversee designees. In addition, 
internal FAA documents from 2000 cited the disparity between the 
agency's Aircraft Certification Service' workload and its staffing 
levels, noting that staff resources have not kept pace with increasing 
workload. To update the information in our 1993 report, a 2002 study 
prepared for FAA confirmed that the two FAA field offices--Seattle and 
Los Angeles--responsible for the majority of commercial transport 
airplane oversight still had high turnover rates (115 percent over an 
8-year period) and that over 50 percent of the engineers in those 
offices had less than 5 years of FAA experience.[Footnote 27] The 
report further noted that the consistently high turnover rate and 
associated low experience levels were indicators of the limited time 
available for FAA engineers to acquire the necessary experience and to 
understand the increasingly complex systems and human factors 
associated with modern aircraft, which are among the skills needed to 
oversee the work of designees. FAA noted that the annual turnover rate 
of engineers at the Seattle and Los Angeles field offices had declined 
in recent years, indicating that from fiscal years 1999 through 2004, 
the average annual rates were 3 percent and 4 percent, respectively for 
the two offices.[Footnote 28]

In addition, designees told us that FAA staff who oversee designated 
engineering representatives change frequently. A designee monitored by 
a Seattle field office told us that he estimated that every 3 years he 
reported to a different FAA staff person. Another designee told us that 
in the last 5 years, he had reported to six different FAA staff. As a 
result of frequent changes in FAA staff, the designees felt frustrated 
in the amount of time that it took to establish a good working 
relationship with each new FAA staff person. We found a similar 
situation in an Atlanta field office, where an FAA engineer explained 
that high turnover of engineers in the office made it difficult to 
oversee the activities of designated engineering representatives. The 
difficulty arises, according to the FAA engineer, because designees 
typically submit forms at the end of each quarter to document their 
activities, which FAA engineers then review. When a designee's FAA 
advisor changes during a quarter, the only information that the new 
advisor has concerning the designee's work is the information contained 
in the form, because the new advisor does not have information 
concerning discussions between the prior FAA staff person and the 
designee. Furthermore, as we mentioned earlier in this report, both an 
FAA engineer and designated engineering representatives told us that 
designated engineering representatives are reluctant to include details 
on how they certified a product, fearing that the information could be 
requested and made public under the Freedom of Information Act.

FAA provided us with information on how the size of its workforce has 
changed over time in comparison with the number of designees they 
oversee for some designee programs. For example, based on FAA's 
staffing information, the number of designees overseen by engineers and 
inspectors in the Aircraft Certification Service decreased slightly 
from 6.7 to 1 in fiscal year 1999 to 6.5 to 1 in fiscal year 2003. 
However, FAA could not provide similar information for Flight Standards 
Service or Aerospace Medicine to determine how the agency's workforce 
has changed over time in comparison to designees. Some members of our 
expert panel commented that the number of FAA staff who oversee 
designees has not increased at the same rate the aviation industry has 
grown. Experts also stated that FAA staff do not have time to provide 
adequate oversight of designees for whom they are responsible for 
overseeing. Additionally, FAA inspectors and engineers that we spoke 
with commented that as FAA's dependence on designees continues to 
increase, their ability to conduct oversight--consisting of designee 
supervision, monitoring, and tracking, as required by FAA policy--will 
continue to decrease. According to some FAA engineers that we spoke 
with, dramatic increases in their workload has resulted in their 
ability to review only a minimal percentage of work conducted by 
designees.

The situation in Aerospace Medicine provides another example of 
workload issues potentially hampering oversight. Between July 2002 and 
June 2003, the nine regional flight surgeons in Aerospace Medicine each 
headed a team of about three or four FAA staff and monitored over 4,900 
designated medical examiners, who conducted more than 420,000 medical 
examinations. Given high workload demands on the flight surgeons and 
their staff, in many cases, they are not able to perform site 
inspections to ensure that designee offices and facilities meet FAA 
standards, according to Aerospace Medicine officials. These officials 
also noted that site visits would help FAA ensure that designees are in 
compliance with FAA's facility and equipment requirements, such as 
verifying that the designees have access to acceptable facilities to 
perform physical examinations, meet minimum vision and hearing test 
equipment standards, and have access to approved diagnostic 
instruments. According to regional flight surgeons, due to the limited 
number of staff and resources available to conduct site visits, they 
primarily conduct those visits only after problems arise due to 
unprofessional behavior or unethical practices on the part of the 
designated examiners. Such questionable designee practices are brought 
to the attention of regional flight surgeons by the Civil Aerospace 
Medical Institute, FAA field staff, and through complaints by the 
designees' clients. According to FAA officials, limited resources also 
hinder the flight surgeons' ability to identify unprofessional or 
unethical designated medical examiners.

Inspectors in Flight Standards also told us of workload demands 
affecting designee oversight. For instance, one FAA inspector provided 
an example of a designated pilot examiner who conducted approximately 
400 practical tests in 1 year. FAA policy calls for inspectors to 
conduct one annual inspection of each designated pilot examiner and to 
carry out additional surveillance of pilot examiners who perform more 
than 50 practical tests per quarter. Because of high workload, the 
inspector was only able to conduct one annual inspection of the 
designee with high activity and was not able to conduct the required 
additional surveillance.

The ability of FAA staff to oversee designees is also affected by the 
amount of time that they spend on a wide variety of other aviation 
safety activities and the priorities that are given to the various 
activities. For example, FAA officials from Flight Standards Service 
commented that inspectors are also responsible for other activities 
such as taking enforcement actions, evaluating air carrier operations, 
monitoring general aviation activities, and conducting accident 
investigations. Several FAA engineers that we spoke with said that 
their first work priority was to conduct accident investigations and 
draft airworthiness directives; their second priority was to draft 
policy and regulations; and their third priority was designee 
oversight. FAA staff that we interviewed estimated that they spend 
about 5 to 15 percent of their time overseeing designees, depending 
largely on the number of designees for whom they are responsible. 
According to one estimate by an FAA engineer who is responsible for 
overseeing 25 designees in the Aircraft Certification Service, 
approximately 10 percent of his time--or about 4 hours per week--is 
devoted to designee oversight. Inspectors and engineers also pointed 
out that poor-performing designees can significantly increase their 
workload as they require greater surveillance and more frequent 
interactions.

Training for FAA Staff:

FAA's oversight of the designee programs may also be weak, in part, 
because of insufficient training for staff who oversee designees. 
Twenty-one of the 62 experts on our panel cited a lack of training in 
designee oversight for FAA inspectors and engineers as a "great" or 
"very great" weakness of the designee programs. Six out of 15 FAA 
inspectors or engineers on our expert panel considered this situation 
to be a "great" or "very great" weakness. (Six experts felt the lack of 
training was not a weakness, and 6 other experts felt it posed little 
weakness.) Flight Standards Service officials acknowledged that 
additional oversight training would be helpful to address training 
weaknesses.

FAA's Aircraft Certification Service and Aerospace Medicine have 
established initial training requirements for newly hired staff, which 
include courses on designee oversight. For example, the Aircraft 
Certification Service requires staff to take the Delegation Management 
Job Functions Course, which focuses on overseeing designees and is 
designed to teach the skills necessary to select, supervise, and 
terminate designees. FAA's Aerospace Medicine requires regional flight 
surgeons to take initial training on policies and regulations 
pertaining to designees. Aerospace Medicine staff who assist flight 
surgeons do not receive initial training concerning designees, but 
periodically attend training at the Civil Aerospace Medical Institute 
in Oklahoma City or are informed of relevant policy changes through 
teleconferences, according to officials in the office. By comparison, 
Flight Standards Service does not provide initial training to its 
inspectors on designee oversight. Instead, this office requires new 
inspectors to attend initial training in their areas of specialization. 
Flight Standards is currently evaluating the Delegation Management 
Course used by Aircraft Certification to determine if the course meets 
inspectors' needs for overseeing designees, according to several 
officials in Flight Standards.

Once inspectors and engineers in Flight Standards and Aircraft 
Certification services have fulfilled their initial training 
requirements, they are encouraged, but not required, to participate in 
refresher training. In contrast, FAA requires designees to receive 
formal refresher training every 2 or 3 years. By not requiring its 
oversight staff to take refresher training, FAA cannot maintain 
reasonable assurance that its inspectors and engineers stay current on 
changes to policies and procedures. In fact, one FAA manager told us 
that, in his office, FAA engineers who oversee designees needed 
additional training, especially in the area of managing designees. In 
addition, several experts on our panel stated that, given the disparity 
in training requirements, it would be possible that designees could 
gain a better knowledge of FAA's policies and procedures than the FAA 
staff who oversee them. FAA officials stated that inspectors and 
engineers receive training through workshops, video training sessions, 
and FAA academy training. However, they do not receive refresher 
training, which is required for designees. This is in contrast to 
regional flight surgeons, who are required to attend refresher training 
every 3 years, which is the same training required for designees.

Additionally, previous recommendations for improving inspector 
training have not been implemented. For example, as mentioned 
previously in this report, in 2000, FAA found that inspectors in field 
offices in the Southwest Region were not reviewing designated pilot 
examiners' work on an annual basis and conducting oversight as 
required. The report recommended that the Southwest Region conduct 
standardized initial and refresher training for FAA inspectors, 
supervisors, and managers on the agency's oversight policies and 
procedures pertaining to designated pilot examiners. In response to the 
recommendation, the region implemented a training course that included 
briefings at each field office to raise the awareness of FAA inspectors 
concerning the importance of designee oversight, to explain current 
policy, and to offer techniques for effective oversight. The region 
also used the briefings as the basis of curriculum for new training 
courses for FAA inspectors and has recommended that such courses be 
made available for all Flight Standards Service inspectors nationwide. 
According to agency officials, FAA plans to implement a national policy 
based on this recommendation in October 2004 and expects the policy to 
be implemented by 2005.

FAA Has Potential Opportunities to Improve Designee Programs:

Experts on our panel, best practices within FAA, and practices adopted 
by other countries in administering their respective designee programs, 
including experiences in implementing organizational delegation 
systems, suggest that there are potential opportunities for FAA to 
improve (1) program oversight to ensure consistent compliance with 
existing policies by FAA inspectors, engineers, and flight surgeons and 
(2) the accuracy and comprehensiveness of computerized information on 
designees so that the databases can be more useful tools for designee 
oversight. Given the central importance that the designee programs hold 
for FAA and future agency plans to expand the use of organizational 
designees with the creation of the ODA program, FAA has incentives to 
carefully consider such opportunities.

Several Opportunities Identified to Improve Oversight of Designee 
Programs:

Our work indicated that additional opportunities exist to improve FAA's 
oversight of its designee programs to ensure consistent compliance with 
existing policies by FAA inspectors, engineers, and flight surgeons. 
For example, our expert panel offered a number of suggestions to 
improve the designee programs that address some of the weaknesses we 
identified, including improvements in selecting and terminating 
designees and ensuring that FAA staff who oversee designees are 
knowledgeable about FAA policy. In addition, many experts agreed that 
it was important for FAA to hold designees accountable for their 
findings.[Footnote 29] For example, one expert pointed out that the 
designated engineering representative and organizational designee 
programs should be overhauled so that the designees are responsible and 
accountable for certifications and that FAA needed to put in place a 
process to monitor that additional responsibility. An FAA official told 
us that accountability is a central part of their designee programs, 
since failure to perform delegated functions in accordance with agency 
standards and expectations will result in removal of the delegation. In 
addition, all of the experts on our panel indicated that it was 
important for FAA to conduct audits of existing designee programs to 
determine if field offices are providing adequate oversight.[Footnote 
30] As we mentioned previously in this report, FAA has audited only 6 
of its 18 designee programs. Table 4 lists the top ranked actions in 
terms of importance and feasibility identified by the experts; these 
actions were identified as "high" or "highest" in importance and 
feasibility for implementation by most of our experts. Appendix IV 
provides a complete list of suggestions made by our expert panel.

Table 4: Experts' Ranking of Top Ways to Improve FAA's Designee 
Programs:

Ranking: 1; 
Suggested improvement: Hold designees accountable for their findings.

Ranking: 2; 
Suggested improvement: Ensure that FAA employees who oversee designees 
are knowledgeable about the regulations, policies, and processes 
applicable to the designees' particular specialization.

Ranking: 3; 
Suggested improvement: Select designees according to their 
qualifications and experience rather than on personal associations 
with FAA managers.

Ranking: 4; 
Suggested improvement: Clearly define and consistently follow the 
criteria for selecting designees.

Ranking: 5; 
Suggested improvement: Increase penalties (including the ability to 
terminate their status as designees) for individual and organizational 
designees found to violate standards or who do not exercise proper 
judgment.

Source: GAO analysis of expert panel information.

Note: Rankings based on responses from 62 experts and the frequency of 
responses indicating a "high" or "highest" importance to implement.

[End of table]

Consistent evaluation and monitoring of designee activities is crucial 
to hold designees accountable for their findings, and some FAA offices 
have best practices that may be broadly applicable across the designee 
programs. For example, as we discussed earlier in this report, FAA's 
internal review of pilot examiners in the Southwest Region was 
implemented to determine whether the designees in the region were 
conducting valid practical tests of general aviation pilot applicants 
and to determine the quality of FAA oversight provided by field offices 
in the region. Findings from the internal review were based on a 
comprehensive statistical analysis of pilot examiners' activities in 
the region, a survey of newly certified private pilots in the region, 
audits of pilot examiner files, surveillance reports from FAA 
inspectors, and interviews with field office managers and staff. The 
review provided a reasonable method to assess program outcomes, 
identify the root causes of the lack of compliance with agency policy, 
and develop corrective action plans to address the root causes. FAA's 
Organization Effectiveness Branch Manager commented that the 
methodology for the internal review was reliable, and suggested that 
the review was informative for developing regional policy. The Branch 
Manager also commented that in order to address FAA national policy, a 
national survey would be necessary. Flight Standards Service has not 
expanded its use of this methodology to other regions or to other 
designee programs.

Canada's practice of systematically evaluating and/or monitoring its 
designee programs provides additional examples of opportunities for 
improving FAA's oversight of its organizational designee programs and 
its plans to implement ODA. Transport Canada oversees both individual 
and organizational designees (which are called "delegates"), and 
focuses on aircraft design and design modifications.[Footnote 31] 
Transport Canada oversees delegates using regional offices and 
headquarters staff, similar to FAA. FAA, however, oversees a much 
larger number of designees. For example, Canada has approximately 760 
aviation medical examiners and 80,000 pilots, while the United States 
has about 5,000 aviation medical examiners and about 630,000 pilots. 
Transport Canada has implemented a policy to provide a consistent and 
standard approach for conducting safety oversight of its organizational 
delegates, which includes conducting audits of delegated organizations 
on a cycle ranging from 6 to 36 months--an initial audit within 6 
months of certification and comprehensive follow-up audits on a 
recurring basis. They have also established a centralized 
standardization office to ensure that field offices are consistently 
interpreting rules and procedures. The centralized office evaluates and 
approves technical submissions from applicants and delegated 
organizations to determine compliance with regulations. The office is 
also responsible for the development, coordination, and implementation 
of a national audit plan in auditing delegated organizations. By 
comparison, FAA policy calls for conducting annual inspections, and 
procedural audits and technical evaluations every 2 years. Annual 
inspections focus on a review of the system that the delegated 
organization has in place to perform the delegated functions and a 
review of the activities conducted by individuals. As mentioned 
previously, FAA conducted 10 technical evaluations (out of 49) as of 
June 2004. According to FAA, it has established centralized offices 
responsible for standardization of policies. However, our work has 
shown that FAA field offices do not implement policies in a standard 
manner, as discussed earlier in this report.

Transport Canada's experiences in developing an organizational 
delegation system in the late-1980s also provide relevant lessons for 
FAA as it begins developing the ODA program. According to the Chief of 
Delegation and Quality Divisions in Transport Canada, an inconsistent 
level of oversight was a major challenge that Transport Canada faced as 
it implemented its organizational delegation system. To address this 
challenge, the agency established a centralized standardization office 
to ensure that field offices are consistently interpreting rules and 
procedures. Based on this experience, the Transport Canada official 
told us that FAA needs to plan for the inconsistencies that will arise 
during the implementation of the ODA program. The larger size of the 
U.S. designee programs increases the likelihood that the level of 
oversight will be inconsistent, according to the Transport Canada 
official. Moreover, the official also commented that, in hindsight, 
they should have developed and conducted an audit of organizational 
delegates early in the implementation process. The Canadian official 
told us that Transport Canada did not conduct audits early on because 
staff were preoccupied with reviewing and approving organizations' 
procedures manuals. Transport Canada's quality assurance review later 
determined that they were not doing audits of organizational delegates 
on time, nor conducting audit follow-ups, which contributed to 
inconsistent oversight.

Database Monitoring Performance of Aviation Medical Examiners Could Be 
Model for Other Designee Databases:

Accurate, comprehensive information on designee activities is an 
important prerequisite for designee oversight and is integral to 
monitoring and evaluating the programs. The Airmen Medical 
Certification Information Subsystem--a database used by FAA's Office of 
Aerospace Medicine to monitor the performance of aviation medical 
examiners--provides a model for the other designee programs. Although 
this database was designed to simplify the processing of airmen medical 
certification information, Aerospace Medicine also uses it as a tool to 
oversee aviation medical examiner designees and monitor their activity 
levels. For instance, regional FAA flight surgeons use information from 
the database to determine if they need to more closely monitor aviation 
medical examiners with high activity levels or to determine how long it 
takes to transmit medical information to FAA. Each flight surgeon is 
periodically provided performance data for their designees that include 
the number of medical certificates issued by each designee, the number 
of errors found in those certificates, and the number of accidents and 
incidents involving pilots that received medical certificates from 
designated medical examiners, according to an Aerospace Medicine 
official. Additionally, according to FAA officials, regional flight 
surgeons also use data from the database and link it with the Airmen 
Registry to determine the region where FAA needs additional examiners. 
Applying this model to Flight Standards Service and Aircraft 
Certification Service would provide those offices and inspectors and 
engineers with more detailed performance information on designees and 
provide a foundation for more consistent oversight of the numerous 
designee programs.

FAA officials agreed that improvements were needed to these databases, 
but expressed a concern that it would cost $50 million to make 
upgrades, which may have implications for other safety programs that 
would then receive less funding. Such concerns might be addressed by 
looking for ways to share the costs of the designee programs with the 
aviation industry, similar to other federal agencies that charge user 
fees to process applications for approvals or licenses. For instance, 
the Federal Drug Administration charges pharmaceutical companies 
application fees to recover the cost of the agency's review of new 
drugs.[Footnote 32] As another example, U.S. Customs and Border 
Protection charges fees to brokers--private individuals and companies 
that are licensed and regulated by the agency to aid importers and 
exporters in moving merchandise through Customs. Brokers pay Customs a 
$100 permit application fee and a $125 annual user fee. FAA does not 
charge designees an initial application fee or a renewal fee, which 
could help recover the cost of processing these applications, because 
it has been prohibited in law from promulgating new user fees since 
1997.[Footnote 33] Moreover, designees charge companies and the general 
public fees to have a product certified or to perform a pilot practical 
test. Some designees earn up to $60,000 or more a year and have made 
designated activities their sole source of income. FAA inspectors, 
engineers, and flight surgeons, on the other hand, provide the same 
service free as a function of their government employment. In prior 
reports, we have stated our belief that, to the extent possible, 
commercial users of the aviation system should pay their share of the 
costs that they impose on the system.[Footnote 34] Charging fees to 
designees to offset the cost to FAA to administer the designee programs 
is an analogous situation.

Conclusions:

Designees perform a valuable function for FAA and the aviation 
industry, enabling FAA to leverage its staff resources and industry to 
obtain FAA-issued certificates in a timely manner. By using designees, 
however, FAA places great trust in the integrity and honesty of 
designees to adhere to the same requirements, instructions, and 
procedures as FAA staff do; therefore, periodic validation and 
consistent oversight by FAA staff is necessary to ensure that such 
trust is well placed. To date, FAA has not ensured that the oversight 
process for its many designee programs is implemented consistently by 
different field offices. While we did not find systematic safety 
problems associated with FAA's oversight of designees, the agency's 
inconsistent oversight limits its assurance that the designees' work is 
performed in accordance with the agency's standards and policies. We 
found examples of weaknesses in FAA's designee programs--such as 
inspectors with too great a workload to conduct required surveillance 
of designees--that underscore the need for FAA to ensure that its staff 
are consistently following agency policy concerning designee oversight 
and to validate those policies and their application by periodic 
evaluations. However, FAA has evaluated only 6 of its 18 designee 
programs to date. Our study indicated that reasons for FAA's 
inconsistent oversight may include limitations on designee data that 
FAA maintains, along with heavy workload, and potentially inadequate 
training for FAA staff overseeing designees. FAA lacks a comprehensive 
information system to effectively monitor and oversee the thousands of 
activities performed by designees. Without such information, FAA 
management cannot readily determine whether its field staff is 
overseeing designees according to policy nor whether designees are 
performing according to FAA's standards. Heavy workload for FAA staff 
responsible for overseeing designees might preclude thorough 
assessment--or any assessment--of some designees' performance. 
Finally, by not requiring refresher training for FAA staff, the agency 
increases the risk that staff do not retain the information, skills, 
and competencies required to perform their oversight responsibilities. 
Potential opportunities exist for FAA to address these weaknesses and 
provide more consistent oversight of the designee programs by expanding 
the use of existing agency practices, such as the Office of Aerospace 
Medicine's practice of maintaining information on aviation medical 
examiners performance and activity levels and using that information in 
conjunction with designee oversight. Charging application and renewal 
fees to designees to help offset the cost of administering these 
programs would be in line with practices by other agencies and prior 
GAO reports on cost-sharing with the aviation industry. However, FAA is 
prohibited, by law, from imposing new user fees unless they are 
specifically authorized by law.

It is especially important for FAA to consider ways to improve the 
oversight of its designee programs as the agency moves forward with the 
organization designation authorization program, which would expand the 
number and types of organizational designees and further transform 
FAA's role to that of monitoring the performance of others. Moreover, 
concerns have been raised that under the proposed program FAA would 
provide less specific and less technical oversight of the new 
organizational designees than under the current program. Expanding the 
use of good oversight practices already used within FAA for some 
designee programs and examining lessons that may be learned from 
Canada's oversight of organizational designees and efforts suggested by 
our expert panel, would increase FAA's assurance that its designees are 
meeting FAA safety standards and that any future changes to the 
designee programs maintain those standards.

Recommendations for Executive Action:

To improve management control of the designee programs, and thus 
increase assurance that designees meet FAA's performance standards, GAO 
recommends that the Secretary of Transportation direct the FAA 
Administrator to take the following three actions:

1. Establish a program to evaluate all designee programs, placing a 
priority on those 12 programs that have not been evaluated. At a 
minimum, the evaluations should examine field office compliance with 
existing policies, identify root causes of noncompliance with those 
policies, and establish and monitor corrective action plans.

2. Develop mechanisms to improve the compliance of FAA program and 
field offices with existing policies concerning designee oversight. The 
mechanisms should include additional training for staff who directly 
oversee designees. As part of this effort, FAA should identify best 
oversight practices that can be shared by all FAA program and field 
offices and lessons learned from the program evaluations and 
incorporate, as appropriate, suggestions from our expert panel.

3. Enhance the effectiveness of FAA designee oversight tools, such as 
databases, by improving the consistency and completeness of information 
on designees' activities and performance and FAA oversight. To the 
extent necessary, FAA should examine charging fees to designees to help 
pay for the costs of such efforts. If FAA identifies a need for such 
fees, the agency should request the Congress to authorize them.

Agency Comments:

We provided a draft of this report to DOT for review and comment. FAA's 
Deputy Associate Administrator for Regulation and Certification and 
other DOT officials provided oral comments. DOT generally agreed with 
our recommendations and acknowledged that automating the data 
concerning oversight of designees and enhancing training for FAA 
employees who oversee designees are useful steps to enhance the 
programs. The department also provided clarifying comments and 
technical corrections, which we incorporated as appropriate. In 
addition, the department noted that designee programs have been a 
cornerstone of aviation safety for 50 years. The constantly improving 
level of safety in the U.S. aviation system is due, in no small 
measure, to the professional performance of the thousands of designees 
who evaluate aircraft designs, assess pilot capability, or conduct the 
myriad of other reviews designees perform, according to DOT. DOT also 
pointed out that statistics and data show that every day of the year, 
the pilots and aircraft that pass through these designee systems fly 
safely from departure to destination.

However, DOT officials expressed concern about the use of the Delphi 
method in our review of 18 different programs with nearly 14,000 
designees. First, they emphasized that, at best, the Delphi method 
provides a means to consolidate and prioritize expert opinion, but even 
under the best of circumstances, the results are opinion, not 
necessarily factual data. The use of Delphi was further complicated in 
this particular case, according to DOT, by the span of knowledge that 
would be necessary to be considered an "expert" on designees when the 
scope of expertise runs from aviation medicine, to aircraft engineering 
and production methods, to parachute rigging. They stated that no 
individual could be considered an expert in all the programs, and the 
solicitation of opinions from the panel of experts would reflect the 
specific experience of each individual--but could not be considered a 
general statement of the strengths or weaknesses of all the programs. 
By consolidating the responses from individuals with expertise from 
these diverse fields, the officials questioned whether the results 
could be useful for guiding decisions to improve any of the individual 
designee programs. Further, the DOT officials cautioned that the Delphi 
results should be carefully qualified in the final report, along with 
explicit statements about the limitations on the use of the 
information.

We disagree with DOT's characterization of our use of the Delphi 
method; furthermore, we believe we used this methodology (which is 
described in detail in app. I) appropriately. In particular, we used a 
"modified" version of the Delphi method in order to compensate for some 
the limitations inherent in the Delphi method as well as to adapt the 
method to the specific needs of this engagement. For example, we 
created a Web-based panel that allowed us to include many more experts 
than had we convened a live panel. In addition, the Web-based panel 
allowed us to keep the experts' identities anonymous, minimizing the 
biasing effects often associated with live group discussions. We also 
carefully selected the experts starting with a list provided by FAA, 
and took into consideration that not all of the panelists would possess 
expertise in many of the designee programs. To help adjust for that 
fact, during the first round of questions, we asked experts to indicate 
if their responses referred only to specific designee programs and, in 
a few cases, experts indicated such. During the second round, the 
experts were given the choice of responding to each question that they 
did not know or had no opinion. In short, while DOT criticizes the 
responses from the experts as "opinions," we believe the responses are 
more appropriately characterized as carefully considered judgments of 
systematically selected experts. Lastly, as described below, the report 
only focuses on issues that were identified by the panel and other 
sources.

Second, in reviewing a draft of this report, DOT officials expressed 
concern about the way the Delphi results had been presented. They 
emphasized, for example, that while the draft mentioned the number of 
respondents who considered a factor a "great" or "very great" weakness, 
the draft should also state the number who considered a factor "no" 
weakness or of "little" weakness. Presenting what DOT considers both 
ends of the response spectrum in the body of the report would allow a 
full understanding of the results, according to the department. We 
agreed that the number of experts responding "no" and "little" should 
also be presented whenever the responses to individual questions were 
mentioned in the report, and we revised the report accordingly.

Finally, DOT officials emphasized the need to consider, what they 
called, the "totality" of the questions and responses in order to 
evaluate any inconsistencies among responses. For example, they said 
that while our report uses the responses from a single question to 
indicate concern regarding the selection process for designees, the 
responses from other questions could be interpreted to conclude that 
there was little concern about the competency of the designees that 
were selected or the quality of their work. Taken together, these 
officials felt that these responses in total present a different 
perspective on the outcome of the designee selection process than the 
first question alone. DOT officials stated that our highlighting the 
responses to one question and not balancing it with the results of 
others, presents an incomplete picture of the panel's overall findings 
and could mislead those who read the report but do not look at the 
details in appendix IV. We disagree with DOT's characterization of our 
analysis. First, we considered all responses from the expert panel and 
provided them in their entirety in the appendix. Furthermore, for the 
body of the report, we only focus on issues that were identified by 
multiple sources. For example, the report highlights the issue of 
selecting designees based on personal association because it was 
identified by other sources during our field work and our review of 
prior evaluations of the designee programs. Other issues raised by some 
of the panel experts concerning the selection process were not 
identified by other work we conducted and, therefore, not highlighted 
in the report.

As agreed with your office, unless you publicly announce the contents 
of this report earlier, we plan no further distribution until 21 days 
from the report date. At that time, we will send copies of this report 
to interested congressional committees, the Secretary of 
Transportation, and the Administrator, FAA. We will also make copies 
available to others upon request. In addition, the report will be 
available at no charge on the GAO Web site at 
[Hyperlink, http://www.gao.gov].

Please call me at (202) 512-2834 if you or your staff have any 
questions concerning this report. Major contributors to this report are 
listed in appendix V.

Sincerely yours,

Signed by:

JayEtta Z. Hecker: 
Director, Physical Infrastructure Issues:

[End of section]

Appendixes:

Appendix I: Objectives, Scope, and Methodology:

This report addresses the following research questions: (1) What are 
the strengths of FAA's designee programs? (2) What are the weaknesses 
of the programs and the factors that contribute to those weaknesses? 
and (3) What can be done to address the identified weaknesses or 
otherwise improve the programs?

To address these questions, we used a variety of methods and sources of 
information. We obtained and analyzed data for fiscal years 1998 
through 2003 from four Federal Aviation Administration (FAA) 
databases[Footnote 35] that maintain records on designees. We assessed 
the reliability of the databases by (1) performing electronic testing 
of required data elements; (2) reviewing existing information about the 
data and the system that produced them; and (3) interviewing agency 
officials knowledgeable about the data to learn how the information 
system was structured, controlled, and used. We determined that the 
data were sufficiently reliable for our purposes of describing the 
number of designees by program, identifying the geographical location 
of designees, and calculating the number of designees per FAA staff 
responsible for designee oversight. However, we found that specific 
data needed for oversight were not found in some databases, as we 
discuss in this report.

In addition, we reviewed FAA program guidance concerning designee 
management to obtain an understanding of designee roles and 
responsibilities. We did not verify how FAA delegates authorized 
functions and what certification activities were delegated. We also 
reviewed FAA's Notice of Proposed Rulemaking on the organization 
designation authorization program and public comments on the proposed 
rule, conducted computer literature searches to obtain information on 
other countries' designee programs, and interviewed officials from the 
Canadian civil aviation authority. In addition, we reviewed past 
studies, by us and others, of FAA's designee programs. (See the 
bibliography at the end of this report.) We identified recommendations 
that had been made to improve the programs and determined whether those 
recommendations had been acted upon by the agency. Information obtained 
from the reports and the databases was not equally comprehensive and 
available for all types of designees.

We obtained information and data on FAA's designee programs on visits 
to four locations--Los Angeles, Seattle, Atlanta, and Oklahoma City. 
 We selected the locations based on (1) number of designees in the 
region; (2) activity-level of designees; (3) ratio of inspectors, 
engineers, or flight surgeons to designees; and (4) location of both 
Aircraft Certification directorate offices and Flight Standards Service 
regional offices. Additionally, these offices were selected because of 
the following: (1) the Seattle office has the largest number of 
aircraft certification designees, (2) the Atlanta office has the 
largest number of flight standards designees along with the most 
certification activity, and (3) the Oklahoma City office manages some 
designee data and is the location of FAA's training institute. We 
interviewed individual FAA inspectors and engineers who oversee 
designees at the offices we visited as well as officials from the 
National Air Traffic Controllers Association and Professional Airway 
System Specialists--unions that represent FAA inspectors and engineers. 
We also interviewed designees in Los Angeles, Seattle, and Atlanta. The 
cities and organizations where we conducted our work are shown in table 
5.

Table 5: Organizations Interviewed by GAO During Site Visits:

Location: Seattle, WA, area; 
Type of entity: Federal government; 
Organization: FAA's Office of Aircraft Certification. 

Location: Seattle, WA, area; 
Type of entity: Federal government; 
Organization: FAA's Office of Aircraft Certification: Transport 
Airplane Directorate. 

Location: Seattle, WA, area; 
Type of entity: Federal government; 
Organization: FAA's Office of Aircraft Certification: Manufacturing 
Inspection Office. 

Location: Seattle, WA, area; 
Type of entity: Federal government; 
Organization: FAA's Office of Aircraft Certification: Manufacturing 
Inspection District Office. 

Location: Seattle, WA, area; 
Type of entity: Federal government; 
Organization: FAA's Office of Aircraft Certification: Manufacturing 
Inspection Satellite Office. 

Location: Seattle, WA, area; 
Type of entity: Federal government; 
Organization: FAA's Office of Aircraft Certification: Boeing 
Certificate Management Office.

Location: Seattle, WA, area; 
Type of entity: Federal government; 
Organization: FAA's Office of Aerospace Medicine. 

Location: Seattle, WA, area; 
Type of entity: Federal government; 
Organization: FAA's Office of Flight Standards Service, Northwest 
Mountain Region. 

Location: Seattle, WA, area; 
Type of entity: Federal government; 
Organization: FAA's Office of Flight Standards Service, Northwest 
Mountain Region: Seattle Flight Standards District Office.

Location: Seattle, WA, area; 
Type of entity: Organizational designated airworthiness representative; 
Organization: The Boeing Company. 

Location: Seattle, WA, area; 
Type of entity: Designated airworthiness representative; 
Organization: Pacific Propellers. 

Location: Seattle, WA, area; 
Type of entity: Designated alteration station; 
Organization: Goodrich Aviation Technical Services, Inc. 

Location: Seattle, WA, area; 
Type of entity: Special Federal Aviation Regulations No. 36, repair 
station; 
Organization: Alaska Airlines. 

Location: Atlanta, GA, area; 
Type of entity: Federal government; 
Organization: FAA's Office of Aircraft Certification. 

Location: Atlanta, GA, area; 
Type of entity: Federal government; 
Organization: FAA's Office of Aircraft Certification: Small Airplane 
Directorate.

Location: Atlanta, GA, area; 
Type of entity: Federal government; 
Organization: FAA's Office of Aerospace Medicine. 

Location: Atlanta, GA, area; 
Type of entity: Federal government; 
Organization: FAA's Office of Flight Standards Service, Southern 
Region. 

Location: Atlanta, GA, area; 
Type of entity: Federal government; 
Organization: FAA's Office of Flight Standards Service, Southern 
Region; 
* Flight Standards Regional Office.

Location: Atlanta, GA, area; 
Type of entity: Designated engineering representative; 
Organization: Garrett Aviation.

Location: Atlanta, GA, area; 
Type of entity: Designated engineering representative; 
Organization: Propulsion Consultants Inc. 

Location: Atlanta, GA, area; 
Type of entity: Designated engineering representative; 
Organization: Delta Airlines.

Location: Los Angeles, CA, area; 
Type of entity: Federal government; 
Organization: FAA's Office of Aircraft Certification. 

Location: Los Angeles, CA, area; 
Type of entity: Federal government; 
Organization: FAA's Office of Aircraft Certification: Manufacturing 
Inspection District Office.

Location: Los Angeles, CA, area; 
Type of entity: Federal government; 
Organization: FAA's Office of Aerospace Medicine. 

Location: Los Angeles, CA, area; 
Type of entity: Federal government; 
Organization: FAA's Office of Flight Standards Service, Western 
Pacific Region. 

Location: Los Angeles, CA, area; 
Type of entity: Federal government; 
Organization: FAA's Office of Flight Standards Service, Western 
Pacific Region: Los Angeles Flight Standards District Office. 

Location: Los Angeles, CA, area; 
Type of entity: Federal government; 
Organization: FAA's Office of Flight Standards Service, Western 
Pacific Region: Riverside Flight Standards District Office.

Location: Los Angeles, CA, area; 
Type of entity: Designated pilot examiner; 
Organization: Aviation Services. 

Location: Los Angeles, CA, area; 
Type of entity: Designated airworthiness representative/Designated 
engineering representative; 
Organization: CDO Associates. 

Location: Oklahoma City, OK, area; 
Type of entity: Federal government; 
Organization: FAA's Aviation Data Systems. 

Location: Oklahoma City, OK, area; 
Type of entity: Federal government; 
Organization: FAA's Designee Standardization Branch. 

Location: Oklahoma City, OK, area; 
Type of entity: Federal government; 
Organization: FAA's Delegation and Continued Airworthiness Programs 
Branch. 

Location: Oklahoma City, OK, area; 
Type of entity: Federal government; 
Organization: FAA's Medical Systems Branch. 

Location: Oklahoma City, OK, area; 
Type of entity: Federal government; 
Organization: FAA's Aerospace Medical Education Division. 

Location: Oklahoma City, OK, area; 
Type of entity: Federal government; 
Organization: FAA's Civil Aerospace Medical Institute. 

Location: Oklahoma City, OK, area; 
Type of entity: Federal government; 
Organization: FAA's Aerospace Human Factors Research Division. 

Location: Oklahoma City, OK, area; 
Type of entity: Federal government; 
Organization: FAA's Office of Flight Standards Service, Southwest 
Region. 

Location: Oklahoma City, OK, area; 
Type of entity: Federal government; 
Organization: FAA's Office of Flight Standards Service, Southwest 
Region: Oklahoma City Flight Standards District Office.

Source: GAO.

[End of table]

In addition, we convened a Web-based panel of experts selected for 
their knowledge and expertise in the area of FAA's designee programs. 
An initial list of experts was identified through referrals by FAA 
officials, the National Air Traffic Controllers Association, the 
Professional Airway System Specialists, and the Aerospace Repair 
Station Association and through citations in the literature on 
aviation. We then asked these initially identified experts for 
additional experts. We continued this process until we had about 10 to 
20 experts in each of four categories: (1) designees, (2) FAA 
inspectors and engineers, (3) independent experts and university 
academics, and (4) private sector and aviation industry associations. 
(See app. II for the list of participating experts.)

To structure and gather expert opinion from the panel, we employed a 
modified version of the Delphi method.[Footnote 36] To obtain opinions 
from the large, diverse group of experts, we incorporated an iterative 
and controlled feedback process--an important feature of the Delphi 
method. We did not encourage experts to arrive at a consensus nor make 
forecasts. During this process, we obtained opinions from the experts 
using questionnaires administered over the Internet. The experts' 
identities were kept anonymous during this step of the process. The 
anonymity of this approach helped minimize potential biasing effects 
often associated with live group discussions. Biasing effects of live 
expert discussion sessions may include the dominance of individuals and 
group pressure for conformity.[Footnote 37] The dominance bias would 
tend to limit the input of less dominant individuals, and the group 
pressure bias would tend to suppress true opinion, particularly on more 
controversial issues. These concerns were particularly important given 
the need for a broad range of expertise from individuals with varying 
backgrounds and perspectives. Also, by creating a Web-based panel we 
were able to include many more experts than we could have if we had 
convened a live panel.

In the first phase of the expert panel, which ran from October 2 to 31, 
2003, we asked the panelists to respond to three open-ended questions: 
(1) What, if any, are the three most significant strengths of the FAA 
designee programs? (2) What, if any, are the three most significant 
weaknesses of the FAA designee programs? And (3) What, if any, are your 
suggestions for addressing the weaknesses of or otherwise improving the 
FAA designee programs? We further asked them to indicate those 
responses that referred only to specific types of designees. The three 
questions, based on our study objectives, were pre-tested to ensure 
that the questionnaire was clear and unambiguous, did not place undue 
burden on individuals completing it, and was independent and unbiased. 
We made relevant changes before we deployed the first questionnaire to 
all participants on the Internet.

We performed a content analysis of the responses to the open-ended 
questions in order to compile a list of all the strengths, weaknesses, 
and improvements mentioned by the experts. We contacted the experts, 
when necessary, if responses were unclear. About 25 percent of the 
coded responses were reviewed by an independent coder to ensure that 
the initial coding decisions were consistent and valid. To maintain 
standards of methodological integrity, any disagreements in coding 
between the coder and reviewer were discussed until consensus was 
reached.

The content coded results from the phase I questionnaire consisted of a 
list of distinct and specific strengths, weaknesses, and suggested 
improvements, which were used to construct the phase II questionnaire. 
The phase II questionnaire also served as a feedback mechanism to the 
panelists about what other experts thought were important strengths and 
weaknesses. The phase II questionnaire was also pre-tested, revised, 
and then administered on the Internet from January 5 to March 30, 2004.

In phase II, the panelists rated the strengths, weaknesses, and 
suggested improvements on various relevant dimensions using a five-
category scale (e.g., "no weakness" to "very great weakness," or 
"definitely infeasible" to "definitely feasible"). In analyzing the 
responses to the phase II questionnaire, we calculated the frequency of 
responses to identify the strongest levels of opinions on each item 
regarding the strength, weakness, or attractiveness (based on 
importance and feasibility) of suggested improvements. We ranked the 
results based on the number of responses at the top two categories 
(e.g., the number of "great weakness" and "very great weakness" 
responses) that were rated as the more frequently identified responses.

Initially, 78 experts agreed to participate in the panel. Fifty-eight 
panelists actually completed the phase I questionnaire, resulting in a 
response rate of 74 percent. There was some attrition during the 
subsequent phase. Of the 76 experts who agreed to participate in phase 
II, 62 actually completed the questionnaire (including some who did not 
participate in phase I). This resulted in a 82 percent response rate 
for phase II (see table 6).

Table 6: The Number of Panelists Participating in Each Phase and 
Response Rate:

Phase: I; 
Experts who agreed to participate: 78; 
Experts responding to questionnaire: 58; 
Response rate in percentile: 74%.

Phase: II; 
Experts who agreed to participate: 76; 
Experts responding to questionnaire: 62; 
Response rate in percentile: 82%.

Source: GAO.

[End of table]

We conducted our work between April 2003 and October 2004 in accordance 
with the generally accepted government auditing standards.

[End of section]

Appendix II: Experts Participating on GAO's Panel:

Independent expert or university affiliation: 
Roger Bacchieri, Chair, 
Air Traffic Management Division, 
Daniel Webster College.

Independent expert or university affiliation: 
Patricia Backer, Chair, 
Department of Aviation and Technology, 
San Jose State University.

Independent expert or university affiliation: 
William Caldwell, Chair, 
Department of Aviation, 
Central Missouri State University.

Independent expert or university affiliation: 
Thomas J. Connolly, Associate Dean, 
College of Aviation, 
Embry-Riddle Aeronautical University.

Independent expert or university affiliation: 
Bart J. Crotty, 
Aviation Safety/Security Consultant, 
former FAA Airworthiness Inspector, 
former FAA Designated Airworthiness Representative.

Independent expert or university affiliation: 
Alfred Dickinson, Director, 
Aviation Safety Program, 
University of Southern California.

Independent expert or university affiliation: 
Carey L. Freeman, Chair, 
Aviation Department, 
Hampton University.

Independent expert or university affiliation: 
Jim Frisbee, 
Aviation Consultant, 
former Director of Quality Assurance, 
Northwest Airlines.

Independent expert or university affiliation: 
Larry Gross, 
Associate Professor of Aviation Technology, 
Purdue University.

Independent expert or university affiliation: 
Gary Kitely, Executive Director, 
Council on Aviation Accreditation.

Independent expert or university affiliation: 
Nick Lacey, 
Aviation Consultant, 
Mortem Beyer and Agnew, 
former Director of FAA's Flight Standards Service.

Independent expert or university affiliation: 
Doug Latia, 
Associate Professor, 
Aviation Technology Department, 
Purdue University.

Independent expert or university affiliation: 
Fred Leonelli, 
former manager of FAA's Aircraft Maintenance Division.

Independent expert or university affiliation: 
Kent Lovelace, Chair, 
Department of Aerospace, 
University of North Dakota.

Independent expert or university affiliation: 
Jacqueline B. Sanders, 
Assistant to the Provost, 
Mercer County Community College.

Independent expert or university affiliation: 
Glynn Dale Sistrunk, Chair, 
Department of Professional Aviation, 
Louisiana Tech University.

Aviation industry: 
Mark Arcelle, 
Senior Manager of Fleet Engineering, 
FedEx Express.

Aviation industry: 
Melissa Bailey, 
Vice President of Air Traffic Regulation and Certification, 
Aircraft Owners and Pilots Association.

Aviation industry: 
Tony Broderick, 
Aviation Safety Consultant, 
former FAA Associate Administrator.

Aviation industry: 
Eric Byer, 
Manager of Government Industry Affairs, 
National Air Transportation Association.

Aviation industry: 
Aubrey Carter, 
General Manager of Enabling Technology, 
Delta Airlines.

Aviation industry: 
Elias Cotti, 
Director of Technical Operations, 
National Business Aviation Association.

Aviation industry: 
Brian Finnegan, President, 
Professional Aviation Maintenance Association.

Aviation industry: 
John Frisbee, 
Manager of Quality Assurance, 
Champion Airline.

Aviation industry: 
Rick Hoy, Manager, 
Regulatory Compliance, 
Delta Airline.

Aviation industry: 
Sarah Macleod, Executive Director, 
Aeronautical Repair Station Association.

Aviation industry: 
Doug MacNair, Vice President, 
Government Relations, 
Experimental Aircraft Association.

Aviation industry: 
Nick Mateo, Senior Director, 
Technical Services, 
Continental Airlines.

Aviation industry: 
Thomas McSweeny, 
Director of International Safety and Regulatory Affairs, Boeing, 
former FAA Associate Administrator of Regulation and Certification, 
former FAA Director of Aircraft Certification Service.

Aviation industry: 
Rick Oehme, Vice President, 
Quality and Engineering, 
America West Airlines.

Aviation industry: 
Richard Peri, Vice President, 
Government and Industry Affairs, 
Aircraft Electronic Association.

Aviation industry: 
Robert Robeson, Jr., Vice President of Civil Aviation, 
Aerospace Industries Association.

Aviation industry: 
Stan Sorscher, Labor Representative, 
Society of Professional Engineering Employees in Aerospace.

Aviation industry: 
Ronald Swanda, Vice President of Operations, 
General Aviation Manufacturers Association.

Aviation industry: 
Mark Szkirpan, Senior Specialist of Regulatory Affairs, 
American Airlines.

Designees: 
David Bryman, D.O., 
Senior Aviation Medical Examiner.

Designees: 
Thomas W. Carroll, 
Designated Airworthiness Representative, 
former FAA Supervisory Aviation Safety Inspector.

Designees: 
Harold Coralnick, M.D., 
Senior Aviation Medical Examiner.

Designees: 
Dominick P. DaCosta, 
Designated Airworthiness Representative, 
Designated Engineering Representative, 
Chief Executive Officer of DERS Group Inc.

Designees: 
Joseph Kilpatrick, 
Designated Engineering Representative.

Designees: 
Osvaldo Lopez, 
Designated Engineering Representative.

Designees: 
Joe Norris, 
Designated Airworthiness Representative.

Designees: 
David Orfant, 
CDO Associates, 
Designated Airworthiness Representative (Manufacturing and 
Maintenance), 
Designated Engineering Representative.

Designees: 
Thomas C. Willis, 
Designated Airworthiness Representative (Maintenance). 

Source: GAO.

Note: In addition, to the experts listed above, 7 inspectors from FAA's 
Flight Standards Service, 10 engineers from FAA's Aircraft 
Certification Service, and 1 other designee participated on the panel.

[End of table]

[End of section]

Appendix III: Roles and Responsibilities of Designees:

Designee type: Individual designees: Aviation Medical Examiners; 
Responsibilities: Authorized to accept applications and perform 
physical examinations necessary to determine qualification for the 
issuance of airman medical certificates and combined medical/student 
pilot certificates. Designees can issue, defer, or deny the 
certificates, as appropriate.

Designee type: Individual designees: Designated Engineering 
Representatives; 
Responsibilities: Authorized to examine and approve certain 
engineering technical data for their employer. Designees can either be 
employed by a company or act as free agents.

Designee type: Individual designees: Designated Manufacturing 
Inspection Representatives; 
Responsibilities: Authorized to perform conformity inspections[A] and 
issue airworthiness certificates and approvals for products and parts 
produced by FAA-approved production approval holders.[B] Production 
approval holders or its authorized suppliers employ this type of 
designee.

Designee type: Individual designees: Training Center Evaluators; 
Responsibilities: Authorized to accept applications and conduct 
practical tests leading to the issuance of pilot and flight instructor 
certificates.

Designee type: Individual designees: Designated Pilot Examiners; 
Responsibilities: Authorized to accept applications for flight tests, 
conduct those tests, and issue temporary pilot certificates to 
qualified applicants.

Designee type: Individual designees: Aircrew Program Designees; 
Responsibilities: Authorized to perform airman certification in one
type of aircraft for an operator's pilots who have been trained under 
the operator's FAA-approved training program.

Designee type: Individual designees: Designated Airworthiness 
Representatives (maintenance); 
Responsibilities: Authorized to perform certain inspections, including 
issuing recurrent airworthiness certificates and approvals for 
maintenance conducted by repair stations and air carriers.

Designee type: Individual designees: Designated Airworthiness 
Representatives (manufacturing); 
Responsibilities: Authorized to perform conformity inspections, issue 
airworthiness certificates and approval for products and parts produced 
by FAA-approved production approval holders. Designees are independent 
individuals, but may be employed by the production approval holder.

Designee type: Individual designees: Designated Mechanic Examiners; 
Responsibilities: Authorized to accept applications for and conduct 
oral and practical tests for issuing mechanic certificates.

Designee type: Individual designees: Designated Parachute Rigger 
Examiners; 
Responsibilities: Authorized to accept applications for, and conduct, 
oral and practical tests for issuing parachute rigger certificates.

Designee type: Individual designees: Designated Aircraft Dispatcher 
Examiners; 
Responsibilities: Authorized to accept applications for, and conduct, 
written and practical tests necessary for issuing aircraft dispatcher 
certificates and, at the discretion of a local Flight Standards 
inspector, issue temporary aircraft dispatcher certificates to 
qualified applicants.

Designee type: Individual designees: Designated Flight Engineer 
Examiners; 
Responsibilities: Authorized to perform airman certification for an 
operator's flight engineer candidates who have been trained under the 
operator's FAA-approved training program.

Designee type: Individual designees: Computer Testing Designee; 
Responsibilities: Authorized to administer computerized airman 
knowledge tests through computer test sites located throughout the 
United States and authorized foreign locations.

Designee type: Organizational designees: Organizational Designated 
Airworthiness Representatives (maintenance); 
Responsibilities: Organizations that (1) hold repair station 
certificates with appropriate ratings or air carrier operating 
certificates with FAA-approved Continuous Airworthiness Maintenance 
programs and (2) are authorized to issue recurrent airworthiness 
certificates and export airworthiness approvals for certain products.

Designee type: Organizational designees: Organizational Designated 
Airworthiness Representatives (manufacturing); 
Responsibilities: Organizations that hold FAA production approvals and 
are authorized to issue airworthiness certificates and approvals and 
make conformity determinations.

Designee type: Organizational designees: Designated Alteration 
Stations; 
Responsibilities: Companies that hold a current domestic repair 
station certificate and are manufacturers of a product for which they 
have alteration authority. The designees are authorized to issue 
supplemental type certificates, perform prototype conformity 
inspections,[C] and issue experimental airworthiness certificates for 
the purpose of flight-testing and the standard airworthiness 
certificate after the supplemental type certificate has been issued.

Designee type: Organizational designees: Special Federal Aviation 
Regulations No. 36, Repair Stations; 
Responsibilities: Companies that are authorized to generate engineering 
technical data that are acceptable to the FAA. These data can be used 
only by the specific designee for major repairs.

Designee type: Organizational designees: Delegation Option 
Authorizations; 
Responsibilities: Companies that are authorized to obtain type 
certificates, approve type design changes, conduct conformity 
inspections, and issue airworthiness certificates and approvals. 

Source: GAO analysis of FAA documents.

[A] Conformity inspection is an assessment necessary to determine that 
aviation products and related parts conform to an approved design and 
can be operated safely.

[B] Production approval holders are aircraft manufacturers that hold a 
type or production certificate and can produce modification or 
replacement parts.

[C] Prototype conformity inspection is an examination to verify an 
applicant's compliance with federal regulations and determine that 
prototype products and related parts conform to proposed design 
drawings and specifications.

[End of table]

[End of section]

Appendix IV: Survey Instrument and Results:

This appendix presents the results from the expert panel on the 
identified strengths, weaknesses, and what can be done to address the 
program weaknesses or otherwise improve the designee programs. Included 
here are the questions and the ranking of responses developed based on 
the frequency of responses to questions that were completed by members 
of the panel selected for this study (referred to as "phase I" and 
"phase II"). We administered the questionnaires for phases I and II 
over the Internet.

As discussed in appendix I, in phase I of the expert panel, we asked 
the panelists to respond to open-ended questions about the identified 
strengths, weaknesses, and the potential of other alternatives to 
improve FAA's designee programs. We performed a content analysis on the 
responses to the open-ended questions in order to develop close-ended 
questions for phase II of the expert panel. The purpose of the second 
phase was to provide the panelists with the opportunity to consider the 
other panelists' responses to the first phase and to respond in a 
structured, quantifiable way. Phase II consisted of 64 closed-ended 
questions on the categorized responses to phase I. Sixty-two of the 76 
experts completed the phase II survey (about 82 percent response rate). 
Table 7 summarizes the results from phase II, ranked based on the 
number of responses at the top two points on the categories (e.g., the 
number of "great" and "very great" responses) that were rated as the 
more frequently identified responses.

Table 7: Experts' Responses to GAO's Survey:

Strengths of FAA's designee programs: 

1; How important, if at all, is each of the following strengths of 
FAA's designee programs toward accomplishing FAA's safety 
responsibilities?

Strengths: a; Use of designees expands available FAA resources; 
No: 0; 
Some: 3; 
Moderate: 9; 
Great: 16; 
Very great: 33; 
Don’t know/No opinion: 1; 
No response: 0.

Strengths: b; Use of designees allows for more timely approvals than by 
not using designees; 
No: 0; 
Some: 3; 
Moderate: 10; 
Great: 15; 
Very great: 33; 
Don’t know/No opinion: 1; 
No response: 0.

Strengths: c; Use of designees expands available technical expertise; 
No: 2; 
Some: 2; 
Moderate: 11; 
Great: 19; 
Very great: 27; 
Don’t know/No opinion: 1; 
No response: 0.

Strengths: d; Use of designees enables FAA staff to concentrate on 
other areas of aviation safety; 
No: 2; 
Some: 5; 
Moderate: 13; 
Great: 19; 
Very great: 20; 
Don’t know/No opinion: 2; 
No response: 1.

Strengths: e; Designees provide greater scheduling flexibility and 
access to the public; 
No: 1; 
Some: 7; 
Moderate: 12; 
Great: 12; 
Very great: 27; 
Don’t know/No opinion: 2; 
No response: 1.

Strengths: f; Use of designees allows for greater geographic coverage; 
No: 3; 
Some: 7; 
Moderate: 12; 
Great: 15; 
Very great: 23; 
Don’t know/No opinion: 1; 
No response: 1.

Strengths: g; Designees also perform liaison role improving relations 
between FAA and aviation community; 
No: 5; 
Some: 12; 
Moderate: 16; 
Great: 19; 
Very great: 7; 
Don’t know/No opinion: 3; 
No response: 0.

Strengths: h; Designees help educate FAA engineers and inspectors; 
No: 10; 
Some: 7; 
Moderate: 15; 
Great: 17; 
Very great: 8; 
Don’t know/No opinion: 4; 
No response: 1.

Strengths: i; Designees provide consistent certification because they 
receive recurrent training; 
No: 7; 
Some: 14; 
Moderate: 16; 
Great: 13; 
Very great: 7; 
Don’t know/No opinion: 4; 
No response: 1.

Strengths: j; Designees provide a pool of resources from which to draw 
when filling positions at FAA; 
No: 11; 
Some: 15; 
Moderate: 16; 
Great: 10; 
Very great: 4; 
Don’t know/No opinion: 5; 
No response: 1.

Weaknesses of FAA's designee programs: 

2; How much of a weakness is each of the following factors related to 
the workload of FAA inspectors and aircraft certification engineers who 
oversee designees?

Factors: a; Numbers of FAA inspectors and engineers not increasing 
commensurate with industry growth; 
No: 1; 
Some: 5; 
Moderate: 17; 
Great: 12; 
Very great: 20; 
Don’t know/No opinion: 7; 
No response: 0.

Factors: b; Backlog of work submitted by designees awaiting approval/
concurrence by FAA; 
No: 2; 
Some: 6; 
Moderate: 18; 
Great: 16; 
Very great: 11; 
Don’t know/No opinion: 8; 
No response: 1.

Factors: c; FAA inspectors and engineers do not have enough time to 
provide adequate oversight of designees for whom they are responsible; 
No: 5; 
Some: 4; 
Moderate: 16; 
Great: 17; 
Very great: 10; 
Don’t know/No opinion: 8; 
No response: 2.

Factors: d; Insufficient number of FAA inspectors/engineers compared 
with designees to provide adequate oversight; 
No: 3; 
Some: 8; 
Moderate: 23; 
Great: 13; 
Very great: 8; 
Don’t know/No opinion: 7; 
No response: 0.

Factors: e; Applies only to Designated Pilot Examiners (DPE): High 
turnover rate of FAA inspectors responsible for overseeing DPEs; 
No: 2; 
Some: 2; 
Moderate: 7; 
Great: 5; 
Very great: 2; 
Don’t know/No opinion: 25; 
No response: 19.

3; How much of a weakness is each of the following factors related to 
the designee selection process?

Factors: a; Local FAA offices appoint designees based on personal 
associations rather than qualifications and experiences; 
No: 7; 
Some: 15; 
Moderate: 12; 
Great: 10; 
Very great: 11; 
Don’t know/No opinion: 7; 
No response: 0.

Factors: b; Shortage of designees in some geographic areas and in 
certain specializations; 
No: 4; 
Some: 6; 
Moderate: 23; 
Great: 15; 
Very great: 5; 
Don’t know/No opinion: 9; 
No response: 0.

Factors: c; FAA limits the number of designees; 
No: 8; 
Some: 7; 
Moderate: 18; 
Great: 13; 
Very great: 7; 
Don’t know/No opinion: 8; 
No response: 1.

Factors: d; FAA does not follow its own selection criteria; 
No: 4; 
Some: 17; 
Moderate: 13; 
Great: 11; 
Very great: 8; 
Don’t know/No opinion: 8; 
No response: 1.

Factors: e; The selection process lacks sufficient rigor to ensure 
that designees are competent and will perform high quality work; 
No: 3; 
Some: 22; 
Moderate: 14; 
Great: 9; 
Very great: 9; 
Don’t know/No opinion: 4; 
No response: 1.

Factors: f; Variation in the qualifications of designees; 
No: 3; 
Some: 19; 
Moderate: 19; 
Great: 13; 
Very great: 4; 
Don’t know/No opinion: 3; 
No response: 1.

Factors: g; The application process for becoming a designee takes a 
long time; 
No: 8; 
Some: 18; 
Moderate: 15; 
Great: 10; 
Very great: 3; 
Don’t know/No opinion: 8; 
No response: 0.

Factors: h; The selection process is not well defined; 
No: 8; 
Some: 24; 
Moderate: 13; 
Great: 8; 
Very great: 4; 
Don’t know/No opinion: 4; 
No response: 1.

4; How much of a weakness is each of the following factors related to 
designee activities?

Factors: a; Applicants for certification shop for "easy" designees; 
No: 0; 
Some: 11; 
Moderate: 14; 
Great: 11; 
Very great: 18; 
Don’t know/No opinion: 6; 
No response: 2.

Factors: b; Employer pressure of financial incentives may lead to 
conflicts of interest; 
No: 10; 
Some: 13; 
Moderate: 15; 
Great: 8; 
Very great: 12; 
Don’t know/No opinion: 4; 
No response: 0.

Factors: c; 
Some applicants for certification are unfamiliar with FAA requirements 
and designee's authority limits; 
No: 7; 
Some: 12; 
Moderate: 21; 
Great: 12; 
Very great: 6; 
Don’t know/No opinion: 1; 
No response: 3.

Factors: d; Designees' fees are inconsistent and unregulated; 
No: 8; 
Some: 15; 
Moderate: 12; 
Great: 9; 
Very great: 8; 
Don’t know/No opinion: 7; 
No response: 3.

Factors: e; Designees perform beyond their delegated authority; 
No: 7; 
Some: 21; 
Moderate: 10; 
Great: 10; 
Very great: 6; 
Don’t know/No opinion: 6; 
No response: 2.

Factors: f; Designees provide inconsistent service; 
No: 6; 
Some: 14; 
Moderate: 25; 
Great: 9; 
Very great: 6; 
Don’t know/No opinion: 2; 
No response: 0.

Factors: g; Designees perform more activities in less time than 
standards would seem to require; 
No: 8; 
Some: 19; 
Moderate: 11; 
Great: 10; 
Very great: 4; 
Don’t know/No opinion: 9; 
No response: 1.

Factors: h; Designees are not current on regulations and orders; 
No: 11; 
Some: 19; 
Moderate: 18; 
Great: 9; 
Very great: 4; 
Don’t know/No opinion: 1; 
No response: 0.

Factors: i; Designees are constrained geographically; 
No: 9; 
Some: 20; 
Moderate: 15; 
Great: 10; 
Very great: 3; 
Don’t know/No opinion: 3; 
No response: 2.

Factors: j; Companies with organizational designations appoint 
inexperienced engineers to make approvals and do not train them in the 
certification process; 
No: 8; 
Some: 12; 
Moderate: 8; 
Great: 5; 
Very great: 7; 
Don’t know/No opinion: 19; 
No response: 3.

Factors: k; Erroneous certification by designees; 
No: 7; 
Some: 18; 
Moderate: 15; 
Great: 5; 
Very great: 7; 
Don’t know/No opinion: 7; 
No response: 3.

Factors: l; Designees do not understand their full authority; 
No: 8; 
Some: 19; 
Moderate: 18; 
Great: 5; 
Very great: 6; 
Don’t know/No opinion: 4; 
No response: 2.

Factors: m; The current scope of organizational delegation is narrow; 
No: 9; 
Some: 18; 
Moderate: 10; 
Great: 8; 
Very great: 3; 
Don’t know/No opinion: 12; 
No response: 2.

Factors: n; Designees perform outside of their jurisdiction without the 
knowledge and authorization of local FAA offices; 
No: 9; 
Some: 17; 
Moderate: 12; 
Great: 5; 
Very great: 4; 
Don’t know/No opinion: 15; 
No response: 0.

Factors: o; Limitations on the approval authority of designees; 
No: 10; 
Some: 23; 
Moderate: 16; 
Great: 5; 
Very great: 2; 
Don’t know/No opinion: 4; 
No response: 2.

Factors: p; Applies only to Designated Mechanic Examiners: Inflexible 
procedures for testing candidates for A&P certificates; 
No: 4; 
Some: 3; 
Moderate: 6; 
Great: 2; 
Very great: 2; 
Don’t know/No opinion: 25; 
No response: 20.

5; How much of a weakness is each of the following factors related to 
FAA oversight?

Factors: a; FAA offices' level of oversight and interpretation of rules 
are inconsistent; 
No: 2; 
Some: 5; 
Moderate: 17; 
Great: 16; 
Very great: 20; 
Don’t know/No opinion: 0; 
No response: 2.

Factors: b; Inactive, unqualified, or poor performing designees are 
not identified or removed expeditiously; 
No: 1; 
Some: 8; 
Moderate: 22; 
Great: 15; 
Very great: 12; 
Don’t know/No opinion: 3; 
No response: 1.

Factors: c; It is difficult to terminate poor performing designees; 
No: 2; 
Some: 9; 
Moderate: 16; 
Great: 6; 
Very great: 17; 
Don’t know/No opinion: 12; 
No response: 0.

Factors: d; Inadequate surveillance and oversight of designees; 
No: 6; 
Some: 13; 
Moderate: 15; 
Great: 8; 
Very great: 14; 
Don’t know/No opinion: 4; 
No response: 2.

Factors: e; FAA has not made oversight of designees a high enough 
priority; 
No: 4; 
Some: 9; 
Moderate: 18; 
Great: 12; 
Very great: 8; 
Don’t know/No opinion: 8; 
No response: 3.

Factors: f; Multitude of bulletins, advisory circulars, and other 
documents from FAA have resulted in conflicting information and 
procedures; 
No: 1; 
Some: 11; 
Moderate: 23; 
Great: 10; 
Very great: 10; 
Don’t know/No opinion: 2; 
No response: 5.

Factors: g; FAA management does not agree with engineers' or 
inspectors' judgment about disciplining or removing poor performing 
designees; 
No: 4; 
Some: 11; 
Moderate: 7; 
Great: 9; 
Very great: 10; 
Don’t know/No opinion: 18; 
No response: 3.

Factors: h; Oversight process is burdensome for FAA staff; 
No: 6; 
Some: 11; 
Moderate: 17; 
Great: 8; 
Very great: 10; 
Don’t know/No opinion: 9; 
No response: 1.

Factors: i; Designees are not held accountable for their findings; 
No: 10; 
Some: 19; 
Moderate: 8; 
Great: 7; 
Very great: 11; 
Don’t know/No opinion: 5; 
No response: 2.

Factors: j; FAA does not terminate poorly performing organizational 
designees because that would put an entire company out of business; 
No: 4; 
Some: 13; 
Moderate: 10; 
Great: 9; 
Very great: 8; 
Don’t know/No opinion: 17; 
No response: 1.

Factors: k; FAA does not have adequate authority to impose penalties on 
certain types of designees; 
No: 9; 
Some: 14; 
Moderate: 11; 
Great: 8; 
Very great: 8; 
Don’t know/No opinion: 10; 
No response: 2.

Factors: l; Lack of FAA process to evaluate the designee programs; 
No: 5; 
Some: 11; 
Moderate: 19; 
Great: 10; 
Very great: 5; 
Don’t know/No opinion: 7; 
No response: 5.

Factors: m; Lack of independent review of data. Designees perform the 
analysis of the data that they then approve. The data are not reviewed 
by a different person; 
No: 6; 
Some: 15; 
Moderate: 10; 
Great: 9; 
Very great: 5; 
Don’t know/No opinion: 12; 
No response: 5.

Factors: n; FAA engineers duplicate efforts of designees; 
No: 4; 
Some: 20; 
Moderate: 8; 
Great: 4; 
Very great: 6; 
Don’t know/No opinion: 17; 
No response: 3.

Factors: o; FAA engineers are reluctant to delegate routine activities 
to designees; 
No: 6; 
Some: 11; 
Moderate: 10; 
Great: 6; 
Very great: 4; 
Don’t know/No opinion: 21; 
No response: 4.

Factors: p; FAA management pressures FAA engineers to give designees' 
findings less scrutiny than standards require; 
No: 4; 
Some: 14; 
Moderate: 3; 
Great: 1; 
Very great: 9; 
Don’t know/No opinion: 25; 
No response: 6.

Factors: q; FAA inspectors and engineers lack the level of professional 
experience necessary to oversee designees; 
No: 9; 
Some: 19; 
Moderate: 12; 
Great: 7; 
Very great: 2; 
Don’t know/No opinion: 9; 
No response: 4.

Factors: r; The designee programs lack formal methods of appeal when 
designees' privileges are revoked; 
No: 14; 
Some: 13; 
Moderate: 12; 
Great: 5; 
Very great: 2; 
Don’t know/No opinion: 12; 
No response: 4.

Factors: s; Designees as well as the FAA inspectors/engineers who 
oversee them have little or no familiarity with the products upon which 
findings are being made; 
No: 6; 
Some: 23; 
Moderate: 12; 
Great: 2; 
Very great: 4; 
Don’t know/No opinion: 10; 
No response: 5.

Factors: t; FAA field office staffs do not have complete knowledge of 
designees within their jurisdictions; 
No: 5; 
Some: 13; 
Moderate: 23; 
Great: 3; 
Very great: 3; 
Don’t know/No opinion: 10; 
No response: 5.

Factors: u; Applies only to Aviation Medical Examiners (AME): Error 
letters are inaccurate indicators of an AME's performance; 
No: 1; 
Some: 0; 
Moderate: 3; 
Great: 1; 
Very great: 1; 
Don’t know/No opinion: 31; 
No response: 25.

6; How much of a weakness is each of the following factors related to 
training for designees, FAA inspectors, and FAA engineers?

Factors: a; FAA engineers and inspectors do not receive adequate 
training in designee oversight; 
No: 6; 
Some: 6; 
Moderate: 12; 
Great: 14; 
Very great: 7; 
Don’t know/No opinion: 12; 
No response: 5.

Factors: b; Lack of adequate and accessible designee training; 
No: 5; 
Some: 15; 
Moderate: 19; 
Great: 10; 
Very great: 4; 
Don’t know/No opinion: 5; 
No response: 4.

Factors: c; Designees are technically well versed in the area in which 
they are authorized but poorly educated in the relevant regulations; 
No: 8; 
Some: 22; 
Moderate: 13; 
Great: 7; 
Very great: 5; 
Don’t know/No opinion: 3; 
No response: 4.

Factors: d; Seminar instructors for designee training are not current 
or knowledgeable in the subject matter; 
No: 14; 
Some: 12; 
Moderate: 14; 
Great: 6; 
Very great: 4; 
Don’t know/No opinion: 8; 
No response: 4.

Factors: e; Training disparity between FAA engineers and designees 
results in designees being more current on new orders, advisories, and 
policies; 
No: 12; 
Some: 13; 
Moderate: 10; 
Great: 4; 
Very great: 5; 
Don’t know/No opinion: 13; 
No response: 5.

Overall Weaknesses of FAA’s Designee Programs: 

7; How much weakness, overall, is there in each of the following main 
areas of FAA's designee programs?

Weakness: a; FAA oversight; 
No: 2; 
Some: 10; 
Moderate: 18; 
Great: 14; 
Very great: 12; 
Don’t know/No opinion: 5; 
No response: 1.

Weakness: b; Workload of FAA inspectors and aircraft certification 
engineers who oversee designees; 
No: 4; 
Some: 5; 
Moderate: 16; 
Great: 15; 
Very great: 11; 
Don’t know/No opinion: 10; 
No response: 1.

Weakness: c; Training for FAA inspectors and FAA engineers who oversee 
designees; 
No: 3; 
Some: 5; 
Moderate: 18; 
Great: 12; 
Very great: 10; 
Don’t know/No opinion: 11; 
No response: 3.

Weakness: d; Designee selection process; 
No: 4; 
Some: 18; 
Moderate: 15; 
Great: 10; 
Very great: 7; 
Don’t know/No opinion: 6; 
No response: 2.

Weakness: e; Training for designees; 
No: 7; 
Some: 13; 
Moderate: 17; 
Great: 11; 
Very great: 6; 
Don’t know/No opinion: 4; 
No response: 4.

Weakness: f; Designee activities; 
No: 6; 
Some: 20; 
Moderate: 12; 
Great: 12; 
Very great: 4; 
Don’t know/No opinion: 6; 
No response: 2.

Addressing Weaknesses with or Otherwise Improving FAA Inspector/
engineer Workload: 

8; Increase the number of engineers/inspectors so that FAA staff have 
more time available for oversight of designees: 

Question: a; How important is it to implement this improvement? 
No: 4; 
Some: 10; 
Moderate: 14; 
Great: 17; 
Very great: 12; 
Don’t know/No opinion: 4; 
No response: 1.

Question: b; How feasible is it to implement this improvement? 
No: 2; 
Some: 8; 
Moderate: 21; 
Great: 12; 
Very great: 11; 
Don’t know/No opinion: 7; 
No response: 1.

9; Increase the priority given to the oversight of designees within 
FAA: 

Question: a; How important is it to implement this improvement? 
No: 2; 
Some: 8; 
Moderate: 17; 
Great: 24; 
Very great: 4; 
Don’t know/No opinion: 5; 
No response: 2.

Question: b; How feasible is it to implement this improvement? 
No: 0; 
Some: 6; 
Moderate: 13; 
Great: 21; 
Very great: 14; 
Don’t know/No opinion: 6; 
No response: 2.

10; Establish specific ratio for FAA engineers/inspectors to designees: 

Question: a; How important is it to implement this improvement? 
No: 5; 
Some: 15; 
Moderate: 11; 
Great: 15; 
Very great: 6; 
Don’t know/No opinion: 8; 
No response: 2.

Question: b; How feasible is it to implement this improvement? 
No: 0; 
Some: 6; 
Moderate: 16; 
Great: 13; 
Very great: 16; 
Don’t know/No opinion: 8; 
No response: 3.

Addressing Weaknesses with or Otherwise Improving the Designee 
Selection Process: 

11; Select designees according to their qualifications and experience 
rather than on personal associations with FAA managers: 

Question: a; How important is it to implement this improvement? 
No: 0; 
Some: 6; 
Moderate: 7; 
Great: 19; 
Very great: 28; 
Don’t know/No opinion: 1; 
No response: 1.

Question: b; How feasible is it to implement this improvement? 
No: 1; 
Some: 2; 
Moderate: 6; 
Great: 10; 
Very great: 40; 
Don’t know/No opinion: 2; 
No response: 1.

12; Clearly define and consistently follow the criteria for selecting 
designees: 

Improving FAA's designee programs: 

Question: a; How important is it to implement this improvement? 
No: 0; 
Some: 5; 
Moderate: 7; 
Great: 23; 
Very great: 24; 
Don’t know/No opinion: 1; 
No response: 2.

Question: b; How feasible is it to implement this improvement? 
No: 1; 
Some: 1; 
Moderate: 7; 
Great: 24; 
Very great: 24; 
Don’t know/No opinion: 3; 
No response: 2.

13; Establish a review process for determining demand for designees by 
type, specialty, activity level, and geographic location: 

Question: a; How important is it to implement this improvement? 
No: 5; 
Some: 9; 
Moderate: 20; 
Great: 16; 
Very great: 8; 
Don’t know/No opinion: 2; 
No response: 2.

Question: b; How feasible is it to implement this improvement? 
No: 3; 
Some: 1; 
Moderate: 11; 
Great: 28; 
Very great: 11; 
Don’t know/No opinion: 4; 
No response: 4.

14; Streamline procedures for the appointment of designees: 

Question: a; How important is it to implement this improvement? 
No: 4; 
Some: 15; 
Moderate: 18; 
Great: 14; 
Very great: 7; 
Don’t know/No opinion: 3; 
No response: 1.

Question: b; How feasible is it to implement this improvement? 
No: 1; 
Some: 1; 
Moderate: 15; 
Great: 25; 
Very great: 16; 
Don’t know/No opinion: 3; 
No response: 1.

15; Centralize the designee selection process: 

Question: a; How important is it to implement this improvement? 
No: 16; 
Some: 6; 
Moderate: 14; 
Great: 14; 
Very great: 9; 
Don’t know/No opinion: 2; 
No response: 1.

Question: b; How feasible is it to implement this improvement? 
No: 7; 
Some: 9; 
Moderate: 13; 
Great: 14; 
Very great: 13; 
Don’t know/No opinion: 5; 
No response: 1.

Addressing Weaknesses with or Otherwise Improving Designee Activities: 

16; Improve FAA communication with designees, including communications 
on regulations and orders and complicated certification situations: 

Question: a; How important is it to implement this improvement? 
No: 0; 
Some: 8; 
Moderate: 8; 
Great: 24; 
Very great: 18; 
Don’t know/No opinion: 3; 
No response: 1.

Question: b; How feasible is it to implement this improvement? 
No: 0; 
Some: 0; 
Moderate: 5; 
Great: 28; 
Very great: 23; 
Don’t know/No opinion: 4; 
No response: 2.

17; Clarify designations, including authority and limits: 

Question: a; How important is it to implement this improvement? 
No: 1; 
Some: 8; 
Moderate: 8; 
Great: 25; 
Very great: 14; 
Don’t know/No opinion: 4; 
No response: 2.

Question: b; How feasible is it to implement this improvement? 
No: 0; 
Some: 1; 
Moderate: 5; 
Great: 24; 
Very great: 25; 
Don’t know/No opinion: 5; 
No response: 2.

18; Make company or organizational designees part of a different group 
within the company than the group seeking the certification: 

Question: a; How important is it to implement this improvement? 
No: 6; 
Some: 6; 
Moderate: 13; 
Great: 14; 
Very great: 14; 
Don’t know/No opinion: 8; 
No response: 1.

Question: b; How feasible is it to implement this improvement? 
No: 3; 
Some: 8; 
Moderate: 11; 
Great: 17; 
Very great: 13; 
Don’t know/No opinion: 9; 
No response: 1.

19; Determine if there are additional safety-critical areas that should 
be beyond the scope of designees' authority: 

Question: a; How important is it to implement this improvement? 
No: 5; 
Some: 8; 
Moderate: 12; 
Great: 16; 
Very great: 12; 
Don’t know/No opinion: 8; 
No response: 1.

Improving FAA's designee programs: 

Question: b; How feasible is it to implement this improvement? 
No: 2; 
Some: 4; 
Moderate: 11; 
Great: 21; 
Very great: 15; 
Don’t know/No opinion: 8; 
No response: 1.

20; Provide individual designees with identification cards listing 
their delegated authorizations that could be requested by and displayed 
to customers: 

Question: a; How important is it to implement this improvement? 
No: 6; 
Some: 7; 
Moderate: 17; 
Great: 11; 
Very great: 16; 
Don’t know/No opinion: 3; 
No response: 2.

Question: b; How feasible is it to implement this improvement? 
No: 1; 
Some: 0; 
Moderate: 9; 
Great: 16; 
Very great: 30; 
Don’t know/No opinion: 5; 
No response: 1.

21; Increase FAA participation in complex approvals conducted by a 
Designated Alteration Station (DAS): 

Question: a; How important is it to implement this improvement? 
No: 1; 
Some: 5; 
Moderate: 9; 
Great: 14; 
Very great: 13; 
Don’t know/No opinion: ; 17; 
No response: 3.

Question: b; How feasible is it to implement this improvement? 
No: 0; 
Some: 0; 
Moderate: 5; 
Great: 22; 
Very great: 14; 
Don’t know/No opinion: 18; 
No response: 3.

22; Implement FAA's Organization Designation Authorization proposal 
and provide training for FAA employees on how to oversee a delegated 
organization: 

Question: a; How important is it to implement this improvement? 
No: 8; 
Some: 5; 
Moderate: 9; 
Great: 14; 
Very great: 12; 
Don’t know/No opinion: 11; 
No response: 3.

Question: b; How feasible is it to implement this improvement? 
No: 1; 
Some: 3; 
Moderate: 8; 
Great: 21; 
Very great: 13; 
Don’t know/No opinion: 13; 
No response: 3.

23; Require designees performing work outside of their geographic 
boundaries to notify their home FAA office and the FAA office where the 
work is being performed: 

Question: a; How important is it to implement this improvement? 
No: 8; 
Some: 11; 
Moderate: 13; 
Great: 16; 
Very great: 8; 
Don’t know/No opinion: 4; 
No response: 2.

Question: b; How feasible is it to implement this improvement? 
No: 2; 
Some: 2; 
Moderate: 4; 
Great: 18; 
Very great: 27; 
Don’t know/No opinion: 5; 
No response: 4.

24; Implement legislative proposal to establish "certified design 
organizations" (also called "design organization certificates"): 

Question: a; How important is it to implement this improvement? 
No: 14; 
Some: 5; 
Moderate: 9; 
Great: 8; 
Very great: 7; 
Don’t know/No opinion: 16; 
No response: 3.

Question: b; How feasible is it to implement this improvement? 
No: 2; 
Some: 6; 
Moderate: 15; 
Great: 10; 
Very great: 9; 
Don’t know/No opinion: 17; 
No response: 3.

25; Develop a fee structure of what designees may charge: 

Question: a; How important is it to implement this improvement? 
No: 23; 
Some: 16; 
Moderate: 5; 
Great: 8; 
Very great: 7; 
Don’t know/No opinion: 1; 
No response: 2.

Question: b; How feasible is it to implement this improvement? 
No: 9; 
Some: 12; 
Moderate: 10; 
Great: 9; 
Very great: 13; 
Don’t know/No opinion: 5; 
No response: 4.

26; Provide designees with broader authority: 

Question: a; How important is it to implement this improvement? 
No: 13; 
Some: 16; 
Moderate: 13; 
Great: 11; 
Very great: 3; 
Don’t know/No opinion: 4; 
No response: 2.

Question: b; How feasible is it to implement this improvement? 
No: 3; 
Some: 2; 
Moderate: 17; 
Great: 24; 
Very great: 9; 
Don’t know/No opinion: 4; 
No response: 3.

27; Make public the fees charged by designees.

Question: a; How important is it to implement this improvement? 
No: 20; 
Some: 12; 
Moderate: 14; 
Great: 5; 
Very great: 7; 
Don’t know/No opinion: 2; 
No response: 2.

Question: b; How feasible is it to implement this improvement? 
No: 9; 
Some: 4; 
Moderate: 17; 
Great: 11; 
Very great: 15; 
Don’t know/No opinion: 4; 
No response: 2.

28; Establish a standard for limiting the number of certifications that 
a designee can perform in a given period of time: 

Question: a; How important is it to implement this improvement? 
No: 23; 
Some: 14; 
Moderate: 10; 
Great: 9; 
Very great: 2; 
Don’t know/No opinion: 3; 
No response: 1.

Question: b; How feasible is it to implement this improvement? 
No: 8; 
Some: 11; 
Moderate: 15; 
Great: 13; 
Very great: 8; 
Don’t know/No opinion: 6; 
No response: 1.

29; Assign designees to applicants instead of allowing applicants to 
choose designees: 

Question: a; How important is it to implement this improvement? 
No: 24; 
Some: 13; 
Moderate: 11; 
Great: 4; 
Very great: 4; 
Don’t know/No opinion: 4; 
No response: 2.

Question: b; How feasible is it to implement this improvement? 
No: 12; 
Some: 10; 
Moderate: 12; 
Great: 9; 
Very great: 9; 
Don’t know/No opinion: 8; 
No response: 2.

Addressing Weaknesses with or Otherwise Improving FAA Oversight: 

30; Hold designees accountable for their findings: 

Question: a; How important is it to implement this improvement? 
No: 1; 
Some: 0; 
Moderate: 4; 
Great: 22; 
Very great: 31; 
Don’t know/No opinion: 0; 
No response: 4.

Question: b; How feasible is it to implement this improvement? 
No: 0; 
Some: 3; 
Moderate: 6; 
Great: 16; 
Very great: 30; 
Don’t know/No opinion: 2; 
No response: 5.

31; Ensure that FAA employees who oversee designees are knowledgeable 
about the regulations, policies, and processes applicable to the 
designee's particular specialization: 

Question: a; How important is it to implement this improvement? 
No: 0; 
Some: 0; 
Moderate: 6; 
Great: 17; 
Very great: 32; 
Don’t know/No opinion: 1; 
No response: 6.

Question: b; How feasible is it to implement this improvement? 
No: 0; 
Some: 0; 
Moderate: 6; 
Great: 19; 
Very great: 30; 
Don’t know/No opinion: 2; 
No response: 5.

32; Increase penalties (including the ability to terminate their 
status as designees) for individual and organizational designees found 
to violate standards or who do not exercise proper judgment: 

Question: a; How important is it to implement this improvement? 
No: 4; 
Some: 3; 
Moderate: 8; 
Great: 15; 
Very great: 28; 
Don’t know/No opinion: 2; 
No response: 2.

Question: b; How feasible is it to implement this improvement? 
No: 0; 
Some: 2; 
Moderate: 4; 
Great: 27; 
Very great: 24; 
Don’t know/No opinion: 3; 
No response: 2.

33; Establish strict criteria and process for identifying and removing 
designees that are underperforming, unqualified, or inactive: 

Question: a; How important is it to implement this improvement? 
No: 0; 
Some: 7; 
Moderate: 11; 
Great: 20; 
Very great: 22; 
Don’t know/No opinion: 2; 
No response: 0.

Question: b; How feasible is it to implement this improvement? 
No: 0; 
Some: 1; 
Moderate: 11; 
Great: 22; 
Very great: 25; 
Don’t know/No opinion: 2; 
No response: 1.

34; Improve coordination among the regional offices and headquarters 
to standardize designee oversight: 

Question: a; How important is it to implement this improvement? 
No: 1; 
Some: 3; 
Moderate: 15; 
Great: 20; 
Very great: 18; 
Don’t know/No opinion: 2; 
No response: 3.

Question: b; How feasible is it to implement this improvement? 
No: 1; 
Some: 4; 
Moderate: 9; 
Great: 24; 
Very great: 18; 
Don’t know/No opinion: 3; 
No response: 3.

35; Obtain feedback from users, designees, and other stakeholders 
regarding the certification process and quality of oversight: 

Question: a; How important is it to implement this improvement? 
No: 2; 
Some: 5; 
Moderate: 14; 
Great: 18; 
Very great: 19; 
Don’t know/No opinion: 1; 
No response: 3.

Question: b; How feasible is it to implement this improvement? 
No: 0; 
Some: 4; 
Moderate: 6; 
Great: 28; 
Very great: 17; 
Don’t know/No opinion: 3; 
No response: 4.

36; Conduct audits to determine if designees have been given adequate 
oversight: 

Question: a; How important is it to implement this improvement? 
No: 0; 
Some: 7; 
Moderate: 15; 
Great: 26; 
Very great: 9; 
Don’t know/No opinion: 2; 
No response: 3.

Question: b; How feasible is it to implement this improvement? 
No: 0; 
Some: 3; 
Moderate: 9; 
Great: 27; 
Very great: 16; 
Don’t know/No opinion: 4; 
No response: 3.

37; Improve FAA's public relations with those in the aviation community 
who use designees by providing timely, knowledgeable responses to 
public inquiries: 

Question: a; How important is it to implement this improvement? 
No: 3; 
Some: 2; 
Moderate: 16; 
Great: 22; 
Very great: 12; 
Don’t know/No opinion: 3; 
No response: 4.

Question: b; How feasible is it to implement this improvement? 
No: 0; 
Some: 1; 
Moderate: 8; 
Great: 29; 
Very great: 15; 
Don’t know/No opinion: 5; 
No response: 4.

38; Establish a "whistleblower" program that would grant protection to 
FAA employees who identify problems with the designee programs: 

Question: a; How important is it to implement this improvement? 
No: 10; 
Some: 4; 
Moderate: 9; 
Great: 17; 
Very great: 15; 
Don’t know/No opinion: 3; 
No response: 4.

Question: b; How feasible is it to implement this improvement? 
No: 4; 
Some: 9; 
Moderate: 8; 
Great: 16; 
Very great: 18; 
Don’t know/No opinion: 3; 
No response: 4.

39; Develop competency testing and performance standards for designees: 

Question: a; How important is it to implement this improvement? 
No: 3; 
Some: 9; 
Moderate: 14; 
Great: 18; 
Very great: 13; 
Don’t know/No opinion: 1; 
No response: 4.

Question: b; How feasible is it to implement this improvement? 
No: 3; 
Some: 6; 
Moderate: 11; 
Great: 21; 
Very great: 13; 
Don’t know/No opinion: 3; 
No response: 5.

40; Increase the support by FAA management of engineers' and 
inspectors' judgment about disciplining poor performing designees: 

Question: a; How important is it to implement this improvement? 
No: 2; 
Some: 8; 
Moderate: 9; 
Great: 18; 
Very great: 12; 
Don’t know/No opinion: 7; 
No response: 6.

Question: b; How feasible is it to implement this improvement? 
No: 2; 
Some: 4; 
Moderate: 15; 
Great: 17; 
Very great: 11; 
Don’t know/No opinion: 6; 
No response: 7.

41; Develop a formal process of appeal for designees facing discipline 
or termination: 

Question: a; How important is it to implement this improvement? 
No: 1; 
Some: 8; 
Moderate: 14; 
Great: 19; 
Very great: 9; 
Don’t know/No opinion: 4; 
No response: 7.

Question: b; How feasible is it to implement this improvement? 
No: 1; 
Some: 1; 
Moderate: 10; 
Great: 20; 
Very great: 18; 
Don’t know/No opinion: 6; 
No response: 6.

42; Increase requirements for oversight and surveillance to be 
conducted by FAA inspectors and engineers: 

Question: a; How important is it to implement this improvement? 
No: 8; 
Some: 6; 
Moderate: 11; 
Great: 17; 
Very great: 11; 
Don’t know/No opinion: 4; 
No response: 5.

Question: b; How feasible is it to implement this improvement? 
No: 4; 
Some: 4; 
Moderate: 18; 
Great: 14; 
Very great: 12; 
Don’t know/No opinion: 6; 
No response: 4.

43; Choose FAA aircraft certification offices with oversight 
responsibility based on their knowledge of the product involved rather 
than the geographic location of the designee: 

Question: a; How important is it to implement this improvement? 
No: 5; 
Some: 6; 
Moderate: 11; 
Great: 18; 
Very great: 8; 
Don’t know/No opinion: 7; 
No response: 7.

Question: b; How feasible is it to implement this improvement? 
No: 2; 
Some: 4; 
Moderate: 15; 
Great: 13; 
Very great: 11; 
Don’t know/No opinion: 10; 
No response: 7.

44; Renew designees based on performance standards, rather than 
allowing renewal to be automatic: 

Question: a; How important is it to implement this improvement? 
No: 4; 
Some: 11; 
Moderate: 13; 
Great: 12; 
Very great: 14; 
Don’t know/No opinion: 3; 
No response: 5.

Question: b; How feasible is it to implement this improvement? 
No: 2; 
Some: 3; 
Moderate: 11; 
Great: 20; 
Very great: 18; 
Don’t know/No opinion: 4; 
No response: 4.

45; Make FAA engineers responsible for understanding and approving the 
results of designee actions rather than checking only the paperwork 
associated with those actions: 

Question: a; How important is it to implement this improvement? 
No: 8; 
Some: 9; 
Moderate: 9; 
Great: 12; 
Very great: 12; 
Don’t know/No opinion: 6; 
No response: 6.

Question: b; How feasible is it to implement this improvement? 
No: 6; 
Some: 9; 
Moderate: 10; 
Great: 15; 
Very great: 9; 
Don’t know/No opinion: 7; 
No response: 6.

46; Reduce the administrative (paperwork) burden of designee oversight: 

Question: a; How important is it to implement this improvement? 
No: 3; 
Some: 8; 
Moderate: 18; 
Great: 19; 
Very great: 5; 
Don’t know/No opinion: 4; 
No response: 5.

Question: b; How feasible is it to implement this improvement? 
No: 3; 
Some: 5; 
Moderate: 18; 
Great: 18; 
Very great: 8; 
Don’t know/No opinion: 6; 
No response: 4.

47; Establish a panel of senior FAA inspectors/engineers to review 
allegations of impropriety by designees. Provide the panel with the 
authority to improve penalties.

Question: How important is it to implement this improvement? 
No: 6; 
Some: 12; 
Moderate: 14; 
Great: 11; 
Very great: 12; 
Don’t know/No opinion: 5; 
No response: 2.

Question: How feasible is it to implement this improvement? 
No: 5; 
Some: 7; 
Moderate: 16; 
Great: 14; 
Very great: 12; 
Don’t know/No opinion: 6; 
No response: 2.

48; Develop an automated system to allow designees to complete and 
submit documents electronically only when they are done correctly.

Question: How important is it to implement this improvement? 
No: 5; 
Some: 11; 
Moderate: 14; 
Great: 16; 
Very great: 7; 
Don’t know/No opinion: 4; 
No response: 5.

Question: How feasible is it to implement this improvement? 
No: 1; 
Some: 5; 
Moderate: 14; 
Great: 19; 
Very great: 12; 
Don’t know/No opinion: 6; 
No response: 5.

49; Develop specific statements or checklists that identify the steps 
in the certification process and the extent of the designee's 
authority.

Question: How important is it to implement this improvement? 
No: 4; 
Some: 5; 
Moderate: 24; 
Great: 16; 
Very great: 4; 
Don’t know/No opinion: 4; 
No response: 5.

Question: How feasible is it to implement this improvement? 
No: 1; 
Some: 1; 
Moderate: 17; 
Great: 21; 
Very great: 11; 
Don’t know/No opinion: 7; 
No response: 4.

50; Eliminate geographic boundaries imposed on aircraft certification 
designees: 

Question: How important is it to implement this improvement? 
No: 10; 
Some: 7; 
Moderate: 9; 
Great: 9; 
Very great: 10; 
Don’t know/No opinion: 12; 
No response: 5.

Question: How feasible is it to implement this improvement? 
No: 2; 
Some: 3; 
Moderate: 10; 
Great: 14; 
Very great: 12; 
Don’t know/No opinion: 15; 
No response: 6.

51; Have FAA inspectors and engineers who oversee designees report to 
a central FAA focal point who is independent of their supervisors: 

Question: How important is it to implement this improvement? 
No: 11; 
Some: 5; 
Moderate: 17; 
Great: 13; 
Very great: 3; 
Don’t know/No opinion: 8; 
No response: 5.

Question: How feasible is it to implement this improvement? 
No: 5; 
Some: 8; 
Moderate: 19; 
Great: 11; 
Very great: 6; 
Don’t know/No opinion: 7; 
No response: 6.

52; Prohibit designees from approving any documents that they have 
produced: 

Question: How important is it to implement this improvement? 
No: 12; 
Some: 13; 
Moderate: 9; 
Great: 6; 
Very great: 9; 
Don’t know/No opinion: 8; 
No response: 5.

Question: How feasible is it to implement this improvement? 
No: 8; 
Some: 7; 
Moderate: 11; 
Great: 10; 
Very great: 13; 
Don’t know/No opinion: 8; 
No response: 5.

53; Applies only to Designated Engineering Representatives (DER): Make 
the selection and oversight process for company DERs the same as for 
consultant DERs.

Question: How important is it to implement this improvement? 
No: 5; 
Some: 4; 
Moderate: 12; 
Great: 8; 
Very great: 7; 
Don’t know/No opinion: 16; 
No response: 10.

Question: How feasible is it to implement this improvement? 
No: 0; 
Some: 2; 
Moderate: 8; 
Great: 14; 
Very great: 11; 
Don’t know/No opinion: 17; 
No response: 10.

54; Limit the ability of designees to contest their removal: 

Question: How important is it to implement this improvement? 
No: 20; 
Some: 15; 
Moderate: 9; 
Great: 7; 
Very great: 4; 
Don’t know/No opinion: 6; 
No response: 1.

Question: How feasible is it to implement this improvement? 
No: 7; 
Some: 7; 
Moderate: 18; 
Great: 12; 
Very great: 9; 
Don’t know/No opinion: 7; 
No response: 2.

Addressing Weaknesses with or Otherwise Improving Training: 

55; Improve availability of training for FAA inspectors and engineers 
to advance technical competence related to oversight of designees: 

Question: How important is it to implement this improvement? 
No: 2; 
Some: 4; 
Moderate: 8; 
Great: 20; 
Very great: 22; 
Don’t know/No opinion: 3; 
No response: 3.

Question: How feasible is it to implement this improvement? 
No: 1; 
Some: 1; 
Moderate: 7; 
Great: 30; 
Very great: 16; 
Don’t know/No opinion: 3; 
No response: 4.

56; Ensure standard training of designees within specific specialties 
to improve consistency of their work: 

Question: How important is it to implement this improvement? 
No: 1; 
Some: 3; 
Moderate: 14; 
Great: 20; 
Very great: 20; 
Don’t know/No opinion: 0; 
No response: 4.

Question: How feasible is it to implement this improvement? 
No: 1; 
Some: 3; 
Moderate: 9; 
Great: 24; 
Very great: 20; 
Don’t know/No opinion: 1; 
No response: 4.

57; Require consistent training for all designees with the same skill 
designation to improve the consistency among designees: 

Question: How important is it to implement this improvement? 
No: 1; 
Some: 6; 
Moderate: 13; 
Great: 27; 
Very great: 13; 
Don’t know/No opinion: 0; 
No response: 2.

Question: How feasible is it to implement this improvement? 
No: 0; 
Some: 5; 
Moderate: 6; 
Great: 31; 
Very great: 16; 
Don’t know/No opinion: 1; 
No response: 3.

58; Increase number of subject matter workshops for designees, with 
instruction provided by industry experts, FAA specialists, engineers, 
and designees: 

Question: How important is it to implement this improvement? 
No: 1; 
Some: 5; 
Moderate: 11; 
Great: 23; 
Very great: 17; 
Don’t know/No opinion: 1; 
No response: 4.

Question: How feasible is it to implement this improvement? 
No: 0; 
Some: 1; 
Moderate: 13; 
Great: 22; 
Very great: 20; 
Don’t know/No opinion: 2; 
No response: 4.

59; Require FAA inspectors and engineers to receive recurrent training 
related to the oversight of designees: 

Question: How important is it to implement this improvement? 
No: 2; 
Some: 3; 
Moderate: 14; 
Great: 22; 
Very great: 14; 
Don’t know/No opinion: 3; 
No response: 4.

Question: How feasible is it to implement this improvement? 
No: 1; 
Some: 2; 
Moderate: 10; 
Great: 27; 
Very great: 14; 
Don’t know/No opinion: 4; 
No response: 4.

60; Require additional training for designees in regulations that apply 
to their work: 

Question: How important is it to implement this improvement? 
No: 1; 
Some: 10; 
Moderate: 14; 
Great: 22; 
Very great: 9; 
Don’t know/No opinion: 2; 
No response: 4.

Question: How feasible is it to implement this improvement? 
No: 0; 
Some: 0; 
Moderate: 10; 
Great: 32; 
Very great: 15; 
Don’t know/No opinion: 2; 
No response: 3.

61; Improve and expand designee training, including routine skills 
testing: 

Question: How important is it to implement this improvement? 
No: 2; 
Some: 11; 
Moderate: 11; 
Great: 16; 
Very great: 15; 
Don’t know/No opinion: 2; 
No response: 5.

Question: How feasible is it to implement this improvement? 
No: 2; 
Some: 4; 
Moderate: 13; 
Great: 24; 
Very great: 12; 
Don’t know/No opinion: 3; 
No response: 4.

62; Have experienced designees mentor designee candidates: 

Question: How important is it to implement this improvement? 
No: 3; 
Some: 6; 
Moderate: 19; 
Great: 23; 
Very great: 6; 
Don’t know/No opinion: 1; 
No response: 4.

Question: How feasible is it to implement this improvement? 
No: 2; 
Some: 3; 
Moderate: 18; 
Great: 18; 
Very great: 15; 
Don’t know/No opinion: 1; 
No response: 5.

63; Make the training and standardization seminar for designees an 
annual requirement: 

Question: How important is it to implement this improvement? 
No: 9; 
Some: 5; 
Moderate: 15; 
Great: 13; 
Very great: 15; 
Don’t know/No opinion: 1; 
No response: 4.

Question: How feasible is it to implement this improvement? 
No: 4; 
Some: 3; 
Moderate: 11; 
Great: 23; 
Very great: 16; 
Don’t know/No opinion: 1; 
No response: 4.

64; Applies only to Designated Alteration Station (DAS): Require 
additional training for FAA inspectors and engineers in areas such as 
designee selection and oversight, regulations that pertain to the 
activities of designees, and the recognition of a management structure 
that provides appropriate direction and support for DAS operations: 

Question: How important is it to implement this improvement? 
No: 1; 
Some: 2; 
Moderate: 6; 
Great: 10; 
Very great: 13; 
Don’t know/No opinion: 18; 
No response: 12.

Question: How feasible is it to implement this improvement? 
No: 0; 
Some: 2; 
Moderate: 6; 
Great: 11; 
Very great: 14; 
Don’t know/No opinion: 18; 
No response: 11. 

Source: GAO analysis of expert panel information.

[End of table]

[End of section]

Appendix V: GAO Contacts and Staff Acknowledgments:

GAO Contacts:

JayEtta Z. Hecker (202) 512-2834: 
Gerald Dillingham (202) 512-2834: 
Teresa Spisak (202) 512-3952:

Staff Acknowledgments:

In addition to the above individuals, Howard Cott, Colin Fallon, Isidro 
Gomez, Curtis Groves, Brandon Haller, David Hooper, Jennifer Kim, Rosa 
Leung, Elizabeth A. Marchak, and Larry Thomas made key contributions to 
this report.

[End of section]

Bibliography:

Booz-Allen & Hamilton, Challenge 2000 Recommendations for Future Safety 
Regulation: Shifting Roles and Responsibilities Between FAA and 
Industry (Prepared for Federal Aviation Administration, Office of 
Policy, Planning, and International Aviation) (McLean, VA: Apr. 19, 
1996).

Department of Transportation, Office of Inspector General, Report on 
the FAA's Designated Pilot Examiner Program, E5-FA-4-007 (Washington, 
D.C.: Feb. 25, 1994).

Department of Transportation, Office of Inspector General, Pilot 
Examiner Program, R2-FA-7-001 (Washington, D.C.: Oct. 22, 1996).

Federal Aviation Administration, Designated Engineering Representative 
Oversight Team Report (Washington, D.C.: Oct. 11, 1994).

Federal Aviation Administration, Aircraft Certification Service: DER 
Oversight Evaluation (Washington, D.C.: Sept. 11, 1997).

Federal Aviation Administration, Designated Alteration Station System 
Assessment Final Report (Washington, D.C.: Sept. 21, 2000).

Federal Aviation Administration, Commercial Airplane Certification 
Process Study: An Evaluation of Selected Aircraft Certification, 
Operations, and Maintenance Processes (Washington, D.C.: March 2002).

GAO, Aviation Safety: FAA Generally Agrees With But is Slow in 
Implementing Safety Recommendations, 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/RCED-96-193]
(Washington, D.C.: September 1996).

GAO, Aircraft Certification: New FAA Approach Needed to Meet Challenges 
of Advanced Technology, 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/RCED-93-155]
(Washington, D.C.: September 1993).

National Research Council, Improving the Continued Airworthiness of 
Civil Aircraft: A Strategy for the FAA's Aircraft Certification Service 
(Washington, D.C: 1998).

RTCA, Final Report of RTCA Task Force 4 Certification (Washington, 
D.C.: Feb. 26, 1999).

(540056):

FOOTNOTES

[1] Those staff are safety inspectors in Flight Standards Service, 
engineers in Aircraft Certification Service, and flight surgeons in 
Aerospace Medicine.

[2] Title 49, U.S.C. 447702(d) provides FAA's legislative authority to 
use designees and Title 14, C.F.R., Part 183, sets out the types of 
designations FAA may issue and the process for selecting designees.

[3] Such employees, who actually perform the delegated activities, are 
referred to as "authorized representatives."

[4] Public Law 108-176, Vision 100 - Century of Aviation 
Reauthorization Act, requires FAA to develop a plan for implementing a 
certified design organization program by 2007.

[5] Production conformity is an inspection necessary to determine that 
aviation products and related parts conform to an approved design and 
can be operated safely. 

[6] A pilot must have both a pilot certificate and a medical 
certificate in order to fly an aircraft, with the exception of glider 
and balloon pilots, who are not required to have a medical certificate. 
The pilot certificate never expires. The medical certificate must be 
updated every 6 months to 3 years, depending on the type of pilot 
certificate (e.g., airline transport pilots must have their medical 
certificate updated more frequently than private pilots).

[7] The Quality Assurance Team was established as a result of a 1999 
recommendation by the International Civil Aviation Organization that 
Flight Standards Service conduct standardized evaluations of its field 
offices.

[8] Federal Aviation Administration, Southwest Region General Aviation 
Pilot Examiner Review Final Report (Fort Worth, TX: Sept. 1, 2000). 

[9] See Federal Aviation Administration, Designated Alteration Station 
System Assessment Final Report (Sept. 21, 2000); Aircraft Certification 
Service Evaluation of the Airworthiness Designee Management Program 
(Dec. 1998); and Aircraft Certification Service DER Oversight 
Evaluation (Sept. 11, 1997).

[10] The office has not assessed its smallest designee program--the 
delegation option authorization program, which has six designated 
organizations.

[11] Federal Aviation Administration, Designated Alteration Station 
System Assessment Final Report (Sept. 21, 2000).

[12] Transportation Safety Board of Canada, Aviation Investigation 
Report, In-Flight Fire Leading to Collision with Water, Swissair 
Transport Limited McDonnell Douglas MD-11 GH-IWF, Peggy's Cove, Nova 
Scotia 5 nm SW, 2 September 1998, report number A98H0003 (no date).

[13] Federal Aviation Administration, Special Certification Review Team 
Report on: Santa Barbara Aerospace STC ST00236LA-D Swissair Model MD-11 
Airplane In-flight Entertainment System (June 14, 1999).

[14] These include 31 designated alteration stations, 12 Special 
Federal Aviation Regulations No. 36 (repair stations), and 6 delegation 
option authorizations.

[15] See bibliography at the end of this report.

[16] When designated engineering representatives conduct work related 
to field approvals outside of their assigned geographic areas, they are 
not required to contact the field office where they are conducting that 
work. On the other hand, when their work is related to issuing type 
certificates or supplemental type certificates outside their assigned 
geographic area, they are required to contact the FAA field office 
where they are conducting that work.

[17] Nineteen experts indicated this factor was a "great" or "very 
great" weakness of the designee programs; 4 experts felt this factor 
was not a weakness; 17 experts felt that this posed "little" weakness.

[18] Each type of designee has unique qualification requirements, which 
are defined in FAA Order 8100.8 Chapter 4.

[19] Federal Aviation Administration, Southwest Region General Aviation 
Pilot Examiner Review Final Report (Fort Worth, TX: Sept. 1, 2000).

[20] Practical test standards are areas of operating aircraft, such as 
flight procedures or flight maneuvers, in which pilot applicants must 
demonstrate their knowledge and skills before receiving pilot 
certificates. FAA developed these standards for FAA inspectors and 
designated pilot examiners to use when conducting practical tests to 
pilot applicants.

[21] Federal Aviation Administration, Commercial Airplane 
Certification Process Study: An Evaluation of Selected Aircraft 
Certification, Operations, and Maintenance Processes (Washington, 
D.C.: March 2002).

[22] During that time period, 2,850 additional designees were 
terminated for reasons not associated with disciplinary action, such as 
change of employment, retirement, or the request of the designee. 

[23] See footnote 21.

[24] The four databases are the National Vital Information Subsystem 
and Program Tracking and Reporting Subsystem used by Flight Standards 
Service, the Designee Information Network used by Aircraft 
Certification Service, and the Airmen Medical Certification Information 
Subsystem used by Aerospace Medicine.

[25] Most designees' appointments are effective for 1 year, with the 
exception of individual and organizational designated airworthiness 
representatives, who are appointed for up to 5 years, and all other 
types of organizational designees, which are appointed indefinitely. 
According to FAA policy, the minimum level of oversight requires FAA 
engineers and inspectors to review designees' files for project 
activity in order to renew the designees' authority.

[26] GAO, Aircraft Certification: New FAA Approach Needed to Meet 
Challenges of Advanced Technologies, GAO/RCED-93-155 (Washington, 
D.C.: Sept. 16, 1993).

[27] See footnote 21.

[28] The turnover rates reported in the two studies were cumulative 
over the time period, while FAA provided information on an annual 
basis. The turnover rates from FAA, therefore, are not comparable to 
the rates from the two studies.

[29] Only one expert indicated that greater accountability of designees 
was not necessary. 

[30] In response to this question, two experts had no opinion and three 
experts declined to answer.

[31] Established in the 1980s, Canada's two types of organizational 
delegates, Design Approval Organizations and Airworthiness Engineering 
Organizations, are authorized to evaluate and approve technical data to 
determine compliance with safety requirements.

[32] The user fee program was established by the Prescription Drug User 
Fee Act of 1992.

[33] P.L. 105-66 (October 27, 1997).

[34] GAO, Transportation Financing: Challenges in Meeting Long-Term 
Funding Needs for FAA, Amtrak, and the Nation's Highways, GAO/T-RCED-
97-151 (Washington, D.C.: May 7, 1997); GAO, Airport and Airway Trust 
Fund: Issues Raised by Proposal to Replace the Airline Ticket Tax, GAO/
RCED-97-23 (Washington, D.C.: Dec. 9, 1996).

[35] The four databases are the (1) National Vital Information 
Subsystem, (2) Program Tracking and Reporting Subsystem, (3) Designee 
Information Network, and (4) Airmen Medical Certification Information 
Subsystem.

[36] For examples of recent use of this methodology see, GAO, Drinking 
Water: Experts' Views on How Future Federal Funding Can Best Be Spent 
to Improve Security, GAO-04-29 (Washington, D.C.: Oct. 31, 2003); 
International Trade: Experts' Advice for Small Businesses Seeking 
Foreign Patents, GAO-03-910 (Washington, D.C.: June 26, 2003); Economic 
Models of Cattle Prices: How USDA Can Act to Improve Models to Explain 
Cattle Prices, GAO-02-246 (Washington, D.C.: Mar. 15, 2002); 
Environmental Protection: Federal Incentives Could Help Promote Land 
Use That Protects Air and Water Quality, GAO-02-12 (Washington, D.C.: 
Oct. 31, 2001).

[37] James P. Wright, "Delphi-Systematic Opinion Gathering," The GAO 
Review (Spring 1972): 20-27.

GAO's Mission:

The Government Accountability Office, the investigative arm of 
Congress, exists to support Congress in meeting its constitutional 
responsibilities and to help improve the performance and accountability 
of the federal government for the American people. GAO examines the use 
of public funds; evaluates federal programs and policies; and provides 
analyses, recommendations, and other assistance to help Congress make 
informed oversight, policy, and funding decisions. GAO's commitment to 
good government is reflected in its core values of accountability, 
integrity, and reliability.

Obtaining Copies of GAO Reports and Testimony:

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains 
abstracts and full-text files of current reports and testimony and an 
expanding archive of older products. The Web site features a search 
engine to help you locate documents using key words and phrases. You 
can print these documents in their entirety, including charts and other 
graphics.

Each day, GAO issues a list of newly released reports, testimony, and 
correspondence. GAO posts this list, known as "Today's Reports," on its 
Web site daily. The list contains links to the full-text document 
files. To have GAO e-mail this list to you every afternoon, go to 
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order 
GAO Products" heading.

Order by Mail or Phone:

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to:

U.S. Government Accountability Office

441 G Street NW, Room LM

Washington, D.C. 20548:

To order by Phone:



Voice: (202) 512-6000:

TDD: (202) 512-2537:

Fax: (202) 512-6061:

To Report Fraud, Waste, and Abuse in Federal Programs:

Contact:

Web site: www.gao.gov/fraudnet/fraudnet.htm

E-mail: fraudnet@gao.gov

Automated answering system: (800) 424-5454 or (202) 512-7470:

Public Affairs:

Jeff Nelligan, managing director,

NelliganJ@gao.gov

(202) 512-4800

U.S. Government Accountability Office,

441 G Street NW, Room 7149

Washington, D.C. 20548: