This is the accessible text file for GAO report number GAO-12-59 entitled 'IRS Management: Cost Estimate for New Information Reporting System Needs to be Made More Reliable' which was released on January 31, 2012. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States Government Accountability Office: GAO: Report to Congressional Committees: January 2012: IRS Management: Cost Estimate for New Information Reporting System Needs to be Made More Reliable: GAO-12-59: GAO Highlights: Highlights of GAO-12-59, a report to congressional committees. Why GAO Did This Study: The Internal Revenue Service (IRS) began developing the Information Reporting and Document Matching (IRDM) program in fiscal year 2009 to enhance IRS’s ability to automatically compare different sources of tax information and thus improve its capacity to identify and address taxpayer noncompliance. GAO’s May 2011 report recommended that IRS follow best practices from the GAO’s Cost Estimating and Assessment Guide if IRS updated the cost estimate for building IRDM systems. IRS provided a new cost estimate for IRDM in August 2011. In this report, GAO assessed the extent to which (1) the IRDM funding request is supported by a reliable cost estimate and, if not reliably supported, why not; and (2) IRS’s practices for capturing data on IRDM’s actual costs and comparing them to estimated costs—-known as earned value management (EVM)—-generate reliable performance data. GAO compared IRS’s 2011 IRDM cost estimate to criteria in GAO’s cost guide and analyzed IRDM’s earned value management data. What GAO Found: The 2011 Information Reporting and Document Matching (IRDM) cost estimate, used to justify the program’s projected budgets of $115 million for fiscal years 2012 through 2016, generally does not meet best practices for reliability. As shown in the table below, the cost estimate did not fully meet any of the four best practices for a reliable cost estimate. Table: Best Practices for a Reliable Cost Estimate and IRDM Assessment: Best practice: Comprehensive: the estimate should cover the entire program over its full life-cycle; IRDM rating: Partially meets. Best practice: Well documented: the estimate should be supported by detailed documentation; IRDM rating: Minimally meets. Best practice: Accurate: the estimate should provide unbiased results that are not overly conservative or optimistic; IRDM rating: Minimally meets. Best practice: Credible: the estimate should check for and discuss any limitations; IRDM rating: Does not meet. Source: GAO analysis of IRS’s 2011 IRDM cost estimate. [End of table] For example, the cost estimate minimally meets best practices for a well documented estimate because the Internal Revenue Service (IRS) did not provide detailed support for staff resources, and the cost estimate documentation only justified about 6 out of the 86 requested full time equivalent staff for IRDM, among other things. If documentation does not provide source data or cannot explain the calculations underlying the cost elements, the estimate’s credibility may suffer. Although IRS has an independent office of cost estimators that can develop and update cost estimates using cost modeling software that generally follows GAO’s best practices, this office did not develop the 2011 IRDM cost estimate. IRS policy does not require project teams to work with the office to update cost estimates. Additionally, IRS’s cost estimation guidance for project managers is inconsistent regarding how cost estimates should be related to a budget, an inconsistency that could lead to budget requests that do not accurately estimate program funding needs. The IRDM program’s earned value management (EVM) data did not meet data reliability criteria in the areas GAO reviewed. For example, the IRDM project schedule was not properly sequenced—meaning activities were not properly linked in the order in which they are to be carried out. In addition, surveillance was not conducted on IRDM’s EVM system, as required by the Office of Management and Budget and the Department of the Treasury. Surveillance involves having qualified staff review an EVM system. Because IRDM’s 2011 cost estimate is based on unreliable EVM data, it does not provide adequate support for IRDM’s budget requests. Until IRS addresses deficiencies in the EVM data, it cannot provide a reliable cost estimate for IRDM. What GAO Recommends: GAO recommends that IRS ensure that IRDM has a reliable cost estimate, require certain project teams to work with its Estimation Program Office, improve cost estimation guidance, and improve the reliability of IRDM’s EVM data. IRS agreed with one, partially agreed with one, and disagreed with two of GAO’s recommendations. GAO generally disagrees with IRS’s concerns, and still believes the recommendations have merit. View [hyperlink, http://www.gao.gov/products/GAO-12-59]. For more information, contact Michael Brostek at (202) 512-9110 or brostekm@gao.gov. [End of section] Contents: Letter: Background: IRDM's 2011 Cost Estimate Does Not Meet Best Practices for Reliability and Does Not Fully Support the Program's Budget: Unreliable EVM Data Raise Additional Concerns about IRDM Cost Estimate: Conclusions: Recommendations for Executive Action: Agency Comments and Our Evaluation: Appendix I: Scope and Methodology: Appendix II: Assessment of MITS's Current IRDM Cost Estimate: Appendix III: High Level Assessment of the Reliability of IRDM's EVM System: Appendix IV: Comments from the Internal Revenue Service: Appendix V: GAO Contact and Staff Acknowledgments: Tables: Table 1: Comparison of IRDM 2007 Preliminary Estimate and the 2009 SCBE (dollars, in millions): Table 2: IRDM Program Budget and Actual Amount Spent, Fiscal Years 2009 through 2011 (dollars, in millions): Table 3: MITS IRDM Cost Estimate Alignment with Best Practices: Table 4: IRDM EVM Data Reliability: Figures: Figure 1: 2011 IRDM Cost Estimate Alignment with Best Practices for Comprehensiveness: Figure 2: 2011 IRDM Cost Estimate Alignment with Best Practices for Being Well Documented: Figure 3: 2011 IRDM Cost Estimate Alignment with Best Practices for Accuracy: Figure 4: 2011 IRDM Cost Estimate Alignment with Best Practices for Credibility: Abbreviations: ANSI: American National Standards Institute: DME: Development/modernization/enhancement: EPO: Estimation Program Office: EVM: Earned Value Management: FTE: Full-time equivalent: IRDM: Information Reporting and Document Matching: IRS: Internal Revenue Service: IT: Information technology: MITS: Modernization and Information Technology Services: SCBE: Solution concept-based estimate: WBS: Work Breakdown Structure: [End of section] United States Government Accountability Office: Washington, DC 20548: January 31, 2012: The Honorable Susan M. Collins: Ranking Member: Committee on Homeland Security and Government Affairs: United States Senate: The Honorable Darrell E. Issa: Chairman: The Honorable Elijah E. Cummings: Ranking Member: Committee on Oversight and Government Reform: United States House of Representatives: The financing of the federal government depends largely on the Internal Revenue Service's (IRS) efforts to collect taxes. To help carry out this work, IRS initiated the Information Reporting and Document Matching (IRDM) program in fiscal year 2009. IRS plans for the IRDM program to build information technology (IT) systems that automatically compare--that is, match--different sources of tax information to improve tax compliance.[Footnote 1] In May 2011, we issued a report that assessed IRS's 2009 cost estimate for building IRDM IT systems, and found that it did not fully meet best practices. [Footnote 2] We recommended that for any future updates to the IRDM cost estimate, IRS ensure that the revised estimate be developed in a manner that reflects the four characteristics of a reliable cost estimate described in our report.[Footnote 3] Since then, IRS provided a new cost estimate for IRDM in August 2011. Having a reliable cost estimate--a summation of individual cost elements using established methods and valid data--is vital for making informed budgetary decisions and ensuring that a project is implemented as planned. In an environment where federal funds are scarce, it is imperative that IT projects, such as IRDM, are implemented as planned, not only because of their value to the government,[Footnote 4] but also because every dollar spent on one program will mean one less to fund other efforts. Our objectives were to determine the extent to which: (1) the IRDM funding request is supported by a reliable cost estimate and, if not reliably supported, why not; and (2) IRS's practices for capturing IRDM's actual costs and comparing them to estimated costs--known as "earned value management" (EVM)--generate reliable performance data. We are making four recommendations that warrant management's consideration. This report builds on our May 2011 report[Footnote 5] and further analysis conducted from June 2011 to August 2011, which was used to provide technical assistance to Congress. For this report, we compared the current 2011 IRDM cost estimate to the best practices in our Cost Estimating and Assessment Guide.[Footnote 6] We also compared IRS guidance on cost estimation to criteria from our cost guide. We shared our cost guide as well as our preliminary analysis of the IRDM 2011 cost estimate with program officials. When warranted, we updated our analyses based on the agency's response and additional documentation provided to us. To assess IRDM's practices for capturing actual costs and comparing them to estimated costs, we used high level criteria on EVM data reliability from our cost guide; EVM is a project management approach that uses actual project data to provide reports on project status. We also compared IRDM's EVM system with guidance in the Department of the Treasury's Earned Value Management Guide. As part of that analysis, we assessed IRDM's implementation of three guidelines from the American National Standards Institute (ANSI), which Treasury's guidance states that IRDM should be following. We selected the three guidelines to represent some of the fundamental steps for maintaining a reliable EVM system, as identified in our cost guide, and because these guidelines are also Treasury Department requirements. For both research objectives, we interviewed officials with IRS's Modernization and Information Technology Services (MITS) division, including those from the IRDM Program Management Office, and the Investment Planning and Management Office, which includes the Estimation Program Office (EPO).[Footnote 7] We conducted this performance audit from August 2011 through January 2012 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. We determined that the IRS data we used were sufficiently reliable for our purposes. We also made appropriate attribution indicating the sources of the data. We are making recommendations to IRS to improve data reliability in the future. See appendix I for more information on our scope and methodology. Background: IRDM Program: IRS initiated the IRDM program, in part, to implement new information reporting requirements, but more generally to increase voluntary compliance with tax laws by expanding and maximizing IRS's ability to match existing and future information returns with tax return data and establishing a new business information matching program. Previously, IRS had only matched information returns to individuals' and sole proprietors' tax returns.[Footnote 8] Under IRDM, IRS plans to build several new IT systems and enhance some existing systems as well as implement numerous organizational and process changes. IRS plans for IRDM to use information returns to identify individual and business tax returns that are likely sources of revenue, which the current individual tax return matching system is not designed to identify. IRDM implementation is led by IRS's Small Business/Self Employed division and MITS, which is leading the IRDM IT system development. [Footnote 9] IRS Cost Estimation Process and Guidance: Cost estimates are a vital factor for sound management decision making and they aid in the formation of a project's budget. IRS uses cost estimates, in part, to justify budget requests and prioritize the selection of IT projects for possible funding. After an IT project is approved, the cost estimate is later used as a starting point for developing the performance measurement baseline for EVM, a project management approach that, if implemented appropriately, provides management important tools such as objective reports of project status and early warning signs of impending schedule delays and cost overruns. Data from a reliable performance management system, such as EVM, are necessary inputs for an updated cost estimate, among other things. OMB issued guidance on managing IT projects, which discusses cost estimation and refers to our cost guide for how to meet cost estimating requirements.[Footnote 10] OMB guidelines state that cost estimates should be continuously updated based on the latest information available to ensure that they are current, accurate, and valid. According to our cost guide, effective program and cost control requires ongoing revisions to the cost estimate, budget, and projected estimates at completion. Specifically, our guide states that estimates should be continuously updated with actual costs incurred to that point so that significant cost, schedule, or performance variances can be examined. In addition, it says that cost estimates should be updated to reflect significant changes to a project's scope or specifications and when certain projects approach key milestones. [Footnote 11] Within MITS, project managers and EPO are involved in estimating program costs. EPO is an independent group of cost estimation experts that assists project teams by developing and updating cost estimates using a standard documented process. Project managers are responsible for maintaining a program's cost estimate. EPO only becomes involved in updating a program's cost estimate at the request of project managers, according to EPO officials. IRS procedures for developing, using, and updating cost estimates and EVM are described in several guidance documents, specifically: * EPO's Estimator's Reference Guide, which is used by EPO staff, is the general resource on the processes and procedures for developing and delivering IT cost estimates. It discusses the technical aspects of updating cost estimates, such as what documents are used in cost modeling once a project has begun. * IRS's Information Technology Investment Planning and Management Guide (Investment Guide) outlines the framework for selecting, managing, and evaluating IRS IT projects. The Investment Guide includes discussions of how IT projects are selected using cost information, and how managers should use cost information to monitor a project. Project managers are responsible for managing cost, schedule, and performance for a project. * MITS's Estimation Procedures document describes IRS's organizational approach to cost estimation, applicable to all IRS projects. The document is directed at project managers and it includes discussions of the steps and staff roles necessary to develop an estimate, and the circumstances when EPO typically becomes involved with updating a cost estimate. The document states that project managers are responsible for monitoring project progress and suggests initiating assistance from EPO if a project meets certain thresholds. * The Department of the Treasury's Earned Value Management Guide provides guidance for implementing EVM on major IRS projects. Program or project managers have the ultimate responsibility for implementing and monitoring the EVM system for their program or project. IRDM Program Cost Estimates: IRS developed two cost estimates for IRDM early in the program. In 2007, IRS developed a preliminary cost estimate for budgetary purposes when very little program information was available (referred to in this report as the 2007 preliminary cost estimate).[Footnote 12] As shown in table 1, in 2007, IRDM system development was estimated to cost about $5 million in fiscal year 2009 and about $23 million per year thereafter.[Footnote 13] In 2009, EPO developed a solution concept-based estimate (SCBE, referred to in this report as the 2009 SCBE),[Footnote 14] which was more rigorous than the 2007 preliminary estimate. The 2009 SCBE was developed before program implementation began, when MITS had more information than it did in 2007, but system design plans were still under development.[Footnote 15] The 2009 SCBE was about $36 million less through the first 4 years of the project than the 2007 preliminary estimate. Table 1: Comparison of IRDM 2007 Preliminary Estimate and the 2009 SCBE: IRDM Estimate: 2007 preliminary estimate[A]; Fiscal year: 2009: $5.1 million; Fiscal year: 2010: $23.0 million; Fiscal year: 2011: $22.9 million; Fiscal year: 2012: $23.2 million; Fiscal year: 2013: $23.0 million; Fiscal year: 2014: $23.0 million; Fiscal year: 2015: $23.0 million; Fiscal year: 2016: $23.0 million; Total, FY 2009 through FY 2012: $74.2 million. IRDM Estimate: 2009 SCBE[B]; Fiscal year: 2009: $2.6 million; Fiscal year: 2010: $13.4 million; Fiscal year: 2011: $13.3 million; Fiscal year: 2012: $9.3 million; Fiscal year: 2013: Not estimated; Fiscal year: 2014: Not estimated; Fiscal year: 2015: Not estimated; Fiscal year: 2016: Not estimated; Total, FY 2009 through FY 2012: $38.6 million. IRDM Estimate: Difference between the 2007 preliminary estimate and SCBE; Fiscal year: 2009: $2.5 million; Fiscal year: 2010: $9.6 million; Fiscal year: 2011: $9.6 million; Fiscal year: 2012: $13.9 million; Fiscal year: 2013: N/A; Fiscal year: 2014: N/A; Fiscal year: 2015: N/A; Fiscal year: 2016: N/A; Total, FY 2009 through FY 2012: $35.6 million. Source: GAO analysis of IRDM's Exhibit 300 and the 2009 SCBE. [A] Annual costs based on the preliminary estimate are reported in the Exhibit 300. The Exhibit 300 is a document required by OMB to support IT projects. It includes the project's desired outcome and budget justification. The 2007 preliminary estimate provided information from fiscal year 2009 through fiscal year 2016. [B] The 2009 SCBE only went through fiscal year 2012. [End of table] IRS used the 2007 preliminary estimate to justify the initial IRDM budget. According to a MITS official, the 2009 SCBE was not used for budgetary purposes because program specifications were undergoing modification in fiscal year 2010, requiring the full $23-million per- year funding. For example, in 2010, IRS made several changes to the complexity of the IRDM program, which included dividing it into four projects. One of the projects required restructuring as a separate development effort, using different resources and technologies. Changes also necessitated using new software development methods not previously used within IRS, and additional contracting support. According to MITS officials, such changes increased IRDM funding needs above the amounts supported by the 2009 SCBE. As a result, IRS did not revise its initial funding request for future fiscal years using the SCBE and instead relied on the 2007 preliminary estimate. From fiscal years 2009 through 2011, IRS received about $52 million in total funding for IRDM, of which the IRS had spent $46 million (when accounting for $2.6 million carried over from fiscal year 2009 funds and $5.8 million carried over from fiscal year 2010), through fiscal year 2011, as shown in table 2. Table 2: IRDM Program Budget and Actual Amount Spent, Fiscal Years 2009 through 2011: Budget; Fiscal year: 2009: $5.6 million: Fiscal year: 2010: $22.9 million; Fiscal year: 2011: $23.0 million; Total FY 2009 through 2011: $51.5 million. Actual amount spent from budget: Fiscal year: 2009: $3.0 million; Fiscal year: 2010: $15.8 million; Fiscal year: 2011: $18.8 million; Total FY 2009 through 2011: $37.6 million. Difference between budget and amount spent[A]: Fiscal year: 2009: $2.6 million; Fiscal year: 2010: $7.1 million; Fiscal year: 2011: $4.2 million; Total FY 2009 through 2011: $13.9 million. Source: IRS officials, IRDM budget documents, and actual spending data from IRS's financial information system. [A] The actual amount spent does not include money from the previous fiscal year that was carried over and spent. The amounts are not cumulative. In FY 2010, IRDM spent $2.6 million in funds carried over from FY 2009, and in FY 2011 IRDM spent $5.8 million of funds carried over from FY 2010. [End of table] In our May 2011 report,[Footnote 16] we assessed the 2009 SCBE because it was the most rigorous IRDM cost estimate available at the time and the 2007 preliminary estimate lacked documentation for a complete review. We found that the 2009 SCBE did not fully follow best practices. We recommended that if IRS updated the cost estimate, it should follow best practices from our cost guide. In response to our report, IRS said it would update the 2009 SCBE. IRS subsequently decided not to revise the estimate because, according to officials, they already have a plan, schedule and funding, the program is not over-budget, and the risks associated with IRDM, and the program's size, do not warrant an update. Over the summer of 2011, MITS provided us with additional cost information. Officials referred to these documents as IRDM's new cost estimate, and they were used in supporting IRS's fiscal year 2012 IRDM budget. Consequently, in this report, we refer to the materials provided to us as the 2011 cost estimate. This estimate was not a SCBE or developed by EPO. The 2011 cost estimate was based on several data sources, including IRDM's Exhibit 300, EVM data, spend plans, and schedule with work breakdown structures (WBS).[Footnote 17] MITS projects IRDM to cost $115 million for fiscal years 2012 through 2016, or about $23 million per year. IRDM's 2011 Cost Estimate Does Not Meet Best Practices for Reliability and Does Not Fully Support the Program's Budget: IRDM's Fiscal Year 2012 and Projected Budget Requests Are Not Supported by a Reliable Cost Estimate: According to best practices established by our cost guide, a cost estimate should be comprehensive, well documented, accurate, and credible.[Footnote 18] We assessed the 2011 cost estimate against cost estimation best practices because IRS told us the estimate was used to support its budget requests for fiscal year 2012 and beyond. While the 2011 IRDM cost estimate shows some characteristics of a reliable cost estimate, it does not fully meet best practices. The estimate partially meets best practices for a comprehensive cost estimate, as shown in figure 1.[Footnote 19] It reflects the current program schedule and contains information about the program's technical characteristics. The estimate provided some details about costs for IRDM's fiscal year 2012 budget request, but the cost estimate does not cover the program's entire life-cycle.[Footnote 20] Without fully accounting for life-cycle costs, management may have difficulty successfully planning program resource requirements and making informed resource-planning decisions. IRS defined assumptions used to estimate some IRDM costs, but did not provide the assumptions used to estimate labor or program operations costs. Furthermore, IRS did not include ground rules[Footnote 21] used to develop the estimate. Unless ground rules and assumptions are clearly defined, the cost estimate will not have a basis to identify and mitigate areas of potential risk. Figure 1: 2011 IRDM Cost Estimate Alignment with Best Practices for Comprehensiveness: [Refer to PDF for image: interactive table] Directions: Rollover each [star] below to see further information on cost estimation best practices. Source: GAO analysis of IRS’s 2011 IRDM cost estimate and GAO-09-3SP. A full text version of this graphic is available in appendix II. [End of figure] The estimate minimally meets best practices for a well documented cost estimate, as shown in figure 2. IRS provided supporting information for some staff resources, but detailed data for the staffing level requested for fiscal year 2012 was missing. The cost estimate documentation says that the labor cost justification was captured in the resource loaded project schedules,[Footnote 22] but we found that these schedules only justified about 6 out of the 86 requested full- time equivalent (FTE) staff for IRDM. Furthermore, although IRS officials cited the WBS as the basis for cost projections, we found no evidence linking the WBS to cost. Multiple documents linked software and hardware specifications to cost, but they did not provide consistent cost information. As a best practice, documentation should describe the source data used, the estimating methodology, and show step-by-step how the estimate was developed. Without a well documented cost estimate, the program's credibility may suffer because the documentation cannot explain the rationale of the methodology or the calculations underlying the cost elements. Figure 2: 2011 IRDM Cost Estimate Alignment with Best Practices for Being Well Documented: [Refer to PDF for image: interactive table] Directions: Rollover each [star] below to see further information on cost estimation best practices. Source: GAO analysis of IRS’s 2011 IRDM cost estimate and GAO-09-3SP. A full text version of this graphic is available in appendix II. [Refer to PDF for image] [End of figure] The estimate minimally meets best practices for an accurate cost estimate, as shown in figure 3. Calculations in the estimate are mathematically correct. However, documentation that IRS provided to support estimated costs for IRDM hardware and software's estimated costs did not match estimates in IRDM's spend plans. IRS officials said the discrepancies occurred because the spend plans were developed using more recent cost information for software purchases that were not included in the supporting documentation. Additionally, the estimate does not list any confidence levels or provide a range of possible costs. According to best practices, unless an estimate is based on an assessment of the most likely costs and reflects the degree of uncertainty given all of the risks considered, management will not be able to make informed decisions. IRDM uses EVM to identify variances between planned and actual costs, but as discussed below, we found IRDM's EVM data to be unreliable, and there was no evidence that IRS uses actual cost data to evaluate whether cost projections are realistic. Figure 3: 2011 IRDM Cost Estimate Alignment with Best Practices for Accuracy: [Refer to PDF for image: interactive table] Directions: Rollover each [star] below to see further information on cost estimation best practices. Source: GAO analysis of IRS’s 2011 IRDM cost estimate and GAO-09-3SP. A full text version of this graphic is available in appendix II. [End of figure] The estimate does not meet best practices for a credible cost estimate, as shown in figure 4. For example, the estimate was not crosschecked[Footnote 23] or assessed for risk and uncertainty. According to best practices, an estimate without risk and uncertainty analysis[Footnote 24] can be unrealistic because it does not assess how the cost estimate would be affected if, for example, the schedule slipped, the mission changed, or a proposed solution did not meet users' needs. In addition, IRS did not perform a sensitivity analysis. [Footnote 25] Further, there is no evidence that another office performed a separate cost estimate--referred to as an independent cost estimate--to validate the 2011 cost estimate. In previous work we found that because of limited resources, IRS generally only does an additional independent cost estimate for its largest programs, and according to officials, IRDM is not considered a large enough program in terms of its funding level.[Footnote 26] While some of MITS's cost estimates are done by EPO--which is independent of program management offices--the 2011 cost estimate was done by the IRDM program office. Figure 4: 2011 IRDM Cost Estimate Alignment with Best Practices for Credibility: [Refer to PDF for image: interactive table] Directions: Rollover each [star] below to see further information on cost estimation best practices. Source: GAO analysis of IRS’s 2011 IRDM cost estimate and GAO-09-3SP. A full text version of this graphic is available in appendix II. [End of figure] Because the 2011 cost estimate does not meet best practices, it does not provide reliable support for IRDM's fiscal year 2012 budget request, or any of the projected budget requests. IRS officials said current IRS policy does not require projects to routinely re-estimate project cost. The IRDM program office--which does not use the same software or modeling techniques as EPO--relied on spend plans, EVM, and other documents to estimate costs. In July 2011, MITS officials said that it would take 90 days for IRDM and EPO staff to complete a new cost estimate for IRDM. When considering FTEs and time, a new cost estimate developed by IRDM and EPO staff would require a total of about eight staff months.[Footnote 27] IRDM Could Benefit From EPO Assistance on Updating Cost Estimates: EPO has specialized cost estimation tools, such as software that incorporates many best practices from our cost guide, and expertise that project teams can leverage to update cost estimates. If used correctly, EPO estimation procedures could help IRDM management to maintain reliable cost information for use in budget requests. EPO officials said they did not work with the IRDM team to maintain an updated SCBE because the team did not seek their assistance. IRDM was not required to do so because MITS guidance, as of September 2011, does not require project teams to consult with EPO when updating a cost estimate. Without EPO involvement, IRS has less assurance that cost estimate updates will follow best practices. Our cost guide states that cost estimates should be (1) updated to reflect actual costs and changes (i.e., significant modifications to a project's scope or specifications) in order to keep the estimate current as the program passes through new phases and milestones and (2) updated if there are significant cost, schedule or performance variances. The continual updating of the cost estimate as the program matures not only results in a higher-quality estimate, but also gives cost estimators the opportunity to collect data for use in future estimates as well as incorporate lessons learned. Our cost guide also states that cost estimation work should be done by a central independent estimating organization, and estimators should monitor programs to determine whether preliminary information and assumptions remain relevant and accurate. EPO has the following characteristics--unlike the IRDM project team--that could help provide more reliable cost estimate updates: * EPO is able to use robust cost estimation techniques, including the SEER-SEM software cost estimation model.[Footnote 28] SEER-SEM analyzes project histories and cost relationships to produce cost estimates and can estimate costs consistent with best practices--such as adjusting for risk and incorporating the results of a sensitivity analysis. When EPO estimators validate or update a project's funding requirements, they tailor SEER-SEM to the project and use it to consider actual cost data from the project team, according to EPO's Estimator's Reference Guide. Estimators calibrate the model to include the schedule for remaining work and evaluate and revise key cost drivers, according to the guidance. As mentioned previously, the 2011 IRDM cost estimate did not use a cost estimation model. * EPO, as a whole, has more cost estimation experience than the IRDM project team. EPO's six cost estimators have 43 years of combined cost estimation experience. They all have received training in SEER-SEM and other cost estimation models. In addition, EPO officials said that project teams generally would not have the technical skills to update a SCBE using cost estimation models. Although the IRDM project team has some cost estimation experience and relevant training, IRDM officials do not have SEER-SEM training or experience. Further, our analysis of IRDM's 2011 cost estimate illustrates that estimate updates done by project teams may not result in reliable cost information. According to EPO officials, an updated estimate developed by EPO would also be independent, more holistic, and would include elements that project teams may miss. * EPO-produced updates can help build a historical record of IRS cost estimate data. According to our cost guide, historical data are crucial to developing high-quality cost estimates because estimators usually develop estimates for new programs by relying on data from programs that already exist and adjusting for any differences. EPO officials told us that they are working to build a historical database that compares estimated costs to actual costs. As IRS's central estimating organization, EPO is uniquely qualified to use cost estimate updates as an opportunity to obtain data that are consistent with other estimates and to use the data to build a historical cost database, which can ensure that future cost estimates are credible. According to MITS officials, it is up to project teams to seek EPO assistance. However, MITS's Estimation Procedures document, which provides cost estimation guidance to project teams, suggests some cost and schedule variance and project size thresholds that, if exceeded, should cause project managers to contact EPO for an updated estimate. It recommends EPO involvement in updating estimates when: * cost or schedule variance are 10 percent or greater for major projects; * cost or schedule variance are 25 percent or greater for non-major projects;[Footnote 29] or: * a project with development/modernization/enhancement (DME)[Footnote 30] costs greater than $5 million reaches milestone 3.[Footnote 31] IRDM meets the first threshold. Specifically, IRDM meets the IRS's criteria for a "major" project because IRDM's projected life-cycle costs are about $166 million, based on funding projections submitted to OMB. Also, according to IRDM EVM data, as of September 30, 2011, the program had a greater than 18 percent cost variance and an almost 13 percent schedule variance. IRDM officials said they did not work with EPO to update the 2009 SCBE because it was not required by current MITS guidance. MITS officials said is it not feasible to require EPO to update all cost estimates for IT projects. EPO is a relatively new office that began developing cost estimates in 2006. Most of its work has focused on developing estimates for proposed projects, rather than updating estimates for existing projects. The Estimation Procedures document was developed in September 2011, and as of October 2011, officials said EPO has six cost estimators on staff. As a result, EPO officials said that project teams generally update IT project cost estimates without EPO assistance and that, as of October 2011, EPO has been involved in few cost estimate updates. However, senior MITS and EPO officials said they would like for EPO to have a greater role in cost estimate updates. If IRS does not have reliable cost estimate updates, projects may face risks and their budget requests may not be adequately justified to inform decision making; these outcomes could be even more significant for projects with cost or schedule variances or high DME costs, such as IRDM. MITS Guidance Has Inconsistencies and Could Better Link Cost Estimation Procedures with Budget Requests: MITS guidance documents used by project managers do not clearly discuss the appropriate uses of different types of cost estimates. According to current guidance used by EPO estimators, non-SCBE cost estimates are less rigorous and are not for use in budgets, but as stated above, neither the IRDM initial budget request nor the current and projected budgets were developed using information from the 2009 SCBE. Three IRS guidance documents describe the relationship between cost estimates and budgets. However, the documents are directed at different audiences, do not present consistent information, and contain different levels of detail. Specifically: * EPO's Estimator's Reference Guide, used by EPO estimators, states that budgets for IT projects should be established using SCBEs. IRS's SCBEs rely on cost estimation methods that incorporate best practices from our cost guide, including considerations of risk, and provide a level of confidence associated with the estimate. According to our cost guide, for management to make good decisions, the program estimate must reflect the degree of uncertainty, so that a level of confidence can be given about the estimate. The Estimator's Reference Guide also discusses techniques that estimators can use to update SCBEs, and aligns that process with annual budget submissions. Although this guidance contains many best practices, it is directed at cost estimators; therefore, project managers do not typically have access to it or use it. * The MITS Investment Guide, directed at MITS project managers, discusses the role of EPO in developing initial rough estimates and budget-ready SCBEs and states that, if a project does not yet have an SCBE, a rough estimate may be used as a placeholder in a budget request. The guide requires that MITS staff should work to ensure that if an SCBE exceeds an initial rough estimate, the project's scope and SCBE fit within the appropriated budget. However, the guide does not discuss how, if at all, a budget request should be adjusted if an SCBE provides an estimated cost that is lower than the budget, or how any future cost information should be incorporated into budgets. * The Estimation Procedures document, directed at MITS project managers, does not define types of cost estimates or discuss whether they are appropriate for budget decisions. EPO officials said they did not believe it is necessary to characterize the different types of cost estimates in the Estimation Procedures document because they are not necessary for defining the organizational approach to estimation, which is the intent of the document. The document states that if updated cost estimates indicate that a project's budget needs to change, the changes must be approved. However, it does not specify who should approve the estimates. Without consistent guidance about what types of cost estimates are appropriate for budget requests, project teams may not use the best information available. Our cost guide states that, as a best practice, an estimate intended to support budgetary decisions should cover the project's entire life-cycle and should be supported by a description of the program's technical characteristics, which would be found in an SCBE. Using a cost estimate that lacks sufficient rigor--such as a preliminary cost estimate, instead of an SCBE--could lead to budget requests that do not accurately reflect program funding needs. For example, the 2007 preliminary IRDM estimate lacks an uncertainty analysis, which would provide a basis for adjusting the estimate to reflect unknown facts and circumstances that could affect costs, and as a result, IRDM managers do not have assurance that the program's funding level remains appropriate. Further, not providing project managers with guidance on how to incorporate new cost information-- either from an SCBE that has replaced a preliminary estimate or from an updated cost estimate--into budget requests could result in requests that do not reflect current or accurate funding needs for a project. Unreliable EVM Data Raise Additional Concerns about IRDM Cost Estimate: IRS provided EVM data in the 2011 IRDM cost estimate to justify its budget requests, but we found that the program's EVM data are not reliable in any of the areas we reviewed. Reliable data on actual performance, obtained from an EVM system, are a necessary input if an updated cost estimate is to be considered accurate and credible. Because IRDM's 2011 cost estimate is based on unreliable EVM data, it does not provide adequate support for IRDM's budget requests. Until IRS addresses deficiencies in its EVM data, it cannot provide reliable cost estimate updates for IRDM. EVM data reliability deficiencies, such as those we observed for IRDM, are common in federal agencies, and we have also previously reported on inconsistencies in implementation of EVM for IT projects at Treasury bureaus.[Footnote 32] Our cost guide identifies top-level EVM data reliability tasks for IT projects, which are also included in OMB guidance.[Footnote 33] We assessed IRDM EVM data and IRS's processes against three data reliability tasks: * Maintain an EVM system that is compliant with the agency's scaling of American National Standards Institute (ANSI) guidelines. ANSI has a national EVM standard, comprised of 32 guidelines, which define acceptable methods for agencies to evaluate an EVM system to determine if cost, schedule, and technical performance data can be relied on for program management. * Conduct an integrated baseline review for the program. An integrated baseline review is an evaluation of a program's baseline plan to determine whether all program requirements have been addressed, risks have been identified, mitigation plans are in place, and available and planned resources are sufficient to complete the work. * Using qualified staff, conduct surveillance on the EVM system. Surveillance is reviewing a contractor's EVM system to observe ANSI compliance and how well a contractor is using its EVM system to manage cost, schedule, and technical performance. Treasury's EVM guidance requires a project of IRDM's size to follow an abbreviated set of 10 ANSI guidelines, and to conduct surveillance on the EVM system. Other departments also scale ANSI guidelines according to the size of projects, which could result in some agencies not fully following certain best practices. Further, projects like IRDM, according to Treasury guidance, only need to complete an independent baseline validation--which although not defined in the guidance, appears to be a less rigorous version of an integrated baseline review. Following ANSI guidelines and conducting an integrated baseline review and surveillance can help ensure that EVM data can indicate how well a program is performing in terms of cost, schedule, and technical matters. This performance information is necessary for proactive program management and risk mitigation, and to maintain a reliable cost estimate. Where applicable, we assessed IRDM's EVM data against the standards cited in the Treasury guidance. IRDM's EVM System Is Not Compliant with Key Required Guidelines: We assessed IRDM on three ANSI guidelines, which are fundamental elements for an EVM system and are included in Treasury's abbreviated 10 EVM guidelines.[Footnote 34] We found that IRDM's EVM system is not compliant with these guidelines. For an overview of our findings on IRDM's EVM data reliability, see table 4 in appendix III. For each selected guideline we found: * WBS: This ANSI guideline states that authorized work elements for the program should be defined, which typically includes using a WBS tailored for effective internal control. Further, a project's schedule,[Footnote 35] cost estimate, and EVM system should be based on the same WBS, according to our cost guide. The WBSs used in IRDM's schedules do not match the WBS used for EVM. The WBSs in four of the five IRDM schedules reflect detailed project-level tasks, while the WBS used for the EVM data is only broken down by contractor and government efforts, and does not include any project-level data. The WBS used to inform the 2011 IRDM cost estimate was not broken down by contractor or government data, and it did not provide costs for detailed tasks. Without a WBS, from which to measure progress and to serve as a consistent framework for the schedules and EVM, there is no basis for reliable EVM data, according to our cost guide. An IRS official said that the WBS in the IRDM schedules is not the primary source of financial information for IRDM. Instead, officials use IRS's financial tracking system, which is much less detailed than the schedules' WBS, to obtain project level data for EVM. This technique for gathering EVM data at a project level is contrary to the purpose of EVM, which is to integrate cost, schedule, and technical data from detailed work packages[Footnote 36] that can be monitored for variances against the original plan. Since a resource-loaded schedule forms the foundation for the EVM baseline, both the schedule and the EVM data should be based on the same WBS, according to our cost guide. [Footnote 37] Because the financial tracking system only provides project level data, and financial information cannot be traced to the WBS elements in the schedules, the cost associated with the project tasks is unknown. * Sequencing: This ANSI guideline states that projects should have a schedule that describes the sequence of work--that is, a list of activities in the order in which they are to be carried out--and identifies significant task interdependencies required to meet project requirements. All of the IRDM schedules had significant problems with sequencing. For example, many predecessor and successor tasks were not linked to one another, which are necessary for properly sequencing work so that the schedule will update in response to changes. Because the schedule is missing so many of these "logic links," it will not automatically recalculate forecasted start and finish dates of remaining activities. Thus, any activities that are late will not automatically recalculate the dates for affected successor activities. IRS officials said they are aware of some issues with missing links. As a result of these missing links in all of the IRDM schedules we reviewed, IRDM's schedules are not reliable. Because the schedules form the basis for the performance measurement baseline[Footnote 38] used to track cost and schedule variances in an EVM system, the data from the IRDM EVM system are not reliable. * Time-phased budget baseline: This ANSI guideline states that a program should establish and maintain a time-phased budget baseline, [Footnote 39] at the control account level, against which performance can be measured. Resources must be accounted for in a schedule in order to develop this baseline, according to our cost guide. IRS was unable to show evidence that it has established and maintained a time- phased budget baseline for IRDM. Specifically, the program does not have one overall schedule, and none of the IRDM schedules we reviewed were completely resource-loaded. IRS officials said although they do not have a program level schedule for IRDM, the individual project schedules are linked through interdependencies. However, we could not identify these links in our analysis. IRS officials said the resource-loaded schedules do not show all project resources because some resources, such as contractor personnel being used across projects, are used in more than one IRDM project. Without schedules that include the resources needed to complete tasks, IRS was unable to prove that it had established and maintained a time-phased budget baseline. This baseline is a critical EVM component for measuring IRDM's performance. IRS Did Not Validate the Baseline for IRDM's EVM System: According to Treasury EVM guidance, an independent baseline validation should be conducted for a DME project like IRDM. As stated above, GAO and OMB consider an integrated baseline review to be a key element of EVM data reliability, while Treasury guidance allows projects like IRDM to complete a less-rigorous independent baseline validation. IRS did not conduct an integrated baseline review or an independent baseline validation for IRDM. IRS officials told us that many of the activities typically done during an integrated baseline validation were performed--such as developing a WBS and schedules. However, as previously discussed we identified problems with the WBSs and schedules and the baseline process. Without a comprehensive integrated baseline review or independent baseline validation, and resolving any issues, IRS has not sufficiently evaluated the validity of IRDM's baseline. This calls into question the reliability of IRDM's EVM data, and could affect the program's ability to identify and mitigate risks. IRS Is Not Conducting Surveillance on IRDM's EVM System: OMB and Treasury require surveillance on EVM systems, and, according to our cost guide, surveillance should be conducted to check whether the EVM system summarizes timely and reliable cost, schedule, and performance information, among other things. IRS officials said IRS is not performing surveillance on the IRDM EVM system because they did not believe it was required and because, according to officials, IRS does not have staff with the necessary technical skills to conduct surveillance. IRS officials said Treasury is reviewing whether Department-level surveillance would be efficient. It is important for the agency to conduct surveillance of EVM systems to ensure that contractors are following their own processes and satisfying ANSI guidelines. Conclusions: The 2011 IRDM cost estimate does not fully meet best practices for a reliable estimate. It is important for IRDM to have a cost estimate that meets best practices to inform budgetary decisions and ensure that the program is implemented as planned. This standard is particularly important in a budgetary environment with scarce resources. IRDM's 2011 cost estimate could be improved by using EPO's expertise to ensure the cost estimate follows best practices from our cost guide. Additionally, more consistent guidance on using cost estimates to develop budget requests could help program managers for IRDM, as well as other programs, make budget decisions that are supported by reliable cost information. IRS could increase the credibility of IRDM's 2011 cost estimate by ensuring that IRDM's EVM data are reliable. Such reliability could also allow for IRS to update projected costs for the remainder of IRDM's implementation. Using a WBS that is developed using best practices from our cost guide could provide a baseline from which to measure progress, a key component to an EVM system. Similarly, developing a single integrated schedule for IRDM, that contains all resources needed to implement the program, could provide more meaningful EVM data. Finally, providing oversight of the EVM system, through an independent baseline validation and EVM surveillance, could help identify potential program risks and any possible issues with contractor performance. Recommendations for Executive Action: To improve the quality of cost and budget information for IRS IT projects, we recommend that the Commissioner of Internal Revenue take the following four actions: 1. Ensure that the IRDM life-cycle cost estimate is reliable and that budget requests are justified by a reliable cost estimate that follows best practices. 2. Require project managers to consult with EPO to determine if projects could benefit from EPO assistance in updating cost estimates for programs that exceed thresholds recommended by MITS's Estimation Procedures document. For those projects where EPO does not update the cost estimate, IRS should require that the decision and rationale be documented. EPO should use the information from updated cost estimates to develop a historical repository of cost estimation data. 3. Review all guidance applicable to cost estimates and take steps to ensure that they are consistent. As a first step, IRS guidance should require the use of current and reliable project cost estimates to inform budget requests, in accordance with the Estimator's Reference Guide. 4. Improve the reliability of IRDM's EVM data, specifically: * address WBS issues by developing an EVM baseline for IRDM that reflects the same WBS as the detailed schedule and IRDM cost estimate; * address sequencing issues and enable the development of a time- phased budget baseline by creating a single integrated master schedule for IRDM that is properly sequenced and resource-loaded so that effective and meaningful EVM data can be obtained to better manage the program; * conduct an independent baseline validation for the IRDM EVM baseline; and: * conduct independent surveillance of EVM systems to ensure that data are reliable. Agency Comments and Our Evaluation: We provided a draft of this report to the Commissioner of Internal Revenue for his review and comment. We received written comments from the Deputy Commissioner for Operations Support, which are reprinted in appendix IV. We sought clarification on IRS's written response in regards to whether it agreed with two of our recommendations, and on a reference to OMB guidance. IRS provided us with additional comments, which are summarized below. In addition, the agency provided technical comments, which we incorporated into the report as appropriate. IRS agreed with one of our four recommendations, partially agreed with another, and disagreed with two. While IRS's comment letter did not address the recommendation, the Director of Risk Management in MITS's Strategy and Planning Office told us the agency agrees with the recommendation to require certain IT project managers to consult with EPO about updating cost estimates, documenting decisions not to update cost estimates, and placing data from updated cost estimates in a repository. Similarly, IRS's comment letter did not address our recommendation to ensure that its cost estimation guidance is consistent. However, IRS officials said they partially agree with this recommendation and have taken steps to ensure that their estimation practices and procedures follow consistent, documented guidance. They noted that in all instances, however, IRS IT cost estimates will be based on the best information available at the time the estimate is requested or required as opposed to our recommendation that IRS require the use of current and reliable cost estimates to inform all budget requests. We note in our report that using an unreliable cost estimate could lead to budget requests that do not accurately reflect program funding needs. Once done in conformance with guidance and best practices, current and reliable cost estimates can be maintained through normal required monitoring and cost tracking procedures, unless significant changes in project circumstances warrant updating the cost estimate. IRS disagreed with our recommendation to ensure that the IRDM life- cycle cost estimate is reliable and that budget requests are justified by a reliable cost estimate that follows best practices. In its comment letter, IRS wrote that implementing the recommendation would require it to spend resources that do not directly contribute to the successful implementation of the IRDM program, and that the IRDM program is within its budget and schedule. At the end of fiscal year 2011, IRDM was under budget by more than 18 percent and behind schedule by almost 13 percent, equal to over 4 months behind schedule, according to IRDM's EVM data. Because the IRDM program does not have a reliable cost estimate, budget authorities do not have reliable information to determine an appropriate funding level. Such variances indicate that the current funding level may not be appropriate. As we reported, IRS estimated in July 2011 that it would take about 90 days, comprising 8 staff months (or a direct staff cost likely less than $200,000, according to our calculations using a top government salary rate and general benefits rate), to complete a new cost estimate for IRDM. While we agree that federal resources are tight, we believe that such an investment could produce benefits that not only improve the reliability of the IRDM cost estimate, but also better ensure that IRS requests the correct amount of resources to ensure IRDM fully achieves successful implementation. Further, benefits of a new estimate would also stretch beyond that program and provide an important foundation for improving IRS cost estimates in general. The OMB Capital Programming Guide directs agencies to develop sound cost estimates based on our cost guide and states that during the budget process, the credibility of the costs will be examined, and OMB and the Congress will hold agencies accountable for meeting the schedule and performance goals within the cost estimates. More reliable cost estimates enable Congress and other budget authorities to make more complete and informed decisions. IRS also disagreed with our recommendation to improve the reliability of EVM data, and stated that OMB's revisions to Circular A-11 remove EVM system requirements due to negative cost benefit. We disagree with IRS, as Circular A-11, Appendix J and the Capital Programming Guide still contain language that directs agencies to use EVM for major projects. Former Circular A-11, section 300.7, instructed agencies to use EVM system requirements to identify areas where problems are occurring when reporting on ongoing investments. While current guidance for Exhibit 300 no longer explicitly discusses EVM reporting on ongoing investments under section 300.7, other sections of the guidance still direct the use of EVM for managing IT capital assets and state that in general cost, schedule and performance goals are to be controlled and monitored by using an EVM system. Moreover, our report assessed the reliability of IRDM's EVM data because the data were included in IRDM's 2011 cost estimate and the data are used to track the program's progress. Regardless of OMB requirements, any data used for cost estimation and program management, particularly when it helps to support a budget request, should be reliable. We will send copies of this report to the Chairmen and Ranking Members of Senate and House committees and subcommittees that have appropriation, authorization, and oversight responsibilities for IRS. We will also send copies to the Commissioner of Internal Revenue, the Secretary of the Treasury, the Chairman of the IRS Oversight Board, and the Acting Director of the Office of Management and Budget. Copies are also available at no charge on the GAO Web site at [hyperlink, http://www.gao.gov]. If you or your staffs have any questions concerning this report, please contact me at (202) 512-9110 or brostekm@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made major contributions to this report are listed in appendix V. Signed by: Michael Brostek: Director, Tax Issues Strategic Issues: [End of section] Appendix I: Scope and Methodology: This report builds on our May 2011 report[Footnote 40] on the Internal Revenue Service's (IRS) Information Reporting and Document Matching (IRDM) program and analysis performed during a budget justification review of the IRDM program, conducted from June 2011 to August 2011, which we provided to Congress as technical assistance. In the May 2011 report, we assessed IRDM's 2009 solution-concept based cost estimate. For the budget justification work and this report, we assessed IRDM's 2007 preliminary cost estimate. Also for this report, we assessed IRDM's 2011 cost estimate. To assess the extent to which the IRDM funding request is supported by a reliable cost estimate, and if not reliably supported, why not, we compared the Modernization and Information Technology Services (MITS) division's 2011 IRDM cost estimate with the characteristics of a high- quality cost estimate, identified in the GAO Cost Estimating and Assessment Guide.[Footnote 41] The 2011 IRDM cost estimate documentation included spend plans, the Exhibit 300, Earned Value Management (EVM) data, project schedules, and other documents. [Footnote 42] Our cost guide, which is based on extensive research of best practices for estimating program schedules and costs, indicates that a high-quality, valid, and reliable cost estimate should be well documented, comprehensive, accurate, and credible; we analyzed the cost estimating practices used by MITS against these characteristics and rated each characteristic as being either: Met, Substantially Met, Partially Met, Minimally Met, or Not Met. To do so, we scored each of the individual key practices associated with cost and scheduling best practices on a scale of 1-5 (Does Not Meet = 1, Minimally Meets = 2, Partially Meets = 3, Substantially Meets = 4, and Meets = 5), and then averaged the individual practice scores to determine the overall rating. We shared our cost guide as well as our preliminary analysis of the IRDM 2011 cost estimate with program officials. When warranted, we updated our analyses based on the agency's response and additional documentation provided to us. Once we determined that the 2011 cost estimate did not fully meet best practices, we determined why this occurred. To this end, we interviewed officials in IRDM's Program Management Office and MITS's Estimation Program Office (EPO). We also compared IRS guidance that addresses cost estimation--the EPO's Estimator's Reference Guide, MITS's Estimation Procedures document, and IRS's Information Technology Investment Planning and Management Guide--to criteria in our cost guide. To assess the extent to which IRS's practices for capturing IRDM's actual costs and comparing them to estimated costs, or EVM, generate reliable performance data, we compared the EVM data for IRDM and IRS's process for maintaining the data to the high-level EVM data reliability tasks outlined in our cost guide.[Footnote 43] We assessed the extent to which IRDM's EVM data adhered to three of the American National Standard Institute's (ANSI)[Footnote 44] 32 guidelines; we selected the three guidelines to represent some of the fundamental steps for maintaining a reliable EVM system, as identified in our cost guide, and because these guidelines are also included in the Department of the Treasury's Earned Value Management Guide, which applies to IRS (see appendix III for a list of the data reliability tasks and ANSI guidelines we reviewed).[Footnote 45] In situations where Treasury's Earned Value Management Guide did not require certain OMB or GAO best practices, we assessed IRDM's EVM practices against the Treasury guidance. To do this analysis, we compared the work breakdown structures used in IRDM's EVM system, schedules, and cost estimates and identified differences in each. We also assessed each of the five IRDM schedules[Footnote 46] against scheduling best practices for ensuring that the activities are sequenced and related using network logic, as identified in our cost guide. For both objectives, we interviewed IRS officials in the MITS division, specifically, officials from the IRDM Program Management Office and the Investment Planning and Management Office, which includes EPO. We spoke primarily with officials at IRS Headquarters in Washington, D.C. and IRS's division office in New Carrollton, Maryland, where the officials responsible for IRDM are located. To assess the reliability of the cost estimate data that we used to support findings in this report, we reviewed relevant program documentation, such as cost estimation spreadsheets, as available, to substantiate evidence obtained from interviews with knowledgeable agency officials. We found the data we used to be sufficiently reliable for the purposes of our report. As appropriate, we attributed the sources of the data. We are making recommendations to IRS to improve data reliability in the future. We conducted this performance audit from August 2011 through January 2012 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. [End of section] Appendix II: Assessment of MITS's Current IRDM Cost Estimate: The following table outlines our assessment of the extent to which the Internal Revenue Service's (IRS) 2011 Information Reporting and Document Matching (IRDM) program cost estimate meets best practices, depicted in figures 1-4. Table 3: MITS IRDM Cost Estimate Alignment with Best Practices: Best practices characteristics: A comprehensive cost estimate; Overall assessment: Partially meets best practices for a comprehensive cost estimate. Best practices characteristics: A comprehensive cost estimate: Includes all life-cycle costs. A life-cycle cost estimate provides a complete and structured accounting of all resources and associated cost elements required to develop, produce, deploy, and sustain a particular program. It should cover the inception of the program through its retirement; Assessment of whether best practices met: The estimate includes costs through FY 2014, but IRS plans for the project to continue through FY 2016, and detailed costs are only provided for FY 2012. (Minimally meets); Effect: A life cycle cost estimate should encompass all past, present, and future costs for every aspect of the program, regardless of funding source, including all government and contractor costs. Life- cycle cost estimates that include all costs can enhance program managers' decision making by allowing them to evaluate design trade off studies on a total cost basis as well as on a technical and performance basis. Best practices characteristics: A comprehensive cost estimate: Completely defines the program, reflects the current schedule, and is technically reasonable. The cost estimate should be based on a documented technical baseline description, which provides a common definition of the program, including detailed technical, program, and schedule descriptions of the system; Assessment of whether best practices met: The estimate reflects the current project schedule and contains high-level information about technical specifications but lacks details that would completely define the program. (Partially meets); Effect: Understanding the program--including the acquisition strategy, technical definition, characteristics, system design features, and technologies to be included--is key to developing a credible cost estimate. Without these data, the cost estimator will not be able to identify the technical and program parameters that will bind the cost estimate. Best practices characteristics: A comprehensive cost estimate: Has a product-oriented work breakdown structure (WBS), traceable to the program's technical scope, at an appropriate level of detail. A WBS provides a basic framework for a variety of related activities like estimating costs, developing schedules, identifying resources and potential risks, and providing the means for measuring program status using EVM. It is product-oriented if it allows a program to track cost and schedule by defined deliverables, such as a hardware or software component; Assessment of whether best practices met: Each of the four IRDM projects has a product-oriented WBS, but the WBSs are not consistent and they are not traceable to the cost estimate. (Partially meets); Effect: A WBS provides a necessary framework for the program to develop a schedule and cost plan that can easily track technical accomplishments. A standard, product-oriented WBS facilitates the tracking of resource allocations and expenditures, which can give the agency insight to reliably estimate the cost of future similar programs. Best practices characteristics: A comprehensive cost estimate: Documents all cost-influencing ground rules and assumptions. Cost estimates are typically based on limited information and therefore need to be bound by ground rules and assumptions. Ground rules are a set of estimating standards that provide guidance and common definitions, while assumptions are judgments about past, present, or future conditions that may affect the estimate. Any risks associated with assumptions should be identified and traced to specific WBS elements; Assessment of whether best practices met: No ground rules were documented. Several documents included assumptions, but none discussed associated risks. (Minimally meets); Effect: Unless ground rules and assumptions are clearly documented, the cost estimate will not have a basis for assessing potential risks. Furthermore, the estimate cannot be reconstructed when the original estimators are no longer available. Best practices characteristics: A well documented cost estimate should: Overall assessment: Minimally meets best practices for a well documented cost estimate. Best practices characteristics: A well documented cost estimate should: Capture the source data used, the reliability of the data, and how the data were made compatible with other data in the estimate. Data should be collected from primary sources. The source, content, time, and units should be adequately documented. Further, data should be analyzed to determine accuracy and reliability, and to identify cost drivers; Assessment of whether best practices met: Data sources are listed, but the estimate is not consistent with the source data that it cites bringing into question the reliability of the data. (Minimally meets); Effect: Data are the foundation of every cost estimate. Depending on data quality, the estimate can range anywhere from a mere guess to a highly defensible cost position. Data are often in many different forms and need to be adjusted before being used. The cost estimator needs information about the source and reliability of the data in order to know whether the data collected can be used directly or need to be modified. Best practices characteristics: A well documented cost estimate should: Describe the calculations and the methodology used to derive each element's cost. Documentation should describe what calculation methods are used, as well as how they were applied, and explain any anomalies; Assessment of whether best practices met: Documentation does not fully explain how IRS derived estimated costs. (Partially meets); Effect: Poorly documented cost estimates can cause a program's credibility to suffer because the documentation cannot explain the rationale of the methodology or the calculations. Estimates that lack sufficient documentation are not useful for updates or information sharing and can hinder understanding and proper use. Best practices characteristics: A well documented cost estimate should: Describe how the estimate was developed. The data supporting the estimate should be available and adequately documented so that the estimate can be easily updated to reflect actual costs or program changes; Assessment of whether best practices met: Labor cost calculations are described at a high level, but the staffing level is not traceable to the schedule. Hardware and software calculations are not described, and costs are not consistent across documents. (Minimally meets); Effect: Without good documentation, management and oversight organizations will not be convinced that the estimate is credible; supporting data, lessons learned, and reasons why costs changed will not be available for future use; questions about the approach or data used to create the estimate cannot be answered; and the scope of the analysis cannot be thoroughly defined. Best practices characteristics: A well documented cost estimate should: Discuss the technical baseline description. A technical baseline description provides a common definition of the program, including detailed technical, program, and schedule descriptions of the system, for a cost estimate to be built on. The data in the technical baseline should be consistent with the cost estimate; Assessment of whether best practices met: Documentation links hardware and software specifications to costs, but project-specific costs are only provided for FY 2012. (Minimally meets); Effect: Because the technical baseline is intended to serve as the basis for developing a cost estimate, it should be discussed in the cost estimate documentation. Without a technical baseline, the cost estimate will not be based on a comprehensive program description and will lack specific information regarding technical and program risks. Best practices characteristics: A well documented cost estimate should: Provide evidence of management review and acceptance. There should be a briefing to management, including a clear explanation of how the cost estimate was derived. Management's acceptance of the cost estimate should be documented; Assessment of whether best practices met: The estimate was developed by IRDM managers, not cost estimators, and while it was reviewed within the IRDM program, there is no evidence of review by the top IRDM managers. (Minimally meets); Effect: A cost estimate is not considered valid until management has approved it. It is imperative that management understand how the estimate was developed, including the risks associated with the underlying data and methods. Best practices characteristics: An accurate cost estimate: Overall assessment: Minimally meets best practices for an accurate cost estimate. Best practices characteristics: An accurate cost estimate: Produces unbiased results. Cost estimates should have an uncertainty analysis, which determines where the estimate falls against the range of all possible costs; Assessment of whether best practices met: No confidence levels are listed and documents do not provide a range of possible costs. Neither risk nor uncertainty is mentioned. (Does not meet); Effect: A cost estimate is biased if the estimated work is overly conservative or too optimistic. Unless the estimate is based on an assessment of the most likely costs and reflects the degree of uncertainty given all of the risks considered, management will not be able to make informed decisions. Best practices characteristics: An accurate cost estimate: Is properly adjusted for inflation. Cost data should be adjusted for inflation to ensure that comparisons and projections are valid. Data should also be normalized to constant year dollars to remove the effects of inflation. Also, inflation assumptions must be well documented; Assessment of whether best practices met: Documentation contains no evidence that the cost estimate is adjusted for inflation. (Does not meet); Effect: Adjusting for inflation is important because in the development of an estimate, cost data must be expressed in like terms. If a mistake is made or the inflation amount is not correct, cost overruns can result. Best practices characteristics: An accurate cost estimate: Contains few mistakes. Results should be checked for accuracy, double- counting, and omissions. Validating that a cost estimate is accurate requires thoroughly understanding and investigating how the cost model was constructed; Assessment of whether best practices met: Documentation does not specify whether the cost estimate went through a quality control process, and there are inconsistencies among documents, but calculations are accurate. (Partially meets); Effect: Without access to estimate details, one cannot be certain that calculations are accurate or expressed consistently. Best practices characteristics: An accurate cost estimate: Is regularly updated to reflect significant program changes. The cost estimate should be updated to reflect significant program changes, such as changes to schedules or other assumptions. Updates should also reflect actual costs so that the estimate always reflects the current program status; Assessment of whether best practices met: IRS does not maintain an updated IRDM cost estimate in a single document. However, IRDM costs are tracked through the program's WBS and EVM system, and IRS evaluates them to develop spend plans, but we found that the EVM data are not reliable. (Minimally meets); Effect: If a cost estimate is not updated, it can become more difficult to analyze changes in program costs and collect cost and schedule data to support future cost estimates. The cost estimate should be updated when the technical baseline changes; otherwise, it will lack credibility. A properly updated cost estimate can provide decision makers with accurate information for assessing alternative decisions. Best practices characteristics: An accurate cost estimate: Documents and explains variances between planned and actual costs. Variances between planned and actual costs should be documented, explained, and reviewed. For any elements whose actual costs or schedules differ from the estimate, the estimate should discuss variances and lessons learned; Assessment of whether best practices met: Variances between planned and actual costs, and explanations for the variance are documented in EVM data, but the EVM data are unreliable (Minimally meets); Effect: Without a documented comparison between the current estimate (updated with actual costs) and the old estimate, cost estimators cannot determine the level of variance between the two estimates. That is, the estimators cannot see how well they are estimating and how the program is changing over time. Best practices characteristics: An accurate cost estimate: Reflects cost estimating experiences from comparable programs. The estimate should be based on historical cost estimation data and actual experiences from other comparable programs. These data should be reliable and relevant to the new program; Assessment of whether best practices met: Documentation contains no evidence that the estimate is based on a model that uses historical records of cost estimating and actual experiences from comparable programs. (Does not meet); Effect: Historical data provides the cost estimator with insight into actual costs on similar programs, including any cost growth that occurred after the original estimate. As a result, historical data can be used to challenge optimistic assumptions and bring more realism to a cost estimate. Best practices characteristics: A credible cost estimate includes: Overall assessment: Does not meet best practices for a credible cost estimate. Best practices characteristics: A credible cost estimate includes: A sensitivity analysis that identifies a range of possible costs based on varying inputs. A sensitivity analysis examines how changes to key assumptions and inputs affect the estimate. The estimate should identify key cost drivers, examine their parameters and assumptions, and re-estimate the total cost by varying each parameter between its minimum and maximum range; Assessment of whether best practices met: Documentation contains no evidence that a sensitivity analysis was performed. (Does not meet); Effect: Because uncertainty cannot be avoided, it is necessary to identify the cost elements that represent the most risk. A sensitivity analysis reveals how the cost estimate is affected by a change in a single assumption, which helps the cost estimator to understand which variable(s) most affects the cost estimate. Any sources of variation should be well documented and traceable. Best practices characteristics: A credible cost estimate includes: A risk and uncertainty analysis. A risk and uncertainty analysis recognizes the potential for error and attempts to quantify it by identifying the effects of changing key cost drivers; Assessment of whether best practices met: Documentation contains no evidence that a risk analysis or an uncertainty analysis were done. (Does not meet); Effect: For management to make good decisions, the program estimate must reflect the degree of uncertainty, so that a level of confidence can be given about the estimate. An estimate without risk and uncertainty analysis is unrealistic because it does not assess the variability in the cost estimate from such effects as schedules slipping, missions changing, and proposed solutions not meeting users' needs. Best practices characteristics: A credible cost estimate includes: Cross-checking of major cost elements. A cross-check is done by using a different cost estimation method to see if it produces similar results; Assessment of whether best practices met: Documentation contains no evidence that cross-checks were performed. (Does not meet); Effect: If a cross-check demonstrates that alternative methods can produce similar results, then confidence in the estimate increases, leading to greater credibility. Best practices characteristics: A credible cost estimate includes: A comparison to an independent cost estimate conducted by another organization. A second, independent, cost estimate should be performed by an organization outside of the program office's influence. It should be based on the same technical baseline, ground rules, and assumptions, as the original estimate; Assessment of whether best practices met: Documentation contains no evidence that an independent cost estimate was performed. (Does not meet); Effect: An independent cost estimate is considered one of the best and most reliable methods for validating an estimate. It provides an independent view of expected program costs that tests the program office's estimate for reasonableness. Without an independent cost estimate, decision makers will lack insight into a program's potential costs because independent cost estimates frequently use different methods and are less burdened with organizational bias. Source: GAO analysis of IRS's 2011 IRDM cost estimate and GAO 09-3SP. [End of table] [End of section] Appendix III: High Level Assessment of the Reliability of IRDM's EVM System: Table 4: IRDM EVM Data Reliability: Data reliability task: Maintain an earned value management (EVM) system that is compliant with the agency's scaling of ANSI guidelines[A]; Was task met? No; Explanation of GAO assessment: IRDM did not meet the three ANSI guidelines we assessed. Data reliability task: Work Elements: Define the authorized work elements for the program, typically done using a WBS; Was task met? No; Explanation of GAO assessment: The WBS used in IRDM's schedules did not match the WBS used for EVM. Data reliability task: Sequencing: Schedule the authorized work in a manner that describes the sequence of work and identifies significant task interdependencies; Was task met? No; Explanation of GAO assessment: The IRDM schedules did not have proper sequencing; related project activities were not linked. Data reliability task: Time-phased budget baseline:[B] At the control account level, establish and maintain a time-phased budget baseline, against which program performance can be measured; Was task met? No; Explanation of GAO assessment: IRS was unable to show evidence that it established a time-phased budget baseline for IRDM. IRDM does not have one overall schedule, and none of the four IRDM project schedules were completely resource loaded. Data reliability task: Conduct an integrated baseline review, or independent baseline validation, for the program[C]; Was task met? No; Explanation of GAO assessment: Neither an integrated baseline review nor an independent baseline validation were conducted for IRDM. Data reliability task: Using independent and qualified staff, conduct surveillance on the EVM system[D]; Was task met? No; Explanation of GAO assessment: Surveillance on the IRDM EVM system was not conducted and IRS does not have staff with sufficient technical skills to conduct surveillance. Sources: GAO analysis of IRDM EVM data, WBSs, and project schedules, as well as OMB and Treasury guidance and GAO-09-3SP. [A] The American National Standards Institute (ANSI) has a national EVM standard, comprised of 32 guidelines, that enables agencies to evaluate an EVM system to determine if cost, schedule, and technical performance data can be relied on for program management. [B] The time-phased budget baseline, against which performance is measured, is formed from the performance measurement baseline, which is essentially the resource consumption plan for the program. Deviations from the baseline identify areas where management should focus attention. A performance measurement baseline represents the cumulative value of a program's planned work over time. This baseline takes into account the program activities that occur in a sequenced order, based on finite resources, with budgets representing those resources spread over time. [C] An integrated baseline review is an assessment done by program management and contractors to verify that the program baseline is adequate and realistically portrays all authorized work according to the schedule. For a project of IRDM's size, guidance from the Department of the Treasury states that an independent baseline validation, which appears to be a less rigorous version of an integrated baseline review, can be performed. [D] Surveillance is reviewing a contractor's EVM system to observe ANSI compliance and how well a contractor is using its EVM system to manage cost, schedule, and technical performance. [End of table] [End of section] Appendix IV: Comments from the Internal Revenue Service: Department of The Treasury: Internal Revenue Service: Deputy Commissioner: Washington, D.C. 20224: January 19, 2012: Mr. Michael Brostek: Director, Tax Issues, Strategic Issues Team: U.S. Government Accountability Office: 441 G Street, NW: Washington, DC 20548: Dear Mr. Brostek: Thank you for the opportunity to comment on the Government Accountability Office (GAO) draft report titled IRS Management: Cost Estimate for New Information Reporting System Needs to be Made More Reliable (GAO-12-59). Information reporting is essential to increase voluntary compliance by expanding and maximizing the matching program. We recognize the importance of these provisions and the impact they have on compliance. We are pleased to report that the IRS has completed components of the Information Return and Document Matching (IRDM) program. We are also on track to deliver additional information technology (IT) components to automatically compare different sources of tax information to improve compliance. This program continues to deliver within budget, schedule, and scope for both interim and overall milestones. Given the successful record of on time, schedule, and budget delivery for this program, we are concerned with the draft report from GAO regarding the IRDM cost estimate. Specifically, between Fiscal Year (FY) 2009 through FY 2011, IRS received $52 million for IRDM of which IRS spent $46 million. In its report, GAO recognizes that federal funds are scarce, and it is imperative that information technology (IT) projects such as IRDM are implemented as planned. The IRS agrees and has demonstrated this approach through continued successful implementation of the IRDM program. The report, however, implies that the IRS should spend additional resources to redo cost estimates for a program that is on schedule and within budget. The IRS disagrees with the recommendation to spend resources creating additional documentation or following practices that do not directly contribute to the successful implementation of the IRDM program or other IT programs that are meeting all governing requirements. The IRS is committed to cost estimation as one component of IT investment management, and as GAO recognized, our IT Estimation Program Office (EPO) has expertise in cost estimation best practices. The IRS will continue to leverage its cost estimation expertise and processes as appropriate in managing its IT investments. With regard to Earned Value Management (EVM), the Office of Management and Budget has revised Circular A-11 to remove EVM system requirements due to negative cost benefit. Therefore, the IRS disagrees with GAO's recommendations in this area. We appreciate your continued support and guidance. If you have any questions, please contact me, or a member of your staff may contact Terence V. Milholland, Chief Technology Officer, at (202) 622-6800. Sincerely, Signed by: Beth Tucker: Deputy Commissioner for Operations Support: [End of section] Appendix V: GAO Contact and Staff Acknowledgments: GAO Contact: Michael Brostek, (202) 512-9110 or brostekm@gao.gov: Staff Acknowledgments: In addition to the contact named above, Libby Mixon, Assistant Director; Laurel Ball; Amy Bowser; Bill Cordrey; Jennifer Echard; Robert Gebhart; Paul Middleton; Donna Miller; Sabine Paul; Karen Richey; Stacy Steele; and Lindsay Swenson made key contributions to this report. [End of section] Footnotes: [1] IRDM systems will automatically match income or expense information reported by third parties to income or expenses reported on a taxpayer's tax returns. For example, for certain securities sales, brokers must report to the IRS cost basis information (generally, the difference between the gross proceeds from the securities sale and the original purchase price, net any fees or commissions) on the information return, Form 1099-B, "Proceeds from Broker and Barter Exchange Transaction." IRDM systems will then compare this information to an individual's tax return data on securities sales, such as found on the Form 1040 Schedule D, "Capital Gains and Losses," enabling it to identify potential compliance problems. [2] GAO, Information Reporting: IRS Could Improve Cost Basis and Transaction Settlement Reporting Implementation, [hyperlink, http://www.gao.gov/products/GAO-11-557] (Washington, D.C.: May 19, 2011). [3] A reliable cost estimate is (1) comprehensive, (2) well documented, (3) accurate, and (4) credible. See GAO, GAO Cost Estimating and Assessment Guide: Best Practices for Developing and Managing Capital Program Costs, [hyperlink, http://www.gao.gov/products/GAO-09-3SP] (Washington, D.C.: March 2009), for a full description of each of the four characteristics. [4] The IRDM system will increase information that is available for compliance purposes and potentially raise significant revenue. Two components of the IRDM program, cost basis and transaction settlement reporting, are expected to generate more than $16 billion over the next 10 years, according to the Joint Committee on Taxation in 2008. [5] [hyperlink, http://www.gao.gov/products/GAO-11-557]. [6] See GAO, Cost Estimating and Assessment Guide: Best Practices for Developing and Managing Capital Program Costs, [hyperlink, http://www.gao.gov/products/GAO-09-3SP] (Washington, D.C.: Mar. 9, 2009). [7] EPO is IRS's cost estimation organization. [8] A sole proprietor is an individual who owns an unincorporated business by himself or herself. [9] For more information on the IRDM program, including IT system development plans, see [hyperlink, http://www.gao.gov/products/GAO-11-557]. [10] Office of Management and Budget, Capital Programming Guide: Supplement to Circular A-11, Planning, Budgeting, and Acquisition of Capital Assets, (Executive Office of the President, Washington, D.C.: August 2011). [11] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. [12] In our May report, [hyperlink, http://www.gao.gov/products/GAO-11-557], based on information from MITS officials, we noted an initial estimate developed in 2008 for IRDM's initial budget request. We obtained additional information on IRDM cost estimates indicating that the initial estimate was provided for the budget request in 2007, and subsequently confirmed this information with MITS officials. [13] IRS plans for IRDM system development funding to continue through FY 2016. [14] A solution concept based estimate relies on a document, referred to as a solution concept, which explains a project proposal's functional scope and technical solution. [15] The 2009 IRDM SCBE did not follow MITS's standard cost estimation process because of time constraints. For this estimate, MITS heavily modified its estimation techniques and used broad assumptions. [16] [hyperlink, http://www.gao.gov/products/GAO-11-557]. [17] The Exhibit 300 is a document required by OMB to support IT project budgets. It includes the project's desired outcome and budget justification. A WBS shows the requirements that must be accomplished to develop a program, and provides the basis for identifying resources and tasks for developing a program cost estimate. It provides a basic framework for estimating costs, developing schedules, identifying resources, determining where risks may occur, and providing the means for measuring program status using EVM. [18] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. [19] We rated the extent to which IRS met each best practice on the following scale: "Meets," IRS provided complete evidence that satisfies the entire criterion; "Substantially meets," IRS provided evidence that satisfies a large portion of the criterion; "Partially meets," IRS provided evidence that satisfies about half of the criterion; "Minimally meets," IRS provided evidence that satisfies a small portion of the criterion; and "Does not meet," IRS provided no evidence that satisfies any of the criterion. See appendix I for our full scope and methodology. [20] IRS plans for IRDM development to continue through fiscal year 2016. [21] Ground rules are a set of estimating standards that provide guidance and minimize conflicts in definitions, while assumptions are judgments about past, present, or future conditions that may affect the estimate. [22] A resource loaded schedule is a schedule with resources of staff, facilities, and materials needed to complete the activities that use them; it should be based on the project's WBS. [23] Cross-checks use alternative estimation methods to see if they produce similar results, and could enhance the estimate's reliability. [24] Risk and uncertainty refer to the fact that because a cost estimate is a forecast, there is always a chance that the actual cost will differ from the estimate. [25] A sensitivity analysis examines the effects on changing assumptions and estimating procedures to highlight elements that are cost-sensitive. [26] Although IRDM is not among IRS's largest programs by funding level, it is one of IRS's top seven investments, as determined by funding level and mission impact. According to our cost guide, for an estimate to be credible, it must be compared to an independent cost estimate. However, program managers and the cost estimating team should define the scope of the estimate based on its intended purpose, including the appropriate level of detail for an independent cost estimate. [27] IRS officials said it would take 1.5 FTEs for estimators working for 90 days (about 4.5 staff months), 1.5 FTEs from IRDM staff working for 60 days to assist with the estimate (about 3 staff months), and 0.6 additional IRDM FTEs working for 30 days (about 0.6 staff months). According to our calculations using a top government salary rate and general benefits rate, the direct staff cost associated with updating the cost estimate would likely be less than $200,000. [28] SEER mathematical models are commercially available cost estimation models, derived from extensive software project histories, behavioral models, and metrics. [29] IRS defines major investments as those that, among other things, have an overall life-cycle cost of greater than $50 million or an annual budget of greater than $5 million. Investments that do not meet these criteria are considered non-major. [30] DME is a term used by OMB to describe the program cost for new investments, changes or modifications to existing systems to improve capability or performance, changes mandated by the Congress or agency leadership, personnel costs for investment management, and direct support. For major IT investments, this amount should equal the sum of amounts reported for planning and acquisition plus the associated FTE costs reported in the Exhibit 300. [31] A milestone is a point in time when management reviews updated cost, progress, and risk information. According to a MITS document, IRDM and eight other programs have DME costs exceeding $5 million in calendar year 2011. IRDM's DME costs exceed $5 million, but the program is not scheduled to exit milestone 3 until the summer of 2012. [32] In a review of 16 federal programs using EVM, we found that many programs did not fully implement practices to ensure data reliability. For example, 13 of the programs had deficiencies in program schedules, which undermined the quality of EVM data. Additionally, some programs did not conduct an integrated baseline review, or conduct ongoing EVM surveillance. The inconsistent application of EVM across the programs exists in part because of the weaknesses in some of the agencies' policies, as well as a lack of enforcement of the EVM policy. See GAO, Information Technology: Agencies Need to Improve the Implementation and Use of Earned Value Techniques to Help Manage Major System Acquisitions, [hyperlink, http://www.gao.gov/products/GAO-10-2] (Washington, D.C.: Oct. 8, 2009). In a review of the use of EVM at 6 selected Treasury projects, we found that none of the projects fully implemented practices to ensure that the data from their EVM system was reliable. The review included projects at the Financial Management Service, IRS, the Bureau of Public Debt, and the Treasury departmental offices. See GAO, Information Technology: Treasury Needs to Better Define and Implement Its Earned Value Management Policy, [hyperlink, http://www.gao.gov/products/GAO-08-951] (Washington, D.C.: Sept. 22, 2008). [33] See [hyperlink, http://www.gao.gov/products/GAO-09-3SP] and Office of Management and Budget, Capital Programming Guide: Supplement to Circular A-11, Planning, Budgeting, and Acquisition of Capital Assets (Executive Office of the President, Washington, D.C.: August 2011). [34] We assessed IRDM's adherence to ANSI guidelines 1, 6, and 8. For DME projects costing less than $20 million, such as IRDM, Treasury EVM guidance states that a project only needs to follow a set of 10 ANSI guidelines (numbers 1, 2, 3, 6, 7, 8, 16, 22, 27, and 28). Treasury's 10 guidelines include 9 of the 10 guidelines that are considered by experts to be the most critical to follow. Treasury's list does not include guideline 9. For a list of the 32 ANSI guidelines, see [hyperlink, http://www.gao.gov/products/GAO-09-3SP] pgs. 212-213. [35] A schedule provides a time sequence for the duration of a program's activities and helps clarify both the dates for major milestones and the activities that drive the schedule. A program schedule also provides the vehicle for developing a time-phased budget baseline. [36] Work packages are detailed tasks that are typically 4 to 6 weeks long. [37] The WBS should be used as the outline for the schedule because the WBS defines the work in lower levels of detail. Thus its framework provides the starting point for defining all activities and tasks that will be used to develop the schedule. Furthermore, by breaking the work into smaller, more manageable work elements, the WBS can be used to integrate the scheduled activities and costs for accomplishing EVM. A WBS, therefore, is an essential part of EVM cost, schedule, and technical monitoring because it provides a consistent framework from which to measure progress. [38] A performance measurement baseline is used in EVM to detect deviations from the plan and give insight into problems and potential impacts. [39] The time-phased budget baseline, against which performance is measured, is formed from the performance measurement baseline, which is essentially the resource consumption plan for the program. Deviations from the baseline identify areas where management should focus attention. A performance measurement baseline represents the cumulative value of a program's planned work over time. It takes into account the program activities that occur in a sequenced order, based on finite resources, with budgets representing those resources spread over time. [40] GAO, Information Reporting: IRS Could Improve Cost Basis and Transaction Settlement Reporting Implementation, [hyperlink, http://www.gao.gov/products/GAO-11-557] (Washington, D.C.: May 19, 2011). [41] See GAO, Cost Estimating and Assessment Guide: Best Practices for Developing and Managing Capital Program Costs, [hyperlink, http://www.gao.gov/products/GAO-09-3SP] (Washington, D.C.: March 2009). [42] The Exhibit 300 is a document required by the Office of Management and Budget to support IT projects. It includes the project's desired outcome and budget justification. Earned Value Management is a project management tool that integrates the technical scope of work with schedule and cost elements for investment planning and control. [43] [hyperlink, http://www.gao.gov/products/GAO-09-3SP], see pg. 98 for a list of EVM data reliability tasks, we assessed whether IRDM is implementing the first three. [44] ANSI has a national EVM standard, comprised of 32 guidelines, that define acceptable methods for agencies to evaluate an EVM system to determine if cost, schedule, and technical performance data can be relied on for program management. We assessed whether IRDM was following guideline numbers 1, 6, and 8. [45] OMB provides guidance for cost estimation and EVM, but some departments and agencies may scale ANSI guidelines to fit specific projects. See Office of Management and Budget, Capital Programming Guide: Supplement to Circular A-11, Planning, Budgeting, and Acquisition of Capital Assets, (Washington, D.C.: Executive Office of the President, August 2011). [46] IRS officials provided five schedules for IRDM, one for each of the four IRDM projects, and a schedule depicting IRDM's progress following IRS Enterprise Lifecycle guidance. [End of section] GAO’s Mission: The Government Accountability Office, the audit, evaluation, and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO’s commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO’s website [hyperlink, http://www.gao.gov]. Each weekday afternoon, GAO posts on its website newly released reports, testimony, and correspondence. To have GAO e mail you a list of newly posted products, go to [hyperlink, http://www.gao.gov] and select “E- mail Updates.” Order by Phone: The price of each GAO publication reflects GAO’s actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO’s website, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. Connect with GAO: Connect with GAO on facebook, flickr, twitter, and YouTube. Subscribe to our RSS Feeds or E mail Updates. Listen to our Podcasts. Visit GAO on the web at [hyperlink, http://www.gao.gov]. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Website: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]; E-mail: fraudnet@gao.gov; Automated answering system: (800) 424-5454 or (202) 512-7470. Congressional Relations: Katherine Siggerud, Managing Director, siggerudk@gao.gov, (202) 512-4400 U.S. Government Accountability Office, 441 G Street NW, Room 7125 Washington, DC 20548. Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov, (202) 512-4800 U.S. Government Accountability Office, 441 G Street NW, Room 7149 Washington, DC 20548.