Guidehouse LLP; Jacobs Technology, Inc.

B-420860.1,B-420860.2,B-420860.3 Oct 13, 2022
Jump To:
Skip to Highlights
Highlights

Guidehouse LLP, of Falls Church, Virginia, and Jacobs Technology, Inc., of Tullahoma, Tennessee, protest the award of an integration support contract (ISC) to BAE Systems Technology Solutions & Services Inc., of Rockville, Maryland, under request for proposals (RFP) No. FA8207-21-R-0001. The Department of the Air Force issued the solicitation for systems engineering and integration services in support of the Intercontinental Ballistic Missile (ICBM) organization. Guidehouse and Jacobs challenge the agency's evaluation of professional employee compensation plans, the assessments of cost realism, and the resulting award decision. Jacobs also alleges that the agency unreasonably failed to identify risk in BAE's proposal.

We sustain the protests.
View Decision

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of: Guidehouse LLP; Jacobs Technology, Inc.

File: B-420860.1; B-420860.2; B-420860.3

Date: October 13, 2022

Jason A. Carey, Esq.; J. Hunter Bennett, Esq., Andrew R. Guy, Esq., and Jennifer Bentley, Esq., Covington & Burling, LLP, for Guidehouse LLP; and Brian P. Waagner, Esq., Steven A. Neeley, Esq., George E. Stewart, Esq., and Leah C. Kaiser, Esq., Husch Blackwell LLP, for Jacobs Technology, Inc., the protesters.
Jamie F. Tabb, Esq., Tyler E. Robinson, Esq., Elizabeth Krabill McIntyre, Esq., and John M. Satira, Esq., Vinson & Elkins LLP, for BAE Systems Technology Solutions & Services Inc., the intervenor.
Colonel Frank Yoon, Josephine R. Farinelli, Esq., and Isabelle P. Cutting, Esq., Department of the Air Force, for the agency.
Samantha S. Lee, Esq., and Peter H. Tran, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1. Protests challenge the agency’s evaluation of proposed professional employee compensation plans are sustained where the record does not demonstrate that the agency conducted an evaluation in accordance with the requirements of Federal Acquisition Regulation provision 52.222-46.

2. Protest challenging the agency’s evaluation of the awardee’s operational capability proposal is denied where the evaluation was reasonable and consistent with the terms of the solicitation.

DECISION

Guidehouse LLP, of Falls Church, Virginia, and Jacobs Technology, Inc., of Tullahoma, Tennessee, protest the award of an integration support contract (ISC) to BAE Systems Technology Solutions & Services Inc., of Rockville, Maryland, under request for proposals (RFP) No. FA8207‑21‑R‑0001. The Department of the Air Force issued the solicitation for systems engineering and integration services in support of the Intercontinental Ballistic Missile (ICBM) organization. Guidehouse and Jacobs challenge the agency’s evaluation of professional employee compensation plans, the assessments of cost realism, and the resulting award decision. Jacobs also alleges that the agency unreasonably failed to identify risk in BAE’s proposal.

We sustain the protests.

BACKGROUND

The mission of the ICBM organization at Hill Air Force Base, Utah, is to manage 400 combat capable nuclear missiles that are safe, secure, and effective. Contracting Officer’s Statement (COS) at 2.[1] The ICBM organization includes the program office for the current nuclear deterrent force, the Minuteman III (MMIII), and the program office for the next generation weapon system, the Ground Based Strategic Deterrent (GBSD). Agency Report (AR), Tab 4b, Performance Work Statement (PWS) at 3.

The ISC provides systems engineering and integration services to the ICBM organization. Memorandum of Law (MOL) at 5. The Air Force awarded the predecessor contract (FA-8214-13-C-0001), known as ISC 1.0, to BAE in August 2013. Id. Near the end of the expected term of ISC 1.0, the Air Force began developing requirements for a follow-on contract. Id. On November 20, 2020, pursuant to the procedures of Federal Acquisition Regulation (FAR) part 15, the Air Force issued solicitation No. FA8207-21‑R‑0001 (at issue here) for the follow-on contract known as ISC 2.0. AR, Tab 4a, RFP at 1. The RFP contemplated the award of a single indefinite-quantity, indefinite-delivery (IDIQ) contract with a 5-year base period and the potential for an 18-year overall period of performance. COS at 2; AR, Tab 4c, RFP Sections L&M at 1; RFP at 2.

Although the RFP included both cost-reimbursement and fixed-price contract line item numbers (CLINs), it is primarily a cost-plus-award-fee (CPAF) vehicle. COS at 2-3. The Air Force anticipated awarding five individual awardable task orders (ATOs) with the basic contract. AR, Tab 4c, RFP Sections L&M at 1.

The solicitation provided that award would be made on a best-value tradeoff basis, considering the following evaluation factors in descending order of importance: (1) technical capability; (2) mission capability; and (3) cost. Id. at 28. The technical capability factor comprised three equally important subfactors to be assessed an adjectival technical rating and an adjectival risk rating: (1) test anomaly root cause; (2) systems engineering; and (3) digital engineering.[2] Id. The mission capability factor comprised four subfactors: (1) workforce management; (2) operational capability; (3) transition approach; and (4) small business participation. Id. Within this evaluation factor, subfactors one to three would also be assessed with adjectival ratings and were listed in descending order of importance; subfactor four (small business participation) would only be rated on an acceptable/unacceptable basis. Id. Ratings for the non-cost evaluation factors would be assigned only at the subfactor level; no rating would be provided for the individual non-cost factors. When combined, all non-cost factors were significantly more important than cost. Id.

Relevant here, the solicitation advised offerors that FAR provision 52.222-46, Evaluation of Compensation for Professional Employees, was incorporated into the solicitation, and offerors were to submit professional employee compensation plans (PECPs) that would be evaluated under the workforce management subfactor. AR, Tab 4c, RFP Sections L&M at 32. The solicitation also provided that the agency would evaluate the realism of proposed labor rates as part of the cost evaluation. Id. at 35.

The Air Force received proposals from five offerors by the January 22, 2021, deadline for receipt of proposals. COS at 30-31. Upon evaluation of these proposals, the agency established a competitive range and entered into discussions with all five offerors, and subsequently eliminated two of the offerors such that only Guidehouse, Jacobs, and BAE remained. Id. at 52-53. The Air Force then requested that the offerors submit final proposal revisions by December 15. Id. at 53.

The agency summarized its evaluation of proposals as follows:

 

Guidehouse

Jacobs

BAE

TECHNICAL CAPABILITY

 

Test Anomaly Root Cause

Good

Good

Acceptable

Systems Engineering

Outstanding

Good

Outstanding

Digital Engineering

Outstanding

Outstanding

Outstanding

MISSION CAPABILITY

 

Workforce Management

Acceptable

Acceptable

Acceptable

Operational Capability

Acceptable

Acceptable

Acceptable

Transition Approach

Acceptable

Good

Good

Small Business Participation

Acceptable

Acceptable

Acceptable

COST

$4,140,478,486

$3,794,670,369

$3,331,901,783

 

AR, Tab 16a, Source Selection Decision Document (SSDD) at 4. The evaluators assessed an adjectival risk rating of low under every subfactor for all three offerors. Id.

The source selection authority (SSA) agreed with the analysis and recommendations of the source selection evaluation board (SSEB) and the source selection advisory council (SSAC) that BAE’s proposal represented the best value. Id. at 9. The SSA explained that although “Guidehouse offered the superior technical proposal overall, and the Jacobs’ technical approach was slightly better than BAE’s[,]” neither proposal merited the associated price premium, “particularly when many of their technical advantages are unquantifiable and unknown since their ultimate value will depend on the frequency of new efforts to sustain the MMIII weapon system during the performance of this contract.” Id. The agency therefore made award to BAE. Id. at 9‑10.

After receiving notices of award and debriefings, Guidehouse and Jacobs timely filed these protests with our Office.

DISCUSSION

The protesters challenge several aspects of the agency’s evaluation of proposals and award decision. First, Guidehouse and Jacobs allege that the agency failed to meaningfully evaluate total compensation plans under FAR provision 52.222-46, and that the Air Force conducted a flawed cost realism analysis. Protest at 21-33; Comments at 7-29; Supp. Comments at 3-23; Jacobs Protest at 10-18; Jacobs Comments & Supp. Protest at 2‑25, 27-29; Jacobs Supp. Comments at 2-12. Next, Jacobs asserts that the agency should have identified risk in BAE’s proposal based on BAE’s subcontracting approach. Jacobs Comments & Supp. Protest at 29; Jacobs Supp. Comments at 12-14. Finally, both protesters assert that the best-value tradeoff decision was improper. Protest at 34-37; Comments at 29-32; Jacobs Protest at 18-19; Jacobs Comments & Supp. Protest at 25-27.

As discussed below, we find the Air Force’s evaluation and consideration of PECPs to be unreasonable and inconsistent with the plain meaning of FAR provision 52.222-46. In recompetitions like the procurement here, FAR provision 52.222-46 requires the agency to conduct a two-part evaluation of how the proposed compensation compares to incumbent compensation and the realism of the proposed compensation.

First, we find that the Air Force did not reasonably determine whether offerors were proposing to compensate their professional employees at levels lower than the levels of compensation under the incumbent contract. The Air Force determined that it could conduct only a limited comparison of fully burdened labor rates from the incumbent contract to proposed fixed price rates for only a subset of the ISC 2.0 labor categories, and that such a comparison would not be meaningful. Notwithstanding the agency’s reservations, the Air Force proceeded with the comparison and relied on this admittedly flawed basis of comparison to determine that BAE’s proposed professional compensation was not lower than the professional compensation for the incumbent contract.

Second, we find that under the cost factor, the Air Force’s cost realism analysis did not compare costs on a common basis. Finally, we find that the Air Force departed from the solicitation’s requirement for nationally competitive compensation and misunderstood BAE’s proposed approach to recruiting and retaining personnel. We therefore sustain the protests on these bases.

Although we do not address all of the protesters’ arguments in this decision, we have reviewed each argument and find that only those discussed below provide a basis to sustain the protest.

Evaluation of Professional Employee Compensation and Cost Realism

Both Guidehouse and Jacobs argue that the Air Force failed to properly evaluate offerors’ total compensation plans and that the agency conducted a flawed cost realism evaluation of the realism of the proposed labor rates. Protest at 21‑33; Comments at 7‑29; Supp. Comments at 3-23; Jacobs Protest at 10-18; Jacobs Comments & Supp. Protest at 2‑25, 27-29; Jacobs Supp. Comments at 2-12.

Under the workforce management subfactor of the mission capability factor, the solicitation required each offeror to address, an “approach to provide and manage a qualified, stable workforce.” AR, Tab 4c, RFP Sections L&M at 11. This approach needed to first address the offeror’s recruiting and hiring plan specific to the labor categories that the Air Force identified as high demand/low density (HD/LD).[3] Id. Next, offerors were required to describe their approach to workforce retention and submit PECPs “meeting the requirements of FAR 52.222-46.” Id. at 12. Compensation plans were to “include how compensation remains competitive when compared nationally and regionally, as well as other forms of compensation/motivation that will be used to recruit and retain high quality employees.” Id. Because the PECPs would be submitted as attachments to the Volume II, Mission Capability proposals, the RFP instructed offerors to not include any rates or cost information in the compensation plan. Id. Instead, the solicitation “required that salaries and fringe benefits be included in the labor rates provided in Volume III, Cost/Price [of the proposals].” Id.

For the cost volume submission, an offeror was required to provide, among other things, a completed LRM and a basis of estimate. Id. at 16-19. The Air Force provided the LRM as a spreadsheet with the solicitation, and it specified the labor categories, hours, other direct costs, and travel for each ATO.[4] AR, Tab 4g, LRM. To complete the LRM, offerors were to “populate the indicated pricing and cost cells specified in the LRM worksheets, [in accordance with the RFP’s instructions] and in the worksheets themselves.”[5] AR, Tab 4c, RFP Sections L&M at 17. The cost cells for each labor category were direct labor rate, fringe benefits, overhead, general and administrative burden, and cost of money. AR, Tab 4g, LRM. The LRM also included a separate worksheet that offerors were to populate with a fixed price rate for each labor category; the agency expected the fixed price rates to be the same as the fully burdened rates for the CPAF CLINs. AR, Tab 4c, RFP Sections L&M at 18.

For the basis of estimate, each offeror was required to “provide the basis of proposed direct and indirect labor rates, e.g. actuals/current individual salaries, Forward Pricing Rate Agreement (FPRA), Forward Pricing Rate Proposal/Submission (FPRP/S), labor survey, etc.” AR, Tab 4c, RFP Sections L&M at 19. The RFP also required a breakdown of each “base labor rate for each labor category, a breakout of all indirect rates applied, and fee.” Id.

According to the RFP, under the workforce management subfactor, the agency’s evaluation would include reviewing the PECP “and supporting information . . . per FAR 52.222-46 to assure it reflects a sound management approach and understanding of the contract requirements.” Id. at 32. The solicitation warned that “[p]rofessional compensation that is unrealistically low or not in reasonable relationship to the various job categories . . . may be viewed as evidence of failure to comprehend the complexity of the contract requirements.” Id. at 32-33. In addition, the RFP explained that “professional compensation proposed will be considered in terms of its impact upon recruiting and retention, and its consistency with a total plan for compensation” by comparing direct labor rates to current market rates and comparing the fringe benefit rate to the current market. Id. at 33.

Under the cost factor, the solicitation committed the Air Force to evaluating “the realism of each Offeror’s proposed labor rates for all Cost CLINs.” Id. at 34. The RFP continued that the evaluation would “include assessment of the extent to which proposed labor rate costs are in line with historical data, market rates,” forward pricing rates, “etc., and the extent to which ATO costs are sufficient for the work to be performed.” Id. at 35. In addition, the RFP warned “that unrealistically low rates may result in an Offeror’s proposal being removed from consideration of an award.” Id.

Comparison of Professional Employee Compensation to Incumbent Compensation Rates

Guidehouse and Jacobs first argue that the Air Force’s evaluation of PECPs was inadequate because the Air Force did not assess whether offerors were proposing compensation levels lower than those under the predecessor contract, ISC 1.0. Protest at 24-25, 28-33; Comments at 8-13; Jacobs Protest at 10-13; Jacobs Comments & Supp. Protest at 22-25. The agency responds that its evaluation was reasonable, asserting that it performed the best possible comparison with the available information. MOL at 14-19; Supp. MOL at 6-9; Jacobs MOL at 15-23.

The purpose of FAR provision 52.222-46 is to evaluate whether offerors will obtain and keep the quality of professional services needed for adequate contract performance, and to evaluate whether offerors understand the nature of the work to be performed. Obsidian Sols. Grp., LLC, B-416343, B-416343.3, Aug. 8, 2018, 2018 CPD ¶ 274 at 7. The provision requires that the agency evaluate an offeror’s total compensation plan (salaries and fringe benefits) by considering its impact on recruiting and retention, its realism, and its consistency with a total plan for compensation. FAR 52.222-46(a).

Our decisions have found that where requirements are performed under an existing contract and then recompeted, such as here, FAR provision 52.222-46(b)[6] requires an agency to determine whether a proposal “envision[s] compensation levels lower than those of predecessor contractors” by comparing proposed compensation rates to those of the incumbent. See, e.g., SURVICE Eng’g Co., LLC, B-414519, July 5, 2017, 2017 CPD ¶ 237 at 6-7. If the agency determines the awardee’s proposal envisions lower compensation levels compared to the incumbent contractor, then the agency must further evaluate the awardee’s proposed compensation plan on the basis of maintaining program continuity, among other considerations. Id.

This provision therefore requires, as a threshold matter, that an agency compare the incumbent professional compensation to proposed professional compensation. This comparison allows the agency to determine whether a proposal “envision[s] compensation levels lower than those of [the] predecessor contractor” and should accordingly be subject to additional evaluation. FAR 52.222-46(b). Our Office has sustained protests when agencies failed to conduct such a comparison. SURVICE Eng’g Co., LLC, supra at 7 (sustaining protest where solicitation incorporated FAR provision 52.222-46 and the agency failed to compare compensation plans of the awardee and incumbent contractor); see also Wackenhut Int’l, Inc., B-286193, Dec. 11, 2000, 2001 CPD ¶ 8 at 7.

Here, the record reflects that the Air Force created a PECP “evaluation tool” that included a worksheet designed to allow the Air Force to compare proposed compensation to compensation from ISC 1.0, the incumbent contract. AR, Tab 10a17, PECP Evaluation Tool Development Process at 3-4. According to the Air Force, however, because ISC 1.0 was a fixed-price-level-of-effort (FPLOE) type contract, it could obtain only the “fixed price labor matrix for the ISC 1.0 contract,” which includes “fully loaded rates[.]” MOL at 6-7.[7] In other words, the agency did not have the actual compensation levels for the incumbent contract; it had only the fully loaded labor rates the contractor charged the government.

After “degrouping” or decoupling some labor categories from each other in the ISC 1.0 labor matrix, the Air Force then compared the resulting labor category titles to those in the solicitation, identifying 94 matching labor category titles between ISC 1.0 and ISC 2.0. Id. at 7. When comparing “the educational and experience requirements of these 94 matches,” the Air Force found that ISC 2.0 generally required “higher levels of education and greater experience,” such that only 18 labor categories were still identified as “matches.” Id. at 7.

Based on this labor category analysis and the difference in contract type between ISC 1.0 and ISC 2.0, the Air Force determined that “[d]irect [l]abor on the ISC 1.0 contract was not reasonably and readily available for comparative evaluation,” AR, Tab 14a, SSEB Evaluation at 49, and “a predecessor rate analysis could not be made from equal points of comparison.” MOL at 6. The agency therefore concluded that such a comparison would not be meaningful. AR, Tab 10a17, PECP Evaluation Tool Development Process at 4. In the SSEB report, the evaluators stated that the agency attempted the comparison but “determined that the results did not provide any indication of an Offeror[’s] capability for ‘maintaining program continuity, uninterrupted high-quality work, and availability of required competent professional service employees.’” AR, Tab 14e, SSEB Evaluation, attach. E at 10-11 (quoting FAR 52.222-46). The Air Force, however, continued with the comparison.

Specifically, although the Air Force ultimately identified only 18 labor categories between ISC 1.0 and ISC 2.0 that it could “match” based on title as well as education and experience requirements, the agency performed a “top level evaluation” for the 94 labor categories matched based on labor category titles alone. AR, Tab 10a17, PECP Evaluation Tool Development Process at 3-4; AR, Tab 11a.4, Guidehouse PECP Rate Evaluation Tool at FPR Predecessor Rate Eval Tab. For those 94 labor categories, the agency compared each offeror’s proposed fixed priced rates to the FPLOE rates from ISC 1.0, and then averaged the percentage differences for an “aggregate average” difference for each offeror. AR, Tab 10a17, PECP Evaluation Tool Development Process at 3-4; AR, Tab 11a.4, Guidehouse PECP Rate Evaluation Tool at FPR Predecessor Rate Eval Tab. The evaluators found, for example, that the awardee, BAE, “was proposing on average [DELETED] percent higher FFP [firm-fixed-price] most comparable rates than the ‘predecessor contract’ FPLOE rates.” AR, Tab 14a, SSEB Evaluation at 49. As a result, the evaluators concluded that BAE’s “rates proposed do not indicate lack of sound management judgment, nor a lack of understanding of the requirement, and are, therefore acceptable.” Id.

While an agency has the discretion to decide upon an appropriate and reasonable method for the evaluation of proposals, an agency may not use an evaluation method that produces a misleading result. 6K Sys., Inc.--Protest and Costs, B-408124.3, B‑408124.4, Dec. 9, 2013, 2014 CPD ¶ 347 at 10. We find that the agency’s approach here produced a misleading result, in that the agency simultaneously concluded that a comparison to the FPLOE labor rates from the incumbent contract was not helpful in determining if offerors were paying less compensation than the incumbent contract, but, at the same time, relied on that comparison to conclude that the awardee was not proposing lower compensation for essentially the same work under the proposed contract.

Setting aside the flawed basis of evaluation discussed above, in response to the protest, the Air Force alternatively asserts that it nonetheless met its obligation to evaluate professional compensation because it developed, instead, “reasonable alternatives against which to evaluate” the compensation rates. MOL at 19. Specifically, the agency maintains that it reasonably evaluated offerors’ professional compensation by comparing the compensation to market compensation rates. In this connection, the agency asserts that “GAO does not require an agency to compare proposed rates to predecessor rates where predecessor rates are not available or where predecessor rates do not offer a meaningful point of comparison because they are fully burdened.” Id. at 16 (citing Obsidian Sols. Grp., LLC, supra at 9; Target Media Mid Atlantic, Inc., B-412468.8, June 27, 2017, 2017 CPD ¶ 208 at 6).

In certain circumstances, our Office has found reasonable an agency’s thorough evaluation of offerors’ PECPs even when an agency did not compare proposed compensation to incumbent compensation. See Systems Implementers, Inc.; Transcend Tech. Sys., LLC, B-418963.5 et al., June 1, 2022, 2022 CPD ¶ 138 at 25-26. In Systems Implementers, we agreed with the agency, finding that the agency did not have sufficient information to conduct a meaningful comparison of offerors’ proposed compensation to compensation of the incumbent contractor. There, “[i]n light of the unreliability of the information available to the agency about the incumbent contract’s compensation rates,” we found unobjectionable the agency’s comparison of offerors’ proposed rates to a salary baseline that the agency generated using market survey data. Id. at 25. In other words, when a traditional comparison between proposed compensation and incumbent compensation is not possible or meaningful, the agency may instead use other measures to assess how compensation compares.

For the reasons discussed below, however, we do not find that the Air Force used a reasonable alternative method to evaluate offerors’ professional compensation here. Specifically, the agency’s comparison of proposed direct labor rates and fringe benefit rates to the market rates developed by the agency was unreasonable and inconsistent with the solicitation. As a result, the record does not demonstrate that the Air Force reasonably determined whether proposed compensation is lower than incumbent compensation. Accordingly, we find that the agency failed to reasonably evaluate whether BAE offered “lowered compensation for essentially the same professional work” as envisioned by FAR provision 52.222-46. See The Bionetics Corp., B-419727, July 13, 2021, 2021 CPD ¶ 259 at 7-8.

Agency Alternatives to Comparison of Proposed Compensation--Market Rates

The agency contends that, through its cost realism evaluation, it developed “reasonable alternatives” (i.e., market rates) to incumbent contract data “against which to evaluate” the compensation rates and otherwise assessed whether the proposed compensation was realistic, meeting its obligations under both FAR provision 52.222-46 and the solicitation. MOL at 19. The protesters, however, challenge the agency’s underlying cost realism evaluation, and thereby the agency’s “reasonable alternatives” to evaluation of professional compensation. Protest at 25-26; Comments at 13-29. The agency responds that its cost “realism analysis was rationally performed using recognized market data and adequately documented.” MOL at 20.

When an agency evaluates proposals for the award of a cost-reimbursement contract, it must perform a cost realism analysis to determine the extent to which an offeror’s proposed costs are realistic for the work to be performed. FAR 15.305(a)(1); FAR 15.404‑1(d); Nat’l Gov’t Servs., Inc., B-412142, Dec. 30, 2015, 2016 CPD ¶ 8 at 8. In conducting such a cost realism analysis, an agency is not required to conduct an in-depth cost analysis, or to verify each and every item in assessing cost realism; rather, the evaluation requires the exercise of informed judgment by the contracting agency. Cascade Gen., Inc., B-283872, Jan. 18, 2000, 2000 CPD ¶ 14 at 8. Additionally, an agency’s cost realism analysis need not achieve scientific certainty; rather, the methodology employed must be reasonable and realistic in view of other cost information reasonably available to the agency as of the time of its evaluation. CSI, Inc.; Visual Awareness Techs. & Consulting, Inc., B-407332.5 et al., Jan. 12, 2015, 2015 CPD ¶ 35 at 6. Our review of an agency’s cost realism evaluation is limited to determining whether the cost analysis is reasonably based and not arbitrary. Jacobs COGEMA, LLC, B-290125.2, B-290125.3, Dec. 18, 2002, 2003 CPD ¶ 16 at 26.

Here, the agency’s cost realism analysis began with its PECP evaluation, which the agency explained “supported realism, but was not the sole determination of realism.” AR, Tab 14a, SSEB Evaluation at 75-76. The record reflects that the agency, for evaluation purposes, created its own market rates for direct labor (for each labor category) and for a fringe benefit rate (overall) for use in the evaluation. AR, Tab 10a.17, PECP Evaluation Tool Development Process at 1-3. For the direct labor market rate, the agency gathered salary data from the Bureau of Labor Statistics (BLS) and salary.com for both the Ogden, Utah region where Hill Air Force Base is located and at the national level. Id. at 1-2. The agency then averaged the BLS data and the salary.com data to arrive at one regional and one national rate for each labor category. Id.

Next, the agency compared the direct labor rates from the offerors’ LRMs to the regional and national market salary rates that the agency had created. Id. at 2-3. According to the Air Force, the evaluators decided any rate that was more than 20 percent below the agency’s national market rate and also more than 20 percent below the agency’s regional market rate, would be “flagged unacceptable” and would be raised in discussions with the offeror.[8] Id. at 2-3. Ultimately, the Air Force concluded that all the direct labor rates proposed by the three offerors were realistic, because no proposed rate was more than 20 percent below the agency’s national and regional market rates. AR, Tab 14a, SSEB Evaluation at 49, 75-76 (BAE); id. at 121, 143 (Guidehouse); Jacobs AR, Tab 14a, SSEB Evaluation at 190, 218 (Jacobs).

Similarly, the agency also created its own market rate for fringe benefits by averaging fringe benefit rates for management professionals from BLS and for civilian personnel from the Department of Defense. AR, Tab 10a17, PECP Evaluation Tool Development Process at 3. The agency then compared this market rate to offerors’ fringe benefit rates to identify any proposed rate more than 20 percent below the agency’s market rate. Id.

Market Rate Analysis Designed to Discern Realism

The protesters first argue that this market rate analysis did not provide a reasonable substitute for the comparison to incumbent compensation because the agency’s analysis was designed for the purpose of its cost realism evaluation, not a professional compensation evaluation under FAR provision 52.222-46. Specifically, the agency’s market rate analysis deemed realistic any direct labor rate and fringe benefit rate that was up to 20 percent less than the agency’s market rates. Protest at 25-26; Jacobs Protest at 13‑18. In other words, the agency’s analysis was narrowly focused on identifying exceedingly low rates.

As we have explained, in recompetitions like the procurement here--where there is a requirement to evaluate PECP--the agency has an obligation to determine whether the compensation proposed is lower than incumbent compensation. The Bionetics Corp., supra at 7-8. In circumstances where the agency did not have access to the incumbent’s salary and fringe benefit information, we have found evaluations reasonable where an agency developed its own baseline against which to evaluate proposed professional compensation, such as market rates or the standard pay scale for United States civilian government employees. See Systems Implementers, Inc., supra at 25-26; Obsidian Sols. Grp., LLC, supra at 9-10.

Here, the agency asserts that the market rates it developed for salaries and fringe benefits were its best estimate of incumbent compensation under the circumstances. See MOL at 18-19. The record, however, reflects that the agency did not use these rates to identify whether the offerors were proposing compensation less than the incumbent compensation, but instead whether any proposed rate was “unrealistically low” as compared to the market for similar professionals more generally. See AR Tab 10a17, PECP Evaluation Tool Development Process at 2. Indeed, the agency only defends the analysis as a reasonable way to identify rates that were unrealistically low. See, e.g., Jacobs Supp. MOL at 6-7 (asserting that its approach was a reasonable “cost realism evaluation method”). Instead of attempting to identify whether any proposed rates were lower than the estimated market rates, which the agency has argued during the protest was a valid stand in for incumbent compensation, the agency only flagged rates that were more than 20 percent lower. We therefore agree that the agency’s comparison did not in fact provide an assessment of whether offerors were proposing salaries or fringe benefits lower than the incumbent, as required by the solicitation.

Evaluation of Direct Labor Rates

The protesters also argue that the agency’s evaluation of BAE’s direct labor rates as if those rates reflected salaries was flawed. Comments at 16-22; Jacobs Comments & Supp. Protest at 2-10. As stated above, after developing its market rates the agency continued its realism assessment by comparing offerors’ direct labor rates to the agency developed market salary rates. The record reflects that both Guidehouse and Jacobs proposed blended direct labor rates in their LRMs that relied on each firm’s own direct labor rates and estimates of their subcontractors’ direct labor rates; whereas BAE proposed blended direct labor rates in its LRM that relied on the firm’s own direct labor rate and its subcontractors’ fully burdened labor rates. Comments at 16-22; Jacobs Comments & Supp. Protest at 2-10. The protesters contend that “BAE’s so called blended ‘direct’ rates were artificially inflated with fully burdened subcontractor rates,” and thereby rendered unreasonable the agency’s evaluation of those rates. Comments at 16-22; Jacobs Comments & Supp. Protest at 2-10.

In response, the agency contends that BAE’s approach was consistent with the solicitation’s instructions that a direct labor rate means “Direct Labor cost or direct subcontract cost per hour to Offeror[,]” which the agency argues meant that the rate should be a blend of the prime contractor’s unburdened labor rate and subcontractors’ burdened labor rates. Supp. MOL at 2-3 (citing AR, Tab 4g, LRM). Thus, according to the agency, any complaints by Guidehouse or Jacobs about the agency’s reliance on BAE’s burdened direct labor rate is an untimely challenge to the terms of the solicitation.

We need not decide if it was reasonable for BAE and the agency to understand that the direct labor rate might be a blend of a prime direct labor rate and a subcontractor burdened labor rate, because the ultimate issue is how the agency relied on direct labor rates in the cost realism evaluation. It is a fundamental principle of government procurement that a contracting agency must provide a common basis for competition and may not disparately evaluate offerors with regard to the same requirements. See, e.g., Lockheed Martin Info. Sys., B-292836 et al., Dec. 18, 2003, 2003 CPD ¶ 230 at 11‑12; Rockwell Elec. Com. Corp., B‑286201 et al., Dec. 14, 2000, 2001 CPD ¶ 65 at 5. While it is up to the agency to decide upon some appropriate and reasonable method for evaluating offerors’ costs, an agency may not use an evaluation method that produces a misleading result. See Bristol-Myers Squibb Co., B-294944.2, Jan. 18, 2005, 2005 CPD ¶ 16 at 4; AirTrak Travel et al., B-292101 et al., June 30, 2003, 2003 CPD ¶ 117 at 22. The method chosen must include some reasonable basis for evaluating or comparing the relative costs of proposals, so as to establish whether one offeror's proposal would be more or less costly than another’s. AirTrak Travel, supra.

Here, the Air Force’s comparison appears to have relied on the following: (1) BAE’s blended direct labor rate, which was at least partially burdened with subcontractor indirect costs; (2) unburdened direct labor rates from the agency’s market research of salaries; and (3) the protesters’ unburdened blended direct labor rates. For example, BAE proposed a partially burdened direct labor rate of $[DELETED] for Senior Systems Analyst. AR, Tab 6b6, BAE LRM at Cost Elements-KFS-OffB Tab, Row 137. According to Jacobs, adjusting BAE’s rate to remove a reasonable estimate of the amount attributable to subcontractor burden reduces BAE’s rate to $[DELETED]. Jacobs Comments & Supp. Protest at 9.

The Air Force’s regional and national market salary rates for Senior Systems Analyst were $[DELETED] and $[DELETED]. AR, Tab 10a4, Guidehouse PECP Evaluation Tool at Labor Category Balance Check Tab, Row 124. The protesters proposed unburdened direct labor rates of $[DELETED] (Jacobs) and $[DELETED] (Guidehouse). Jacobs AR, Tab 21.c5, Jacobs LRM at Cost Elements-KFS-OffB Tab, Row 137; AR, Tab 12b5, Guidehouse LRM at Cost Elements-KFS-OffB Tab, Row 137. In essence, the protesters contend that using the properly adjusted rates would have led the Air Force to conclude that BAE’s rates were too low. See Comments at 18- 22; Jacobs Comments & Supp. Protest at 7‑10.

As the agency’s evaluation relies on a comparison between rates that contain subcontractor indirect costs and rates that do not, offerors were not evaluated for cost realism on a common basis. As a result, we cannot find that the agency’s method reasonably established whether offerors’ proposals would be realistic. AirTrak Travel, supra. Similarly, we cannot conclude that the agency’s evaluation of professional compensation was reasonable where the agency compared BAE’s partially burdened labor rates to the agency’s unburdened market rates. See, e.g., MicroTechnologies, LLC, B-413091, B-413091.2, Aug. 11, 2016, 2016 CPD ¶ 219 at 11‑12 (sustaining protest where agency compared burdened rates to unburdened market survey information).

Because the agency relied on the cost realism analysis to assess professional compensation, the protesters further argue that the agency’s cost realism evaluation of direct labor rates was not consistent with the solicitation’s stated basis for evaluating professional compensation. Protest at 25-26; Comments at 13-16; Jacobs Protest at 16; Jacobs Comments & Supp. Protest at 16-19. According to the protesters, the solicitation specified that proposed compensation should be competitive not only regionally, but also nationally, yet the agency only flagged, as unrealistic, those rates that fell below the 20 percent threshold for both markets, i.e., the agency did not flag proposed rates that fell below the 20 percent threshold for only one of the market rates. Protest at 25‑26; Comments at 13‑16; Jacobs Protest at 16; Jacobs Comments & Supp. Protest at 16-19.

The Air Force does not dispute that its analysis identified rates as too low only when they were below both the (higher) national rates and the (lower) regional rates. MOL at 24‑25. For example, BAE’s direct labor rate for junior configuration management engineer was $[DELETED], which was [DELETED] percent lower than the agency’s regional market average of $[DELETED] and [DELETED] percent lower than the agency’s national market average of $[DELETED]. See AR, Tab 10a4, Guidehouse PECP Evaluation Tool at FPR Evaluation 20% Low Tab, Row 89. The record confirms that even though this rate was more than 20 percent below the agency’s national market average, the rate was not flagged for further analysis or considered unrealistic because it was within 20 percent of the agency’s regional market average.[9] See Protest at 25-26; Comments at 13-16.

This evaluation approach was inconsistent with the terms of the solicitation, which stated that the agency would analyze the adequacy and realism of professional compensation to ensure it was competitive not only within the Utah region but also nationally. AR, Tab 4c, RFP Sections L&M at 11. This aspect of the analysis was therefore also unreasonable. See Valkyrie Enters., LLC, B-415633.3, July 11, 2019, 2019 CPD ¶ 255 at 9.

Evaluation of Narrative Professional Employee Compensation Plan

Finally, Guidehouse argues that the agency erred when the Air Force misread BAE’s proposal. According to Guidehouse, the agency wrongly concluded in its evaluation of BAE’s PECP that BAE proposed to pay all HD/LD personnel an average of [DELETED] more than non-HD/LD personnel, when in reality, BAE proposed to pay only a subset of HD/LD personnel the [DELETED] premium. Comments at 22-23. The intervenor acknowledges that BAE’s “[DELETED] higher average salary for HD/LD positions was proposed [for] those [positions] [DELETED]” rather than all HD/LD positions. Intervenor Supp. Comments at 12. The Air Force denies that it misunderstood BAE’s proposal, and asserts that, in any event, Guidehouse cannot establish prejudice from any potential misunderstanding in this regard, because “this would not change the Air Force’s determination that BAE’s labor rates were realistic.” Supp. MOL at 5.

The contemporaneous evaluation documents reflect that the agency found BAE’s PECP adequate, in part, based on a misunderstanding of BAE’s proposal. As explained above, the record reflects that BAE did not propose a premium for all HD/LD positions; the premium applied to only a subset of these positions. See AR, Tab 14a, SSEB Evaluation at 48. Nevertheless, before concluding that BAE’s proposed PECP “provides sufficient evidence of the Offeror’s ability to remain competitive nationally and regionally,” the evaluators describe BAE’s salary structure as though it applied to all HD/LD positions. Specifically, the evaluation record states that “[i]n an effort to attract and retain key ICBM talent, [BAE] has tailored a compensation strategy specifically for HD/LD employees, offering an average of [DELETED] higher than non-HD/LD positions.” Id. at 48-49. We find this aspect of the agency’s evaluation to also be unreasonable.

Despite these findings, our Office will not sustain a protest unless the protester demonstrates a reasonable possibility that it was competitively prejudiced by the agency’s actions, that is, unless the protester demonstrates that, but for the agency’s actions, it would have had a substantial chance of receiving the award. Raytheon Co., B-409651, B‑409651.2, July 9, 2014, 2014 CPD ¶ 207 at 17. We resolve any doubts regarding prejudice in favor of a protester. Intelsat Gen. Corp., B-412097, B-412097.2, Dec. 23, 2015, 2016 CPD ¶ 30 at 19-20. Here, had the Air Force properly evaluated proposed compensation plans under FAR provision 52.222-46 and the cost realism of labor rates, it is possible that the agency may have found sufficient risk in BAE’s proposal to result in Guidehouse’s or Jacobs’s proposal being the best value to the government. Thus, we find that there is a reasonable possibility of prejudice to the protesters, and on this basis, we sustain the protests.

Alleged Subcontracting Risk

Jacobs also argues that the agency “did not properly consider the risk posed by BAE subcontracting over a quarter of its work.” Jacobs Comments & Supp. Protest at 29; Jacobs Supp. Comments at 12-13. According to Jacobs, the fact that “BAE proposed to subcontract 26 percent of the work to 19 separate subcontractors” created “coordination, integration, and efficiency risks, management risk, and cost risk” that should have resulted in the Air Force assessing a weakness in BAE’s proposal. Id. In Jacobs’s view, the weakness should have been assessed given that section L of the proposal preparation instructions required offerors to address how they will provide effective work processes. Specifically, Jacobs notes that for the operational capability subfactor, each offeror was to “describe the operational approach for providing the services and products delineated in the PWS” by addressing, among other things, the “[a]pproach for an integrated, effective, and efficient work process. . . . ” AR, Tab 4c, RFP Sections L&M at 11. Jacobs contends that the Air Force acted unreasonably by failing to assess risk in BAE’s proposal for relying--according to Jacobs--too heavily on subcontractors resulting in a lack of “accountability” and “transparency.” Jacobs Comments & Supp. Protest at 20; Jacobs Supp. Comments at 12-13.

The Air Force responds that the solicitation provided no basis to assign a weakness to BAE for proposing to subcontract a portion of the work. Jacobs Supp. MOL at 14-15. The Air Force notes that Jacobs also proposed to subcontract--in this instance, 15 percent of the work--such that Jacobs cannot establish that the agency’s evaluation was unreasonable or unfairly prejudiced Jacobs. Id.

In reviewing a protest challenging an agency’s evaluation, our Office will not reevaluate proposals nor substitute our judgment for that of the agency, as the evaluation of proposals is a matter within the agency’s discretion. Sterling Med. Assocs., B-418674, B-418674.2, July 23, 2020, 2020 CPD ¶ 255 at 4. Rather, we will review the record to determine whether the agency’s evaluation was reasonable and consistent with the stated evaluation criteria and with applicable procurement statutes and regulations. Id.; Arctic Slope Mission Servs. LLC, B-417244, Apr. 8, 2019, 2019 CPD ¶ 140 at 8. A protester’s disagreement with the agency’s evaluation judgments, without more, does not render those judgments unreasonable. Id.; Serco Inc., B-407797.3, B-407797.4, Nov. 8, 2013, 2013 CPD ¶ 264 at 8.

Here, Jacobs’s contention that the agency ignored BAE’s plan to rely on subcontractors is not supported by the record. Rather, the record shows that the agency understood and evaluated BAE’s proposal under operational approach--including BAE’s approach for an integrated, effective, and efficient work process--and we find nothing unreasonable with the agency’s assessment that there were no weaknesses and only low risk. See, e.g., Jacobs AR, Tab 14c, SSEB Comparative Analysis Report at 19. The protester’s arguments to the contrary represent nothing more than its disagreement with the evaluators’ judgments, which, without more, is insufficient to establish that the agency acted unreasonably. See, e.g., Serco Inc., supra at 9 (denying protest that agency failed to assess performance risk in awardee's proposal where it was clear from the record that the agency had considered the risk associated with the awardee's approach and concluded it was very low). Accordingly, this protest allegation is denied.

Best-Value Award Decision

The protesters also challenge the agency’s best-value determination. Guidehouse contends that the agency’s selection decision was unreasonable because it was “marred” by errors in the underlying evaluations. Comments at 29-30. Additionally, both protesters contend that the agency departed from the RFP’s stated evaluation criteria and failed to meaningfully consider their technical advantages. Comments at 30-32; Jacobs Comments & Supp. Protest at 25-27.

The Air Force maintains that the tradeoff analysis and best-value determination were well-documented and rational, and that the agency need only show that the SSA was aware of the relative merits and costs of competing proposals. MOL at 27-30; Jacobs MOL at 32-36. The agency notes that while source selection decisions must be documented, there is no requirement for extensive documentation of every consideration factored into a tradeoff decision. Id.

In reviewing an agency’s source selection decision, we examine the supporting record to determine if it was reasonable and consistent with the solicitation’s evaluation criteria and applicable procurement statutes and regulations. Technology Concepts & Design, Inc., B-403949.2, B-403949.3, Mar. 25, 2011, 2011 CPD ¶ 78 at 8. The evaluation of proposals and consideration of their relative merits should be based upon a qualitative assessment of proposals consistent with the solicitation’s evaluation scheme. NOVA Corp., B-408046, B-408046. 2, June 4, 2013, 2013 CPD ¶ 127 at 5. Where a solicitation provides for a tradeoff between the cost and non-cost factors, an agency properly may select a lower-cost, lower-rated proposal if the agency reasonably concludes that the cost premium involved in selecting a higher-rated, higher-priced proposal is not justified in light of the acceptable level of technical competence available at a lower cost. i4 Now Sols., Inc., B-412369, Jan. 27, 2016, 2016 CPD ¶ 47 at 15. However, a tradeoff analysis that fails to furnish any explanation as to why a higher-rated proposal does not in fact offer technical advantages or why those technical advantages are not worth a cost premium does not satisfy the requirement for a documented tradeoff rationale, particularly where, as here, cost is secondary to technical considerations under the RFP’s evaluation scheme. Blue Rock Structures, Inc., B-293134, Feb. 6, 2004, 2004 CPD ¶ 63 at 6.

Here, the SSA began by discussing the evaluation of BAE, Guidehouse, and Jacobs under each evaluation subfactor, explaining that the SSA concurred with the findings and conclusions of the SSEB and SSAC. AR, Tab 16a, SSDD at 3-9. The source selection decision concludes with an “award determination” section that addresses the tradeoff between the cost and non-cost factors for the three proposals. Id. at 9. The tradeoff includes an acknowledgment that “Guidehouse offered the superior technical proposal overall, and the Jacobs’ technical approach was slightly better than BAE’s.” The rationale identified for why Guidehouse’s and Jacobs’s higher-rated proposals were not worth the cost premiums, however, is that the SSA considered the “proposals and the benefits of all the strengths assigned” and adopted the SSAC’s conclusion that:

[N]one of these strengths, either individually or in the aggregate, come close to providing hundreds of millions of value to the Government, let alone over $462M in the case of Jacobs or $808M in the case of Guidehouse, particularly when many of their technical advantages are unquantifiable and unknown since their ultimate value will depend on the frequency of new efforts to sustain the MMIII weapon system during the performance of this contract.

Id. at 9. Although the agency is correct that the SSA need not “monetize” or assign specific dollar values to strengths, MOL at 29, we have also found that general statements, without any meaningful comparison of the proposals, are insufficient to justify cost/technical tradeoff decisions. ManTech Advanced Sys. Int’l Inc., B-415497, Jan. 18, 2018, 2018 CPD ¶ 60 at 6; see also VariQ Corp., B-414650.11, B- 414650.15, May 30, 2018, 2018 CPD ¶ 199 at 11 (sustaining protest where the record did not include a meaningful explanation for the source selection’s authority determination that certain strengths were more substantial). Additionally, even where the source selection document contains summaries of the strengths and weaknesses of the proposals, our Office will sustain a protest where the record does not reflect a qualitative comparison of those strengths and weaknesses. West Coast Gen. Corp., B‑411916.2, Dec. 14, 2015, 2015 CPD ¶ 392 at 12.

The Air Force asserts that the SSA did explain the analysis by characterizing the benefits of Guidehouse’s and Jacobs’s proposals as uncertain under the circumstances here. MOL at 28-29; Jacobs MOL at 34-35. Specifically, the Air Force asserts that it awarded this contract as an IDIQ vehicle because its needs are uncertain given that the “MMIII weapon system is aging and the Air Force’s capabilities are being transitioned to the new GBDS, Sentinel weapon system, which has not yet been fully developed.” Id. Agencies are, however, still obligated to adequately document rational bases for their source selection decisions when awarding IDIQ contracts. See Qbase, LLC et al., B‑416377.9 et al., Nov. 13, 2020, 2020 CPD ¶ 367 at 14-19; Novetta, Inc., B‑414672.4, B-414672.7, Oct. 9, 2018, 2018 CPD ¶ 349 at 23‑25.

In light of our determination that certain aspects of the evaluation of professional employee compensation and cost realism were not reasonable, and our corresponding recommendations, we need not determine whether the agency departed from the solicitation’s stated evaluation scheme when making its best-value tradeoff. Innovative Test Asset Sols., LLC, B-411687, B‑411687.2, Oct. 2, 2015, 2016 CPD ¶ 68 at 19 n.26. The agency, however, may want to consider this discussion when performing a new best-value tradeoff consistent with our recommendation below.

RECOMMENDATION

As detailed above, we find the agency’s evaluation of professional employee compensation and cost realism to be unreasonable in certain regards. We recommend that the Air Force reevaluate proposals, consistent with this decision, ensuring that the evaluation is performed on a common and consistent basis for all offerors. We also recommend that the Air Force perform a new best-value tradeoff and make a new source selection decision. In the event the reevaluation results in the selection of an offeror other than BAE, we recommend that the agency terminate the contract awarded to BAE for the convenience of the government and award the contract to the offeror found to represent the best value, if otherwise proper. We also recommend that Guidehouse and Jacobs be reimbursed the costs of filing and pursuing their protests, including reasonable attorneys’ fees. 4 C.F.R. § 21.8(d)(1). Guidehouse and Jacobs should submit their certified claim for costs, detailing the time expended and costs incurred, directly to the contracting agency within 60 days after receipt of this decision. 4 C.F.R. § 21.8(f)(1).

The protests are sustained.

Edda Emmanuelli Perez
General Counsel

 

[1] The protests were developed separately. For ease of reference, citations are to the filings and record in Guidehouse LLP, B-420860.1, except where identified as Jacobs. Also, references to page numbers throughout the decision are to the sequential pagination numbering provided by the agency in its report to our Office.

[2] The adjectival technical ratings were blue/outstanding, purple/good, green/acceptable, yellow/marginal, and red/unacceptable. AR, Tab 4c, RFP Sections L&M at 29. The adjectival technical risk ratings were low, moderate, high, and unacceptable. Id. at 30.

[3] The RFP included an attachment, listing a subset of the labor categories identified in the labor rate matrix (LRM) with “limited labor pools, which makes recruiting and hiring fully qualified personnel challenging,” i.e., HD/LD labor categories. Id.

[4] As discussed above, the Air Force anticipated awarding five individual ATOs with the basic contract. AR, Tab 4c, RFP Sections L&M at 1. The solicitation provided that the agency would rely on the ATOs to calculate the total evaluated cost for each offeror. Id. at 17.

[5] The LRM also included worksheets for the total evaluated costs and for each ATO that did not require any input from offerors; instead, the worksheets would “auto-populate” from the “cost elements worksheets” and from the award fee and indirect rates worksheet. Id. at 17-19.

[6] This provision provides in relevant part:

(b) . . . Additionally, proposals envisioning compensation levels lower than those of predecessor contractors for the same work will be evaluated on the basis of maintaining program continuity, uninterrupted high-quality work, and availability of required competent professional service employees. Offerors are cautioned that lowered compensation for essentially the same professional work may indicate lack of sound management judgment and lack of understanding of the requirement.

FAR 52.222-46(b).

[7] The matrix specified that the rates were “fully loaded rates and include[d] all applicable direct labor, indirect labor, general & administrative (G&A), facilities capital cost of money (FCCOM) costs, and profit[,]” and did not provide any breakdown. MOL at 6.

[8] The protesters also challenge the agency’s selection of a 20 percent threshold for realism and selection of salary survey data. See, e.g., Jacobs Comments & Supp. Protest at 10-16. The record shows, however, that the agency reasonably compared proposed costs against BLS and salary.com survey data available as of the date proposals were due, and adjusted costs using independent judgment. On this record, we do not find a basis within these challenges to sustain the protest. See Sabre Sys., Inc., B-420090.3, June 1, 2022, 2022 CPD ¶ 137 at 3.

[9] The contemporaneous record reflects that the agency intended to flag any rates that were equal to 20 percent (or more) below the regional and national rates, but ultimately flagged only rates than were more than 20 percent below (i.e., the agency did not flag rates that its calculation showed were equal to 20 percent). See AR, Tab 10a17, PECP Evaluation Tool Development Process at 2-3; Jacobs Supp. COS at 11-12. Jacobs asserts that this error provides an independent basis to sustain its protest, identifying five labor categories that were not flagged and raised in discussions with BAE because they were only (exactly) 20 percent lower than the regional market rate. Jacobs Comments & Supp. Protest at 19-20. The agency concedes that it made a calculation error, but asserts that the protest argument should be denied because the 20 percent threshold was not specified in the solicitation and the effect was negligible. Jacobs Supp. MOL at 7-8. Because of our analysis of the agency’s PECP and cost realism overall, we need not determine whether this error was prejudicial here, but instead identify this concern so the agency may consider it during its reevaluation.

Downloads

GAO Contacts