Skip to main content

Booz Allen Hamilton, Inc.

B-409355,B-409355.2 Mar 19, 2014
Jump To:
Skip to Highlights

Highlights

Booz Allen Hamilton Inc. (BAH), of Ridgecrest, California, protests the issuance of a task order to Jacobs Technology Inc. (Jacobs), of Fort Walton, Florida, pursuant to request for proposals (RFP) No. N68936-11-R-0056, issued by the Department of the Navy, Naval Air Warfare Center Weapons Division (NAWCWD) for information technology systems and services, primarily at China Lake and Point Mugu, California. The protester challenges the agency's evaluation of proposals.

We deny the protest.

We deny the protest.
View Decision

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of: Booz Allen Hamilton, Inc.

File: B-409355; B-409355.2

Date: March 19, 2014

Mark D. Colley, Esq., Kristen E. Ittig, Esq., Dominique L. Casimir, Esq., and Dana E. Peterson, Esq., Arnold & Porter LLP, for the protester.
Robert J. Symon, Esq., Daniel P. Golden, Esq., and Aron C. Beezley, Esq., Bradley Arant Boult Cummings LLP, for Jacobs Technology Inc., the intervenor.
Andre Long, Esq., Department of the Navy, for the agency.
Jennifer D. Westfall-McGrail, Esq., and Edward Goldstein, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1. Protest that in its past performance evaluation agency improperly considered only most recent annual rating of awardee’s past performance on very relevant contract is denied where solicitation did not require review of performance over more extended period.

2. Protest arguments that (1) adjustments to awardee’s proposed costs were indicative of a lack of understanding, and (2) awardee’s proposed compensation plan posed a risk to successful program execution are denied where record demonstrates that evaluators had reasonable basis for concluding otherwise.

DECISION

Booz Allen Hamilton Inc. (BAH), of Ridgecrest, California, protests the issuance of a task order to Jacobs Technology Inc. (Jacobs), of Fort Walton, Florida, pursuant to request for proposals (RFP) No. N68936-11-R-0056, issued by the Department of the Navy, Naval Air Warfare Center Weapons Division (NAWCWD) for information technology systems and services, primarily at China Lake and Point Mugu, California. The protester challenges the agency’s evaluation of proposals.

We deny the protest.

BACKGROUND

The RFP was issued on June 28, 2013 to firms holding contracts under the General Services Administration’s Alliant government-wide acquisition contract.[1] The solicitation contemplated the issuance of a single cost-plus-fixed-fee/level-of-effort order for a base period of one year, with four 1-year options. The solicitation advised that the order would be issued on a best-value basis, considering technical, past performance, and cost/price factors, with technical and past performance of equal weight, and, when combined, of significantly greater importance than cost/price.

Under the technical factor, the RFP provided for consideration of the following four criteria (the first and third of which are of particular relevance to this protest): (1) workforce; (2) understanding of the work; (3) management plan; and (4) software development approach. With regard to workforce, offerors were to provide information regarding the education, experience, and security clearance of each proposed employee; describe the labor categories they intended to use; submit resumes for key personnel; and provide a workforce hour matrix by labor category. With regard to their management plans, offerors were to describe their overall management approach; use of subcontractors; procedures for qualifying, recruiting, and retaining personnel; and transition plans. Regarding the transition plans, the solicitation advised that “tasks under this solicitation are vital to the Government and must be continued without interruption,” and that “a smooth workplace changeover from an incumbent with no loss of service and minimal loss of corporate knowledge” was required. RFP, amend. 0001, at 136-137.

The solicitation provided that, under the technical factor, the agency would separately rate the quality of the offeror’s proposed solution and the risk associated with the approach.[2] As relevant here, a technical rating of outstanding reflected an exceptional approach and understanding of the requirements, with multiple strengths and no deficiencies, whereas a rating of good reflected a thorough approach and understanding of the requirements, with at least one strength and no deficiencies. Further, a risk rating of low reflected an approach with little potential to cause disruption of schedule, increased cost, or degradation of performance, whereas a risk rating of moderate reflected an approach that could potentially cause any of the above.

Under the past performance factor, the solicitation provided for consideration of the offeror’s team’s performance on similar efforts currently ongoing or completed within the five years prior to release of the RFP here. Past performance was first to be assessed for relevance; then, based on the quality of the recent and relevant performance, the agency would assign performance confidence ratings of substantial, satisfactory, limited, no, or unknown confidence.

Regarding cost, the RFP provided for a realism analysis of offerors’ proposed costs to determine whether they reflected an understanding of the solicitation’s requirements and consistency with the various elements of the offeror’s proposal. In this connection, the RFP provided that “[u]nrealistically low costs or inconsistencies between the technical and cost proposals may be assessed as proposal risk and could be considered weaknesses under the technical factor.” Id. at 149. The solicitation further provided that if proposed costs, including labor and/or indirect rates, were considered unrealistic, they might be adjusted upward to reflect more realistic costs.

The agency received five proposals prior to the August 20 closing date. After each member of the agency’s technical evaluation team individually reviewed each proposal, the team members gathered as a group to decide on consensus findings and ratings. A cost evaluation team separately analyzed the realism of offerors’ cost proposals. Evaluation ratings and costs (both as proposed and as adjusted) were as follows:

 

 

Technical

Past Performance

 

Proposed Cost

 

Adjusted Cost

Jacobs

Outstanding/

Low Risk

Substantial Confidence

 

$64,384,082

 

$71,239,312

BAH

Good/

Moderate Risk

Substantial Confidence

 

$71,799,102

 

$76,121,576

Offeror A

Good/

Moderate Risk

Substantial Confidence

 

$83,346,693

 

$83,372,342

Offeror B

Marginal/

Moderate Risk

Substantial Confidence

 

$72,418,625

 

$73,395,557

Offeror C

Marginal/

Moderate Risk

Substantial Confidence

 

$79,199,542

 

$80,144,356


Technical and Past Performance Evaluation Consensus Report at 2.

The evaluators identified 16 strengths, 1 weakness (with low performance risk), and no significant weaknesses or deficiencies in Jacobs’ proposal. They identified 8 strengths, 3 weaknesses (one with low, and two with moderate, performance risk), and no significant weaknesses or deficiencies in the protester’s proposal. The weaknesses in BAH’s proposal were as follows:

  •  
  • Concerns surround the ability to fill billets and avoid a break in service. Of the [deleted] proposed personnel, [deleted] are prospective. Though the offeror plans to hire incumbent personnel upon award, the offeror failed to adequately address how they would mitigate this risk to ensure that a sufficient workforce is ready to hit the ground running on the effective date. For example, the offeror proposes to replace the [deleted] with their current employees. This is considered a moderate performance risk due to the loss of corporate knowledge.

  • The Workforce Qualification Spreadsheet (WQS) is inconsistent with what is stated on page 1 in the Technical Volume, Section 2.1 Workforce. On page 1 in the Technical Volume, the Offeror lists [deleted] incumbent personnel and [deleted] total personnel. The WQS lists [deleted] incumbent personnel and [deleted] total personnel. This risk is considered low because it has little potential to cause disruption of schedule, increase cost, or degradation of performance.

  • Enterprise Architect is proposed to be filled with a current [deleted] employee with a “Public Trust” security clearance level, which will not meet the requirements of the position and is therefore considered a weakness. For example, providing database design, architecture and migration requires a higher level clearance than “Public Trust.” This is considered to be a moderate risk because a person with a “Public Trust” security clearance level cannot be issued a CAC card, preventing access to information systems.
Id. at 14.

 

After considering the findings of the technical and cost evaluation teams, the source selection authority (SSA) selected Jacob’s proposal as representing the best value to the government, noting that it had received superior technical and past performance ratings and that it had both the lowest proposed and the lowest evaluated cost. Source Selection Decision, Nov. 26, 2013, at 4. By letter dated December 5, the contracting officer notified the protester of Jacobs’ selection. After the agency furnished the protester with a requested debriefing, BAH protested to our Office.[3]

DISCUSSION

BAH raises a number of objections to the agency’s evaluation. The protester argues that the Navy’s evaluation of Jacobs’ past performance was unreasonable, and that the agency improperly failed to consider whether the adjustments made to Jacobs’ proposed costs were indicative of a lack of understanding. BAH further argues that some of the adjustments made to its own proposed costs as part of the agency’s cost realism analysis were unwarranted. BAH also challenges the agency’s evaluation of its own technical proposal, arguing that the findings of weakness were unreasonable and unequal as compared to the evaluation of Jacobs’ proposal; it was entitled to three additional strengths for matters where Jacobs received credit for approaches also found in BAH’s proposal; and the evaluation record was inadequately documented. As discussed below, we find the protester’s complaints pertaining to the evaluation of Jacobs’ proposal and to the sufficiency of the evaluation record to be without merit. We also find that the protester suffered no prejudice as a result of alleged errors in the evaluation of its own technical and cost proposals, and accordingly do not address its specific complaints pertaining to these areas.

Jacobs’ Past Performance

The evaluators rated Jacobs’ past performance (as well as the past performance of every other offeror) as meriting substantial confidence. The rating was based on CPARS (Contractor Performance Assessment Reporting System) records for eight contracts performed by Jacobs and one performed by a member of its team. One of the contracts was considered very relevant and five were considered relevant. For the very relevant contract, the predecessor contract to the one here, Jacobs’ performance was rated as very good for quality of product/service, schedule, cost control, business relations, and management of key personnel, and as exceptional for utilization of small business; for all of the relevant contracts, Jacobs (and its team member’s) performance was rated as exceptional in all applicable areas.

BAH’s objection to the evaluation of Jacobs’ past performance focuses on Jacobs’ performance under the predecessor contractor. The protester argues that the agency evaluators unreasonably considered only the most recent CPARS record, which assessed Jacobs’ performance between January 31, 2012 and July 31, 2012. The protester maintains that the RFP provided for consideration of the preceding five years of performance, and that had the evaluators considered Jacobs’ performance problems over this term, they would have found that Jacobs had encountered problems in performing. In the foregoing connection, BAH submitted with its protest a declaration from one of its employees, who represented that he had repeatedly spoken with numerous NAWCWD employees who had worked with Jacobs on the predecessor contract, and that these individuals had “expressed their frustration” regarding Jacobs’ performance on the contract. Protest, Exh. G. The declarant went on to identify several specific issues pertaining to Jacobs’ performance of which he had allegedly been informed.

The evaluation of past performance is a matter within the discretion of the contracting agency. In reviewing an agency’s evaluation of past performance, we will not reevaluate proposals, but instead will examine the agency’s evaluation to ensure that it was reasonable and consistent with the solicitation. Maywood Closure Co., LLC, B-408343 et al., Aug. 23, 2013, 2013 CPD ¶ 199 at 5.

The protester’s argument that it was improper for the agency to consider only the most recent CPARS record pertaining to Jacobs’ performance is without merit. The solicitation provided that offerors’ past performance would be evaluated based on contracts or subcontracts for similar efforts that were ongoing or completed within the last five years, RFP, amend. 0001, at 148, not that the agency would consider the full five years of performance. Moreover, the record shows that the agency was consistent in its approach of using the most recent annual report in the CPARS in its evaluation of offerors’ past performance; that is, there is no evidence that the evaluators deviated from their general approach to evaluating offeror past performance in evaluating Jacobs’ performance under the contract in question. With regard to the substance of the BAH declarant’s allegations, the agency submitted declarations from two cognizant government employees, the head of the NAWCWD Information Technology/Information Management (IT/IM) Department and the contracting officer’s representative for the NAWCWD IT/IM Department contract, both of whom expressed disagreement with the BAH declarant’s allegations pertaining to issues with Jacobs’ performance. The contracting officer’s representative noted, for example, that “[t]he official evaluation/assessment records on the IT/IM contract with Jacobs reflect very good to outstanding scores, including IA related tasks, which are based on the evaluation feedback/input from the Government subject matter experts.” Agency Report, Exh. 19. Since the record supports neither the protester’s argument that the agency improperly limited the scope of its review of Jacobs’ performance on the predecessor contract, nor its argument that agency officials have been dissatisfied with Jacobs’ performance, we deny this basis for protest.

Cost Realism Evaluation of Jacobs

Next, BAH argues that the agency improperly failed to consider whether the adjustments made to Jacobs’ proposed costs were indicative of a lack of understanding. In this connection (and as previously noted), the RFP here provided that “[u]nrealistically low costs or inconsistencies between the technical and cost proposals may be assessed as proposal risk and could be considered weaknesses under the technical factor.” RFP, amend. 0001, at 149. In a related vein, the protester argues that the evaluators unreasonably failed to identify Jacobs’ proposed direct labor rates, which are on average slightly lower than the rates Jacobs is currently paying its employees under the incumbent contract, as posing a risk to successful contract performance.

When an agency intends to award a cost-reimbursement contract, a cost realism analysis must be performed to determine the extent to which the offeror’s proposed costs represent what the costs are likely to be, given the offeror’s unique technical approach. This analysis includes independently reviewing and evaluating specific elements of each offeror’s cost estimate to determine whether the estimated proposed cost elements are realistic for the work to be performed, reflect a clear understanding of the requirements, and are consistent with the unique technical approach described in the offeror’s proposal. Federal Acquisition Regulation (FAR) § 15.404-1(d)(1); Advanced Commc’n Sys., Inc., B-283650 et al., Dec. 16, 1999, 2000 CPD ¶ 3 at 5. Based on the results of the cost realism analysis, an offeror’s proposed costs should be adjusted when appropriate. FAR § 15.404‑1(d)(2)(ii). Our review of an agency’s cost realism evaluation is limited to determining whether the cost analysis is reasonably based and adequately documented. Metro Mach. Corp., B-402567, B‑402567.2, June 3, 2010, 2010 CPD ¶ 132 at 6.

The record fails to support the protester’s first allegation. The mere fact that the evaluators did not find the adjustments to Jacobs’ proposed costs to be indicative of a lack of understanding does not demonstrate that they failed to consider that possibility. According to the cost realism report for Jacobs, the agency’s cost analysis included consideration of whether Jacobs’ proposed costs reflected an understanding of the requirements. Jacobs Cost Realism Report at 11. Furthermore, the record demonstrates that the evaluators had a reasonable basis for not regarding the understatement of costs as indicative of a lack of understanding. In this connection, the cost adjustments were made to account for potential cost overruns associated with [deleted] proposed by Jacobs. The SSA summarized the basis for the adjustment to Jacobs’ proposed costs as follows:

Cost realism analysis determined that Jacobs’ proposal was understated. Jacobs’s proposal ($64,384,082) intends to establish a [deleted] that will significantly lower Jacobs’s indirect cost structure. The proposal contained all necessary information to establish the [deleted]; however, the cost evaluation team utilized a conservative approach to establish Jacobs’s evaluated cost/price ($71,239,312) based on worst case scenarios regarding escalation and other subjective aspects of their proposed rates.

 

Source Selection Decision at 3-4. Nothing in the foregoing summary suggests that the adjustment in question concerned Jacobs’ understanding of the agency’s technical requirements.

Regarding the protester’s second argument, in analyzing the realism of Jacobs’ proposed costs, the cost evaluators noted that it had proposed an average salary reduction of [deleted] affecting approximately [deleted] of the current workforce. Jacobs Cost Realism Report at 3. The evaluators further noted that even after the proposed reductions, the wage rates for all employees covered by the Service Contract Act were in accordance with the applicable wage determination, and the rates for all professional employees exceeded the 10th percentile rates for similar positions provided by Salary.com. Id. at 3-4. The evaluators acknowledged that Jacobs’ plan to reduce wage rates raised concern, but concluded as follows:

While the overall rate decreases do cause concern regarding employee retention, the offeror has specified . . . that the wages for [deleted] will not be decreased. The technical evaluation team assessed the decreases and has determined that they pose little risk to successful program execution overall. Therefore, based on this analysis and given the prevailing economic realities and budget pressures in the Government sector, Jacobs’ proposed direct labor rates are considered realistic, prior to application of escalation . . . .

 

Id. at 4. While the protester takes issue with the evaluators’ conclusion that reducing the wages of [deleted] will have little negative impact on successful program execution, it has not demonstrated that the evaluators’ judgment was unreasonable. In this connection, a protester’s mere disagreement with the agency’s conclusions does not demonstrate that an evaluation was unreasonable. Visual Connections, LLC, B-407625, Dec. 31, 2012, 2013 CPD ¶ 18 at 4.[4]

Adequacy of Evaluation Record

BAH also argues that the evaluation record is inadequately documented in that it provides no explanation for why several weaknesses and a deficiency in Jacobs’ proposal identified by the evaluators on their individual evaluation worksheets did not make their way into the consensus evaluation report.[5]

We recognize that it is not unusual for individual evaluator ratings to differ from one another, or from the consensus ratings eventually assigned. SRA International, Inc., B-407709.5, B-407709.6, Dec. 3, 2013, 2013 CPD ¶ 281 at 10-11; Systems Research and Applications Corp.; Booz Allen Hamilton, Inc., supra, at 18. Indeed, the reconciling of such differences among evaluators’ viewpoints is the ultimate purpose of a consensus evaluation. J5 Sys., Inc., B‑406800, Aug. 31, 2012, 2012 CPD ¶ 252 at 13; Hi-Tec Sys., Inc., B-402590, B-402590.2, June 7, 2010, 2010 CPD ¶ 156 at 5. Likewise, we are unaware of any requirement that every individual evaluator’s scoring sheet track the final evaluation report, or that the evaluation record document the various changes in evaluators’ viewpoints. J5 Sys., Inc., supra, at 13 n.15; see Smart Innovative Solutions, B-400323.3, Nov. 19, 2008, 2008 CPD ¶ 220 at 3. The overriding concern for our purposes is not whether an agency’s final evaluation conclusions are consistent with earlier evaluation conclusions (individual or group), but whether they are reasonable and consistent with the stated evaluation criteria, and reasonably reflect the relative merits of the proposals. See, e.g., URS Fed. Tech. Servs., Inc., B-405922.2, B-405922.3, May 9, 2012, 2012 CPD ¶ 155 at 9 (a consensus rating need not be the same as the rating initially assigned by the individual evaluators); J5 Sys., Inc., supra, at 13; Naiad Inflatables of Newport, B-405221, Sept. 19, 2011, 2012 CPD ¶ 37 at 11.

Here, we find nothing unreasonable in the existence of differences between the evaluators’ preliminary findings and the final consensus evaluation findings of Jacobs’ proposal. In performing its evaluation of offerors’ proposals, an agency commonly relies upon multiple evaluators who often perform individual assessments before the evaluation team reaches consensus as to the evaluation findings. SRA International, Inc., supra. In doing so, it is not uncommon for the final group evaluation to differ from individual evaluator findings. As noted above, there is simply no requirement that agencies document why evaluation judgments changed during the course of the evaluation process. Rather, agencies are required to adequately document the final evaluation conclusions on which their source selection decision was based, and we review the record to determine the rationality of the final evaluation conclusions. Id. In sum, we will not find an evaluation record to be inadequately documented simply because it does not explain how all differences of opinion among the individual evaluators were resolved.

Evaluation of BAH’s Proposal

Finally, the protester takes issue with the agency’s technical evaluation and realism analysis of its own proposal, arguing that its proposal, like Jacobs’, should have received a technical rating of outstanding and that one of the upward adjustments made to its cost proposal was not justified.[6]

As noted above, we do not address these arguments because it is clear from the record here that even if the protester’s proposal had received a technical rating of outstanding/low risk, and even if no adjustments had been made to its proposed costs, BAH would still not have been in line for award ahead of Jacobs, whose proposal received a technical rating of outstanding/low risk and a past performance rating equivalent to the protester’s--and whose costs, even as adjusted, were lower than the protester’s costs as proposed.[7] In the foregoing connection, assuming the protester’s allegations pertaining to the evaluation of its own technical proposal, if sustained, would support the argument that the proposal merited a technical rating of outstanding, the allegations do not support the argument that the protester’s proposal should have been found superior to Jacobs’ under the technical factor. Competitive prejudice is an essential element of every viable protest, and where the record shows that the protester objecting to the evaluation of its own proposal would not be in line for award even if its objections were to be sustained, the requisite prejudice is absent. As a result, there is no basis for us to consider the allegations further. Orion Int’l Techs., Inc., B-293256, Feb. 18, 2004, 2004 CPD ¶ 118 at 3.

The protest is denied.

Susan A. Poling
General Counsel



[1] The Alliant contract is a multiple-award, indefinite-delivery, indefinite-quantity contract for various information technology services.

[2] Technical quality ratings were outstanding, good, acceptable, marginal, and unacceptable; risk ratings were low, moderate, and high.

[3] As the value of this task order is in excess of $10 million, this procurement is within our jurisdiction to hear protests related to the issuance of task orders under multiple-award indefinite-delivery, indefinite-quantity contracts. 41 U.S.C. § 4106(f)(1)(B).

[4] BAH also complains that while the cost report states that the technical evaluation team assessed the decreases and determined that they pose little risk to successful program execution overall, the technical evaluation record includes no discussion on this point. We do not regard the absence of such discussion from the technical evaluation report as problematic given that the issue was raised in the context of the cost realism analysis, and the findings are recorded in the cost realism report.

[5] The protester maintains that we have sustained protests where the agency’s documentation fails to show how the evaluators resolved weaknesses and risks attributed to the awardee’s proposal at an earlier stage in the evaluation process. In support of its argument, BAH cites our decisions BAE Sys. Info. and Elec. Sys. Integration Inc., B-408565 et al., Nov. 13, 2013, 2013 CPD ¶ 278 and Systems Research and Applications Corp.; Booz Allen Hamilton, Inc., B-299818 et al., Sept. 6, 2007, 2008 CPD ¶ 28. These cases, however, do not stand for the proposition that an evaluation record is inadequately documented if it does not provide an explanation for all discrepancies between the findings of the individual evaluators and the consensus findings reflected in the team evaluation report. Rather, BAE was sustained on the basis that post-discussion findings of weakness disappeared without explanation as to how they had been resolved, and Systems Research was sustained on the basis that the record failed to establish that the consensus ratings reasonably reflected the relative merits of the proposals.

[6] In its initial protest, BAH objected to three aspects of the agency’s cost realism analysis: (1) the upward adjustment of its escalation rate; (2) the upward adjustment of its labor rates for two labor categories [deleted]; and (3) the upward adjustment of its overhead rate. After the agency responded to these complaints in its report, BAH withdrew its challenges to the adjustments of its escalation and overhead rates. BAH Supplemental Protest and Comments on the Agency Report, Jan. 27, 2014, at 31 n.9. The agency points out that the protester’s abandonment of its argument pertaining to the upward adjustment of its escalation rate is significant given that the adjustment for escalation accounted for all but approximately [deleted] of the total upward adjustment of approximately $4.3M. Supplemental Agency Report, Feb. 13, 2014, at 15.

[7] As noted above, Jacobs received the highest technical rating of outstanding/low risk based on16 strengths, and one minor weakness, whereas BAH’s rating of good/moderate risk was based on 8 strengths and 3 weaknesses, two with moderate performance risk. BAH claims that it was entitled to three additional strengths since three of the strengths identified in Jacobs’ proposal were similar to the features offered in BAH’s proposal, and that the weaknesses were unfairly assigned such that it should not have had any weaknesses or Jacobs should have been assigned similar weaknesses. However, even if the strengths are added and unfairly assigned weaknesses removed, it is clear from the evaluation record here that the protester’s proposal would still not have been higher rated than Jacobs’ proposal under the technical factor, meaning that Jacobs would still remain in line for award ahead of BAH based on its lower evaluated costs and the absence of offsetting advantages.

Downloads

GAO Contacts

Office of Public Affairs