Skip to main content

Qbase, LLC; Perspecta Enterprise Solutions, LLC; and Northrop Grumman Systems Corporation

B-416377.9,B-416377.10,B-416377.11,B-416377.12,B-416377.13,B-416377.14 Nov 13, 2020
Jump To:
Skip to Highlights

Highlights

Qbase, LLC, a small business located in Herndon, Virginia, Perspecta Enterprise Solutions, LLC, located in Herndon, Virginia, and Northrop Grumman Systems Corporation, located in Herndon, Virginia, protest the award of seven indefinite-delivery, indefinite-quantity (IDIQ) contracts under request for proposals (RFP) No. DJJP-17-RFP-1022, issued by the Department of Justice (DOJ) for information technology support services. The protesters argue that the agency unequally and unreasonably evaluated proposals after our Office sustained a prior protest of this procurement, unequally and improperly conducted discussions, and failed to properly consider price in its best-value tradeoff determination.

We sustain the protest.
View Decision

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

 

Decision

Matter of:  Qbase, LLC; Perspecta Enterprise Solutions, LLC; and Northrop Grumman Systems Corporation

File:  B-416377.9; B-416377.10; B-416377.11; B-416377.12; B-416377.13; B-416377.14

Date:  November 13, 2020

Richard J. Conway, Esq., Adam Proujansky, Esq., and Michael J. Slattery, Esq., Blank Rome LLP, for Qbase, LLC; Daniel R. Forman, Esq., Eric M. Ransom, Esq., James G. Peyster, Esq., and William B. O’Reilly, Esq., Crowell & Moring LLP, for Perspecta Enterprise Solutions, LLC; and Richard A. Sauber, Esq., Deneen J. Melander, Esq., and Courtney L. Millian, Esq., Robbins, Russell, Englert, Orseck, Untereiner & Sauber LLP, for Northrop Grumman Systems Corporation, the protesters.
James Y. Boland, Esq., and Christopher G. Griesedieck, Esq., Venable LLP, for MetroStar Systems, Inc.; Gary J. Campbell, Esq., G. Matthew Koehl, Esq., and Lidiya Kurin, Esq., Womble Bond Dickinson (US) LLP, for Booz Allen Hamilton Inc.; and Carla J. Weiss, Esq., and Noah B. Bleicher, Esq., Jenner & Block, LLP, for SRA International, the intervenors.
Andrew J. Baker, Esq., and Christopher Radcliffe, Esq., Department of Justice, for the agency.
Alexander O. Levine, Esq., Sarah T. Zaffina, Esq., and Jennifer D. Westfall-McGrail, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1.  Protest contending that exchanges conducted with two offerors were improper discussions is denied where the exchanges amounted to clarifications of vague information contained within the offerors’ proposals and did not afford the offerors the opportunity to revise their proposals. 

2.  Protest challenging agency’s evaluation of an awardee’s past performance and corporate experience is denied where the agency reasonably considered and accounted for the awardee’s failure to provide the minimum number of relevant past performance and corporate experience references.

3.  Protest challenging agency’s best-value determination is sustained where the record reflects that the agency performed a mechanical tradeoff analysis that failed to meaningfully consider price and resulted in the exclusion of technically acceptable proposals.

DECISION
 

Qbase, LLC, a small business located in Herndon, Virginia, Perspecta Enterprise Solutions, LLC, located in Herndon, Virginia, and Northrop Grumman Systems Corporation, located in Herndon, Virginia, protest the award of seven indefinite-delivery, indefinite-quantity (IDIQ) contracts under request for proposals (RFP) No. DJJP-17-RFP-1022, issued by the Department of Justice (DOJ) for information technology support services.[1]  The protesters argue that the agency unequally and unreasonably evaluated proposals after our Office sustained a prior protest of this procurement, unequally and improperly conducted discussions, and failed to properly consider price in its best-value tradeoff determination. 

We sustain the protest.

BACKGROUND

On February 22, 2017, DOJ issued the RFP, seeking contractor assistance in support of the agency’s Information Technology Support Services-5 (ITSS-5) program.  The base period of performance will be from the date of award through September 30, 2022; the solicitation also contains a 5-year option period.  Agency Report (AR), Tab 1, RFP at 30.[2]  The agency anticipated award of approximately 15 contracts, six on an unrestricted basis and nine to service-disabled veteran-owned small businesses; this protest concerns the award of contracts on an unrestricted basis.  Id. at 88.  The total estimated value for the base period and option period is $4.5 billion.  Id. at 43.

The solicitation anticipated that the agency would evaluate proposals in two phases.  In phase one, the agency was to evaluate technical and price proposals.  The RFP provided for the evaluation of five technical subfactors as part of this phase:  corporate experience, past performance, architectural attributes experience, management, and mandatory technical certifications.  Id. at 73. 

The mandatory technical certifications subfactor was evaluated as either achieved or not achieved, with the agency assessing “whether or not the offeror has the required certification [under International Standards Organization (ISO) 9001], and . . . has either achieved [Capability Maturity Model Integration (CMMI)] Level 2 or 3, OR has a complete, realistic and well-supported plan for achieving CMMI Level 2 or 3 within a reasonable time after award.”[3]  Id. at 89.  A proposal that did not meet these requirements would not be selected for phase two. 

The remaining phase one subfactors were evaluated for relative merit.  Corporate experience was significantly more important than each of the other subfactors.  Id. at 87.  Past performance and architectural attributes were relatively equal in importance, and both subfactors were significantly more important than the management subfactor.  Id. 

The solicitation contemplated that the most highly rated offerors after phase one would be selected to submit a proposal for phase two.  In phase two, proposals were to be evaluated for their technical proficiency and also for their responses to two sample task order scenarios.  Id. at 89.  Each sample task order response was of equal importance, and each was significantly more important than the technical proficiency factor.  Id. at 87-88.  DOJ reserved the right to award either or both task orders to the offeror whose proposal provided the best value to the agency.

DOJ would perform an evaluation of each offeror’s technical and price proposals to determine which proposals were most advantageous to the government, with technical merit being significantly more important than price.  Id. at 90.  The RFP anticipated that the agency’s best-value tradeoff determination would consider each offeror’s overall technical rating for phase one and its overall technical rating for phase two, with the phase two rating being considerably more important than the phase one rating.  Id. at 88.  The solicitation stated that, between substantially equal technical proposals, the proposed price would be the determining factor in selecting a proposal for award.   

The protesters submitted timely proposals under the procurement’s unrestricted track.  DOJ selected 14 offerors to proceed to phase two, including the three protesters and the seven eventual awardees.  On March 28, 2019, all 14 offerors timely submitted phase two proposals. 

On December 19, the agency awarded contracts to AceInfo, NTT, BAH, SRA, and CACI. 

Following these awards, MetroStar and Perspecta filed protests, which our Office docketed as B-416377.5 and B-416377.6 respectively.  On April 2, 2020, our Office sustained MetroStar’s protest and denied Perspecta’s protest.  MetroStar Sys., Inc.,  B-416377.5, B-416377.8, Apr. 2, 2020, 2020 CPD ¶ 135; Perspecta Enter. Sols., LLC,  B-416377.6, B-416377.7, Apr. 2, 2020, 2020 CPD ¶ 136. 

In MetroStar Sys., Inc., supra, our Office found that DOJ had improperly credited SRA and BAH with meeting the ISO 9001 certification requirement.  In this respect, we determined that both awardees’ proposals included certifications that did not indicate that they applied to SRA and BAH themselves at the time of proposal submission.  In addition, we found that the agency unreasonably credited CACI with the corporate experience and past performance of affiliated entities, where the record did not show that those affiliates would provide resources or be relied upon during contract performance.

In Perspecta Enter. Sols., LLC, supra, our Office denied the protest but found that the agency had made several non-prejudicial errors in its evaluation of the proposals of Perspecta, BAH, NTT, SRA, and CACI.

Following these two decisions, the agency reevaluated portions of the proposals of MetroStar, CACI, SRA, and BAH, and also examined whether the evaluation errors noted by our Office with respect to Perspecta’s proposal were prejudicial.  AR, Tab 9, Technical Evaluation Addendum at 1.  Following this reassessment, the agency evaluated relevant proposals under the phase 1 and phase 2 subfactors as follows:

Phase One Subfactors

 

Corporate Experience

Past Performance

Architectural Attributes Experience

Management

Technical Certification

Combined Phase One Technical Rating

SRA

Excellent

Excellent

Very Good

Very Good

Achieved

Excellent

AceInfo

Very Good

Very Good

Excellent

Satisfactory

Achieved

Very Good

NTT

Very Good

Very Good

Excellent

Satisfactory

Achieved

Very Good

BAH

Very Good

Very Good

Excellent

Satisfactory

Achieved

Very Good

CACI

Very Good

Very Good

Satisfactory

Very Good

Achieved

Very Good

BAE

Very Good

Excellent

Satisfactory

Satisfactory

Achieved

Very Good

MetroStar

Very Good

Very Good

Satisfactory

Very Good

Achieved

Very Good

Perspecta

Satisfactory

Excellent

Excellent

Very Good

Achieved

Very Good

Qbase

Very Good

Excellent

Very Good

Satisfactory

Achieved

Very Good

Northrop

Satisfactory

Very Good

Very Good

Very Good

Achieved

Very Good

 

AR, Tab 4, Phase One Technical Consensus Recommendation Report at 3; AR, Tab 9, Technical Evaluation Report Addendum at 3.

Phase Two Subfactors

 

Sample Task Order One

Sample Task Order Two

Technical Proficiency

Phase Two Combined Rating

SRA

Excellent

Very Good

Very Good

Very Good

AceInfo

Very Good

Satisfactory

Very Good

Very Good

NTT

Very Good

Very Good

Satisfactory

Very Good

BAH

Very Good

Very Good

Very Good

Very Good

CACI

Very Good

Very Good

Very Good

Very Good

BAE

Very Good

Satisfactory

Satisfactory

Satisfactory

MetroStar

Satisfactory

Very Good

Satisfactory

Satisfactory

Perspecta

Satisfactory

Satisfactory

Satisfactory

Satisfactory

Qbase

Satisfactory

Marginal

Satisfactory

Satisfactory

Northrop

Satisfactory

Marginal

Satisfactory

Satisfactory

 

AR, Tab 10, Recommendation Report at 3.  This led to the following summary ratings and prices:

 

Phase One Combined Rating

Phase Two Combined Rating

Overall Technical Rating

Total Evaluated Price[4]

SRA

Excellent

Very Good

Very Good

$203,707,393

AceInfo

Very Good

Very Good

Very Good

$237,282,324

NTT

Very Good

Very Good

Very Good

$238,149,728

BAH

Very Good

Very Good

Very Good

$262,162,951

CACI

Very Good

Very Good

Very Good

$196,703,612

BAE

Very Good

Satisfactory

Satisfactory

$214,086,012

MetroStar

Very Good

Satisfactory

Satisfactory

$287,441,714

Perspecta

Very Good

Satisfactory

Satisfactory

$286,175,619

Qbase

Very Good

Satisfactory

Satisfactory

$167,321,196

Northrop

Very Good

Satisfactory

Satisfactory

$255,848,790

 

Id. at 3, 5.

Based on this evaluation, the agency determined that the seven highest-ranked technical proposals presented the government with the best value.  AR, Tab 11, Award Decision at 7. 

On June 29, the agency awarded contracts to the seven highest-rated offerors:  SRA, AceInfo, NTT, BAH, CACI, BAE, and MetroStar.  These protests followed.

DISCUSSION

The protesters challenge multiple aspects of the agency’s evaluation and best-value tradeoff determination.  The protesters argue that the agency unequally and improperly conducted discussions with SRA and BAH in an effort to resolve the failure of those offerors to meet the RFP’s ISO 9001 certification requirement.[5]  Northrop and Perspecta additionally contend that the agency unreasonably overlooked CACI’s failure to provide the required number of past performance and corporate experience references specified by the RFP instructions.  Northrop also argues that the agency unequally and unreasonably evaluated its proposal under the phase two subfactors.  The protesters further contend that the agency unreasonably failed to reevaluate their proposals as part of the corrective action taken in response to our decision in MetroStar.  Last, the protesters argue that the agency engaged in a mechanical comparison of proposals and failed to meaningfully consider price in its best-value tradeoff determination.[6]

As discussed below, we deny the protesters’ arguments pertaining to the ISO 9001 certification requirement, the number of CACI references, and the agency’s failure to reevaluate proposals as part of its earlier corrective action.  We sustain the protesters’ challenge to the agency’s best-value tradeoff determination and also find two errors in DOJ’s phase two evaluation of Northrop’s proposal.

ISO 9001 Certification Requirement

The protesters challenge exchanges DOJ conducted with BAH and SRA with respect to the certifications included in their proposals.  For context with regard to this procurement, our Office previously concluded in Metrostar, supra, that the agency had unreasonably determined that BAH and SRA had provided the ISO 9001 certification required by the RFP.  We reached this conclusion because there was no evidence in BAH’s or SRA’s proposal that either offeror (as opposed to an affiliate) had an ISO 9001 certification in place at the time of its proposal submission.

By way of background, the solicitation stated that the agency would evaluate “whether or not the offeror has the required certification for ISO 9001,” and provided that “[p]roposals will be eliminated from the competition and will receive no further consideration if they do not contain the mandatory ISO 9001 technical certification.”  RFP at 87 & 89 (emphasis omitted).  The RFP further required the offeror to be the entity that “has the required certification for ISO 9001.”  RFP at 89.  Responding to offerors’ questions about this requirement, the agency further explained that the offeror itself, and not its subcontractor, must be the one with the ISO certification.  AR, Tab 2, Phase One Questions and Answers (Q&As), at No. 21.  Further, in several related questions, offerors asked the agency if it was acceptable for the offeror to have the certification in progress or to have a plan in place for achieving ISO 9001 certification after award.  The agency responded each time that this would not be acceptable and that the certification was required for the prime offeror at the time of response submission.  Id. at Nos. 28, 32, 41 & 42.

In MetroStar, supra, our Office concluded that BAH and SRA had not provided evidence that they had ISO 9001 certifications in place at the time of proposal submission.  That is, although these offerors provided certifications in their proposals, it was not clear from the proposals that these certifications applied to the offerors’ own quality management systems.  Metrostar, supra at 6.  For example, the certification provided by BAH listed a different address than the address listed in BAH’s proposal and stated that it applied to the “management of provisioning of services including systems engineering, system administration and management consulting to the federal government by Booz Allen’s Corporate Quality Office and SIDEPOCKET BRIDGE.”  AR, Tab 13.2, BAH Phase One Technical Proposal at 46. 

For SRA, the certification was issued to SRA’s parent entity, CSRA, Inc., and stated that it was for a quality management system managed by the “CSRA Defense Training Division’s [program management office] located in Orlando, FL,” and that “[u]pon award, CSRA will incorporate the ITSS-5 IDIQ program into this ISO 9001:2015 certified [quality management system].”  AR, Tab 13.1, SRA Technical Proposal at Vol. 1, 37-38.  Due to the absence of evidence that these certifications applied to BAH’s and SRA’s quality management systems, our Office concluded that it was unreasonable for the agency to credit these offerors with meeting the applicable requirement. 

Following our Office’s decision in MetroStar, supra, the agency conducted exchanges with BAH and SRA.  Specifically, DOJ asked both offerors to “clarify whether the ISO 9001 certification possessed by [the entity named in its ISO certification], as included in your proposal, covered at the time of proposal submission the quality management system that would be used in performance of task orders awarded under this contract.”  AR, Tab 14.2, BAH Clarification Response at 1; AR, Tab 14.1, SRA Clarification Response at 1.[7]  

The protesters[8] contend that these exchanges amounted to unequal discussions because BAH and SRA were permitted to remedy material omissions in their proposals by submitting new information; information that was necessary to determine the technical acceptability of their proposals.

Clarifications are “limited exchanges” between an agency and an offeror for the purpose of clarifying certain aspects of a proposal, and do not give an offeror the opportunity to revise or modify its proposal.  Federal Acquisition Regulation (FAR) 15.306(a)(2). Discussions, on the other hand, occur when an agency communicates with an offeror for the purpose of obtaining information essential to determining the acceptability of a proposal, or provides the offeror with an opportunity to revise or modify its proposal in some material respect.  Highmark Medicare Servs., Inc.et al., B-401062.5 et al., Oct. 29, 2010, 2010 CPD ¶ 285 at 11; see FAR 15.306(d).  In situations where there is a dispute regarding whether an exchange between an agency and an offeror constituted discussions, the acid test is whether an offeror has been afforded an opportunity to revise or modify its proposal.  Priority One Servs., Inc., B-288836, B-288836.2, Dec. 17, 2001, 2002 CPD ¶ 79 at 5.

Here, we find that the exchanges in question are better categorized as clarifications rather than discussions.  In this respect, neither offeror was invited, or permitted, to revise its proposal; instead, each was simply asked to verify and clarify the unclear information included in its proposal, i.e., whether the ISO 9001 certification applied to the offeror’s own quality management systems at the time of proposal submission.  Neither offeror was permitted to alter its proposal by submitting a new or revised ISO 9001 certification.  Moreover, the fact that clarifying information was required did not mean that the proposals were noncompliant with the RFP requirement for ISO 9001 certification.  Instead, it meant that the proposals were unclear regarding compliance with this requirement. 

In this regard, the exchanges conducted by the agency were similar to those in L & G Tech. Servs., Inc., B-408080.2, Nov. 6, 2013, 2014 CPD ¶ 47.  In that case, our Office considered exchanges conducted with an offeror for the purposes of verifying the offeror’s intent to comply with applicable subcontracting obligations during contract performance.  We found that this verification constituted a clarification because the offeror was not provided an opportunity to revise its proposal and, instead, was asked to explain an aspect of its proposal that was otherwise vague.  Similarly, here, neither BAH nor SRA was provided an opportunity to revise its proposal or supply information required by the solicitation.  Instead, BAH and SRA were simply asked to clarify an aspect of their proposals that was otherwise vague.  We find that these exchanges amounted to permissible clarifications.  

The protesters[9] contend that, even if the exchanges at issue do not constitute discussions, they nonetheless were insufficient to demonstrate that BAH and SRA met the ISO 9001 certification requirement.  In this respect, the protesters argue that the explanations proffered by both offerors did not establish that the certifications provided in their proposals were issued to, and applied to, the offerors themselves, rather than affiliates.  The protesters assert that to comply with the RFP requirement, BAH and SRA had to be the entities holding the relevant certificates. 

The evaluation of technical proposals is a matter within an agency’s discretion.  Acquisition Servs. Corp., B-409570.2, June 18, 2014, 2014 CPD ¶ 197 at 7.  In reviewing an agency’s evaluation, we will not reevaluate technical proposals, but instead will examine the agency’s evaluation to ensure that it was reasonable and consistent with the solicitation’s stated evaluation criteria and with procurement statutes and regulations.  Technology & Telecomms. Consultants, Inc., B-415029, Oct. 16, 2017, 2017 CPD ¶ 320 at 3.  A protester’s disagreement with the agency’s judgment, without more, is insufficient to establish that an evaluation was improper.  Technica LLC, B‑413546.4, B-413546.5, July 10, 2017, 2017 CPD ¶ 217 at 5.

Here, we find that the agency reasonably credited BAH and SRA with meeting the solicitation’s ISO 9001 certification requirement.  In this regard, we note that an ISO 9001 certification is a certification based on an audit of an entity’s quality management system.  See SRA Comments, B-416377.10, at 5.  As we noted in MetroStar, supra at 6, neither BAH nor SRA adequately explained how the certification provided with its proposal applied to the offeror’s own (as opposed to an affiliate’s) quality management system.  With their clarifications responses, both BAH and SRA provided reasonable explanations for how these certifications applied to their quality management systems. 

In the case of SRA’s certification, SRA explained that it shared its quality management system with its parent company, CSRA Inc., noting that at the time of proposal submission, “SRA and CSRA operated as an integrated enterprise, with shared management, common policies, and shared resources . . . [and that CSRA’s] ISO 9001-compliant quality management system processes . . . applied across the CSRA enterprise, including throughout SRA, to govern the performance of all contracts and task orders.”  AR, Tab 14.1, SRA Clarification Response at 3.  In the case of BAH’s certification, BAH explained that the certificate provided was issued in the name of one of its program offices, which was not a separate entity or affiliate, and that the certificate covered BAH and the quality management system “that would be utilized in performance of task orders awarded under the ITSS-5 contract.”  AR, Tab 14.2, BAH Clarification Response at 4. 

Based on these explanations, we find that the agency reasonably credited both BAH and SRA with having included ISO 9001 certifications that applied to the quality management systems they would utilize during contract performance.  While the protesters contend that the solicitation required the certifications to be issued to BAH and SRA themselves, we do not agree, particularly because ISO 9001 certifications are issued to cover specific quality management systems not specific entities.  Thus, the fact that a certificate lists an entity that is different from the offeror is of little practical importance if the certification nonetheless covers the offeror’s quality management system that will be used during contract performance.  Nor do we find anything in the solicitation that requires otherwise.  Accordingly, we find that the agency reasonably credited BAH and SRA with meeting the requirement to have a certified ISO 9001-compliant quality management system in place at the time of proposal submission.  

Corporate Experience and Past Performance

Perspecta and Northrop challenge DOJ’s reevaluation of CACI’s corporate experience and past performance.  In Metrostar, our Office concluded that DOJ had unreasonably evaluated CACI’s corporate experience and past performance by crediting CACI for references submitted by CACI’s affiliates when CACI’s proposal failed to demonstrate the meaningful involvement of those affiliates.  Following our decision, the agency excluded these references in its reevaluation of CACI’s past performance and corporate experience.  Instead, DOJ evaluated CACI’s past performance and corporate experience based on the one remaining reference, the incumbent contract performed by CACI’s affiliate CACI-ISS.[10]  Based on this one contract reference, the agency rated CACI’s corporate experience as remaining very good since the incumbent’s “ITSS-4 experience alone demonstrates [CACI’s] capability within the functional areas . . . of the [statement of work] with a focus on the six categories of service. . . .”  AR, Tab 9, Phase Two Technical Evaluation Report at 5.  For past performance, the agency downgraded CACI’s rating from excellent to very good, finding that the contract is relevant to the ITSS-5 work in terms of size, scope, and complexity and that the quality of work performed indicated a strong likelihood of success.  Id. at 4.   

Perspecta and Northrop argue that the agency should have disqualified CACI since it failed to meet material requirements.  In this respect, the RFP instructions required the prime offeror to “provide three (3) directly relevant past or present references” in the corporate experience section of its proposal.  RFP at 75.  In addition, the solicitation stated that the agency must receive five past performance questionnaires for the prime offeror.  Id. at 77.  Perspecta and Northrop argue that a reasonable evaluation of CACI’s corporate experience and past performance would have led to the disqualification of CACI’s proposal based on its failure to meet these reference requirements. 

With respect to past performance, we find that while DOJ did not receive five past performance questionnaires for CACI, it reasonably accounted for this failure by downgrading CACI’s past performance rating.  This was consistent with the solicitation, which stated that the failure of an offeror’s references to submit the past performance questionnaire within the required timeframe “may result in the inability of the [g]overnment to evaluate an offeror’s past performance and may affect the overall evaluation.”  Id. at 78.  In other words, although the solicitation required five questionnaires for each prime offeror, it also accounted for the possibility of an evaluation based on less than five.  Under these circumstances, we think that the solicitation did not require the agency to disqualify an offeror for failing to provide the required number of references.  

With respect to corporate experience, we find that the agency reasonably determined that the incumbent ITSS contract met the functional areas of the statement of work, while demonstrating high quality experience.  See AR, Tab 9.1, Amended Technical Evaluation of CACI at 1.  In this respect, the agency noted four strengths relating to this reference, including, for example, that it “demonstrated a very good portfolio of experience in areas of emerging technologies and initiatives similar to DOJ objectives under its ITSS-4 reference.”  Id.  The agency also considered the risks stemming from the fact that CACI only provided one relevant reference, with DOJ assigning both a weakness and a risk based on this fact.  Id.  Ultimately, however, the agency concluded that, on balance, the strengths outweighed the risks and weaknesses such that CACI’s corporate experience warranted a very good rating.  We find this conclusion reasonably accounted for CACI’s failure to meet the solicitation requirements while still considering the depth and breadth of CACI’s experience on projects similar in size, scope, and complexity to the requirements specified in the RFP, as DOJ was required to do under the evaluation criteria for this subfactor.[11] 

Evaluation of Northrop

Northrop challenges several additional elements of the agency’s phase two evaluation as unreasonable and unequal.  We have reviewed these challenges and find that two have merit.[12]  In this regard, Northrop asserts that it should have received a major strength, under the sample task order one subfactor, for proposing personnel that would be available on day one of the contract.  See Northrop Protest, exh. 4, Sample Task Order One Proposal Excerpts at B-1-B-3.  Northrop contends that DOJ’s failure to assign such a strength amounted to disparate treatment since both MetroStar and NTT received major strengths for proposing that their personnel would be available on day one of the contract.  The agency responds to this argument, by asserting that Northrop did not deserve a strength for this aspect of its proposal because Northrop’s proposal pledged a 60-day transition period.  We find no evidence within the contemporaneous record, however, to suggest that this was the reason that Northrop was not credited with a strength for proposing that its personnel would be available on day one.  Nor does the agency contend that this was the contemporaneous reason why Northrop’s proposal was not credited with a major strength for this approach.  Accordingly, we find that the agency has not reasonably explained its disparate evaluation treatment with regard to this aspect of Northrop’s proposal.

In addition, Northrop notes that our decision in Perspecta, supra, stated that “where a proposed web software developer did have extensive web development experience, the agency noted that fact and assigned a strength.”  Id. at 11-12.  Northrop contends that its proposal should have been similarly assigned a strength, under the sample task order two staffing and key personnel subfactor, for the extensive experience of its web developer, who had [DELETED].  Northrop Protest, exh. 5, Sample Task Order Two Proposal Excerpts at Tab C, 13-14.  Northrop contends that the failure to assign a strength in this regard amounted to unequal treatment. 

The agency largely does not respond to the merits of this contention, and instead asserts that the protest argument is untimely because Northrop knew that its proposal had not been assigned a strength for this experience in December 2019, when Northrop received a debriefing following DOJ’s initial award decision.  We disagree because the argument in question challenges the agency’s unequal treatment of Northrop’s proposal.  This unequal treatment was first revealed to Northrop following our Office’s decision in Perspecta, which noted the fact that other offerors were assigned strengths for having a web developer with extensive web development experience.  Having learned of this unequal treatment at the time of our decision, Northrop timely protested this treatment following the agency’s reevaluation of proposals occurring subsequent to our decision.[13]  As the agency has largely not challenged the substance of this protest ground, we agree with Northrop that the agency unequally failed to credit Northrop’s proposal for proposing a web developer with extensive web development experience.

Failure to Re-evaluate Protesters’ Proposals as Part of Corrective Action

Perspecta also argues that to implement our recommended corrective action and make a reasonable revised best-value determination, DOJ was obligated to correct the evaluation errors identified in both the Perspecta and MetroStar decisions.  In this regard, Perspecta contends that even though our prior Perspecta decision found that the agency’s errors did not prejudice Perspecta with respect to the initial award decision, these same errors were prejudicial in relation to the revised award decision and DOJ should have reevaluated Perspecta’s proposal.  Qbase and Northrop also argue that DOJ was required to reevaluate their proposals consistent with our recommendation in MetroStar and the agency’s failure to do so rendered the best‑value determination unreasonable.[14]  We reject the protesters’ contention that our recommendation in MetroStar required the agency to reevaluate the protesters’ proposals. 

As a general rule, the details of implementing recommendations of our Office are within the sound discretion and judgment of the contracting agency, and we will not question an agency’s ultimate manner of compliance, so long as it remedies the procurement impropriety that was the basis for our recommendation.  AXIS Mgmt Group, LLC, B‑408575.2, May 9, 2014, 2014 CPD ¶ 150 at 4.

The concerns expressed in MetroStar were that the agency improperly credited two awardees for having mandatory certifications at the time of proposal submission, and unreasonably credited an awardee with the corporate experience and past performance of affiliated entities.  None of the protesters here were parties to that protest.  Moreover, none of our findings related to the agency’s unreasonable evaluation were in regard to Northrop’s or Qbase’s proposals.  While, in MetroStar, our Office recommended that “the agency reevaluate proposals in a manner consistent with the terms of the solicitation and this decision, and make a new source selection decision based on that reevaluation,” we find no basis to object to the agency’s decision not to reevaluate Northrop’s and Qbase’s proposals as part of its corrective action.  MetroStar, supra at 10.

With respect to Perspecta, the record demonstrates that the agency reconsidered Perspecta’s proposal in its reevaluation and resolved the evaluation improprieties identified in MetroStar and Perspecta.[15]  AR, Tab 11, Award Decision at 5‑6.  We have no basis to question this aspect of the agency’s reevaluation.

Best-Value Determination

Finally, the protesters raise various challenges to the agency’s best‑value determination.  In particular, the protesters argue that the agency’s tradeoff analysis was unreasonable because the agency failed to conduct a tradeoff between lower‑priced, technically acceptable proposals and higher-priced, higher‑rated technical proposals.  The protesters assert that the agency failed to meaningfully consider price in its tradeoff analysis.[16]  As explained below, the record shows that DOJ performed a mechanical tradeoff that relied exclusively on adjectival ratings, excluded technically acceptable proposals without any consideration of the price of those proposals, and, in general, did not meaningfully consider price.  Accordingly, we sustain the protesters’ challenges to the tradeoff analysis.   

Source selection officials have considerable discretion in determining the manner and extent to which they will make use of technical and price evaluation results, and their judgments are governed only by the tests of rationality and consistency with the stated evaluation criteria.  The SI Org., Inc., B‑410496, B‑410496.2, Jan. 7, 2015, 2015 CPD ¶ 29 at 14.  Where, as here, a solicitation provides that technical factors are more important than price in source selection, selecting a technically superior, higher‑priced proposal is proper where the agency reasonably concludes that the price premium is justified in light of the proposal’s technical superiority.  The MIL Corp., B‑294836, Dec. 30, 2004, 2005 CPD ¶ 29 at 8.

The agency’s conclusion, however, must be adequately documented and supported by a rational explanation as to why the higher-rated proposal is, in fact, superior, and why its technical superiority warrants paying a price premium.  Arcadis U.S., Inc., B‑412828, June 16, 2016, 2016 CPD ¶ 198 at 10; Cyberdata Techs., Inc., B‑406692, Aug. 8, 2012, 2012 CPD ¶ 230 at 5.  Overall, the documentation must show, not merely the tradeoff decision or business judgment made, but the rationale for that decision or judgment. FAR 15.308; Blue Rock Structures, Inc., B‑293134, Feb. 6, 2004, 2004 CPD ¶ 63 at 5.

Here, rather than documenting a reasonable basis for the tradeoffs made, the record indicates that the agency mechanically made award to the 7 offerors whose proposals exhibited, in descending order, the best combination of adjectival ratings under the non‑price factors.  In this respect, DOJ arranged similarly rated proposals into groups and made awards to all offerors in those groups--regardless of price--until DOJ reached what it referred to as a “logical break” in the proposals.  AR, Tab 10, Recommendation Report at 7.  The first “logical break” appeared after the top five proposals, all of which received an overall very good rating.  AR, Tab 10, Recommendation Report at 7; AR, Tab 11, Award Decision at 4.  The next break occurred after the next two proposals, which were rated satisfactory overall, and each of which had received a very good rating for one phase two evaluation subfactor.  AR, Tab 10, Recommendation Report at 7; AR, Tab 11, Award Decision at 4‑5.  The record reflects, however, that each “logical break” among the proposals was based exclusively on the adjectival ratings assigned to proposals‑‑the agency failed to discuss the qualitative differences between the proposals.

For instance, in declining to recommend an award for Perspecta--also rated satisfactory overall and priced $1,266,095 lower than MetroStar--the technical evaluation panel observed that Perspecta was rated satisfactory “across the board” in the phase two evaluation.  AR, Tab 10, Recommendation Report at 7.  In this respect, the technical evaluators noted that even though MetroStar and BAE received an overall satisfactory rating, they each received one very good rating for one of the phase two sample task orders, and this was considered an advantage.  Id.  In contrast, the agency argued that it saw no compelling reason to award to Perspecta, which received a satisfactory rating under each of the phase two subfactors, but did not receive any of the higher ratings.  Id.  The source selection authority (SSA) observed that the one very good subfactor rating BAE and MetroStar each received for a sample task order distinguished their offers from Perspecta’s proposal and the other overall satisfactory proposals.  AR, Tab 11, Award Decision at 5.  The source selection authority reconsidered Perspecta’s technical rating in light of the errors identified in our earlier Perspecta decision, and determined that correcting the errors in the initial evaluation did not alter the rating or relative position of Perspecta’s or any other offeror’s proposal.  AR, Tab 11, Award Decision at 5-6; see also AR, Tab 9, Technical Evaluation Report Addendum at 8.  According to the SSA, Perspecta’s proposal would still have received a satisfactory rating, “and no [v]ery [g]ood rating for any of the [p]hase 2 factors.”[17]  AR, Tab 11, Award Decision at 6. 

Neither the technical evaluation panel’s nor the source selection authority’s analysis, however, examined whether a proposal such as Perspecta’s, with the same overall rating as MetroStar’s and a lower price, might be among those proposals offering the best value to the government notwithstanding the fact that it did not receive a very good rating for a subfactor in the phase 2 evaluation.  In sum, without any consideration of the underlying differences between the technical proposals or price, Perspecta was not recommended for award because its proposal was not assigned the necessary combination of adjectival ratings.

Further, in determining to award contracts to MetroStar and BAE, both rated satisfactory overall, in addition to the five offerors rated very good overall, the SSA stated that

BAE and MetroStar are the only two vendors among the nine vendors receiving an overall Satisfactory rating that offer the clear advantage of being rated better than merely Satisfactory (that is, Very Good) in two sample task order selection factors.  In contrast, Perspecta earned Satisfactory ratings across the board in Phase 2, resulting in an overall Satisfactory rating; this contrasts with the value-added Very Good ratings of BAE and MetroStar that distinguish their offers from Perspecta and the rest of the overall Satisfactory offers.[18]

AR, Tab 11, Award Decision at 5 (emphasis added).  Based on their ratings, the source selection authority concluded that Perspecta, Qbase, Northrop, and the other satisfactory offerors do not have sufficient technical merit to warrant an award.  Id. at 6 (“While not rising to the level of [v]ery [g]ood overall for [p]hase 2, . . . MetroStar and BAE, unlike Perspecta and the others receiving an overall [s]atisfactory rating, achieved [v]ery [g]ood ratings in at least one of the [p]hase 2 factors.”). 

We have long recognized that an agency’s source selection decision cannot be based on a mechanical comparison of the offerors’ technical scores or ratings, but must rest upon a qualitative assessment of the underlying technical differences among competing proposals.  See The MIL Corp., supra, at 8 (sustaining protest where agency mechanically made award to all proposals that received “blue” ratings on the two non-price factors, and declined to make award to any proposal that did not receive a “blue” rating for the non-price factors).  Here, in adopting such a mechanical approach, DOJ failed to make a qualitative assessment of the technical differences among the competing proposals in order to determine whether the perceived technical superiority of those proposals receiving the highest overall ratings justified paying the price premium associated with those proposals.

Furthermore, in a tradeoff source selection process, an agency cannot eliminate a technically acceptable proposal from consideration for award without taking into account the relative cost of that proposal to the government.  See e.g., Cyberdata Techs., Inc., supra, at 5 (protest sustained where technically acceptable proposal excluded from consideration for award without consideration of its price); System Eng’g Int’l, Inc., B‑402754, July 20, 2010, 2010 CPD ¶ 167 at 5 (protest sustained where record shows that agency in best‑value procurement performed tradeoff between two higher‑rated, higher‑priced quotations but did not consider the lower prices submitted by other lower‑rated, technically acceptable vendors).

Here, the record shows that price was not considered in any meaningful way in the source selection decision.  In this respect, the record shows that price had no material impact on an offeror’s ability to be selected for award.  After the price evaluation in phase one, the technical evaluation panel noted that the price model used in the RFP to calculate the total evaluated price “was not intended to, and in fact does not, predict the actual cost the government will incur.”  AR, Tab 10, Recommendation Report at 8 (emphasis omitted).  Rather, the actual cost would be determined through the task order competition.  Id.  Moreover, the technical evaluation panel observed that even though sample task order pricing was not an evaluation factor, based upon its review of sample task order pricing, there was little correlation between the total evaluated price and the proposed sample task order prices.  Id. at 8‑9.  The technical evaluation panel concluded that because the total evaluated price could not predict performance costs, it should not be a basis for selecting a lower‑rated offeror.[19]  Id. at 9. 

Once the higher-rated proposals were identified, the agency did not perform a price/technical tradeoff; rather, award was based strictly on technical merit.  The technical evaluation panel determined that because technical factors were significantly more important than price, there needed “to be some additional compelling price, technical or other reason,” to displace a higher-rated offeror.  AR, Tab 10, Recommendation Report at 9.  No such reason was found, and therefore, there was no justification for selecting a lower-priced, lower‑rated offeror.  Id.  The panel went on to conclude it was unnecessary and did not “make sense” to conduct a tradeoff of every higher-rated, higher-priced proposal, against every other lower-priced, lower-rated offeror possessing the same overall adjectival rating.  Id.  This was especially true in light of the finding that “the pricing of the 14 vendors, though disparate numerically, to be within a reasonable band and for evaluation purposes, substantially equal.”  Id. 

In a tradeoff source selection process, however, an agency may not so minimize the impact of price to make it merely a nominal evaluation factor because the essence of the tradeoff process is an evaluation of price in relation to the perceived benefits of an offeror’s proposal.  Sevatec, Inc., supra, at 8 (citing FAR 15.101-1(c)); Electronic Design, Inc., B‑279662.2 et al., Aug. 31, 1998, 98‑2 CPD ¶ 69 at 8.  Contrary to DOJ’s selection process here, there is no exception to the requirement set forth in the Competition in Contracting Act of 1984 (CICA) that cost or price to the government be considered in selecting proposals for award because the selected awardees will be provided the opportunity to compete for task orders under the awarded contract.  But see 41 U.S.C. § 3306(c)(3) (limited exception where the agency intends to make a contract award to each qualifying offeror[20]).  In the absence of any tradeoff analysis considering price, we cannot conclude that the agency’s best‑value determination is reasonable.

While we acknowledge that the agency repeatedly made the statement that the total evaluated price does not measure actual costs to the government and that therefore no price-based justification exists to replace any of the highest-rated proposals with a lower‑rated, technically inferior proposal, we find such consideration of price to be nominal.  See Cyberdata Techs., Inc., supra, at 5 n.1 (sustaining protest where agency emphasized the importance of technical superiority and concluded that selection of the lower-priced proposals “would be at the reduction of technical quality and not worth a trade-off to that extent.”).  That the agency gave only nominal consideration to price is further reflected by the fact that the agency determined the pricing of all offerors to be substantially equal for evaluation purposes, despite a difference of $120 million between the lowest and highest evaluated prices. 

We also recognize that there is no need for extensive documentation of every consideration factored into a tradeoff decision, see, e.g., AGVIQ, LLC, B‑413586, Nov. 2, 2016, 2016 CPD ¶ 303 at 5; however, an agency that fails to adequately document its source selection decision bears the risk that our Office may be unable to determine whether the decision was proper.  See CSR, Inc., B‑413973, B‑413973.2, Jan. 13, 2017, 2017 CPD ¶ 64 at 12 (sustaining protest where the agency failed to adequately document how it found the awardee’s proposal was superior to that of the protester, when the proposals were equally rated).  Although proposals with the same adjectival ratings are not necessarily of equal quality, a source selection official’s finding that one proposal is technically superior to another, notwithstanding equal ratings, must be adequately documented.  See ERC Inc., B‑407297, B‑407297.2, Nov. 19, 2012, 2012 CPD ¶ 321 at 6‑7.  The record before us is devoid of any explanation as to how the agency determined MetroStar’s and BAE’s proposals were technically superior to the other proposals with the same overall adjectival ratings, except to say that MetroStar and BAE each received a very good rating for one of the phase 2 subfactors.  We find the agency’s documentation inadequate. 

In sum, the agency’s best-value determination is unreasonable because the agency performed a mechanical tradeoff analysis that failed to meaningfully consider price and resulted in the exclusion of technically acceptable proposals from consideration for award. 

Competitive Prejudice

Prejudice is an essential element of a viable protest.  AdvanceMed Corp., B-414373, May 25, 2017, 2017 CPD ¶ 160 at 16.  We will sustain a protest where a protester demonstrates a reasonable possibility that it was prejudiced by the agency’s actions. Id. at 16-17.  Here, had the agency qualitatively compared the underlying technical strengths and weaknesses of the proposals and meaningfully considered price in its best-value tradeoff decision, it might have selected one or more of the protesters’ proposals for award.  In these circumstances, we resolve doubts regarding competitive prejudice in favor of the protester.  Id.  

RECOMMENDATION

For the reasons discussed above, we conclude that DOJ’s phase two evaluation of Northrop’s proposal was unreasonable and that the agency’s best-value determination was unreasonable and prejudicial to Qbase, Perspecta, and Northrop.  We recommend that, consistent with this decision, DOJ reevaluate Northrop’s proposal, conduct and document a new best-value tradeoff analysis of the phase two proposals, and prepare a new source selection decision with appropriate consideration given to all evaluation factors, or take such other steps permitted by applicable procurement laws and regulations.  We also recommend that the agency reimburse the protesters’ reasonable costs associated with filing and pursuing their protests, including attorneys’ fees.  4 C.F.R. § 21.8(d)(1).  The protesters’ certified claims for costs, detailing the time expended and costs incurred, must be submitted to the agency within 60 days after the receipt of this decision.  4 C.F.R. § 21.8(f).

The protest is sustained.

Thomas H. Armstrong
General Counsel

 

[1] The seven awardees are:  Ace Info Solutions, Inc. (AceInfo); Booz Allen Hamilton (BAH); CACI, Inc.-Federal (CACI); SRA International, A CSRA Inc. Company (SRA); NTT Data Federal Services, MetroStar Systems, Inc., and BAE Systems Technology Solutions & Services, Inc.

[2]  Citations to the RFP are to the version of the solicitation incorporating amendment 7. Unless otherwise noted, citations to tab numbers in the agency report are common across all three protests.

[3] ISO-9000 standards (including ISO 9001) are a series of internationally recognized quality assurance standards.  See LBM Inc., B-286271, Dec. 1, 2000, 2000 CPD ¶ 194 at 2 n.2.

[4] The total evaluated price used in this table reflects an upward adjustment of 10 percent for non-small business concerns.

[5] Qbase does not challenge the agency’s exchanges with BAH, or its decision to recognize BAH’s ISO 9001 certificate, but does challenge DOJ’s exchanges with SRA and subsequent decision to credit SRA’s proposal with meeting the certification requirement. 

[6] While we do not address in detail every argument raised by the protesters, we have reviewed each issue and, with the exception of those issues discussed herein, do not find any basis to sustain the protest. 

[7] The agency also asked a second question, to be answered if the offeror responded to the first question by stating that its certification did not cover its quality management system at the time the offeror submitted its proposal.  That question asked the offeror to clarify how it would obtain, prior to contract performance, an ISO 9001 certification applying to its quality management system.  Because the agency determined that both offerors answered the first question in the affirmative, we need not consider whether the second question invited discussions with BAH and SRA.  In this regard, the agency did not consider the answers provided to the second question, rendering the question academic.

[8] As discussed above, Qbase only challenges the agency’s exchanges with SRA not BAH.

[9] As noted above, Qbase does not challenge the ISO 9001 certification provided by BAH, and instead limited its challenge to the certification provided by SRA.

[10] In contrast to other affiliates relied upon by CACI, our Office found that CACI’s proposal did demonstrate the meaningful involvement of CACI-ISS.  Metrostar, supra at 8.  Accordingly, our decision concluded that it was reasonable for the agency to consider the performance of CACI-ISS under the ITSS-4 contract in the evaluation of CACI’s corporate experience and past performance.

[11] Moreover, even to the extent the agency’s failure to disqualify CACI may be viewed as a relaxation of a solicitation requirement, the record fails to show that either Perspecta or Northrop was competitively prejudiced by this relaxation.  See McDonald–Bradley, B–270126, Feb. 8, 1996, 96-1 CPD ¶ 54 at 3 (Competitive prejudice is necessary before we will sustain a protest; where the record does not demonstrate that the protester would have had a reasonable chance of receiving award but for the agency’s actions, we will not sustain a protest, even if deficiencies in the procurement process are found).  In this respect, neither offeror has demonstrated that it would have been able to materially improve its corporate experience had the agency permitted it to submit less references.  Nor does such an outcome appear likely since the corporate experience subfactor was intended to evaluate the depth and breadth of an offeror’s experience on projects similar in size, scope, and complexity to the instant requirement.  RFP at 88.

[12] The agency argues that even if these challenges are meritorious, any errors did not competitively prejudice Northrop.  As discussed below, we find that Northrop was competitively prejudiced as a result of errors made in DOJ’s best-value tradeoff and that a new best-value tradeoff may lead to Northrop receiving an award.  Under such circumstances, we find that Northrop has adequately established competitive prejudice with respect to the two errors discussed below.

[13] We note that had Northrop protested this disparate treatment during the pendency of the agency’s corrective action, such a challenge would have been premature since the agency was reevaluating proposals during that time.

[14] Northrop also asserts that DOJ improperly failed to reevaluate proposals consistent with Perspecta

[15] It should be noted that although we identified errors in DOJ’s evaluation of Perspecta’s proposal, we denied the protest because Perspecta failed to demonstrate prejudice, an essential element of every viable protest.  Perspecta, supra, at 12‑13.Consequently, our Office made no recommendations to the agency for corrective action in connection with Perspecta’s proposal.

[16] DOJ argued that Perspecta’s challenge to the agency’s failure to consider price in its tradeoff analysis was untimely because the agency provided documentation in connection with Perspecta’s earlier protest that placed Perspecta on notice that the agency had not considered price in its best-value analysis.  We are not persuaded that Perspecta should be foreclosed from objecting to the agency’s failure to consider price now, however, given that prejudice to Perspecta from the agency’s failure to consider price was not apparent until the agency selected for award the proposal of MetroStar--which, in contrast to the proposals of the five original awardees--was higher-priced than Perspecta’s.  Moreover, the agency conducted a new tradeoff analysis after the earlier protest and made a new award decision that includes new awardees; this fundamentally changes the underlying facts from Perspecta’s earlier challenges.  Perspecta’s challenge to this new decision is timely.

[17] The SSA found that Perspecta’s rating for sample task order 1 remained satisfactory despite removing a minor weakness from Perspecta’s subfactor evaluation.  AR, Tab 11, Award Decision at 5.  DOJ also removed a weakness from Perspecta’s staffing and key personnel subfactor evaluation under sample task order 2, which did not change Perspecta’s overall satisfactory rating for that subfactor.  Id. at 6.

[18] Both the technical evaluation panel recommendation and the award decision are devoid of any reference to Qbase and Northrop’s technical overall ratings and phase two subfactor ratings, outside of their inclusion in a table that presents the results of the phase two technical reevaluation, the phase one price evaluation results, and the price rank of the phase two offerors.  AR, Tab 10, Recommendation Report at 3, 5 & 6.  In fact, the only mention of Qbase’s and Northrop’s technical proposals in the award decision is the SSA’s conclusion that even though they were lower‑priced than MetroStar (which had the highest‑priced proposal), “none presents a technical advantage that would justify trading‑away the technical merit of any of the top seven [offerors] (all of which earned at least one [v]ery [g]ood in [p]hase 2’s factors).”  AR, Tab 11, Award Decision at 7. 

[19] Although the agency might have elected to employ a “highest technically rated offerors with fair and reasonable prices” source selection methodology, see Sevatec, Inc., et al., B‑413559.3 et al., Jan. 11, 2017, 2017 CPD ¶ 3 at 8‑9, it did not.  Instead, the RFP here provided that the agency intended to make awards using a best‑value tradeoff evaluation scheme, which does not permit the agency to eliminate a technically acceptable proposal from consideration for award without taking into account the relative cost of the proposals.  See id. at 8.  To the extent that DOJ decides the RFP’s stated evaluation criteria no longer reflect the proper approach for evaluating proposals, the agency is not permitted to change evaluation criteria mid‑procurement without first amending the solicitation.  See Computer World Servs. Corp., B‑418287.3, June 29, 2020, 2020 CPD ¶ 204 at 6‑7 (sustaining protest where the agency’s intended corrective action changed the price evaluation without amending the solicitation and affording competing firms an opportunity to submit revised quotations).  To do otherwise would prevent offerors from a reasonable opportunity to compete intelligently and on a comparatively equal basis.

[20] A “qualifying offeror” is defined as a responsible source that has submitted a proposal conforming to the solicitation requirements, meets all technical requirements, and is otherwise eligible for award.  41 U.S.C. §3306(c)(4).

Downloads

GAO Contacts

Office of Public Affairs