Systems Implementers, Inc.; Transcend Technological Systems, LLC

B-418963.5,B-418963.6,B-418963.7,B-418963.8,B-418963.9 Jun 01, 2022
Jump To:
Skip to Highlights
Highlights

Systems Implementers, Inc. (SI), a small business of Clearfield, Utah (the incumbent contractor), and Transcend Technological Systems, LLC (TTS), a small business of Prattville, Alabama, protest the award of a contract to OM Group, Inc. (OMG), a small business of Piscataway, New Jersey, under request for proposals (RFP) No. FA8201-20-R-0005. The Air Force issued the RFP for sustainment, modernization, and consolidation services for the Hill Enterprise Data Center (HEDC) located at Hill Air Force Base, Utah. The protesters challenge the agency's evaluation of the awardee's technical proposal, the agency's evaluation of the protesters' own technical proposals, the agency's evaluation of offerors' proposed professional employee compensation plans, and the agency's best-value tradeoff.

We deny the protests.
View Decision

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of: Systems Implementers, Inc.; Transcend Technological Systems, LLC

File: B-418963.5; B-418963.6; B-418963.7; B-418963.8; B-418963.9

Date: June 1, 2022

David S. Black, Esq., Gregory R. Hallmark, Esq., Amy L. Fuentes, Esq., and Danielle R. Rich, Esq., Holland & Knight LLP, for Systems Implementers, Inc.; J. Sott Hommer, III, Esq., Rebecca E. Pearson, Esq., Christopher G. Griesedieck, Esq., Taylor A. Hillman, Esq., Marcos R. Gonzalez, Esq., and Lindsay M. Reed, Esq., Venable LLP, for Transcend Technological Systems, LLC, the protesters.
Katherine B. Burrows, Esq., Matthew E. Feinberg, Esq., Jacqueline K. Unger, Esq., and Eric A. Valle, Esq., Piliero Mazza, for OM Group, Inc., the intervenor.
Colonel Frank Yoon, Major Alissa J. K. Schrider, and Heidi M. Fischer, Esq., Department of the Air Force, for the agency.
Heather Self, Esq., and Peter H. Tran, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1. Protests challenging agency’s technical evaluation of the awardee’s proposal are denied because the evaluation was reasonable and consistent with the terms of the solicitation.

2. Protests that the agency should have assigned protesters’ proposals multiple strengths and higher technical ratings reflect the protesters’ disagreement with the agency’s evaluation judgment, and provide no basis for sustaining the protests.

3. Protests that the agency evaluated in a disparate manner are denied because the record reflects that differences in evaluations were a result of differences in proposals.

4. Protest challenging agency’s evaluation of offerors’ professional employee compensation plans is denied because the record demonstrates that the agency’s evaluation was reasonable and consistent with the solicitation and the requirements of Federal Acquisition Regulation provision 52.222-46.

3. Protests challenging the agency’s best-value tradeoff source selection decision are denied. With respect to the first protester, the agency reasonably found that firm’s proposal to be lower-rated and higher-priced than the awardee’s proposal, thus, no tradeoff between the two was required. With respect to the second protester, the agency reasonably found that firm’s and the awardee’s proposals essentially equal under four of five evaluation factors, and reasonably relied on a discriminator under a fifth evaluation factor to justify payment of the awardee’s less than half a percent price premium.

DECISION

Systems Implementers, Inc. (SI), a small business of Clearfield, Utah (the incumbent contractor), and Transcend Technological Systems, LLC (TTS), a small business of Prattville, Alabama, protest the award of a contract to OM Group, Inc. (OMG), a small business of Piscataway, New Jersey, under request for proposals (RFP) No. FA8201-20-R-0005. The Air Force issued the RFP for sustainment, modernization, and consolidation services for the Hill Enterprise Data Center (HEDC) located at Hill Air Force Base, Utah. The protesters challenge the agency’s evaluation of the awardee’s technical proposal, the agency’s evaluation of the protesters’ own technical proposals, the agency’s evaluation of offerors’ proposed professional employee compensation plans, and the agency’s best-value tradeoff.

We deny the protests.

BACKGROUND

The agency issued the solicitation on January 23, 2020, as a small business set-aside using the procedures of Federal Acquisition Regulation (FAR) part 15. Agency Report (AR), Tabs 1, Contracting Officer’s Statement in SI Protest (SI-COS) at 3-4; Contracting Officer’s Statement in TTS Protest (TTS-COS) at 3-4; Tab 3, RFP at 1, 29.[1] The agency sought proposals for information technology (IT) professional subject matter expertise in enterprise architecture, engineering, and services to support the sustainment, modernization, and consolidation of the HEDC. RFP at 7; AR, Tab 3a, RFP attach. 1, Performance Work Statement (PWS) at 6. The HEDC is a “world-class data center” that hosts “over 2000 physical and virtual servers servicing 250+ applications.” PWS at 6.

The solicitation contemplated award of a single indefinite-delivery, indefinite-quantity (IDIQ) contract with “a Firm Fixed Price (FFP) Contract Line Item Number (CLIN) structure, with one CLIN for travel, as a Cost Reimbursable No Fee (CRNF)” item. SI‑COS at 3-4; TTS-COS at 3-4; RFP at 36, 62-63; AR, Tab 3e, RFP attach. 5, RFP § M--Evaluation Factors for Award at 3. The IDIQ contract would have a 5-year base ordering period with a 6-year period of performance, and include an option to extend both the ordering period and period of performance for one year. PWS at 7. The agency’s estimated maximum value over the contract’s possible 7‑year life is $485 million. SI-COS at 4; TTS-COS at 4.

The solicitation established that award would be made on a best-value tradeoff basis considering technical and price, and provided that the technical evaluation factor was significantly more important than price. RFP § M at 3-4. The technical factor consisted of the following five equal subfactors: (1) scenario 1--onboarding and technical capabilities; (2) scenario 2--program and configuration management; (3) corporate experience; (4) transition plan; and (5) cybersecurity. Id. at 4. For the first three subfactors, the evaluators would assign both a technical rating and a technical risk rating, which were of equal importance. Id. at 4-5. The possible technical ratings were: outstanding, good, acceptable, marginal, or unacceptable. Id. at 5. The possible technical risk ratings were: low, moderate, high, or unacceptable. Id. at 5-6. For the fourth and fifth subfactors, the agency would evaluate proposals on an acceptable/ unacceptable basis, and would not assign a technical risk rating. Id. at 6.

With respect to price, the solicitation provided that the agency would evaluate for completeness, reasonableness, and balance. RFP § M at 9. The solicitation required offerors to submit a completed total evaluated price matrix, and explained that offerors’ total evaluated prices would be considered by the source selection authority (SSA) as part of the award decision. Id. In addition to their total evaluated price matrices, offerors were required to submit a professional employee compensation plan meeting the requirements of FAR provision 52.222-46 (Evaluation of Compensation for Professional Employees), which the solicitation incorporated by reference. Id. at 10; AR, Tab 3d, RFP attach. 4, RFP § L‑‑Instructions, Conditions, and Notices to Offerors or Respondents at 13-14; RFP at 65.

The agency received 8 timely proposals, including those submitted by SI, TTS, and OMG. SI-COS at 6; TTS-COS at 6. After evaluating initial proposals, the agency established a competitive range, which included SI, TTS, and OMG, and requested final proposal revisions. Id. at 6-7. Based on the evaluation of final proposals, the agency selected TTS’s proposal for award. Id. at 7. After being notified of the award decision, both SI and OMG filed protests with our Office. Id. In response to the protests, the agency notified us of its intent to take corrective action by terminating the award to TTS, reevaluating proposals, and making a new source selection decision, resulting in our dismissal of the protests as academic. OM Group, Inc., B-418963, Aug. 25, 2020; Systems Implementers, Inc., B-418963.2, B‑418963.3, Aug. 27, 2020 (unpublished decisions). During the agency’s implementation of its corrective action, SI filed a second protest with our office challenging the adequacy of discussions and limitation of initial proposal revisions in connection with the agency’s ongoing source selection process, which we dismissed as premature. Systems Implementers, Inc., B‑418963.4, Apr. 19, 2021, 2021 CPD ¶ 174.

As part of its corrective action, the agency reopened discussions with, and requested new final proposal revisions from the offerors in the competitive range. SI-COS at 7; TTS-COS at 7; SI-AR, Tab 8, Source Selection Evaluation Board Final Report (SSEB Rpt.) at 1; TTS-AR, Tab 8, SSEB Rpt. at 1. The evaluators assigned the final proposals submitted by SI, TTS, and OMG the following ratings:

 

Systems Implementers (SI)

Transcend Technological Systems (TTS)

OM Group (OMG)

Scenario 1--Onboarding and Technology Capabilities

Acceptable

Low Risk

Acceptable

Low Risk

Acceptable

Low Risk

Scenario 2--Program and Configuration Management

Acceptable

Low Risk

Good

Low Risk

Good

Low Risk

Corporate Experience

Good

Low Risk

Good

Low Risk

Good

Low Risk

Transition Plan

Acceptable

Acceptable

Acceptable

Cybersecurity

Acceptable

Acceptable

Acceptable

Total Evaluated Price

$224,274,134

$194,776,490

$195,171,764

 

SI-AR, Tab 9, Comparative Analysis Report (Comp. Rpt.) at 16, 18; TTS-AR, Tab 9, Comp. Rpt. at 16, 18.

Based on the evaluators findings and a comparative assessment of proposals, the SSA “determined that the proposal submitted by [OMG] offers the best overall value” for the agency’s HEDC requirement. SI-AR, Tab 10, Source Selection Decision Document (SSDD) at 1; TTS-AR, Tab 10, SSDD at 1. Specifically, the SSA found that the proposals submitted by SI and a fourth offeror while “awardable, affordable and executable” were higher priced and not as highly rated as the proposals submitted by OMG and TTS, and, thus, “it was not in the best interest of the Government to select” SI’s or the fourth offeror’s proposals for award. Id. at 20.

As between the two more highly rated proposals submitted by OMG and TTS, the SSA acknowledged that OMG’s proposal was assessed two weaknesses under technical subfactor 1, scenario 1--onboarding and technology capabilities, while TTS’s proposal was not assessed any weaknesses, but noted that OMG’s weaknesses were considered minor and easily correctable. Id. at 20-21. The SSA concluded that the two proposals offered different, but relatively equal, strengths under technical subfactor 2, scenario 2--program and configuration management; and were both rated acceptable under subfactors 4, transition plan; and 5, cybersecurity. Id. at 21. Under technical subfactor 3, corporate experience, however, the SSA found that OMG’s “strength in corporate experience in research and development with data center management provide[d] additional value” that set OMG’s proposal apart and warranted payment of its approximate 0.2 percent premium over TTS’s lower-priced proposal. Id. at 21-22.

After being notified of the agency’s selection of OMG for award and receipt of a debriefing, SI and TTS filed these protests with our Office. SI-COS at 9; TTS-COS at 9; see also AR, Tab 14, SI Unsuccessful Offeror Letter; Tab 15, TTS Unsuccessful Offeror Letter; Tab 16, SI Debriefing; Tab 17, SI Debriefing Q&A; Tab 18, TTS Debriefing; Tab 19, TTS Debriefing Q&A.

DISCUSSION

SI and TTS challenge various aspects of the agency’s technical and price evaluations, as well as the resulting source selection decision. Although we do not address each of the protesters’ many arguments, we have considered all of the firms’ contentions and find that none provides a basis to sustain the protests. Below, we address a sampling of SI’s and TTS’s allegations related to: (1) evaluation of the awardee’s proposal; (2) non‑assessment of strengths in the protesters’ proposals; (3) disparate evaluation; (4) consideration of offerors’ professional employee compensation plans; and (5) the best-value tradeoff.

Awardee’s Evaluation

Both protesters challenge the agency’s evaluation of the awardee’s proposal. SI and TTS argue that the agency unreasonably ignored weaknesses assessed in the awardee’s proposal under technical subfactor 1, scenario 1--onboarding and technology capabilities, which, they maintain, should have resulted in the assignment of a higher risk rating. SI Protest at 30-32; SI Comments & 2nd Supp. Protest at 15-22; SI Supp. Comments at 12-17; TTS Protest at 26-28; TTS Comments & Supp. Protest at 6-8; TTS Supp. Comments at 5‑7.[2] In addition, SI contends that the agency assessed strengths under technical subfactor 2, scenario 2--program and configuration management, and subfactor 3, corporate experience, that are not supported by the content of the awardee’s proposal. SI Protest at 33‑39, 41-52; SI Comments & 2nd Supp. Protest at 25-30, 34-42, 44-45; SI Supp. Comments at 20‑26, 28-40. The agency maintains that it evaluated the awardee’s proposal reasonably and in a manner consistent with the solicitation. SI-COS at 24-34, 36-45, 47-48; TTS-COS at 11-22; SI-Supp. COS at 17‑21, 23-34, 36-45, 47-48; TTS‑Supp. COS at 4-7; SI‑Memorandum of Law (MOL) at 19-22, 23-27, 30-34; TTS-MOL at 9-15; SI-Supp. MOL at 14-34; TTS-Supp. MOL at 5-7. Based on the record before us, we agree. As a representative example of the protesters’ challenges, we discuss the agency’s evaluation of the awardee’s proposal under technical subfactor 1--scenario 1, onboarding and technology capabilities.[3]

Under technical subfactor 1 the solicitation required offerors to “propose a detailed solution to Scenario 1” using “the technologies and skills identified in PWS sections 3.7 through 3.19.” RFP § L at 10. Scenario 1 set forth a situation where a customer wanted “to relocate the hosting of a system [SystemX] into the [HEDC] while modernizing and consolidating the operating system, database and application interface components while conforming to Air Force security protocols.” SI-COS at 11; RFP § L at 16. The scenario parameters indicated that SystemX comprised 15 terabytes of data. Id. Further, the customer in scenario 1 required “a Continuity of Operations (COOP) solution in a Portable Operating Datacenter (POD) design to support SystemX with fail‑over procedures and a Disaster Recovery (DR) plan.” Id. The solicitation provided that the agency’s evaluation would “focus on the degree to which the offeror demonstrates a comprehensive and in-depth approach to the scenario requirements.” RFP § M at 6. As relevant here, the solicitation defined a technical risk rating of low as indicating that a “[p]roposal may contain weakness(es) which have little potential to cause disruption of schedule, increased cost or degradation of performance,” and that “[n]ormal contractor effort and normal Government monitoring [would] likely be able to overcome any difficulties.” Id. at 5.

The record reflects that the evaluators assessed two weaknesses in OMG’s final proposal. SI-AR, Tab 8, SSEB Rpt. at 3; TTS-AR, Tab 8, SSEB Rpt. at 3. The first weakness was initially assigned as a deficiency because OMG’s initial proposal “did not define any hardware or software specifications and quantities required for the COOP POD design for Scenario 1.” SI‑COS at 21; TTS-COS at 11. The agency raised this deficiency with OMG during discussions, and OMG revised its proposal to include virtualization specifications and hardware listings for the COOP/POD element of scenario 1. SI-AR, Tab 8, SSEB Rpt. at 3; TTS-AR, Tab 8, SSEB Rpt. at 3. The evaluators considered OMG’s revised hardware specification to be compliant with the PWS and found that it provided “storage, network, and compute capacity” sufficient to satisfy the requirements of scenario 1. Id. The evaluators noted, however, that OMG’s revised hardware listing did not include ancillary items such as “server rack(s), cabling, and power components.” Id. Overall, the evaluators found that OMG’s final proposal demonstrated “an in-depth approach, as the Hardware Bill of Material supports the proposed solution overview areas of hardware, software, network, storage, and virtualization requirements in terms of storage, network and compute capabilities for each environment and tier, but lacks in demonstrating a comprehensive approach by missing the foundational aspects of housing, cabling, and power for the COOP/POD.” Id. at 3-4. Accordingly, the evaluators revised the initial assessment of OMG’s proposal from a deficiency to a weakness concluding that this area of OMG’s final proposal “remain[ed] a weakness.” Id. at 4.

The record shows that OMG’s second weakness was initially assigned a significant weakness because OMG’s initially proposed solution for migrating the SystemX database in scenario 1 “lacked information necessary to assess the comprehension of the approach and determine if it could successfully be completed.” SI-COS at 24; TTS‑COS at 14; AR, Tab 38, Interim SSEB Rpt. at 2. Additionally, the evaluators were concerned “that OMG’s approach to transferring 15 terabytes of data [DELETED] failed to consider network latency” or otherwise propose alternate solutions.[4] SI-COS at 24; TTS-COS at 14. The agency raised these issues with OMG during discussions, and OMG revised its proposal to include a discussion of some of the challenges with transferring such a large amount of data, and to include multiple methods of accomplishing the transfer. SI-AR, Tab 8, SSEB Rpt. at 4; TTS-AR, Tab 8, SSEB Rpt. at 4. The evaluators found that OMG’s final proposal still “was not as in‑depth as the Government would have liked” with respect to addressing risk factors. Id. As a result, they revised the initial assessment of OMG’s proposal from a significant weakness, but concluded that this aspect of OMG’s final proposal “remain[ed] a weakness.” Id.

The record reflects that the evaluators considered both of the remaining weaknesses in OMG’s final proposal to be “minor,” and “to have little potential to cause disruption of schedule, increase cost or degradation of performance. SI-AR, Tab 8, SSEB Rpt. at 5; TTS-AR, Tab 8, SSEB Rpt. at 5. Specifically, the evaluators concluded that both weaknesses involved “items that can easily be corrected.” Id. The evaluators also found that both items related to “non‑critical aspects of contract performance.” Id. Further, they noted that there would be “sufficient time to make adjustments between the Preliminary Design Review (when the items would be identified) and the Critical Design Review processes (when the items need to be corrected),” such that “normal contractor effort and normal Government monitoring will likely be able to overcome any potential difficulties.” Id. Accordingly, the evaluators assigned OMG’s final proposal a technical risk rating of low. Id. at 2.

Both protesters contend that the agency disregarded the two weaknesses assessed in OMG’s final proposal, and unreasonably assigned it a technical risk rating of low. SI Protest at 30-32; SI Comments & 2nd Supp. Protest at 15-22; SI Supp. Comments at 12‑17; TTS Protest at 26-28; TTS Comments & Supp. Protest at 6-8; TTS Supp. Comments at 6-7. For example, TTS contends that the evaluators failed to adhere to the solicitation’s evaluation criteria in assigning OMG’s proposal a technical risk rating of low. TTS Protest at 26. Specifically, TTS maintains that “[i]nstead of assessing the likelihood that weaknesses assigned to OMG’s proposal under [subfactor 1] would degrade performance--as required by the RFP--the Agency assessed the relative importance of the performance threatened by those weaknesses.” Id. at 27. The firm contends that “[t]he RFP did not permit the Agency to discount OMG’s weaknesses based on the [agency’s] perception of whether the performance to which those weaknesses pertains is critical,” and argues that because part of the definition of a technical risk rating of low was that “any difficulties” could be overcome, the solicitation “required a determination as to whether OMG’s weaknesses threatened performance, not whether the performance threatened is more or less important.” Id.

In response, the agency contends that TTS’s “argument is belied by the plain language in the solicitation, which require[d] the Agency to consider the potential of any weaknesses to cause disruption of schedule, increased cost, or degradation of service.” TTS-MOL at 11-12. The agency maintains that inherent in the required evaluation was “a consideration of the criticality of contract aspects that may be impacted; [as] impacting a more critical contract component logically will result in a higher risk of disruption, increased cost, or degradation of service on the contract overall.” Id. at 12.

When a protester and agency disagree over the meaning of solicitation language, we will resolve the matter by reading the solicitation as a whole and in a manner that gives effect to all of its provisions. Patronus Systems, Inc., B-418784, B-418784.2, Sept. 3, 2020, 2020 CPD ¶ 291 at 5. To be reasonable, and therefore valid, an interpretation must be consistent with the solicitation when read as a whole and in a reasonable manner. Id. Here, the interpretation of the solicitation advanced by TTS is unreasonable because it fails to read as a whole the solicitation’s definition of a technical risk rating of low.

TTS’s focus on the solicitation’s use of the phrase “any difficulties” to argue that the assessment of two weaknesses in OMG’s proposal necessarily warranted assignment of a higher risk rating ignores almost the entire definition of a risk rating of low. First, TTS disregards that the definition specifically provides for the assignment of a risk rating of low to a proposal that contains weaknesses. Second, TTS ignores that under the definition, the agency was required to assess the potential of the weaknesses to impact performance as well as the ability of the weaknesses to be overcome by normal monitoring. Third, TTS discounts that the evaluators specifically concluded that the issues presented by OMG’s two weaknesses could be easily corrected with normal monitoring prior to assigning the firm’s proposal a risk rating of low, precisely as required by the solicitation. Accordingly, we find that the agency’s consideration of the likelihood of OMG’s weaknesses to impact performance, based on the criticality of the aspects of contract performance to which the weaknesses related, was a rational application of the plain language of the solicitation.

As another example, SI contends that the agency’s evaluation was “self‑contradictory” because, SI maintains, the evaluators “identified a substantive, significant, and unresolved flaw in [OMG’s] approach to Scenario 1, having to do with a tricky aspect of the scenario--the sheer quantity of data that must be transferred,” yet the flaw was then “downplayed.” SI Comments & 2nd Supp. Protest at 15-16. SI argues that there is no reasonable basis for the evaluators’ conclusion that OMG’s weakness related to transferring the 15 terabytes of data for SystemX was a minor weakness with little risk of impacting performance because transferring the SystemX data was “central to Scenario 1.” Id. at 18.

The agency explains that OMG’s final proposal included different options for the transfer of SystemX’s 15 terabytes of data, one of which could be used as “a [DELETED]” “if the transfer of the data [DELETED].” SI-COS at 25, 27. The evaluators considered OMG’s [DELETED] and [DELETED] solutions to meet the requirements of scenario 1, but assessed a remaining weakness in OMG’s final proposal because its [DELETED] solution still “could be impacted by Air Force network latency” and the explanation for the firm’s [DELETED] solution “still lacked details.” Id. at 28. The evaluators recognized, however, “that in a real world-application of the proposed solution . . . [t]hese details will be obtained during the mandatory design review stages, which occur as part of the normal contract process.” Id. Thus, the evaluators reasonably concluded that any risk associated with this weaknesses was low “because the proposal provided enough information, and adequate alternative methods, to demonstrate to the evaluators that OMG’s solution can address latency issues in the network if required.” Id. at 29.

In reviewing a protest challenging an agency’s evaluation, our Office will not reevaluate proposals nor substitute our judgment for that of the agency, as the evaluation of proposals is a matter within the agency’s discretion. Sterling Med. Assocs., B-418674, B-418674.2, July 23, 2020, 2020 CPD ¶ 255 at 4. Rather, we will review the record to determine whether the agency’s evaluation was reasonable and consistent with the stated evaluation criteria and with applicable procurement statutes and regulations. Id.; Arctic Slope Mission Servs. LLC, B-417244, Apr. 8, 2019, 2019 CPD ¶ 140 at 8. A protester’s disagreement with the agency’s evaluation judgments, without more, does not render those judgments unreasonable. Id.; Serco Inc., B-407797.3, B-407797.4, Nov. 8, 2013, 2013 CPD ¶ 264 at 8.

Here, SI’s contention that the agency ignored or discounted the weakness in OMG’s proposal related to the transfer of SystemX’s 15 terabytes of data is not supported by the record. Rather, the record shows that the agency considered the risk of impact to performance from the weakness, and reasonably concluded that the risk was low. The protester’s arguments to the contrary represent nothing more than its disagreement with the evaluators’ judgments, which, without more, is insufficient to establish that the agency acted unreasonably. See e.g., Serco Inc., supra at 9 (denying protest that agency failed to assess performance risk in awardee’s proposal where it was clear from the record that the agency had considered the risk associated with the awardee’s approach and concluded it was very low).

Unacknowledged Strengths in Protesters’ Evaluations

Both protesters maintain that the agency unreasonably failed to assess numerous strengths in each firms’ proposal by evaluating in a manner inconsistent with the solicitation. For its part, SI suggests that its proposal should have been assessed numerous additional strengths under technical subfactors 1, 2, and 3. SI Protest at 14‑28, 39-40; SI Comments & 2nd Supp. Protest at 3-15, 22-24, 30-31, 47-49; SI Supp. Comments at 4-12, 17-19, 27-28, 40-41. While TTS focuses its arguments on subfactor 3. TTS Protest at 31-34; TTS Comments & Supp. Protest at 12-17; TTS Supp. Comments at 8‑17. The agency argues that it reasonably concluded the allegedly advantageous elements of the protesters’ proposals did not merit the assessment of strengths. SI‑COS at 10-17, 32-34, 37-38; TTS-COS at 22‑27; SI-Supp. COS at 4-16, 21-23, 34-35, 45-46, 48‑49; TTS-Supp. COS at 7-10; SI-MOL at 11-18, 27-29; TTS‑MOL at 18-21; SI-Supp. MOL at 7-13, 25-26, 34-35; TTS-Supp. MOL at 7‑16. Based on the record before us, we agree. As representative examples of the protester’s challenges, we discuss some of SI’s contentions under technical subfactor 1, scenario 1--onboarding and technology capabilities, and some of TTS’s arguments under technical subfactor 3, corporate experience.[5]

As noted above, when reviewing a protest challenging an agency’s evaluation, we do not reevaluate proposals, nor do we substitute our judgment for that of the agency. Similarly, a protester’s disagreement, without more, with an agency’s evaluation does not provide a basis to sustain a protest. Additionally, while both protesters contend that the agency’s arguments responding to the protest constitute post hoc rationalizations not documented in the evaluation record, we note that an agency is not required to document every single aspect of its evaluation or explain why a proposal did not receive a strength for a particular feature. Sterling Med. Assocs., Inc., supra at 8; ICON Govt. and Public Health Solutions, Inc., B-419751, July 2, 2021, 2021 CPD ¶ 238 at 8. Here, we find the agency’s post-protest explanations to be credible and consistent with the contemporaneous evaluation record.

As relevant to both protesters’ arguments, we note that the solicitation defined a strength as “[a]n aspect of an offeror’s proposal that has merit or exceeds specified performance or capability requirements in a way that will be advantageous to the Government during contract performance.” RFP § M at 4.

SI Evaluation for Technical Subfactor 1--Scenario 1, Onboarding and Technology Capabilities

As noted above, under technical subfactor 1 the solicitation required offerors to “propose a detailed solution” to an onboarding and technology capabilities scenario, and provided the agency would evaluate “the degree to which the offeror demonstrates a comprehensive and in-depth approach to the scenario requirements.” RFP § L at 10; RFP § M at 6.

The record reflects that the evaluators assessed one weakness and zero strengths in SI’s proposal, and assigned a technical rating of acceptable and a risk rating of low to SI’s proposal for subfactor 1. SI-AR, Tab 8, SSEB Rpt. at 47. In describing key highlights of SI’s approach, the evaluators twice noted the firm’s “detailed” response. Id. at 47-48. The evaluators found that SI’s proposal overall included “the most detailed technical approach to accomplish the onboarding and technology capabilities,” but concluded that while it was more detailed “the approach itself [was] not more advantageous to the Government.” SI-AR, Tab 9, Comp. Rpt. at 4. The evaluators noted that “the requirements and constraints within Scenario 1 guided the offerors to the same or similar end states,” and that the other offerors’ proposals while less detailed “still provided an adequate approach to meet the scenario requirements for Subfactor 1.” Id. The SSA found that “[a]lthough the technical approaches of [OMG, SI, and TTS] for this subfactor are all very different, they all satisfactorily met the scenario requirements,” and presented “low risk” of schedule disruptions, cost increases, or “degradation of performance.” SI-AR, Tab 10, SSDD at 8. The SSA concluded that “each proposal provide[d] a similar overall value to the Government,” and that there was “little differentiation between the three offerors for this subfactor.” Id.

SI represents that “[a]s the long-time contractor for the HEDC, it holds a distinct advantage in terms its of understanding the requirement,” and that its “superior understanding came through” in its proposal.[6] SI Protest at 14. SI contends that the agency failed to recognize SI’s superior approach to subfactor 1, and unreasonably considered all the offerors’ approaches equal because they each arrived at a similar end state. Id. at 14-15. SI argues that the agency’s assessment of whether an offeror arrived at the correct end state deviated from the solicitation’s stated evaluation criteria which contemplated consideration of the degree of comprehensiveness of an offeror’s approach. Id. at 15. SI maintains that had the agency evaluated in a manner consistent with the solicitation the evaluators would have assessed a strength for SI’s more detailed technical approach, which SI contends necessarily equates to a more comprehensive and in-depth approach. SI Comments & 2nd Supp. Protest at 6. Instead, SI contends that the agency “simply unreasonably wiped away an acknowledged advantage of SI’s approach” and essentially converted the evaluation under subfactor 1 from a qualitative assessment to a pass/fail review. SI Protest at 16.

The agency explains that the scenario set out in the solicitation essentially “told offerors the desired end-state, technologies, and skills,” and, thus, an offeror’s proposal was “unlikely to be rated higher simply because it propose[d] an approach that [met] the end state.” SI-COS at 12. Instead, the evaluators assessed the comprehensiveness of offerors’ approaches, and while they acknowledged that SI’s approach was more detailed, the evaluators did not view the additional detail as meriting assessment of a strength. Id. at 13; see also SI-AR, Tab 9, Comp. Rpt. at 4. The agency concluded that although “SI demonstrated a comprehensive and in depth approach sufficiently meeting the scenario requirements,” the greater level of detail provided by the firm constituted “[m]ore details about an adequate approach” but did “not make the overall approach more thorough.” SI-COS at 13‑14.

While SI may view the evaluators’ conclusion as wiping away an advantage to SI’s proposal, the record reflects that the evaluators acknowledged that SI’s proposal included a greater level of detail than other offerors, but concluded that no advantage was conferred by the extra details. SI’s disagreement with this conclusion, without more, is insufficient to demonstrate that the agency’s judgment was unreasonable. Based on the record here, we find unobjectionable the agency’s decision not to assign a strength to SI’s proposal for including more detail about an approach that the agency reasonably concluded met, but did not exceed, the solicitation’s requirements. See e.g., ICON Govt. and Public Health Solutions, Inc., supra at 8-9 (denying protester’s contention that it was “illogical and highly suspect” that the firm’s offer of “additional dedicated employees” was not viewed by the agency as advantageous and meriting assessment of a strength).

In addition to challenging the evaluation of its overall approach for subfactor 1, SI contends that the evaluators unreasonably failed to assess multiple strengths in the firm’s proposal for specific aspects of its approach. SI Protest at 16-28. For example, with respect to scenario 1’s provision that the customer would like to move “all or part [of] the SystemX into a Hybrid Cloud model,” SI argues that its proposed approach “provides significant value to the HEDC,” yet the agency “inexplicably did not assign SI a strength and did not recognize” SI’s advantage. SI Protest at 13, 17; see also RFP § L at 16. SI represents that its proposed approach is advantageous because it is “the Agency’s actual approach that is in the process of being implemented,” as “SI introduced the approach to the Agency under the incumbent contract as a Course of Action (COA), which the Agency approved.” SI Protest at 17. SI maintains that its approach “had a clear advantage over [OMG’s approach] in that it reflects the Agency’s actual cloud strategy and not an alternative strategy that would waste the Agency’s efforts to date and require essentially restarting the Agency’s cloud transformation.” Id. at 19. SI further argues that its approach should have been recognized as advantageous because unlike other possible approaches it doesn’t involve “[DELETED]” and has “[DELETED].”[7] Id. at 19, 21.

The agency responds that “SI has introduced a potential hybrid cloud course of action (COA) on the current contract,” but that “this COA has, to-date not been approved for implementation.” SI-COS at 14. The agency explains that because it “is not committed to the approach introduced by SI” under its incumbent contract “[a] solution that purports to build on that approach will not necessarily reduce risk and cost to the Government, improve schedule performance, or be more advantageous or compatible than another acceptable solution.” Id. at 15. The Air Force also notes that, because the agency’s “cloud strategies continue to evolve,” the solicitation specifically did not limit offerors to a certain cloud approach. Id. Instead, the solicitation provided that the successful offeror “will be required to support ‘hosted cloud environments, designated cloud environments, or a hybrid of both hosted and designated cloud environments when [a COA] is approved.’” Id. at 14; citing PWS at 6.

SI’s view that its cloud approach is necessarily the best approach because the agency is considering it as a path forward under the incumbent contract represents no more than SI’s disagreement with the agency’s evaluation judgments, which, without more, provides no basis to sustain the protest. Additionally, to the extent that SI argues that it should have received more credit due to its incumbency, we have consistently found, “a protester’s apparent belief that its incumbency status entitled it to higher ratings or dispositive consideration provides no basis for finding an evaluation unreasonable.” ICON Govt. and Public Health Solutions, Inc., supra at 8 n.4; CACI-WGI, Inc., B‑408520.2, Dec. 16, 2013, 2013 CPD ¶ 293 at 12.

TTS Evaluation for Technical Subfactor 3--Corporate Experience

Under technical subfactor 3 the solicitation required offerors to “provide examples of their overall corporate experience within the last 5 years” in the following areas: (1) management of a data center; (2) onboarding of legacy applications using various methodologies; (3) the use of a variety of technologies; (4) risk management; and (5) cybersecurity. RFP § L at 11-12. The solicitation provided that the agency’s evaluation would “focus on the degree to which the offeror’s examples of corporate experience within the last 5 years demonstrates depth and breadth of experience” in each of the areas. RFP § M at 8. As relevant here, for experience in the area of management of a data center, the solicitation: (1) required offerors to indicate the tier of the data center managed; and (2) established a minimum acceptability level of a tier I data center. RFP § L at 11; RFP § M at 8. Also for data center management, offerors were required to indicate whether the managed center had “multi-tenancy characteristics of applications and infrastructure.” Id.

The agency explains that a data center’s tier level “is a standard that specifically identifies the buildings’ dedicated space for IT systems, an uninterrupted power supply (to filter power spikes, sags, and momentary outages), dedicated cooling equipment and a generator (to protect IT functions from extended power outages). TTS-COS at 24. A “certified Tier I data center will provide all of these characteristics/abilities,” while “[a] Tier II data center will include additional redundant power and cooling components that provide a greater level of safety from power and cooling disruptions and provide a higher percent of uptime than a Tier I” data center. Id. Similarly, data centers at the tier III and IV levels include even more redundancies in power and cooling components that further insulate the data centers’ operations from disruption. Id. at 24-25. Additionally, the agency explains that “[m]ulti-tenancy characteristics of applications and infrastructure (hardware, software, storage, virtualization) refers to architecture that allows a single instance of software running on a server to serve multiple tenants, or group of users sharing common access and designated privileges to the software instance, at the same time.” Id. at 25.

The record reflects that the evaluators assigned TTS’s proposal a technical rating of good and a risk rating of low for subfactor 3. TTS-AR, Tab 8, SSEB Rpt. at 74. The evaluators assessed zero weaknesses and two strengths in TTS’s proposal--one strength was assessed under the technologies element of the subfactor for TTS’s experience with cloud migrations and one was assessed under the cybersecurity element of the subfactor for TTS’s experience in that area. Id. at 75-76. Additionally, the record shows that in noting “key highlights” of TTS’s proposal, the evaluators acknowledged that the firm demonstrated experience managing tier III and IV data centers as well as data centers with multi-tenancy characteristics. Id. at 74.

TTS contends that the evaluators should have assessed multiple additional strengths in the firm’s proposal, and that it should have been assigned the highest technical rating of outstanding. TTS Protest at 31-33; TTS Comments & Supp. Protest at 12-17, 27-31; TTS Supp. Comments at 8-17, 19-20. For example, TTS maintains that its proposal “merited the assignment of two strengths for significantly exceeding the [agency’s] minimal requirements for experience in management of a data center.” TTS Protest at 32. Specifically, TTS argues that its proposal merited assessment of a strength because it demonstrated experience with tier III and IV data centers that exceeded the solicitation’s minimum acceptability requirement of experience with a tier I data center. Id. at 33. Similarly, TTS represents that its proposal “demonstrated the depth and breadth of its experience managing complex, highly secure, multi-tenant data center workloads” that exceeded the solicitation’s minimum requirements. Id. TTS maintains that the solicitation’s provision that the agency’s evaluation would focus on offerors’ “depth and breadth of experience” meant that experience beyond the minimum acceptable levels required assessment of a strength without the agency having to find the experience would provide any particular advantage or benefit because the additional experience “was itself to be considered advantageous to the Government.” TTS Comments & Supp. Protest at 15.

In response, the agency points out that the solicitation defined a strength not just as an aspect of a proposal that exceeded the solicitation requirements, but as an aspect that exceeded requirements “in a way that will be advantageous to the Government during contract performance.” TTS-Supp. COS at 7, citing RFP § M at 4. As reflected in the record, the agency acknowledged that TTS had experience with data centers above the tier I level. The agency explains that because tier level primarily describes the physical infrastructure of a data center “the tier level is not indicative of the complexity of the requirement,” and “[e]xperience in a higher tiered data center alone is not advantageous to the Government.” TTS-Supp. COS at 7-8; TTS COS at 25. The agency further explains that it included in the solicitation a minimum acceptability level of a tier I data center “to ensure offerors had experience working within an actual data center” rather than experience managing “just a few servers.” TTS-COS at 25. As set forth in the solicitation, once the minimum requirement of managing a tier I or higher data center was met, the agency’s evaluation focused on the depth and breadth of an offeror’s experience managing the data center, not on the tier level of the data center being managed. Id. at 24. With respect to TTS’s experience, the record shows the evaluators noted the firm’s experience included managing tier III and IV data centers, but the evaluators did not find that the actual management aspects of TTS’s experience exceeded the requirements in an advantageous way. Id. at 26.

Similarly, the agency contends that the evaluators acknowledged TTS’s experience with multi-tenancy in the data centers it managed, but “[s]imply having ‘depth and breadth’ of experience does not warrant a strength, as that term is defined in the solicitation.” TTS‑ Supp. COS at 9. The agency maintains there was nothing about TTS’s experience with multi-tenancy that was “indicative of any specialized or innovative approaches in the management of data centers that would be advantageous to the Government during contract performance.” TTS-COS at 26. Based on the record before us, we have no basis to object to the evaluators’ judgment that while TTS had experience beyond the minimum level designated as acceptable by the solicitation it did not exceed the requirements in an advantageous way.

While TTS may view its experience as an advantage, such disagreement with the agency’s assessment, without more, is insufficient to demonstrate that the agency’s subjective judgment is unreasonable. See e.g., Arctic Slope Mission Servs. LLC, supra at 10 (denying protester’s contention that its recruitment and retention programs provided an advantage when the evaluators concluded that “[w]hile the [protester’s] proposal contains positive attributes, no distinct aspects of the proposal rose to the level of a strength as defined in Section M” of the solicitation).

Disparate Treatment

Both SI and TTS argue that the agency evaluated disparately by assessing various strengths in the awardee’s proposal but not in the protesters’ proposals for the same or similar elements under technical subfactor 2, scenario 2--program and configuration management, and subfactor 3, corporate experience. SI Protest at 52-60; SI Comments & 2nd Supp. Protest at 42-44, 46; SI Supp. Comments at 26, 37-38, 39-40; TTS Protest at 34‑39; TTS Comments & Supp. Protest at 17-31; TTS Supp. Comments at 17‑22. The agency contends that it evaluated proposals in an equal manner, and that any differences in evaluations stemmed from differences in the proposals. SI-COS at 38-44; TTS-COS at 28-35; TTS-Supp. COS at 10-33; SI‑MOL at 34-39; TTS-MOL at 21-27; TTS‑Supp. MOL at 16‑18. Based on the record before us, we agree. As representative examples, we discuss TTS’s challenge related to evaluation of the program management plan element of technical subfactor 2 and SI’s challenge related to evaluation of the data center management element of technical subfactor 3.[8]

It is a fundamental principle of federal procurement law that a contracting agency must even-handedly evaluate proposals against common requirements and evaluation criteria. Battelle Memorial Inst., B-418047.5, B-418047.6, Nov. 18, 2020, 2020 CPD ¶ 369 at 6. Agencies properly may assign dissimilar proposals different evaluation ratings, however. IndraSoft, Inc., B-414026, B-414026.2, Jan. 23, 2017, 2017 CPD ¶ 30 at 10; Paragon Sys., Inc.; SecTek, Inc., B-409066.2, B-409066.3, June 4, 2014, 2014 CPD ¶ 169 at 8-9. Accordingly, to prevail on an allegation of disparate treatment, a protester must show that the agency unreasonably evaluated its proposal in a different manner than another proposal that was substantively indistinguishable or nearly identical. Battelle Memorial Inst., supra at 6.

Technical Subfactor 2--Scenario 2, Program and Configuration Management

Under technical subfactor 2 the solicitation required offerors to “propose a detailed solution to Scenario 2” that provided a program management plan, “configuration baseline recommendations that detail[ed] how modernization” would affect various aspects of system operations, and a surge planning staffing approach. RFP § L at 11. Scenario 2 posited a program team that “just received an unexpected 90-day corporate‑level, high-priority effort in the modernization area,” which the contractor was required to accomplish “without adverse impacts to existing schedules and service levels in the areas of sustainment, modernization, onboarding and cybersecurity.” Id. at 17. The solicitation provided that the agency’s evaluation would “focus on the degree to which the offeror demonstrates a comprehensive and in-depth approach to the scenario requirements.” RFP § M at 7.

The record reflects that the evaluators assessed a strength in OMG’s proposal because it demonstrated “a very in-depth and comprehensive approach and exceeded the requirements in key areas of Element 1 Subfactor 2 (Program Management Plan).” TTS-AR, Tab 8, SSEB Rpt. at 6. Specifically, the evaluators found that OMG’s program management plan included: (1) an “in-depth integrated approach defined for on-time delivery of surge modernization with no impact to the current workload”; (2) the use of “[DELETED]”; (3) “a good description of [DELETED] and [DELETED]”; (4) “[DELETED]”; (5) a good understanding of a well demonstrated [DELETED]; (6) “demonstrated service delivery to meet Service Level Agreements”; and (7) “an [DELETED] on all areas of Scenario 2.” Id. The evaluators concluded that OMG’s program management plan exceeded the solicitation requirements in an advantageous way because the plan demonstrated that OMG would “shorten timelines for onboarding, sustainment and consolidation of customer systems and applications in the HEDC with their comprehensive process and achieve program compliancy within a shorter timeframe.” Id. at 6-7.

TTS argues that the agency “unreasonably assigned OMG a strength for its program management plan, while failing to assign TTS an equivalent strength, despite the fact that TTS’s program management plan, which the Agency itself credited, offered an equivalent, if not superior approach to OMG.” TTS Protest at 34. In support of its argument, TTS points to the SSA’s finding that “[b]oth [OMG and TTS] demonstrated a thorough approach and understanding in this subfactor, demonstrating a comprehensive and in-depth approach to the scenario [2] requirements (in accordance with Section M, 2.6.2) and each providing strengths in Program and Configuration Management.” Id. at 35, citing TTS-AR, Tab 10, SSDD at 8. TTS maintains that “[t]he Agency’s decision to assign only OMG a strength for proposing a ‘very in-depth and comprehensive’ program management approach, despite expressly acknowledging that TTS also offered a ‘comprehensive and in-depth approach,’ demonstrates that the Agency applied a less demanding standard to OMG’s proposal in its evaluation than it applied to TTS’s proposal.” TTS Protest at 36.

TTS’s contention is not supported by the record. Rather, the record reflects that the statements relied upon by TTS are excerpts taken out of context from the record as a whole. Specifically, the record shows that the evaluators assessed two strengths under subfactor 2 in the proposals of both TTS and OMG and assigned both proposals technical ratings of good and technical risk ratings of low. TTS-AR, Tab 9, Comp. Rpt. at 5. In comparing the two proposals, the evaluators noted that both offerors demonstrated a thorough, comprehensive, and in-depth approach under subfactor 2. Id. at 7; TTS-AR, Tab 10, SSDD at 8. Contrary to TTS’s assertions, however, OMG’s proposal was not assessed a strength for the general thoroughness of its approach for the entire subfactor. Rather, OMG’s proposal was assessed a strength for its “very in‑depth and comprehensive” program management plan, which was a specific element of the subfactor. TTS-AR, Tab 8, SSEB Rpt. at 6; Tab 9, Comp. Rpt. at 6. Reading the contemporaneous evaluation and source selection record as a whole, we find wholly unavailing TTS’s protestation that its proposal was held to a higher standard than OMG’s proposal.

TTS further contends that its proposal included specific aspects that were the same as those credited in OMG’s proposal. Specifically, TTS maintains that its proposal also demonstrated how its program management plan would “streamline and shorten the key HEDC timelines using an agile program management framework,” yet only OMG’s proposal was assessed a strength for shortening timelines. TTS Protest at 36. Additionally, TTS represents that it “proposed each of the seven elements of OMG’s program management plan and approach that earned OMG, but not TTS, a strength.” TTS Comments & Supp. Protest at 18-27.

The agency explains that the evaluators determined OMG’s proposal exceeded the solicitation requirements in an advantageous way because the firm’s “program management plan will shorten the timeframe for onboarding, sustainment and consolidation of customer systems and applications in the HEDC, and achieve program complianc[e] within a shorter timeframe.” TTS-COS at 29, citing TTS-AR, Tab 8, SSEB Rpt. at 6-7. Conversely, the agency maintains, “TTS’s program management plan simply included sufficient information to meet the program management plan requirements in accordance with the solicitation.” TTS-COS at 29. The agency also notes that some of the aspects of its proposal that TTS alleges the agency failed to credit were recognized by the evaluators in the two strengths assessed in TTS’s proposal under subfactor 2--one for configuration management baseline recommendations related to cybersecurity, and one for baseline recommendations related to service levels. Id. at 30, 32.

In support of their respective arguments, both TTS and the agency included in their submissions to our Office lengthy comparisons of different aspects of TTS’s and OMG’s program management plans. For instance, TTS maintains that its proposal, like OMG’s proposal, “demonstrated using [DELETED].” TTS Comments & Supp. Protest at 21-23, citing TTS-AR, Tab 8, SSEB Rpt. at 7; comparing Tab 6f, TTS Final Technical Proposal at 45-46 to Tab 7l, OMG Final Technical Proposal at 40-41. The agency, in response, highlights a number of differences between the proposals of OMG and TTS that contributed to the evaluators’ assessments. One of the differences was OMG’s identification of “a unique [DELETED]” which OMG proposed to use “in conjunction with its [DELETED] to better manage risk, utilize resources effectively with minimal waste of time and effort, and iterate constant improvement through [DELETED].” TTS-Supp. COS at 14-18, comparing TTS-AR, Tab 6f, TTS Final Technical Proposal at 43-47 to Tab 7l, OMG Final Technical Proposal at 16, 37, 39-40, 45, 58-59. The agency explains that TTS’s proposal included “little information to quantify and substantiate their claims,” while OMG’s proposal provided “a measurable example showing how it successfully used its [DELETED] in conjunction with its [DELETED] to handle surge requirements without increasing the timeframe.” TTS-Supp. COS at 16, 18.

Based on our review of the record, we find no basis to object to the agency’s evaluation. While the differences between TTS’s and OMG’s explanation of their use of [DELETED] processes are not stark, the proposals were neither nearly identical nor substantively indistinguishable. Similarly, while the evaluators recognized that both TTS and OMG proposed comprehensive approaches to subfactor 2, we have no basis to question the evaluators’ conclusion that the two proposals merited assessment of strengths for different aspects of the firms’ approaches, and that the areas in which OMG received strengths and TTS did not were attributable to differences in the proposals. See e.g., American Systems Corp., B-420132 et al., Dec. 13, 2021, 2021 CPD ¶ 387 at 11 (denying protest alleging disparate evaluation where the agency’s explanation of the different evaluation results was nuanced but not unreasonable); Candor Solutions, LLC, B-417950.5, B-417950.6, May 10, 2021, 2021 CPD ¶ 199 at 7 (denying protest alleging disparate evaluation of offerors’ proposed approaches to candidate pipelines when the differences in the firms’ proposals were not stark, but the proposals were not exactly the same or substantially similar).

Technical Subfactor 3--Corporate Experience

As noted above, under technical subfactor 3 the solicitation required offerors to “provide examples of their overall corporate experience within the last 5 years in” the following areas: (1) management of a data center; (2) onboarding of legacy applications using various methodologies; (3) the use of a variety of technologies; (4) risk management; and (5) cybersecurity. RFP § L at 11-12. The solicitation provided that the agency’s evaluation would “focus on the degree to which the offeror’s examples of corporate experience within the last 5 years demonstrates depth and breadth of experience” in each of the areas. RFP § M at 8.

The record reflects that the evaluators assessed three strengths in OMG’s proposal under subfactor 3, assigned it a technical rating of good, and a technical risk rating of low. SI-AR, Tab 8, SSEB Rpt. at 8-9. Specifically, the evaluators assessed strengths in OMG’s proposal for its corporate experience with: (1) research and development related to management of a data center; (2) use of technologies related to cloud migration; and (3) cyber risk management. Id. The evaluators assessed two strengths in SI’s proposal for its experience with management of a data center and onboarding. Id. at 53. SI maintains that its proposal also merited assessment of strengths related to cloud migration and risk management. SI Protest at 52-59; SI Comments & 2nd Supp. Protest at 42-44, 46; SI Supp. Comments at 37-38, 39-40.

For instance, SI argues that “it was unreasonable and unequal” for the evaluators to assign a strength for experience with cloud migration technology to OMG and not SI because “SI’s proposal described its experience with cloud migration within the context of its hybrid cloud course of action implanted at the HEDC.” SI Comments & 2nd Supp. Protest at 42-43, citing SI-AR, Tab 5j, SI Final Technical Proposal at 128-129, 134, 136. As discussed above, the hybrid cloud solution SI included in its proposal is an approach that SI proposed under the incumbent contract--an approach which has not received final approval by the agency and therefore has not been implemented yet by SI. Thus, the agency maintains, none of the examples in SI’s proposal “actually demonstrate cloud migration experience” meriting assessment of a strength. SI-Supp. MOL at 32. In contrast, the evaluators found advantageous OMG’s demonstrated experience “for its innovative use of technology to [DELETED].” Id. at 31, citing AR, Tab 7l, OMG Final Technical Proposal at 58; SI-AR Tab 8, SSEB Rpt. at 10. Based on this record, we find no merit in SI’s contention that the agency’s assessment of a strength related to cloud migration in OMG’s proposal but not in SI’s proposal was the result of disparate evaluation.

As a further example, SI contends that it was unreasonable for the agency to assess a strength related to cyber risk management in OMG’s proposal but not in SI’s proposal. SI Comments & 2nd Supp. Protest at 46. SI maintains that the agency assessed the strength on the basis of OMG’s Capability Model Maturity Integration (CMMI) certification, which SI also demonstrated in its proposal. Id., citing AR, Tab 5j, SI’s Final Technical Proposal at 135. In response, the agency notes that the strength assessed by the evaluators in OMG’s proposal was specific to the firm’s proposal “showing CMMI Level 3 certification,” while SI’s proposal references being “a CMMI certified org” without indicating the level of certification. SI-Supp. MOL at 34, citing SI-AR, Tab 8, SSEB Rpt. at 10; Tab 5j, SI’s Final Technical Proposal at 135. Contrary to SI’s contentions, based on the record here, we find that the difference in evaluations reasonably resulted from differences in the proposals. See e.g., Battelle Memorial Inst., supra at 7 (denying allegations of disparate treatment where the record showed that the differences in evaluations reasonably resulted from differences in the proposals).

Price Evaluation--Professional Employee Compensation Plans

In addition to challenging the agency’s technical evaluation, SI contends that the agency conducted an unreasonable evaluation of offerors’ professional employee compensation plans, and unreasonably concluded that the awardee’s plan will enable OMG to recruit and retain qualified employees. Specifically, SI argues that the agency unreasonably compared offerors’ proposed labor rates to an unrealistic agency‑created baseline, rather than solely to the compensation rates paid on SI’s incumbent contract.[9] SI Protest at 60, 64-67; SI Comments & 2nd Supp. Protest at 50-55; SI Supp. Comments at 41-45. The agency represents that it did compare offerors’ rates to the incumbent contract rates, but out of concern about the reliability of the information about the incumbent rates available to the agency, the Air Force also compared offerors’ rates to a baseline rate created by using market research. SI‑COS at 44‑51; SI‑Supp. COS at 49-54; SI‑MOL at 40‑47; SI-Supp. MOL at 36-39. In light of the record before us, we find the agency’s evaluation reasonable.

As noted above, the solicitation included FAR provision 52.222-46, Evaluation of Compensation for Professional Employees. RFP at 65. The purpose of FAR provision 52.222-46 is to evaluate whether offerors will obtain and keep the quality of professional services needed for adequate contract performance, and to evaluate whether offerors understand the nature of the work to be performed. MicroTechnologies, LLC, B‑413091.4, Feb. 3, 2017, 2017 CPD ¶ 48 at 8. In the context of a fixed-price contract, such as the one contemplated by the solicitation here, our Office has noted that this FAR provision anticipates an evaluation of whether an offeror understands the contract’s requirements, and has offered a compensation plan appropriate for those requirements--in effect, a price realism evaluation regarding an offeror’s proposed compensation. Obsidian Solutions Group, LLC, B-416343, B-416343.3, Aug. 8, 2018, 2018 CPD ¶ 274 at 7. The depth of an agency’s price realism analysis is a matter within the sound exercise of the agency’s discretion. Id. In reviewing protests challenging price realism evaluations, our focus is on whether the agency acted reasonably and in a manner consistent with the solicitation’s requirements. Id.; MicroTechnologies, LLC, supra at 7.

Here, the solicitation required offerors to submit professional employee compensation plans, and provided that, “[i]n accordance with FAR [provision] 52.222-46,” the agency would evaluate an offeror’s plan to ensure “it reflect[ed] a sound management approach and understanding of the contract requirements and allows the Offeror to obtain and keep suitably qualified personnel to meet mission objectives.” RFP § M at 10; see also RFP § L at 13-14. Further, the agency would evaluate and compare “against current standards” an offeror’s “[b]ase labor rates, fringe, and overall benefits.” RFP § M at 10.

The record reflects that the agency first compared each offerors’ direct labor rates to the current IDIQ contract (i.e., SI’s incumbent contract) for 13 of the 15 labor categories included in the solicitation. SI-AR, Tab 8, SSEB Rpt. at 16; Tab 8a, SSEP Rpt. attach. A at 6‑10. The remaining two labor categories are new positions that do not have corresponding labor rates for comparison under the incumbent contract. SI-AR, Tab 8, SSEB Rpt. at 16. The incumbent contract rates used by the agency for comparison were the direct labor rates proposed by SI in September 2018, prior to its receipt of the contract on a sole-source basis. Id.; SI-COS at 45. SI’s 2018 proposed direct labor rates were not incorporated into the incumbent contract, however; rather, only SI’s fully burdened labor rates were made a part of the contract. Id. During the course of performance of task orders under its incumbent contract, SI has invoiced at the fully burdened rates incorporated in the contract, and its invoices have not included a breakdown of the fully burdened rates or any indication of what direct labor or fringe rates SI is paying its incumbent employees. Id.

After comparing offerors’ direct labor rates to those proposed by SI in 2018, the agency found “large variations between the offerors’ rates and the rates on the current contract,” with every offeror having proposed some rates below the current contract rates. SI-AR, Tab 8, SSEB Rpt. at 16. Additionally, the agency “discovered there were large differences in labor rates between the incumbent’s rates on the current contract and the incumbent’s proposed rates.” Id. Specifically, in responding to the solicitation at issue here, SI proposed direct labor rates that ranged from 5 percent above to 32 percent below the direct labor rates it proposed prior to receipt of the incumbent contract, with 12 of the 13 comparable rates being below SI’s 2018 proposed rates. Id. SI’s proposed total employee compensation rates also varied from its 2018 proposed rates, with 7 of its 13 currently proposed rates going from 13 percent above to 27 percent below its 2018 proposed rates. Id. Based on the comparison results, the agency “became concerned that the current contract rates did not reflect the amount actually being paid to the incumbent contractor’s employees.” Id.

During discussions, the agency considered its concern confirmed by SI’s explanation that the firm’s proposed rates for the solicitation at issue here “would not require its [DELETED]” and that the rates were “based off the [DELETED].” SI-AR, Tab 8, SSEB Rpt. at 16, citing AR, Tab 5d, Evaluation Notice (EN) EN‑SI-CP-0004 at 5; Tab 5h, EN‑SI-CP-006 at 4. Having learned that the direct labor rates proposed by SI in 2018 for the incumbent contract were not reflective of the compensation currently being paid to the incumbent employees, the agency “determined that the incumbent contract rates would not, alone, be a reliable measure against which to compare offerors’ proposed rates.” SI-AR, Tab 8, SSEB Rpt. at 16.

The record shows that the agency then conducted a second comparative analysis of offerors’ proposed rates using an agency-created professional employee compensation plan baseline (PECP baseline). SI-AR, Tab 8, SSEB Rpt. at 17; Tab 8a, SSEB Rpt. attach. A at 10-11. The agency created its PECP baseline using market survey data of median base salaries and total employee compensation received by employees with the requisite experience working in the solicitation’s 15 labor categories in the geographic area where the HEDC is located. SI-AR, Tab 8, SSEB Rpt. at 17; see also AR, Tab 8a, SSEB Rpt. attach. A at 1-5. For its market survey data, the agency used the website salary.com, which is a commercial service that tracks salary data for various labor positions across U.S. and international markets. SI-AR, Tab 8, SSEB Rpt. at 17. To help choose which survey data to use, the agency mapped the solicitation’s labor categories into an “experience level matrix,” with the majority of the positions “requiring ‘experienced’ to ‘highly experienced’ personnel representing seven or more years of experience.” SI‑COS at 48. For some of the solicitation’s labor categories, however, there were no directly comparable positions available in the market survey data. SI‑AR, Tab 8, SSEB Rpt. at 17. For these positions, the agency “used the average of all offerors’ proposed compensation along with the survey data for the position most closely aligned to the HEDC requirement.” Id.

For both SI and the awardee, OMG, the agency found some of the offerors’ initially proposed rates to be below the agency’s PECP baseline. SI-COS at 49. During discussions, the agency issued evaluation notices to both offerors requesting additional rationale for these rates. Id. In the firms’ final proposal revisions, SI made no changes to its proposed rates, while OMG increased some of its rates. Id. All of OMG’s final direct labor rates and all, but one, of OMG’s final total employee compensation rates were above the agency’s PECP baseline, and the single total employee compensation rate that fell below the baseline did so by [DELETED] per hour. Id.; SI-AR, Tab 8, SSEB Rpt. at 22.

Finally, the agency also compared offerors’ proposed rates to the incumbent (SI’s) proposed rates. SI-COS at 50, citing SI-AR, Tab 8a, SSEB Rpt. attach. A at 15. With respect to the 13 labor categories that have corresponding categories under the incumbent contract, OMG’s direct labor rates were higher than SI’s for [DELETED] of the 13 and were lower than SI’s for [DELETED] of the 13. SI-AR, Tab 8a, SSEB Rpt. attach A. at 15. For the [DELETED] labor categories in which OMG proposed higher rates than SI, OMG did so in amounts ranging from [DELETED] to [DELETED] percent, while for the [DELETED] labor categories in which OMG proposed rates lower than SI, OMG did so in amounts ranging from [DELETED] to [DELETED] percent. Id. Similarly, OMG’s proposed total employee compensation hourly rates were above SI’s rates for [DELETED] of the 13 labor categories (in amounts ranging from [DELETED] to [DELETED] percent), and OMG’s proposed rates were below those of SI for [DELETED] of the 13 labor categories (in amounts ranging from [DELETED] to [DELETED] percent). Id.

The evaluators acknowledged that some of OMG’s proposed rates were below those proposed by SI, but concluded it was “not clear that these lower rates would require incumbent employees currently filling the positions to take a cut in compensation.” SI‑AR, Tab 8, SSEB Rpt. at 23. The evaluators noted that even if some of OMG’s rates “would require an incumbent employee to take a pay cut, the rates are on par for what employees with the corresponding skill sets would make in the local market,” making it likely that most incumbent employees would “agree to stay on the contract at least for some amount of time.” Id. Further, based on [DELETED] of OMG’s proposed rates being above the agency’s PECP baseline--which was created with market survey data for experienced and highly-experienced personnel--the evaluators concluded that “OMG’s proposed rates represent highly competitive top of the market rates for the geographic area,” which would enable OMG to “quickly hire comparable personnel” even if it was unable to retain all the incumbent staff. Id. Based on the totality of its three comparative analyses, the evaluators concluded that OMG’s proposed rates were “representative of the competitive nature of the labor market and” were rates that would “allow the Offeror to recruit and retain professional employees that will meet the technical requirements outlined in the PWS.” Id. at 21.

The record further reflects that the agency evaluated OMG’s benefits plan and found that it provided benefits comparable to those “currently provided on the incumbent’s contract,” such as [DELETED]. SI-COS at 50. Based on these comparisons, the evaluators concluded that OMG’s professional employee compensation plan reflected an ability to maintain program continuity and provide uninterrupted, high quality work. Id. at 49; SI-AR, Tab 8, SSEB Rpt. at 24.

SI contends that the agency unreasonably concluded that OMG’s professional employee compensation plan proposed realistic rates sufficient to enable the awardee to retain and recruit qualified employees. SI Protest at 60, 64-67; SI Comments & 2nd Supp. Protest at 51-55; SI Supp. Comments at 41-45. In support of this contention, SI maintains that the salary.com survey data the agency used “did not appropriately map the skills needed or failed to account for the conditions in the relevant labor market,” and that for at least some of the positions salary.com was not a credible source. Id. Further, SI argues that it was unreasonable for the agency not to consider why some of the agency’s PECP baseline rates were “so far below” SI’s own proposed rates. SI Comments & 2nd Supp. Protest at 53.

In essence, SI is arguing that, because it is the incumbent contractor, the agency was obligated to use SI’s proposed rates as the definitive benchmark for incumbent employee compensation. As an initial matter, we note that SI’s arguments disregard some aspects of the record discussed above. First, SI ignores that the agency did compare other offerors’ proposed rates to both the rates proposed by SI under the incumbent contract and the rates proposed by SI in response to the current solicitation. Second, SI focuses on only the OMG rates that are below those of SI while overlooking the fact that the [DELETED] of the rates proposed by OMG are above those proposed by SI. Third, SI’s argument fails to account for its own rates that are below the agency’s allegedly unrealistically low PECP baseline. See SI-AR, Tab 8, SSEB Rpt. at 68 (showing that for one labor category both SI’s hourly direct labor rate and its hourly total employee compensation rate were below the agency’s PECP baseline).

Our decisions have found that where requirements are performed under an existing contract and then recompeted, such as here, FAR provision 52.222‑46(b) requires an agency to conclude whether a proposal “envision[s] compensation levels lower than those of predecessor contractors” by comparing proposed compensation rates to those of the incumbent. See e.g., SURVICE Eng’g Co., LLC, B-414519, July 5, 2017, 2017 CPD ¶ 237 at 6-7. Here, the record shows that after comparing offerors’ proposed rates to the 2018 direct-labor rates proposed by SI prior to its receipt of the incumbent contract, the agency concluded that both this 2018 information and the current fully burdened labor rate information to which it had access for the incumbent contract was insufficient to conduct a meaningful comparison of offerors’ proposed compensation to the compensation of the incumbent contractor. In light of the unreliability of the information available to the agency about the incumbent contract’s compensation rates, we find reasonable the agency’s comparison of offerors’ rates both to a salary baseline created using market survey data from salary.com and to the incumbent’s proposed rates in the recompetition. See Obsidian Solutions Group, LLC, supra at 9 (“In light of the unavailability of the incumbent’s salary and fringe benefit information, we find no basis to conclude that the agency’s evaluation [was] unreasonable based upon the protester’s allegation that the agency failed to compare [the awardee’s] compensation to incumbent rates.”); Jacobs Polar Servs.--CH2M Facility Support Servs., B-418390.2 et al., June 12, 2020, 2020 CPD ¶ 195 at 9‑10 (denying protest challenging agency evaluation of professional employee compensation plans that utilized several evaluation tools, including labor rate information available on salary.com).

In sum, we find that the agency conducted a thorough evaluation of offerors’ professional employee compensation plans, and reasonably concluded that the awardee’s plan reflects a clear understanding of the work to be performed; demonstrates the ability to recruit and retain qualified personnel; and includes realistic rates for professional compensation overall.

Best-Value Tradeoff

Finally, both protesters contend that the agency’s best-value tradeoff was necessarily flawed because the underlying technical and price evaluations upon which the tradeoff decision relied were flawed. SI Protest at 70; SI Comments & 2nd Supp. Protest at 55‑56; SI Supp. Comments at 46; TTS Protest at 39; TTS Comments & Supp. Protest at 31-32, 35; TTS Supp. Comments at 20-22. The agency responds that its source selection decision was based on a reasonable underlying evaluation. SI-COS at 51-52; TTS-COS at 35‑37; SI‑Supp. COS at 54; TTS-Supp. COS at 33; SI-MOL at 47-49; TTS‑MOL at 27-31; SI‑Supp. MOL at 40-41; TTS-Supp. MOL at 19-22.

In a competitive negotiated procurement, a source selection decision must be based upon a comparative assessment of proposals against all of the solicitations’ evaluation criteria. FAR 15.308; ICON Govt. and Public Health Solutions, Inc., supra at 10. Our review of an agency’s price/technical tradeoff decision is limited to a determination of whether the tradeoff was reasonable and consistent with the solicitation’s evaluation criteria. Hyperbaric Techs., Inc., B-293047.4, Mar. 29, 2004, 2004 CPD ¶ 89 at 10.
As discussed above, we find no reason to object to the agency’s technical and price evaluations. Thus, there is no basis to question the SSA’s reliance upon those evaluations in making the source selection decision. Further, our review of the record shows that the SSA conducted and documented a comparative assessment of proposals under each of the technical subfactors, to which we have no basis to object. Moreover, with respect to SI’s challenge, we note that because the agency reasonably evaluated OMG’s proposal as higher‑rated and lower-priced than SI’s proposal, a tradeoff between the proposals of SI and OMG was not necessary. ICON Govt. and Public Health Solutions, Inc., supra at 11; Arctic Slope Mission Servs. LLC, supra at 11.

For its part, TTS additionally challenges the best-value tradeoff by arguing that the SSA unreasonably discounted the weaknesses in OMG’s proposal when comparing offerors’ risk ratings under technical subfactor 1, scenario 1--onboarding and technology capabilities. TTS Protest at 29-31, 41-42; TTS Comments & Supp. Protest at 9-12, 32‑34; TTS Supp. Comments at 7-8, 20-21. Specifically, TTS argues that the SSA unreasonably concluded that the proposals of TTS and OMG presented equivalent levels of risk based on the adjectival risk ratings of low assigned to both proposals without looking behind the ratings to compare TTS’s zero weaknesses to OMG’s two weaknesses under technical subfactor 1. TTS Protest at 29. TTS contends that “a proposal with fewer weaknesses is necessarily less risky,” and that “[t]he Agency’s refusal to adequately acknowledge, much less account for OMG’s increased risk of unsuccessful performance stemming from its two assessed weaknesses in its best value tradeoff renders the Agency’s best value determination and resultant source selection decision unreasonable.” Id. at 29, 42.

The agency contends that “[i]n putting forth this argument, TTS wholly ignores the SSA’s comparative assessment” of proposals. TTS-MOL at 28. Rather, the record reflects that the SSA considered the two weaknesses in OMG’s proposal under subfactor 1, and concluded that both were minor weaknesses that: (1) did not impact critical aspects of performance; (2) would be “easily correctable in the Preliminary Design Review and Critical Design Review processes”; and (3) had “little potential to cause disruption of schedule, increased cost, or degradation of performance.” TTS-AR, Tab 10, SSDD at 6-7. The record also shows that the SSA recognized that TTS’s proposal did not have any weaknesses under subfactor 1. Id. at 7. In comparing the two proposals under subfactor 1, the SSA acknowledged that OMG’s proposal had two weaknesses compared to TTS’s proposal having none. Id. at 18. The SSA concluded, however, that “the two minor weaknesses identified for [OMG] provide[d] minimal differentiation between the two offerors for this subfactor and [were] considered insignificant in comparison to the strengths provided in Subfactors 2[, scenario 2--program and configuration management] and 3[, corporate experience].” Id. On this record, we find no support TTS’s contention that the SSA failed to look behind the adjectival risk ratings assigned to TTS and OMG.

TTS further argues that the agency placed undue emphasis on a strength in OMG’s proposal under technical subfactor 3. TTS Comments & Supp. Protest at 34-35; TTS Supp. Comments at 21-22. Specifically, TTS argues that, despite the solicitation establishing all the technical subfactors as equal, the agency “downplayed the importance” of subfactors 1, 2, 4 (transition plan) and 5 (cybersecurity) as compared to subfactor 3, “essentially awarding to OMG based on a single strength” under this subfactor. TTS Comments & Supp. Protest at 34.

Here, the record shows that the SSA undertook a comparative assessment of proposals that considered the strengths and weaknesses assessed in proposals under each technical subfactor. In comparing the proposals of the two most highly-rated offerors, TTS and OMG, the SSA: (1) acknowledged the two weaknesses assessed in OMG’s proposal under subfactor 1, as compared to TTS’s proposal having no weaknesses, but considered the two weaknesses minor; (2) considered OMG’s two strengths and TTS’s two strengths under subfactor 2, “although different,” “provide[d] relatively equal value”; (3) noted that both offerors were rated acceptable for subfactors 4 and 5; and (4) concluded that “[t]he differentiation in value to the Government is identified in Subfactor 3.” TTS-AR, Tab 10, SSDD at 20-21.

Specifically, the SSA noted that both TTS and OMG were assessed strengths under subfactor 3 for experience with cloud migration technologies. TTS-AR, Tab 10, SSDD at 21. The SSA further found that while OMG’s strength for its experience with cyber risk management was “different than” TTS’s strength for its experience with cybersecurity, “each provide[d] similar overall value in ensuring strong compliance with [Department of Defense] cybersecurity objectives and policies ultimately decreasing cyber risk.” Id. Additionally, the SSA stated that OMG’s third strength under subfactor 3 related to its experience “in research and development with data center management provide[d] additional value that [was] not included in [TTS’s] proposal.” Id. The SSA concluded that this strength set OMG apart because its experience demonstrated “strong forward-thinking capabilities in cloud migration, and in the consolidation of large portfolio systems with varied complexities and user communities” and that OMG’s ability to leverage this experience would be advantageous to the agency. Id. Ultimately, the SSA considered this additional benefit of OMG’s proposal to be worth its approximately 0.2 percent price premium over TTS, and selected OMG’s proposal as representing the best value to the government. Id. at 21-22.

Source selection officials have broad discretion to determine the manner and extent to which they will make use of evaluation results, and must use their own judgment in deciding what the underlying differences between proposals might mean to successful performance of the contract. ERC Inc., B-407297, B-407297.2, Nov. 19, 2012, 2012 CPD ¶ 321 at 6; Applied Physical Scis. Corp., B-406167, Feb. 23, 2012, 2012 CPD ¶ 102 at 6. A protester’s disagreement with an agency’s judgments about the relative merit of competing proposals, without more, does not establish that the judgments were unreasonable. Battelle Memorial Inst., supra at 13.

Here, we find the SSA reasonably focused on OMG’s additional strength under subfactor 3 as a key discriminator between the proposals of OMG and TTS. Although the solicitation provided that all technical subfactors were to be given equal weight, there is nothing objectionable in the SSA’s reliance on a key discriminator under a single subfactor for purposes of a best‑value tradeoff. See e.g., General Dynamics Land Sys., B-412525, B-412525.2, Mar. 15, 2016, 2016 CPD ¶ 89 at 11 (noting that it is well settled that a single evaluation factor may properly be relied upon as a key discriminator for purposes of a source selection decision, and an agency’s tradeoff analysis may focus on a particular discriminator where there is a reasonable basis to do so); TriWest Healthcare Alliance Corp., B-401652.12, B-401652.13, July 2, 2012, 2012 CPD ¶ 197 at 37 (same). TTS’s arguments to the contrary reflect nothing more than its disagreement with the agency’s tradeoff decision.

The protests are denied.

Edda Emmanuelli Perez
General Counsel

 

[1] The agency provided individual reports responding to each protest using a uniform system of identifying documents and numbering agency report tabs. We cite to the two reports generally as a singular “AR,” except where necessary to differentiate between different versions of the same document included in the two reports. For example, each report includes the source selection decision at Tab 10, but the version of Tab 10 in the SI-AR has different redactions than the version of Tab 10 in the TTS-AR. We further note that our citations to the record are to documents’ Adobe PDF pagination, even in instances where the parties’ filings refer to a document’s internal pagination.

[2] On February 22, 2022, SI filed its initial protest with our Office, then on February 28 it filed a consolidated initial and first supplemental protest. Our citations refer to SI’s later submitted consolidated filing as the “SI Protest.”

[3] In its second supplemental protest, SI also challenged the agency’s evaluation of TTS’s proposal. For example, SI argued that because TTS “is an inexperienced Federal contractor with limited contract history in FPDS [the Federal Procurement Data System],” it necessarily “lacks the technical expertise to provide a proposal response warranting two Strengths in response to Subfactor 2.” SI Comments & 2nd Supp. Protest at 32. Prior to submission of its report responding to the supplemental protest, the agency requested dismissal of this argument. We agreed that dismissal was appropriate and dismissed the argument. Notice of Partial Dismissal, Apr. 13, 2022.

We noted that SI did not cite to any portion of the solicitation that limited the evaluation of offerors’ technical expertise or experience to only that gained while performing on federal contracts. Notice of Partial Dismissal, Apr. 13, 2022, at 2. Thus, even if uncontradicted, SI’s allegation that TTS has limited experience as a federal contractor does not demonstrate that it necessarily lacks sufficient expertise for its proposed approach to have reasonably merited the assessment of strengths. Id. Rather, we found that SI’s assertion was conclusory and did not provide any specific allegations regarding TTS’s proposal or the agency’s evaluation thereof that would establish a reasonable potential that the protest ground had merit. Id. We concluded that such speculation, without more, failed to set forth a legally sufficient basis of protest, and dismissed this argument accordingly. Id.; see e.g., Perspecta Enter. Solutions, LLC, B‑418533.2, B-418533.3, June 17, 2020, 2020 CPD ¶ 213 at 6 n.11, 24 (dismissing protest grounds that were based on mere speculation as to the contents of the awardee’s proposal); CALIBRE Systems, Inc., B-414301.3, Sept. 20, 2017, 2017 CPD ¶ 305 at 6 n.3 (dismissing protest because protest did not provide any specific allegations regarding the awardee’s proposal or the agency’s evaluation).

SI also challenged the evaluators’ assessment of a strength in both OMG’s proposal and TTS’s proposal under technical subfactor 3, corporate experience, for the firms’ experience with cloud migration, arguing that the agency applied an unstated evaluation criterion. SI Comments & 2nd Supp. Protest at 33-34, 40-42. The agency requested dismissal of this argument as it related to TTS’s proposal. We declined to dismiss this argument. Notice of Partial Dismissal, Apr. 13, 2022, at 3. While we also do not discuss this argument herein as it relates to either OMG’s or TTS’s proposals, we have considered the argument, and find it provides no basis to sustain the protest as the agency’s assessment was reasonably based on considerations logically encompassed in the solicitation’s evaluation criteria for technical subfactor 3.

[4] The agency explains that:

Network latency describes the amount of delay on a network or internet connection--i.e. the amount of time it takes for data to travel from one point to another. Network latency is affected by network bandwidth, which is the rate of data transfer for a fixed period of time. The Air Force Network has more limited bandwidth than a commercial network, and its architecture introduces restrictive delays in the data traveling through the communication path. This more limited bandwidth increases the risk of network latency.

SI-COS at 26.

[5] In addition to contending that the agency unreasonably failed to assign multiple strengths to the firm’s proposal, SI initially challenged the agency’s assessment of a weakness in its proposal under technical subfactor 1. SI Protest at 29-31; SI Comments & 2nd Supp. Protest at 4 n.2. In its comments on the agency’s report responding to the protest, SI withdrew this challenge. SI Comments & 2nd Supp. Protest at 4 n.2. SI also withdrew some of its unacknowledged strengths challenges. SI Comments & 2nd Supp. Protest at 4 n.2.

[6] SI has been performing the HEDC requirement since 2006. SI Protest at 4.

[7] SI explains that [DELETED] is the process of [DELETED] the particular cloud service” to which it will be migrated. SI Protest at 19-20.

[8] SI also raised allegations of disparate evaluation with respect to the agency’s evaluation of TTS’s proposal as compared to SI’s proposal under technical subfactors 2 and 3. SI Comments & 2nd Supp. Protest at 32, 42-44. Prior to submission of its report responding to the supplemental protest, the agency requested dismissal of these arguments. We declined to dismiss the protest grounds, but also did not require further briefing or record production from the agency regarding these arguments. Notice of Partial Dismissal, Apr. 13, 2022, at 3. In light of our conclusion herein that the agency’s evaluation and resulting source selection decision were reasonable and consistent with the terms of the solicitation, we need not reach these additional allegations of disparate treatment vis-à-vis SI and TTS. Notwithstanding SI’s allegations, our Office has recognized that generally no competitive prejudice can flow from alleged disparate treatment with respect to other unsuccessful offerors. Operations Servs., Inc., B‑420226, Jan. 4, 2022, 2022 CPD ¶ 21 at 5 n.4; Smiths Detection, Inc., B-420110, B‑420111, Nov. 5, 2021, 2021 CPD ¶ 359 at 6-7 n.4; Environmental Chemical Corp., B‑416166.3 et al., June 12, 2019, 2019 CPD ¶ 217 at 6 n.5. Competitive prejudice is an essential element of any viable protest, and where none is shown or otherwise evident, we will not sustain a protest, even if a protester may have shown that an agency’s actions arguably were improper. Id.

[9] SI also initially argued that the “agency unreasonably concluded that [the awardee] could provide program continuity given its unrealistically low rates.” SI Protest at 67. Similarly, SI maintained that the awardee’s lower total proposed price “suggests that [the awardee] failed to propose escalation of rates throughout the contract, rendering its pricing unbalanced ‘at face value’.” Id. at 69. Prior to submission of its report responding to the protest, the agency requested dismissal of these arguments. We agreed that dismissal was appropriate. Notice of Partial Dismissal, Mar. 16, 2022.

We noted that the only support SI provided for its contention that the awardee could not provide program continuity at its “unrealistically low rates,” was SI’s assumption that the awardee must have proposed low labor rates because its total proposed price was lower than SI’s total proposed price. Notice of Partial Dismissal, Mar. 16, 2022, at 2. Similarly, the only support SI provided for its contention that the awardee failed to propose escalation rates was SI’s speculation that this must have been the case given the awardee’s lower proposed price. Id. We found that SI’s unsupported speculation regarding the labor rates and year-to-year price balancing proposed by the awardee failed to set forth a legally sufficient basis of protest, and dismissed these arguments accordingly. Id.; see e.g., Sayres & Assocs. Corp., B-418382, Mar. 31, 2020, 2020 CPD ¶ 134 at 4 n.6 (“[E]vidence that the awardee proposed a somewhat lower cost than the protester, alone, is not generally enough to establish a legally sufficient challenge to an agency’s cost realism assessment. This is because such an argument, by itself, does not address the possibility that an awardee simply proposed a different technical approach or composed their indirect labor rates differently such that the somewhat lower cost is realistic for the awardee’s proposed approach.”); Computer Tech. Assocs., Inc., B‑403798, B-403798.2, Dec. 2, 2010, 2010 CPD ¶ 280 at 4 n.1 (dismissing challenge to awardee’s price proposal based on only protester’s unsupported speculation that awardee proposed low labor rates).

SI also contended that TTS’s proposed professional employee compensation was unrealistically low, and presented as support for its challenge only TTS’s lower total proposed price, which was lower than both SI’s and the awardee’s prices. SI Comments & 2nd Supp. Protest at 50. Accordingly, we also dismissed this allegation as failing to set forth a legally sufficient basis of protest. Notice of Partial Dismissal, Apr. 13, 2022, at 4.

Downloads

GAO Contacts