Skip to main content

Metric 8 LLC; M6-VETS, LLC; RCH Partners, LLC; Stratera Fulcrum Technologies, LLC; MERPTech, LLC

B-419759.2,B-419759.3,B-419759.4,B-419759.5,B-419759.7,B-419759.9,B-419759.10,B-419759.11,B-419759.12,B-419759.13 Jul 29, 2021
Jump To:
Skip to Highlights

Highlights

Metric 8 LLC, a small business of Atlanta, Georgia; M6-VETS, LLC, a small business of Charleston, South Carolina; RCH Partners, LLC, a small business of Leesburg, Virginia; Stratera Fulcrum Technologies, LLC, a small business joint venture of Alexandria, Virginia; and MERPTech, LLC, a small business joint venture of Herndon, Virginia, protest the award of multiple indefinite-delivery/indefinite-quantity (IDIQ) contracts under request for proposals (RFP) No. 1333BJ20R00280001, issued by the Department of Commerce, U.S. Patent and Trademark Office (PTO). The RFP anticipated the award of contracts for information technology (IT) development, modernization, enhancement, operations, and maintenance services in support of both legacy and modernized PTO software products, referred to by the agency as the business oriented software solutions (BOSS) procurement. The protesters primarily challenge the agency's evaluation of proposals and resulting source selection decisions.

We deny the protests.
View Decision

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of:  Metric 8 LLC; M6-VETS, LLC; RCH Partners, LLC; Stratera Fulcrum Technologies, LLC; MERPTech, LLC

File:  B-419759.2; B-419759.3; B-419759.4; B-419759.5; B-419759.7;

  B‑419759.9; B-419759.10; B-419759.11; B-419759.12; B-419759.13

Date:   July 29, 2021

Damien C. Specht, Esq., James A. Tucker, Esq., and David Allman, Esq., Morrison & Foerster LLP, for Metric 8 LLC; Ryan C. Bradel, Esq., P. Tyson Marx, Esq., Stephen G. Darby, Esq., and Chelsea A. Padgett, Esq., Ward & Berry PLLC, for M6-VETS, LLC; Jon Davidson Levin, Esq., W. Brad English, Esq., and Emily J. Chancey, Esq., Maynard Cooper Gale, for RCH Partners, LLC; Jonathan T. Williams, Esq., Meghan F. Leemon, Esq., Eric A. Valle, Esq., and Christine C. Fries, Esq., Piliero Mazza, PLLC, for Stratera Fulcrum Technologies, LLC; J. Alex Ward, Esq., Rachel K. Plymale, Esq., and Caitlin A. Crujido, Esq., Morrison & Foerster LLP, for MERPTech, LLC, the protesters.
Alexander J. Brittin, Esq., Brittin Law Group, PLLC, and Mary Pat Buckenmeyer, Esq., Dunlap Bennett & Ludwig PLLC, for Halvik, Inc.; Elizabeth N. Jochum, Esq., Todd M. Garland, Esq., and Nora K. Brent, Esq., Smith Pachter McWhorter PLC, for RIVA Solutions, Inc.; Gary J. Campbell, Esq., G. Matthew Koehl, Esq., and Lidiya Kurin, Esq., Womble Bond Dickinson (US) LLP, for Booz Allen Hamilton Inc.; James J. McCullough, Esq., Michael J. Anstett, Esq., Anayansi Rodriguez, Esq., and Christopher H. Bell, Esq., Fried, Frank, Harris, Shriver & Jacobson LLP for Science Applications International Corporation; David S. Black, Esq., Gregory R. Hallmark, Esq., Amy L. Fuentes, Esq., Kelsey M. Hayes, Esq., and Hillary J. Freund, Esq., Holland & Knight LLP, for Steampunk, Inc., the intervenors.
Andrew Squire, Esq., and Chieko M. Clarke, Esq., Department of Commerce, for the agency.
Christopher Alwood, Esq., and Christina Sklarew, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1.  Protests challenging the agency’s evaluation of proposals under all of the solicitation’s evaluation factors is denied where either the evaluation was reasonable and consistent with the solicitation’s criteria or the protesters could not establish they were competitively prejudiced by the agency’s actions.

2.  Protests challenging the agency’s comparative analysis and source selection decisions under the solicitation’s source selection scheme, which based award on the highest technically rated proposals with fair and reasonable prices, are denied where the agency’s comparative analysis and source selection decisions were reasonable, adequately documented, and consistent with the terms of the solicitation.

DECISION

Metric 8 LLC, a small business of Atlanta, Georgia; M6-VETS, LLC, a small business of Charleston, South Carolina; RCH Partners, LLC, a small business of Leesburg, Virginia; Stratera Fulcrum Technologies, LLC, a small business joint venture of Alexandria, Virginia; and MERPTech, LLC, a small business joint venture of Herndon, Virginia, protest the award of multiple indefinite-delivery/indefinite-quantity (IDIQ) contracts under request for proposals (RFP) No. 1333BJ20R00280001, issued by the Department of Commerce, U.S. Patent and Trademark Office (PTO).  The RFP anticipated the award of contracts for information technology (IT) development, modernization, enhancement, operations, and maintenance services in support of both legacy and modernized PTO software products, referred to by the agency as the business oriented software solutions (BOSS) procurement.  The protesters primarily challenge the agency’s evaluation of proposals and resulting source selection decisions.

We deny the protests.

BACKGROUND

On May 29, 2020, the agency issued the RFP under the commercial item procedures of Federal Acquisition Regulation (FAR) part 12, using the negotiated procurement policies and procedures established under FAR part 15, seeking proposals to provide services, primarily in the form of agile teams, in support of development, modernization, enhancement, operations, and maintenance of PTO IT products.  Agency Report (AR), Tab A17, RFP at 7;[1] Contracting Officer’s Statement (COS), B-419759.2 at 2.  The RFP contemplated the award of multiple IDIQ contracts with 10-year ordering periods.  Id.  The solicitation advised offerors that the PTO intended to award at least five IDIQ contracts at a ratio of 3:2 for small to large businesses, representing a 60 percent small business set-aside goal for prime contract awardees.  RFP at 78.  The solicitation specified that the exact number of awardees had not been pre-determined and that the agency could award more or fewer than five IDIQ contracts.  Id.

The RFP provided for award to the highest technically rated proposals with fair and reasonable prices, considering four non-price evaluation factors, in descending order of importance:  (1) small business participation, (2) technical approach, (3) past performance, and (4) program management and staffing approach.  Id. at 78-79.

To evaluate the small business participation factor, the agency would assess all large business offerors’ small business participation plans and small business subcontracting plans.  Id. at 79-80.  The agency would evaluate the small business participation plans to determine the extent of an offeror’s proposed participation and commitment to use small businesses in the performance of the BOSS procurement.  Id at 80.  The solicitation specified that small business offerors would not be evaluated under this factor.  Id

The agency was to evaluate proposals under the technical approach factor considering the offeror’s proposed approaches to agile development, system and software development, and system tests and delivery.  Id. at 81.  As relevant here, the RFP provided that the agency would evaluate “how well the proposed system and software development and architecture, engineering, and design processes will perform as part of a holistic [development, security, and operations] and Agile development approach.”  Id.  The performance work statement (PWS) noted that, as part of its development, security, and operations objective, PTO “is driving a comprehensive architectural move to micro-services.”  AR, Tab A12, PWS at 6.

The RFP provided that the agency would assign each offeror’s technical approach an adjectival rating of superior, satisfactory, or unsatisfactory.  RFP at 79.  The RFP did not define the adjectival ratings; however, the source selection plan (SSP) [2] contains the standards the agency used to evaluate proposals.

A “superior” rating under the technical approach factor[3] was defined as:

The proposal significantly exceeds the solicitation requirements in a manner that benefits the government. The proposal is comprehensive and demonstrates a thorough approach to and understanding of the solicitation requirements.  The proposal contains strength(s) and may contain weaknesses, but contains no significant weaknesses or deficiencies.[4] The combined impact of the strengths considerably outweighs the combined impact of the weaknesses and as such, the chance of unsuccessful performance is very low. 

AR, Tab B01, SSP at 8. 

A “satisfactory” rating under the technical approach factor was defined as:[5]

The proposal meets the solicitation requirements.  The proposal demonstrates an adequate approach to and understanding of the solicitation requirements.  The proposal may contain strength(s), weaknesses, or significant weaknesses but does not contain any deficiencies.  The combined impact of strengths, weaknesses, and significant weaknesses results in an overall low chance of unsuccessful performance. 

Id. at 8-9.

The agency was to evaluate proposals under the past performance factor by considering past performance information provided by the offeror and the offeror’s references to determine the likelihood that the offeror would successfully perform the contract.  RFP at 81-82.  While the RFP provided that the agency could consider past performance information from other sources, it did not require the agency to do so.  Id. at 81. 

As relevant here, the RFP required offerors to provide:

contract summaries for a minimum of three (maximum of five) contracts and/or orders for services that are recent and relevant to the solicitation’s requirements.  At least two (2) contract summaries shall be in reference to the prime offeror’s own past performance as either a prime or a 1st tier sub-contractor under a previous contract.  If the offeror is submitting as a Joint Venture, then the two (2) prime offeror contract summaries may be in reference to work performed by either or both of the partnering companies.

Id. at 72.

The RFP allowed offerors to submit information from work performed by subcontractors, work performed as part of a team or joint venture, and work performed by “other previous incarnation[s] of its current organization.”  Id. at 73.  However, the RFP specified that “the offeror shall clearly define what entity performed the work if other than the prime offeror’s past performance is submitted.”  Id.   

The RFP specified that the agency would only evaluate past performance that it deemed recent and relevant.  Id.  The RFP defined recent past performance as work ongoing or completed during the three years prior to the solicitation’s due date for proposals.  Id.  The RFP specified that the agency would consider an offeror’s past performance to be relevant if it met certain size, scope, and complexity requirements.[6]  Id.  The RFP provided that the agency would assign each offeror’s technical approach an adjectival rating of superior, satisfactory, neutral, or unsatisfactory.  Id. at 79. 

The SSP defined a “superior” rating under the past performance factor as:

The past performance response gives [the agency] a high degree of confidence that the solicitation requirements will be met in a timely and cost-effective manner.  The combined impact of the increases confidence[7] findings considerably outweighs the combined impact of the decreases confidence findings and as such, the chance of unsuccessful performance is very low.

AR, Tab B01, SSP at 9. 

The SSP defined a “satisfactory” rating under the past performance factor as:

The past performance response gives the [the agency] confidence that the solicitation requirements will be met in a timely and cost-effective manner.  The combined impact of the increases confidence findings offsets the combined impact of the decreases confidence findings and as such, the chance of unsuccessful performance is low.

Id

The SSP defined a neutral past performance rating as “[n]o relevant past performance record is identifiable upon which to base a meaningful past performance rating.  This is neither a negative or positive assessment.”  Id.  The SSP defined an “unsatisfactory” rating as “[t]he past performance response gives [the agency] low confidence” of performance in a timely and cost-effective manner . . . the chance of unsuccessful performance is moderate to high.”  Id

With regard to the program management and staffing approach factor, the agency would evaluate proposals to determine whether they met or exceeded the contract requirements from the PWS.  RFP at 82.  In its evaluation, the agency was to specifically consider the proposed program management team, the offeror’s approach to identified types of risk, how well the offeror’s approach promotes collaboration and manages interdependencies with agency staff and other contractors, the offeror’s approach to task order transitions, and the proposed staffing approach.[8]  Id. at 82-83.  The RFP provided that the agency would assign each offeror’s program management and staffing approach an adjectival rating of superior, satisfactory, or unsatisfactory.  Id. at 79.  As noted above, the SSP used the same definitions for the adjectival ratings under the technical approach and program management and staffing approach factors.  AR, Tab B01, SSP at 8. 

The agency was to evaluate proposed labor rates and their associated build-up elements for each labor category to determine if they were fair and reasonable.  RFP at 83.  The RFP did not specify how the agency would determine the highest technically rated proposals.  However, the SSP provided that the agency would utilize the mathematical transitive property[9] when conducting the comparative analysis of proposals.  AR, Tab B04, SSP at 19.  In this regard, the SSP specified that if the evaluation team determined that “offeror A represents a better value than offeror B and offeror B represents a better value than offeror C, then the team can reasonably conclude that offeror A represents a better value than offeror C without conducting a head-to-head comparative analysis.”  Id

The deadline for the submission of proposals was July 30, 2020.  RFP at 1.  On or before the July 30 due date, the agency received 24 timely proposals.  COS, B‑419759.2 at 2.  After the initial evaluation of proposals, the agency conducted a best-value comparative analysis to identify the three highest technically rated small business proposals.  Id. at 11.  The contracting officer (CO) explained how the agency performed this analysis as follows:

The ‘small businesses’ best value analysis began with the CO conducting a cursory assessment of all the consensus evaluation summaries (i.e. ratings and types and number of findings) to identify a ‘control’ offeror.  The ideal ‘control’ offeror would allow the evaluation team to identify the three highest technically rated small businesses with the fewest possible number of vendor-to-vendor comparisons.  With this in mind, the CO identified [Offeror A], Halvik, and Steampunk as candidates to be used as the ‘control’ offeror.  Ultimately, from the three candidates identified, the CO selected [Offeror A], as it was the first offeror listed, as the ‘control’ offeror for comparison purposes. The evaluation team then conducted the vendor-to-vendor comparisons by comparing [Offeror A] as the ‘control’ offeror to all other small business offerors.

Id. at 11. 

After identifying the three highest technically rated small business proposals with a fair and reasonable price, the agency conducted another best-value comparative analysis with all remaining offerors, both small and large.  Id.  The contracting officer identified SAIC’s proposal as the large business control proposal.  Id.  As part of its second best-value comparative analysis, the evaluators found that SAIC’s proposal was higher technically rated than the proposals submitted by the protesters.  AR, Tab B10, Best-Value Comparative Analysis, tabs “01v19”, “19v03”.  As a result of the second comparative analysis, the evaluation team identified Booz Allen and SAIC as having the highest technically rated non-small business proposals with fair and reasonable prices.  COS, B‑419759.2 at 12. 

The agency evaluated the awardees’, the control offerors’, and the protesters’ proposals as follows:

 

Small Business Participation

Technical Approach

Past Performance

Program Management and Staffing Approach

Price

Booz Allen

Satisfactory

Superior

Superior

Superior

Fair and Reasonable

SAIC

Superior

Satisfactory

Superior

Superior

Fair and Reasonable

Halvik

Not Applicable

Satisfactory

Superior

Superior

Fair and Reasonable

RIVA

Not Applicable

Superior

Superior

Satisfactory

Fair and Reasonable

Steampunk

Not Applicable

Satisfactory

Superior

Superior

Fair and Reasonable

[Offeror A]

Not Applicable

Superior

Satisfactory

Satisfactory

Fair and Reasonable

Metric 8

Not Applicable

Satisfactory

Superior

Satisfactory

Fair and Reasonable

M6-VETS

Not Applicable

Satisfactory

Superior

Satisfactory

Fair and Reasonable

RCH

Not Applicable

Satisfactory

Superior

Satisfactory

Fair and Reasonable

Stratera

Not Applicable

Satisfactory

Satisfactory

Satisfactory

Fair and Reasonable

MERPTech

Not Applicable

Superior

Satisfactory

Satisfactory

Fair and Reasonable

 

AR, Tab B10, Best-Value Comparative Analysis, tab “Evaluation Summary”.

The source selection authority (SSA) independently evaluated the proposals and reviewed the evaluation team’s findings, including the consensus evaluation report, the price evaluation report, the best value comparative analysis document, and the award recommendation memorandum.  AR, Tab B12, Source Selection Decision Document (SSDD) at 4.  The SSA concurred with the findings of the evaluation team and selected the proposals submitted by Booz Allen, SAIC, RIVA, Halvik, and Steampunk for award.  Id. at 6. 

On April 2, 2021, the PTO notified the protesters that they had not been selected for award.  COS, B-419759.2 at 14.  After the agency provided debriefings, these protests followed.[10]

DISCUSSION


The protesters generally challenge the agency’s evaluation of proposals and resulting source selection decisions.  We note that the protesters raise many collateral arguments.  While our decision does not specifically address every argument, we have reviewed all the arguments and conclude that none provides a basis to sustain the protests.  We discuss several representative issues below. 

As an initial matter, we dismiss several protest grounds that were not suitable for consideration on the merits.  For example, Metric 8’s initial protest challenged the agency’s assignment of a “satisfactory” rating under the technical approach factor, objected to the agency’s alleged failure to assess 11 additional strengths to its proposal, challenged the reasonableness of the lone weakness the agency assigned its proposal, and generally argued that the agency’s past performance evaluation of the small business awardees was unreasonable.  Metric 8 Protest at 5-14.  The agency provided a detailed response to these protest allegations.  See Memorandum of Law (MOL), B‑419759.2.  In response, Metric 8 did not rebut or address many of the agency’s arguments, instead raising supplemental protest grounds that only address two of the strengths it alleges it should have received and limiting its past performance challenge to the agency’s evaluation of Halvik’s proposal.  See Metric 8 Comments & Supp. Protest at 2-9.  Accordingly, we dismiss the protest grounds on which Metric 8 did not comment as abandoned.  See Tec-Masters, Inc., B‑416235, July 12, 2018, 2018 CPD ¶ 241 at 6.

Similarly, M6-VETS challenged the agency’s failure to disqualify awardee Halvik’s proposal for allegedly using an improperly small font size.  M6-VETS Comments & Supp. Protest at 21.  The agency provided a detailed response that M6-VETS did not rebut or address in its supplemental comments.  Supp. MOL, B-419759.3, B-419759.9 at 40‑42; see M6-VETS Supp. Comments.  We also dismiss this ground of protest as abandoned.  M6-VETS also raises several challenges to the adequacy of its debriefing, arguing that the agency failed to provide an overall ranking of offerors and required awardee price information.  M6-VETS Protest at 9-12; M6-VETS Comments & Supp. Protest at 12, 16-17.  We dismiss these protest grounds because our Office does not review protests challenging the adequacy of debriefings.  American Native Veterans of Louisiana, B-414555.2, July 11, 2017, 2017 CPD ¶ 219 at 5-6 n.3, citing A1 Procurement, JVG, B-404618, Mar. 14, 2011, 2011 CPD ¶ 53 at 5 n.5 (debriefings are procedural matters that do not affect the validity of an award). 

General Evaluation Challenges

First, several of the protesters generally challenge the agency’s assignment of adjectival ratings to the proposals as evaluated under the technical approach, program management and staffing approach, or past performance factors, arguing that certain proposals warranted higher or lower ratings.[11]  For example, M6-VETS contends that the agency’s assignment of adjectival ratings under the technical approach factor was improper because the ratings did not correspond with the number of strengths and weaknesses assessed to each proposal.  M6-VETS Comments & Supp. Protest at 18‑19.  In this regard, M6-VETS argues that the agency’s rating of M6-VETS’s technical approach--which had been evaluated to have four strengths and no weaknesses--as “satisfactory” instead of “superior” was “patently unreasonable” because three other offerors had been assessed the same “satisfactory” rating despite having fewer strengths, more weaknesses, or both.[12]  Id. at 19. 

The agency responds that it reasonably assessed adjectival ratings in accordance with the RFP.  Supp. MOL, B-419759.3, B-419759.9, at 33-35.  The agency argues that, instead of M6-VETS’s “simplistic number counting of strengths and weaknesses,” PTO assigned adjectival ratings by considering the underlying character and quality of the strengths and weaknesses.  Id. at 34-35.  We agree with the agency. 

In reviewing a protest challenging an agency’s evaluation, our Office will not reevaluate proposals, nor substitute our judgment for that of the agency, as the evaluation of proposals is a matter within the agency’s discretion.  Rather, we will review the record to determine whether the agency’s evaluation was reasonable and consistent with the stated evaluation criteria and with applicable procurement statutes and regulations.  AECOM Mgmt. Servs., Inc., B‑417639.2, B‑417639.3, Sept. 16, 2019, 2019 CPD ¶ 322 at 9.  A protester’s disagreement with the agency’s judgment, without more, is insufficient to establish that the agency acted unreasonably.  Vertex Aerospace, LLC, B‑417065, B‑417065.2, Feb. 5, 2019, 2019 CPD ¶ 75 at 8. 

M6-VETS’s disagreement with its rating based on the number of assessed strengths and weaknesses is misplaced.  There is no legal requirement that an agency award the highest possible rating under an evaluation factor simply because the proposal contains strengths and/or is not evaluated as having any weaknesses.  See Applied Tech. Sys., Inc., B‑404267, B‑404267.2, Jan. 25, 2011, 2011 CPD ¶ 36 at 9.  Evaluation ratings and the number of strengths and weaknesses assessed are merely a guide to, and not a substitute for, intelligent decision making in the procurement process.  Affolter Contracting Co., Inc., B‑410878, B‑410878.2, Mar. 4, 2015, 2015 CPD ¶ 101 at 11 n.10. 

As noted above, the SSP defined a “superior” rating under the technical approach factor as “significantly exceeds the solicitation requirements in a manner that benefits the government . . . is comprehensive and demonstrates a thorough approach” and allowed that the proposal “may contain weaknesses.”  AR, Tab B01, SSP at 8.  On the other hand, a “satisfactory” rating merely meets the solicitation requirements and “may contain strength(s), weaknesses, or significant weaknesses.”  Id. at 8-9.  M6-VETS has not explained why the various proposals it objected to being rated “satisfactory” under the technical approach factor do not meet that definition based on their assessed strengths and weaknesses.  With regard to the rating assigned to M6-VETS’s proposal, the contemporaneous record shows that while the agency found its proposal to be “comprehensive” and demonstrate a thorough approach, it found that the proposal only met, but did not “significantly exceed” the solicitation requirements, as required for a superior rating.  AR, Tab B06, Technical Approach Consensus Evaluation, tab “12F2”.  Accordingly, we find that M6-VETS’s arguments here--like the other protesters’ similar challenges to the assessment of adjectival ratings to the proposals as evaluated--simply constitute disagreement with the agency’s judgements and provide no basis for sustaining M6-VETS’s protest. 

RCH generally challenges the agency’s evaluation of proposals for failing to consider the “uniqueness” of an aspect of a proposal when assessing strengths under the technical approach and program management and staffing approach factors.  RCH Protest at 10‑11; RCH Comments & Supp. Protest at 5.  RCH complains that because other offerors’ proposed unique approaches that were assessed strengths by the agency, it was unreasonable for the agency not to assess strengths to the unique approaches RCH proposed.  RCH Comments & Supp. Protest at 5. 

On this record, we see no basis to sustain this protest ground.  RCH does not point to, and the record does not reveal, anything in the RFP that would require the agency to evaluate proposals for “uniqueness.”  Further, RCH has not explained why a unique aspect of its proposal would automatically increase the potential for successful contract performance.  As noted above, the RFP did not specify what constituted a strength, but the SSP provided that a strength was an “aspect of an offeror’s proposal that increases the potential for successful contract performance and/or has a positive impact for the government.”  AR, Tab B01, SSP at 9.  Further, while RCH points to several unique aspects of other proposals that the agency considered strengths, it has not demonstrated that any of those strengths were assessed simply because they were unique.  Accordingly, RCH has not demonstrated that the agency’s failure to evaluate “uniqueness” has violated the terms of the RFP or otherwise treated offerors unequally.  As such, we deny this ground of protest. 

Technical Approach Evaluation

The protesters challenge several aspects of the agency’s evaluation under the technical approach factor.[13]  First, Stratera, RCH, and Metric 8 challenge the agency’s assessment of strengths under the technical approach factor, arguing both that the agency treated offerors disparately in its assessment of certain strengths and failed to recognize other, additional, strengths in the proposals.  We have reviewed the evaluation record and find no basis to question the agency’s assignment of strengths under the technical approach factor.[14] 

For example, both RCH and Stratera allege that their technical approaches were evaluated unequally when compared to RIVA’s.  RCH Comments & Supp. Protest at 12‑13; Stratera Comments & Supp. Protest at 36-37.  Specifically, the protesters complain that while RIVA was assessed a strength for its proposed use of the scrum and Kanban development methodologies,[15] neither protester was given a strength for what they contend are virtually identical, or superior, proposal features.  Id.; RCH Supp. Comments at 2-3; Stratera Supp. Comments at 15‑17. 

In response, the agency explains that RIVA’s proposal contained a “significantly more detailed and more substantive” discussion of scrum and Kanban than RCH’s and Statera’s, including clearly describing when to use each method.  Supp. MOL, B‑419759.4, B-419759.11 at 6-10; Supp. MOL, B-419759.5, B-419759.12 at 10-17.  In support of its argument, the agency points to a workflow diagram in RIVA’s proposal that outlines both its scrum and Kanban methodologies and compares the level of detail in RIVA’s discussion of scrum and Kanban to RCH’s and Stratera’s proposals.  Supp. COS, B‑419759.4, B-419759.11 at 10-16; Supp. COS, B-419759.5, B‑419759.12 at 37‑41. 

It is a fundamental principle of federal procurement law that a contracting agency must treat all offerors equally and evaluate their proposals evenhandedly against the solicitation’s requirements and evaluation criteria.  Abacus Tech. Corp.; SMS Data Prods. Grp., Inc., B‑413421 et al., Oct. 28, 2016, 2016 CPD ¶ 317 at 11.  Where a protester alleges unequal treatment in a technical evaluation, it must show that the differences in the evaluation did not stem from differences between the proposals.  Nexant Inc., B-417421, B-417421.2, June 26, 2019, 2019 CPD ¶ 242 at 10. RCH and Stratera have not made such a showing here.

The contemporaneous record shows that the agency assessed a strength to RIVA’s technical approach for its proposed “use of both scrum & Kanban and its clear and detailed description of when to use each method.”  AR, Tab B06, Technical Approach Consensus Evaluation, tab “18F2.”  RIVA’s proposal contains a workflow diagram and short narrative broadly outlining its process of when it would utilize scrum or Kanban as its Agile development method, followed by pages of narrative explanation of RIVA’s process for using each method.  AR, Tab C24, RIVA Proposal Technical Approach Volume at 5‑7.  While both RCH and Stratera, like RIVA, propose to use the scrum methodology for larger and less time sensitive development efforts and the Kanban methodology for unplanned and/or time sensitive development efforts, neither provides the same level of narrative detail as RIVA about their framework for implementing the methodologies.  AR, Tab D38, RCH Proposal Technical Approach Volume at 9; AR, Tab D51, Stratera Proposal Technical Approach Volume at 5‑6. 

In its final comments, Stratera essentially concedes that RIVA’s proposal contained more detail of this aspect of the proposals, but argues that it still should have been similarly awarded a strength for proposing “the same mechanisms.”  Stratera Supp. Comments at 16.  RCH points back to language from its proposal that describes the elements of its “Lean-Agile Methodology,” but RCH does not explain, and our review of the record does not reveal, a detailed description of their scrum and Kanban frameworks and when it will use each in the language it references.  See AR, Tab D38, RCH Proposal Technical Approach Volume at 6-7, 9 (explaining RCH’s proposed “Lean-Agile Methodology” could use either “the scrum or Kanban workflow”).  We find unobjectionable the agency’s conclusion that RIVA’s proposal included a more clear and detailed description of its proposed use of the scrum and Kanban methods.  The record here demonstrates that the differences in the assessments of strengths stem from the differences in the details found in the proposals.  Accordingly, we deny these grounds of protest. 

Stratera also challenges the agency’s assessment of two weaknesses in its evaluation of Stratera’s technical approach.  We have reviewed the evaluation record and find no basis to question the agency’s assignment of weaknesses to Stratera’s proposal under the technical approach factor.  By way of example, Stratera alleges that the agency unreasonably assessed its proposal a weakness for lacking detail on their proposed microservices architecture.  Stratera Protest at 10-11; Stratera Comments & Supp. Protest at 18-21.  Stratera argues that it provided all the information required by the solicitation, including details regarding its microsevices architecture.  Stratera Protest at 10. 

The agency responds that Stratera’s proposal “provides little more than a mention of the requirements” when discussing its proposed microservices architecture.  COS, B‑419759.5 at 21.  The agency argues that Stratera did not meet the RFP’s standard that its proposal should be “clear, coherent, and prepared in sufficient detail for effective evaluation” and that Stratera’s proposal did not sufficiently explain how it intended to accomplish the work with microservices architecture.  Id. (citing RFP at 68); MOL, B‑419759.5 at 40-41. 

Again, our review of an evaluation challenge is to determine whether the agency’s evaluation was reasonable and consistent with the stated evaluation criteria and with applicable procurement statutes and regulations.  AECOM Mgmt. Servs., Inc., supra.  A protester’s disagreement with the agency’s judgment, without more, is insufficient to establish that the agency acted unreasonably.  Vertex Aerospace, LLC, supra.  Further, it is an offeror’s responsibility to submit a well-written proposal with adequately detailed information that clearly demonstrates compliance with the solicitation requirements and allows a meaningful review by the procuring agency.  CACI Techs., Inc., B‑296946, Oct. 27, 2005, 2005 CPD ¶ 198 at 5.

As noted above, the RFP provided that the agency would evaluate the technical approach factor for “how well the proposed system and software development and architecture, engineering, and design processes will perform as part of a holistic [development, security, and operations] and Agile development approach.”  RFP at 81.  The PWS noted that, as part of its development, security, and operations objective, the agency “is driving a comprehensive architectural move to micro-services.”  AR, Tab A12, PWS at 6.  The agency assessed the weakness at issue under the above evaluation criteria, noting “[t]he vendor’s proposal lacks details on their proposed microservices architecture.  As the government plans to migrate to a microservices based architecture, lack of details on the same in the proposal poses a potential risk to the government.”  AR, Tab B06, Technical Approach Consensus Evaluation, tab “23F2.”

Here, we have reviewed the evaluation record and find no basis to question the agency’s assessments regarding Stratera’s proposed microservices architecture.  In this regard, we note that while Stratera cites to several references to microservices in the technical approach portion of its proposal, see Stratera Comments & Supp. Protest at 19-20, it does not point to, nor does our review of the record reveal, any significant explanation of how it intended to use or implement its microservices architecture, only that it would.  AR, Tab D51, Stratera Proposal Technical Approach Volume at 3, 11-12 (“[products are designed] using API-first development, Microservices, and Domain Driven Design . . . teams utilize a set of microservices related to a unique business domain”). 

Based on our review of the record, we find nothing unreasonable in the agency’s conclusion that the information contained in Stratera’s proposal did not provide sufficient detail as to how the offeror planned to perform contract tasks utilizing its microservices architecture.  While the RFP did not specifically require discussion of microservices architecture, the PWS listed it as one of the agency’s development, security, and operations objectives.  Stratera, having proposed the use of such microservice architecture, was required to provide sufficient detail so the agency could evaluate how it would use microservice architecture “as part of a holistic” development, security, and operations approach.  Stratera’s disagreements with the agency’s judgments do not provide a basis to sustain its protest. 

Program Management and Staffing Approach Evaluation

The protesters challenge several aspects of the agency’s evaluation under the program management and staffing approach factor.  Stratera, RCH,[16] MERPTech, and Metric 8 challenge the agency’s assessment of strengths under the program management and staffing approach factor, arguing that the agency treated offerors disparately in its assessment of certain strengths and/or failed to recognize other, additional strengths in the proposals.  We have reviewed the evaluation record and find no basis to question the agency’s assignment of strengths under this factor. 

For example, RCH, MERPTech, and Metric 8 allege that the agency unreasonably assessed a strength to Halvik’s proposal for its employee retention plan and did not assess the three protesters a similar strength for “virtually identical” features.  See, e.g., RCH Supp. Comments at 5. 

The RFP provided that the agency would evaluate each offeror’s “approach to motivate and retain qualified personnel.”  RFP at 83.  The agency assessed Halvik’s proposal a strength, stating:

[Halvik] employs a tailored program to retain highly skilled Agile team members including - open communication regarding innovation and new technologies; celebration of employee accomplishments; bonus pools; generous benefit package; training, etc. The vendor has key activities related to technical competency and innovation - employees technical challenges, ‘hack-a-thons,[’] lunches and quarterly group events, and a strong certification culture, where collectively they claim thousands of Team employees with certifications in dozens of technical areas aligned to [PTOs] technology stack. The cumulative effect of these offerings is they have in place tools to increase the potential for successful contract performance by recruiting and retaining highly-skilled, highly-motivated talent to support the [PTO]'s product needs.

AR, Tab B08, Program Management and Staffing Approach Consensus Evaluation, tab “08F4.”

The agency responds by pointing to statements from Halvik’s proposal that it argues collectively justify each part of the assessed strength; then compares that language to that of the protester’s proposals, arguing that the proposed retention plans, while sometimes containing similar elements, are not “nearly identical.”  Supp. COS, B‑419759.4, B-419759.11 at 19-21; Supp. COS, B-419759.2, B-419759.10 at 11‑12; Supp. COS, B-419759.7, B-419759.13 at 3-6.  In sum, the agency argues that the three protesters’ proposals did not provide a similar level of detail or an identical set of advantages in their proposed employee retention plans when compared to Halvik’s, and that the agency reasonably assessed or did not asses strengths as a result of those differences.  See, e.g., Supp. MOL, B‑419759.4, B‑419759.11 at 17.

We first note that only Metric 8 argues in its final comments that its underlying proposal is “substantively indistinguishable” when it comes to employee retention.  Metric 8 Supp. Comments at 12.  However, the record demonstrates the retention plans were not identical.  Despite its arguments to the contrary, Metric 8 does not show where it proposed an equivalent to the “IT culture founded on open and frequent communication with staff and focused on innovation and infusion of new technologies” that the agency directly cited in assessing a strength in Halvik’s proposal.[17]  See AR, Tab C15, Halvik Proposal Program Management and Staffing Approach Volume at 14.  Further, while the agency concedes there are some aspects of Metric 8’s proposal that discuss the celebration of employee accomplishments, the proposal language Metric 8 cites does not propose identical aspects when compared to the language in Halvik’s proposal.  Specifically, the agency points to features of Halvik’s retention plan, including “multiple collaboration touchpoints between team members and management,” a continuous performance management system, and a “promote from within” culture, which Metric 8 does not propose.  See Metric 8 Supp. Comments at 12‑13 (comparing the Halvik and Metric 8 proposal language relating to the celebration of employee accomplishments). 

RCH and MERPTech, on the other hand, do not generally dispute that Halvik’s retention plan contained aspects that their proposals did not; instead, they argue that these aspects that their proposals do not match were not explicitly named in the narrative of the strength in the contemporaneous record, and should be ignored as a post hoc rationalization by the agency.[18]  RCH Supp. Comments at 5; MERPTech Supp. Comments at 3.  However, post‑protest explanations that provide a detailed rationale for contemporaneous conclusions and simply fill in previously unrecorded details will generally be considered in our review of evaluations and award determinations, so long as those explanations are credible and consistent with the contemporaneous record.  AdvanceMed Corp.; TrustSolutions, LLC, B‑404910.4 et al., Jan. 17, 2012, 2012 CPD ¶ 25 at 21 n.14.  Here, in response to the protesters’ allegation, the agency has simply explained why the evaluators found that certain features of the awardees’ proposals were viewed a strength while allegedly similar elements of the protester's proposal were not.  In these circumstances, where the protester has not shown the agency explanation to be inconsistent with the contemporaneous evaluation record or unreasonable, we have no basis to question the evaluation. 

In sum, based on our review of the record, we find no basis to question the agency’s assessments regarding the relative merits of the retention plans.  RCH, MERPTech, and Metric 8 have not shown that the differences in the evaluation here did not stem from differences in the proposals.  See Nexant Inc., supra

M6-VETS, Stratera, RCH, and MERPTech also challenge the agency’s assessment of weaknesses and significant weaknesses under the program management and staffing approach factor, arguing that the agency treated offerors disparately in its assessment of certain weaknesses and/or unreasonably assessed weaknesses to their proposals.  We have reviewed the evaluation record and find no basis to question the agency’s assignment of weakness and significant weaknesses under the program management and staffing approach factor. 

For example, M6-VETS and Stratera challenge the agency’s evaluation of their proposals’ approaches to task order transitions out.  We discuss each in turn below.  As we noted above, the RFP required the agency to evaluate offerors’ program management and staffing approaches for “how well the offeror’s approach manages task orders transition (in and out), including significant actions and notional timeframes.”  RFP at 83. 

First, M6-VETS disagrees with the agency’s assessment of a significant weakness to its proposal for a lack of detail supporting its transition out plan.  M6-VETS Protest at 7‑8.  In this regard, M6-VETS argues that, because this procurement is for an IDIQ contract rather than one of the resulting task orders, “it would be difficult to establish” a single “transition‑out plan for individual task orders.”[19]  Id. at 8.  The agency responds that it evaluated M6-VETS’s proposal in accordance with the solicitation’s evaluation criteria and properly assessed a significant weakness where M6-VETS’s proposed transition out actions were generic and lacked detail, and where M6-VETS failed to propose any notational timeframes for transition out.  MOL, B-419759.3 at 55. 

The record shows that M6-VETS’s entire transition out approach was as follows:

Our team achieves a successful transition by establishing a close‑out team, allowing personnel to continue work uninterrupted.  Our close‑out team formulates a transition‑out plan and a knowledge transfer strategy which they then work with the incoming contractor and current personnel to implement without disruption to normal work activities.  The key to a successful transition is an effective knowledge transfer, which prevents loss of accumulated project knowledge.  By collaborating with the incoming team, we achieve a smooth transition and set them up for continuous success. 

AR, Tab D29, M6-VETS Proposal Program Management and Staffing Approach Volume at 12. 

The plain language of M6-VETS’s proposal does not describe the significant actions in its approach in any detail and fails to propose any notational transition out timeframes.  We find reasonable the agency’s conclusion that M6-VETS’s minimal transition out approach was a flaw in the proposal that appreciably increased the risk of unsuccessful contract performance and therefore warranted the assessment of a significant weakness.  M6-VETS’s disagreement with the agency’s evaluation judgement does not provide a basis to sustain its protest.  See Vertex Aerospace, LLC, supra

M6-VETS alternatively contends that the solicitation contained a latent ambiguity with respect to what an offeror needed to include in its proposal with regard to a transition out approach.  M6-VETS Comments & Supp. Protest at 8.  M6-VETS argues that it reasonably interpreted the statement in the PWS that “[t]he Government plans for Contractor transition-out at the task order level” to mean that the “onus for transition-out activities and schedules” was with the agency, not the offeror.[20]  Id.; M6-VETS Protest at 8.  The agency responds that the PWS language unambiguously meant that the agency intended vendors to perform transition out activities at the task order, and not just the IDIQ, level.  MOL, B-419759.3 at 49.

An ambiguity exists where two or more reasonable interpretations of the terms or specifications of the solutions are possible.  FEI Systems, B‑414852.2, Nov. 17, 2017, 2017 CPD ¶ 349 at 4.  A patent ambiguity exists where the solicitation contains an obvious, gross, or glaring error, while a latent ambiguity is more subtle.  Id.  Where a patent ambiguity in a solicitation is not challenged prior to the submission of proposals, we will dismiss as untimely any subsequent challenge to the meaning of the solicitation term.  4 C.F.R. § 21.2(a)(1); Simont S.p.A., B‑400481, Oct. 1, 2008, 2008 CPD ¶ 179 at 4. 

On July 7, 2020, the agency issued amendment 0001 to the solicitation, which included a list of questions received in response to the solicitation and the agency’s answers.  AR, Tab A10, RFP Amendment 0001 at 1.  One potential offeror asked the agency if it could provide “an idea” of what the duration of the notational timeframes would be for the transition plan.  AR, Tab A16, RFP Attachment 8, tab “Contractual.”  The agency responded that the offeror was required to describe its approach to manage task order transitions in and transitions out, and specified that notational timeframes were “to be provided by the offerors.”  Id.  When combined with the plain solicitation language describing the evaluation of the program management and staffing approach factor, it is sufficiently clear that the agency expected offerors to describe their significant transition out actions and propose notational timeframes.  Assuming, for the sake of argument, that M6-VETS’s interpretation of the PWS language--that the agency would be responsible for transition out plans--is reasonable, we do not agree with M6-VETS’s assertions that this aspect of the solicitation was latently ambiguous, and therefore dismiss this protest ground as untimely. 

Next, Stratera challenges the agency’s assessment of a weakness to its transition out approach for its proposed 14-day transition out timeframe and insufficient detail regarding engagement and communication with other vendors.  Stratera Protest at 13‑14.  Stratera argues that the agency’s evaluation is unreasonable because it proposed a notional timeframe and described all of its significant transition out activities, including communication with the incoming vendor and agency, as required by the solicitation.  Stratera Comments & Supp. Protest at 24-25. 

In response, the agency argues that its evaluation was reasonable and in accordance with the solicitation’s evaluation criteria.  MOL, B-419759.5 at 42-43.  The agency maintains that it reasonably concluded that a transition out timeframe of 14 calendar days or less was too short and posed a risk to the agency, and that Stratera had failed to focus on communication in this aspect of its proposal.[21]  Id. at 43. 

Here, the record is clear that while Stratera did propose that it would “proactively collaborate with the incoming contractor” and the agency “to assure all operations are transitioned smoothly and professionally,” the remainder of its discussion of significant transition out actions does not provide detail regarding communicating with the new contractor or agency, but rather, relates to documentation.  AR, Tab D53, Stratera Proposal Program Management and Staffing Approach Volume at 11-12.  Further, Stratera does not dispute that it proposed a 14-day notional transition out timeframe.

The evaluators stated in the text of the weakness assessment that a 14-day transition “may not provide sufficient overlap with the incoming vendor.”  AR, Tab B08, Program Management and Staffing Approach Consensus Evaluation, tab “23F4.”  They also explain that Stratera’s focus on documentation and lack of detail on communication poses risk because “the documentation that [Stratera] create may not provide sufficient detail to allow the next vendor to meet government expectations without the opportunity for direct communication with the incoming vendor.”  Id

On this record, we find no basis to question the agency’s assessment of a weakness with regard to Stratera’s transition out approach.  We note again that the RFP required the agency to evaluate the significant activities and notional timeframes proposed by the offerors as part of their transition out approaches.  We find it reasonable for the agency to have perceived risk in Stratera’s lack of a detailed plan to communicate with other parties during the transition out process.  Further, without more from Stratera,[22] we see no basis to object to the agency’s conclusion that a 14-day transition out timeframe poses risk.  While Stratera may disagree with the agency’s judgments, it has failed to establish that those judgments were unreasonable.  This protest ground is denied. 

Past Performance Evaluation

Metric 8, RCH, Stratera, and MERPTech challenge several aspects of the agency’s evaluation under the past performance factor.  We have reviewed the protesters’ arguments and the evaluation record and find that none of the protesters’ arguments provides a basis to sustain a protest.  As discussed below in a few representative examples, we find the agency’s evaluation of past performance was either reasonable or that the protester’s arguments failed to demonstrate competitive prejudice.

First, Stratera and RCH challenge the agency’s evaluation of the recency and relevance of their past performance.  While the agency found that all past performance references submitted by Stratera and RCH were relevant, the protesters argue that the agency should have more positively considered their past performance that was more recent and/or more relevant than the past performance of other offerors.  In this regard, Stratera points to its very recent work performed for the PTO, arguing that it is unreasonable for the agency to have assigned Stratera a “satisfactory” rating and Steampunk a “superior” rating when Steampunk has no past performance with the PTO.  Stratera Protest at 21.  RCH notes that its submitted past performance references involved large contracts that “covered the entire scope and complexity” of the current solicitation’s requirements, and argues that the agency should have more positively considered this past performance compared to Halvik, Steampunk, and RIVA’s because their submitted references involved contracts that were comparatively smaller in scope and complexity.  RCH Comments & Supp. Protest at 9.

The agency responds that it evaluated past performance in accordance with the terms of the solicitation.  The agency argues that its consideration of relevancy under the past performance factor reasonably did not include an assessment of the degree of relevancy because the solicitation did not provide for the provision of extra credit for more relevant references.  MOL, B-419759.5 at 56-57.  The agency explains that the solicitation’s terms provided for a binary evaluation of past performance relevancy; that the agency would find past performance either relevant or not relevant, and would evaluate the significance of all relevant past performance.  MOL, B‑419759.4 at 60‑62. 

An agency’s evaluation of past performance, including its consideration of the relevance, scope, and significance of an offeror’s performance history, is a matter of discretion which we will not disturb unless the agency’s assessments are unreasonable or inconsistent with the solicitation criteria.  Metropolitan Interpreters & Translators, Inc., B‑415080.7, B‑415080.8, May 14, 2019, 2019 CPD ¶ 181 at 10; see also SIMMEC Training Sols., B‑406819, Aug. 20, 2012, 2012 CPD ¶ 238 at 4.  A protester’s disagreement with the agency’s judgment does not establish that an evaluation was unreasonable.  FN Mfg., LLC, B‑402059.4, B‑402059.5, Mar. 22, 2010, 2010 CPD ¶ 104 at 7.

As noted above, the RFP provided that past performance would be considered relevant if it met certain size, scope, and complexity requirements.  RFP at 82.  With regard to size, relevant past performance was required to meet a size standard of contracts with a value of no less than $1 million for a 12-month period of performance.  Id.  With regard to scope, relevant past performance would include a contract summary similar in scope “to one or more [of] the services and related items listed in the BOSS PWS Section 3.”  Id.  With regard to complexity, the RFP required that an offeror’s past performance collectively reference contracts and orders that involve Agile development, automated testing, development and operations or development, security, and operations services, and operation and maintenance support services.  Id.  Additionally, the RFP does not appear to have contemplated that the agency would assess relevancy across a qualitative spectrum; rather, the RFP only provides for a binary relevancy determination (i.e., relevant or not relevant). 

In light of the RFP’s evaluation criteria and the broad discretion afforded to the agency, we find no basis to disagree with the agency’s relevancy assessment.  In this regard, the protesters do not meaningfully allege, and our review of the record does not reveal, that any of the challenged awardees’ past performance did not meet the relevancy standards laid out in the RFP.  While the protesters may be correct that the scopes of work, complexity of work, or performance history with the procuring agency are not identical, that is not the standard for relevance established by the RFP.  As such, we deny this ground of protest. 

Stratera and MERPTech allege that the agency’s past performance evaluation overly relies on the contract reference ratings contained in the submitted past performance questionnaires (PPQs).  In this regard, the protesters argue that the agency’s evaluation was essentially a number-counting exercise.  The protesters claim that the agency failed to qualitatively assess the offerors’ past performance or consider their submitted contract summaries, and instead simply found the ratio of “exceptional” past performance indicators across an offeror’s submitted PPQs, and assigned “superior” ratings to offerors that met an undisclosed benchmark.  Stratera Comments & Supp. Protest at 12-13; MERPTech Comments & Supp. Protest at 14‑15. 

The agency responds that its past performance evaluation was reasonable and in accordance with the RFP.  The agency argues that its consideration of PPQ responses was proper, pointing to language in the RFP that required offerors to submit PPQs from contract point-of-contact references and advised offerors that the agency would evaluate information provided by the offeror’s references.  Supp. MOL, B-419759.5, B‑419759.12 at 33-34 (citing RFP at 73, 81).  The agency also argues that the evaluation record showed it considered the qualitative and narrative information provided in the past performance submissions.  Id.; Supp. MOL, B-419759.7, B‑419759.13 at 22-23. 

The RFP instructed offerors to send a PPQ to one of the points of contact identified in each contract summary.  RFP at 73.  Further, the RFP notified offerors that the agency would assess past performance information provided by the offeror and the offeror’s references.  Id. at 83.  The past performance evaluation criteria specified that the agency reserved the right to, but was not required to, consider past performance information from other sources. 

As an initial matter, in light of the evaluation criteria and the broad discretion afforded to the agency, we see no basis to object to the agency’s focus on the information found in PPQs in its evaluation of past performance.  The record shows that the agency did consider the contract summaries submitted by the offerors when evaluating the recency and relevance of past performance.  See AR, Tab B07, Past Performance Consensus Evaluation.  While the protesters object to the agency not relying on self-reported information in the offerors’ past performance narratives when making their confidence assessment, the protesters have not demonstrated that the RFP required the agency to do so.

Further, we see no basis to object to the agency’s consideration and reliance on “exceptional” past performance indicators in the PPQs.  The relative merits of offerors’ past performance information is generally within the broad discretion of the contracting agency.  See Paragon Tech. Group, Inc., B-407331, Dec. 18, 2012, 2013 CPD ¶ 11 at 5.  The RFP did not establish specific standards for how the agency would value the past performance information submitted by offerors and their references.  Accordingly, without more, the protesters’ objections to the value placed by the agency on different types of past performance information amount to nothing more than disagreement with the agency’s judgement and discretion.

Moreover, we are unpersuaded by the protesters’ arguments that the agency’s past performance evaluation was essentially a number-counting exercise.  The record demonstrates that, in additional to the number and ratio of “exceptional” past performance indicators from offerors’ submitted PPQs, the agency also considered the qualitative information provided by contract references in the PPQs.  See, e.g., AR, Tab B07, Past Performance Consensus Evaluation, tab “23F3” (incorporating narrative input from Stratera’s PPQ references that the agency found noteworthy into the agency’s explanation of why the past performance reference increased the agency’s confidence in Stratera).  While the protesters complain that only offerors who reached an undisclosed ratio of “exceptional” past performance indicators were assigned a superior rating, they do not point to any information from their PPQs that the agency failed to consider.  We find it unsurprising that the offerors with the highest ratio of “exceptional” past performance indicators in their submitted PPQs would inspire the most confidence from the agency evaluators.  Here, the protesters’ disagreements with the agency’s judgements regarding the relative merits of the offerors’ past performance do not provide a basis to sustain the protests. 

Metric 8, RCH, and Stratera specifically challenge the agency’s evaluation of Halvik’s past performance.  Specifically, they argue that Halvik is ineligible for award because it failed to comply with the solicitation’s requirement for each offeror to submit at least two past performance references for work the offeror itself had performed.  Metric 8 Comments & Supp. Protest at 2‑5; RCH Comments & Supp. Protest at 17-19; Stratera Supp. Protest at 16‑17.  In this regard, the protesters argue that the contract summaries submitted by Halvik show that the past performance it is attempting to claim as its own was performed by a wholly-owned subsidiary of Halviks, referred to as SSB, Inc.  Id.  The protesters argue that the RFP did not allow offerors to substitute the past performance of subsidiary companies for their own. 

The agency responds that Halvik was entitled to rely on the prior contracts at issue because SSB is actually a predecessor company of Halvik’s.  Supp. MOL, B-419759.4, B-419759.11 at 20-21.  The agency argues that Halvik acquired and absorbed SSB and the agency therefore properly credited the past performance to Halvik in accordance with FAR section 15.305(a)(2)(iii) and the terms of the solicitation.  Halvik agrees with the agency, and has submitted documents demonstrating that it did in fact acquire and absorb SSB.[23]  Halvik Supp. Comments, B-419759.4, B-419759.11 at 3.

While we afford an agency great discretion in the evaluation of past performance, we will question an agency’s evaluation conclusions when they are unreasonable or undocumented.  OSI Collection Servs., Inc., B–286597, B–286597.2, Jan. 17, 2001, 2001 CPD ¶ 18 at 6.  An agency properly may attribute the experience or past performance of a parent, subsidiary, or affiliated company to an offeror where the firm’s proposal demonstrates that the resources of the affiliate will affect the performance of the offeror.  See GM-Bulltrack, JV, B‑414591.6, B‑414591.7, Oct. 30, 2018, 2018 CPD ¶ 378 at 4.  The relevant consideration is whether the resources of the parent or subsidiary company--its workforce, management, facilities or other resources--will be provided or relied upon for contract performance such that the parent or affiliate will have meaningful involvement in contract performance.  See GM-Bulltrack, JV, supra.  

The RFP required offerors to provide at least two contract summaries that describe “the prime offeror’s own past performance.” RFP at 72.  The RFP provided that offerors could submit past performance information on work performed by “other previous incarnation[s] of its current organization.”  Id. at 73.  However, the RFP required offerors to “clearly define what entity performed the work if other than the prime offeror’s past performance is submitted.”  Id.  The RFP also required offerors to submit the stated amount of past performance information or “affirmatively state in its proposal that it possesses no relevant, directly related, or similar past performance.”  Id.

Here, the record shows that the two PPQs at issue list the contractor as “Halvik Corp (subsidiary SSB, INC)” and “Halvik Corp (SSB, Inc., wholly owned subsidiary).”  AR, Tab C18, Halvik PPQ 1; AR, Tab C19, Halvik PPQ 2.  Halvik’s past performance volume also describes SSB as “a wholly owned subsidiary of Halvik Corp” without further comment.  AR, Tab C14, Halvik Proposal Past Performance Volume at 1.  The agency does not point to, and our review of the contemporaneous record does not reveal, anything in either Halvik’s proposal or the agency’s evaluation record that demonstrates the agency was aware of SSB’s predecessor status during its past performance evaluation.  Alternatively, nowhere in Halvik’s proposal does it describe what resources would be provided by SSB and relied upon for performance under this contract.[24] 

On this record, we agree with the protesters that the agency’s evaluation of Halvik’s “own” past performance submission was either not reasonable or undocumented.  The RFP required offerors to clearly define which entity performed the work, and Halvik’s proposal clearly defined the SSB as having performed as a subsidiary.  To the extent the agency evaluated SSB as a subsidiary of Halvik’s, it has not shown that it considered what SSB resources would be used on this contract such that it reasonably considered SSB to have meaningful involvement in contract performance.  To the extent the agency considered SSB a predecessor to Halvik, the agency failed to document how it reasonably reached such a conclusion, given the language in Halvik’s proposal.[25]  In short, the agency should have concluded that Halvik failed to provide two of its own past performance references as required by the RFP.  As a result, Halvik’s past performance should have been evaluated as no better than “neutral” under the RFP’s past performance evaluation scheme.

However, given our conclusions above and below, that the agency’s evaluation, comparative analysis of proposals, and award decisions were otherwise unobjectionable, we find that the agency’s unreasonable past performance evaluation of Halvik had no impact on Metric 8’s, RCH’s, or Stratera’s competitive standing, and that therefore the protesters were not prejudiced.  Competitive prejudice is an essential element of a viable protest.  Where a protester fails to demonstrate that, but for the agency’s actions, it would have had a substantial chance of receiving the award, our Office will not sustain the protest.  See e.g., Access Interpreting, Inc., B‑413990, Jan. 17, 2017, 2017 CPD ¶ 24 at 5.  As discussed below, we find no basis to object to the agency’s conclusion that [Offeror A]’s proposal, which was the 4th-highest ranked small business proposal, was higher technically rated than all five protesters’ proposals here.  Accordingly, even if the agency had properly disqualified or downgraded Halvik’s proposal, none of the protesters would be in line for award before [Offeror A]. 

Agency’s Comparative Analysis and Source Selection Decisions 


The protesters challenge several aspects of the agency’s comparative analysis and source selection decisions.  We have reviewed the protesters’ arguments and the contemporaneous record and find that none of the protesters’ arguments provides a basis to sustain a protest.[26]  As discussed below in a few representative examples, we find that the agency’s comparative analysis and source selection decisions were reasonable, well documented and in accordance with the terms of the solicitation.

As an initial matter, RCH argues that the agency’s award decision gave price no meaningful consideration because it did not compare the proposed prices of the offerors in its comparative analysis or otherwise tradeoff non-price proposals against proposed prices.  RCH Comments & Supp. Protest at 22-23; RCH Supp. Comments at 9‑10. 

Our Bid Protest Regulations contain strict rules for the timely submission of protests. These rules reflect the dual requirements of giving parties a fair opportunity to present their cases and resolving protests expeditiously without unduly disrupting or delaying the procurement process.  Verizon Wireless, B‑406854, B‑406854.2, Sept. 17, 2012, 2012 CPD ¶ 260 at 4.  Our timeliness rules specifically require that a protest based upon alleged improprieties in a solicitation that are apparent prior to the closing time for receipt of initial proposals or quotations be filed before that time.  4 C.F.R. § 21.2(a)(1). 

Here, the solicitation was clear that the agency would make awards using a “Highest Technically Rated with a Fair and Reasonable Price” basis.  RFP at 78-79.  The RFP specified that, after the initial evaluation of proposals, the agency would conduct an analysis of the non-price factors separate from its evaluation of price.  Id. at 79 (“[the agency] will conduct an analysis of Factors 1 through 4 to determine which offerors are the highest technically rated”).  At no point in the RFP does the agency state it intends to trade off the evaluated benefits of the non-price proposals against proposed prices. 

Accordingly, RCH’s protest that the agency should have selected awardees using a best‑value tradeoff of price and non-price proposals is a challenge to the terms of the solicitation, which were apparent prior to the closing time for receipt of proposals.  RCH filed its initial protest on April 26, 2021, more than 8 months after the deadline for the receipt of proposals.  Therefore, we dismiss this protest ground as untimely.

MERPTech challenges the agency’s comparative analysis and source selection decisions as inconsistent with the terms of the solicitation, arguing that the agency gave undue weight to the program management and staffing approach factor.  MERPTech Comments & Supp. Protest at 19‑21.  In this regard, the protester contends that the agency unreasonably found [Offeror A]’s proposal to be higher technically rated than its own due to improper inflation of the importance of the program management and staffing approach factor when compared to the technical approach factor.[27]  Id.

The protester points to language in the solicitation establishing that the evaluation factors were listed in descending order of importance.  On this basis, MERPTech argues that, because the solicitation’s evaluation scheme established that the technical approach factor was the most important factor for small business offerors, MERPTech’s advantages under that one factor should have outweighed other offerors’ advantages under less important factors.  Protest at 19‑20. 

The agency responds that its evaluation and source selection decisions reasonably applied the correct relative weights of the evaluation factors in accordance with the RFP.  MOL, B-419759.7 at 65.  The agency contends that it properly found [Offeror A]’s proposal to be higher technically rated than MERPTech’s because the relative benefits identified in [Offeror A]’s program management and staffing approach outweighed the relative benefits identified in MERPTech’s technical approach.  Supp. MOL, B‑419759.7, B-419759.13 at 29-33. 

As noted above, our Office will not reevaluate proposals, nor substitute our judgment for that of the agency; rather, we will review the record to determine whether the agency’s evaluation was reasonable and consistent with the stated evaluation criteria and with applicable procurement statutes and regulations.  AECOM Mgmt. Servs., Inc., supra.  A protester’s disagreement with the agency’s judgment, without more, is insufficient to establish that the agency acted unreasonably.  Vertex Aerospace, LLC, supra.

Here, the agency concluded that “the combined quality of [] MERPTech’s strengths provide an overall marginally greater benefit than [Offeror A]’s strengths” under the technical approach factor.  AR, Tab B10, Best-Value Comparative Analysis, tab “01v13.”  The agency explained that it found “both vendors provide relatively equal benefit to the government in terms of their proposed Agile Development Process and System Test and Delivery approach.”  Id.  The agency noted that the discriminators under the technical approach factor were MERPTech’s proposed use of a “strangler pattern” and its approach to microservices architecture, which provided “marginally greater benefit to the government when compared to [Offeror A]’s” approach for system and software development, and architecture, engineering, and design.  Id

The agency evaluated the offerors’ past performance records as providing “a relatively equal degree of confidence.”  Id.  Under the program management and staffing approach factor, the agency concluded that [Offeror A]’s proposal “provides greater benefit and poses a lower risk” than MERPTech’s.  The agency specified that [Offeror A]’s proposed technical [DELETED] and [DELETED] were positive discriminators in the area of program management, compared to MERPTech’s proposal in this area.  Id.  The evaluators also stated that “MERPTech’s limited description of its on-boarding process poses a higher level of risk” when compared to [Offeror A]’s staffing approach.  Id

The agency’s contemporaneous evaluation record reflects the agency’s in-depth consideration of the relative merits of [Offeror A]’s and MERPTech’s proposals under the non-price factors, including the various discriminators identified by the agency in its comparative analysis.  See AR, Tab B06, Technical Approach Consensus Evaluation, tabs “01F2”, “13F2”; see also AR, Tab B08, Program Management and Staffing Approach Consensus Evaluation, tabs “01F4”, “13F4.”  On this record, we find the agency’s comparative analysis unobjectionable. 

We note that the mere fact that an agency’s source selection decision turns on an evaluation consideration that is designated as less important is not inherently objectionable, since there is no requirement that the key award discriminator be found under the most heavily weighted factor.  See KIRA Inc., B‑287573.4, B‑287573.5, Aug. 29, 2001, 2001 CPD ¶ 153 at 6.  As discussed above, we did not find any merit to the protesters’ challenges to the agency’s evaluation of [Offeror A]’s and MERPTech’s proposals.  We find it consistent with the evaluation criteria that a proposal with a significantly more advantageous approach to program management and staffing could be higher technically rated than a proposal with more marginal advantages in technical approach.  MERPTech has not demonstrated that the agency’s judgments were inconsistent with the stated evaluation criteria or otherwise unreasonable.  We deny this ground of protest. 

Stratera, M6-VETS, and RCH also challenge the agency’s comparative analysis of proposals, including the use of the transitive property to avoid comparing all proposals head-to-head.  In this regard, the protesters allege that the agency’s failure to compare the protesters’ proposals head-to-head with the awardees’ as part of its comparative analysis is unreasonable.  In addition, these protesters argue that the agency cannot demonstrate that it considered the combined benefits of the specific features of a given awardee’s proposal to exceed the benefits found in a protester’s proposal.  See Stratera Comments & Supp. Protest at 10; see also RCH Comments & Supp. Protest at 21 (“the [a]gency’s approach masked the qualitative differences the comparative assessment is designed to identify and value”); see also M6-VETS Comments & Supp. Protest at 16 (“the transitive property of inequality has no basis in procurement law and raises concerns of false equivalency”).

While source selection officials are required to evaluate submitted proposals, and make a reasoned source selection decision, our Office has found the “indirect” comparison of proposals to be unobjectionable.  DMS Int’l, B-409933, Sep. 19, 2014, 2014 CPD ¶ 278 at 5.  Specifically, we have found transitive analysis of evaluated proposals to be reasonable where the record shows the agency took into account all the advantages offered by the proposals.  See Client Network Servs., Inc., B-297994, Apr. 28, 2006, 2006 CPD ¶ 79 at 9 (“Since the SSA determined that QSS’s proposal was a better value than CNSI’s, and that CSC’s was a better value than QSS’s, we think it follows that the agency effectively found that CSC’s proposal was a better value than CNSI’s, even without a direct comparison of the two.”)

Given our conclusions above--that the agency’s evaluation and comparative analysis of proposals were generally reasonable, with the exception of the past performance evaluation of Halvik--we find that the protesters’ objections to the agency’s use of an indirect or transitive comparison of proposals does not provide a basis to sustain a protest.  Our review of the record shows that the agency clearly documented why it considered [Offeror A]’s proposal to provide greater benefit than the proposals submitted by the protesters.  See AR, Tab B10, Best-Value Comparative Analysis, tabs “01v12”, “01v13”, “01v14”, “01v16”, and “01v23.”  The agency also clearly documented why it considered Steampunk’s, RIVA’s, Halvik’s, and SAIC’s proposals each to provide greater benefit than [Offeror A]’s proposal. Id., tabs “01v08”, “01v18”, “01v19”, and “01v22.”  Finally, the agency clearly documented why it considered Booz Allen’s proposal to provide greater benefit than SAIC’s proposal.  Id., tab “19v03.” 

In short, we see no basis to question the agency’s transitive determination that Booz Allen, SAIC, Steampunk, and RIVA each submitted higher technically rated proposals than the protesters.  While the agency unreasonably evaluated Halvik’s past performance, we explained above that the protesters were not prejudiced by the agency’s actions because, even if the agency had properly disqualified or downgraded Halvik’s proposal, none of the protesters would be in line for award before [Offeror A].

The protests are denied.

Thomas H. Armstrong
General Counsel

 

[1] The agency amended the RFP twice.  All citations to the RFP in this decision are to the conformed version issued as part of amendment 0002. 

[2] The agency amended the SSP once.  Citations in this decision are to the amended SSP dated September 29, 2020.

[3] The SSP used the same definitions for the adjectival ratings under the technical approach, and program management and staffing approach factors.  AR, Tab B01, SSP at 8.

[4] The SSP defined a strength as an “aspect of an offeror’s proposal that increases the potential for successful contract performance and/or has a positive impact for the government.”  AR, Tab B01, SSP at 9.  The SSP defined a weakness as a “flaw in the proposal that increases the risk of unsuccessful contract performance and has a negative impact for the government.”  Id.  The SSP defined a significant weakness as a “flaw in the proposal that appreciably increase the risk of unsuccessful contract performance and has a negative impact for the government.”  Id.  The SSP defined a deficiency as a “material failure of a proposal to meet a Government requirement or a combination of related weaknesses that increase the risk of unsuccessful contract performance to an unacceptable level.”  Id

[5] The definition of an “unsatisfactory” rating under the technical approach, and program management and staffing approach factors is not relevant to the resolution of these protests.

[6] With regard to size, relevant past performance was defined as a minimum value of $1 million for a 12-month period of performance.  RFP at 82.  With regard to scope, relevant past performance would include a contract summary similar in scope “to one or more [of] the services and related items listed in the BOSS PWS Section 3.”  Id.  With regard to complexity, the RFP required that an offeror’s past performance collectively reference contracts and orders that involve Agile development, automated testing, [development and operations or development, security, and operations services], and operation and maintenance support services.  Id

[7] The SSP defined an “increases confidence” finding as a “note-worthy past performance finding” that would increase the agency’s confidence that the solicitation requirements would be met in a timely and cost effective manner while a “decreases confidence” finding would “decrease the [agency’s] confidence” in the same.  AR, Tab B01, SSP at 9. 

[8] As relevant here, the agency’s evaluation of the offerors’ approaches to task order transitions was to consider the proposed significant actions and notional timeframes.  Also, the agency’s evaluation of the staffing approach was to consider the proposed approach to motivate and retain qualified personnel.  RFP at 83. 

[9] The transitive property of inequality can be expressed as:  if a is greater than b and b is greater than c, then a must be greater than c. 

[10] Our Office received two other protests of the awards under this solicitation, both from large businesses.  These protests were addressed in other decisions. 

[11] These arguments are distinct from those discussed below, alleging that the agency unreasonably evaluated the underlying proposals and accordingly failed to properly consider the value of the various proposals when making its source selection decisions.

[12] M6-VETS notes that the agency rated each of Steampunk’s, Halvik’s, and Stratera’s technical approaches “satisfactory” despite assessing these offerors three strengths and one weakness, two strengths and zero weaknesses, and one strength and two weaknesses, respectively, under that factor.  M6-VETS Comments & Supp. Protest at 19. 

[13] In addition to the issues discussed below, Stratera also generally challenges the agency’s evaluation of RIVA’s proposal under the technical approach factor, alleging that RIVA does not possess the relevant experience to have proposed a “superior” technical approach.  Stratera Protest at 21-22.  Stratera’s protest does not point to, and our review of the solicitation does not reveal, any requirement for the agency to evaluate a protester’s experience under the technical approach factor.  Accordingly, we deny this ground of protest. 

[14] Stratera initially contended that the agency disparately evaluated it’s and M6-VETS’s technical approaches to key performance indicators where M6-VETS was assessed a strength and Stratera was not despite both allegedly proposing “identical proposal features.”  Stratera Comments & Supp. Protest at 33.  Stratera also contended that the agency similarly disparately evaluated it and MERPTech’s technical approaches with regard to their proposed uses of microservices architecture.  Id. at 35.  Stratera later withdrew these grounds of protest.  Stratera Supp. Comments at 18 n.7.

[15] Scrum and Kanban are two different development methodologies that the PWS stated could “[l]ead to success” when used in line with the agency’s “newly enhanced Agile practices.”  PWS at 3. 

[16] RCH initially argued that the agency unreasonably evaluated its approach to recruiting what the RFP called “T-shaped resources,” but later withdrew this ground of protest.  RCH Comments & Supp. Protest at 14-15; RCH Supp. Comments at 2 n.2. 

[17] Metric 8 points to the following language from its proposal:  “team Metric 8 uses a multitude of retention strategies to ensure that our combined staff know our mutual cultures, our focus on customer service, and our commitment to employee growth and development.”  Metric 8 Supp. Comments at 12 (citing AR, Tab D16, Metric 8 Proposal Program Management and Staffing Approach Volume at 15).  Metric 8 argues, without explanation, that this language is essentially identical to Halvik’s above language describing its culture based on open and frequent communication, innovation, and new technologies.  Id

[18] Metric 8 also argues that the agency’s arguments in response to this protest ground are post hoc rationalizations.  See Metric 8 Supp. Protest at 15-16.  However, as we discussed above, it also argued that its proposal warranted a strength even considering the agency’s post-protest explanation. 

[19] M6-VETS also argues that the agency unequally evaluated its and Steampunk’s transition out approaches where it did not assess the same significant weakness to Steampunk for a “not that different” proposal.  M6-VETS Comments & Supp. Protest at 8.  The record demonstrates that, unlike M6-VETS’s approach described below, Steampunk proposed notional timeframes and provided some concrete steps for its significant transition out actions.  AR, Tab C37, Steampunk Proposal Program Management and Staffing Approach Volume at 10.  We find that M6-VETS has not established that the differences in the evaluation did not stem from differences between the offerors’ proposals.  Nexant Inc., supra

[20] RCH initially raised a substantially similar argument challenging the agency’s evaluation of its transition out approach.  RCH Protest at 10.  However, after receipt of the agency report, RCH withdrew this ground of protest.  RCH Comments & Supp. Protest at 1 n.2.  RCH nominally states in its comments and supplemental protest that the agency failed to reasonably evaluate its transition out, but then proceeds to raise an argument challenging the agency’s evaluation of its and Halvik’s staffing approaches.  Id. at 1-4.

[21] The agency notes that Stratera could have provided more detail regarding the timeframes of its proposed significant transition out actions, which may have resolved the risk of a short transition out timeframe, but did not do so.  Id. at 43.

[22] Stratera alternatively argues that the agency evaluated this weakness disparately, arguing that the agency failed to assess a similar weakness to SAIC when it proposed a 2 to 4-week transition out timeframe.  Stratera Comments & Supp. Protest at 25.  However, the record is clear that SAIC proposed a timeframe that could be twice as long as Stratera’s and described specific meetings and trainings where it would communicate with the incoming vendor.  AR, Tab C48, SAIC Proposal Program Management and Staffing Approach Volume at 10.  On this record, we find that Stratera has not demonstrated that the differences in its and SAIC’s evaluations did not stem from differences between their proposals.  Accordingly, we deny this ground of protest.  See Nexant Inc., supra

Stratera also argues the agency evaluated this weakness disparately where other offerors received allegedly similar weaknesses for failing to propose any notional timeframes at all.  Stratera Comments & Supp. Protest at 25-26.  Despite its objection, Stratera has not shown that the agency’s evaluation was not in accordance with the terms of the RFP.  We find it reasonable that the agency could assess separate weaknesses for either failing to propose any notional timeframe or for proposing too short a notional timeframe.  We deny this ground of protest. 

[23] In response to the multiple protests, Halvik attached a declaration from its president, certain board of director documents, press releases, and a copy of an interim contract awarded directly to Halvik.  The interim contract was a follow-on contract to one addressed by one of the past performance references at issue.  Halvik Supp. Comments. B-419759.4, B-419759.11, Attachs.

[24] Notably, SSB was not listed as a team member in Halvik’s proposal.  See AR, Tab C12, Halvik Proposal Small Business Participation Volume at 1-4

[25] Based on the documents submitted by Halvik in response to these protests, we have no reason to doubt that SSB was in fact acquired and absorbed by Halvik and that several former SSB resources would be utilized on this contract.  However, we see nothing in the contemporaneous record showing that Halvik included this information in its proposal or that the agency was otherwise aware of it. 

[26] Stratera initially argues that the comparative analysis of proposals relied solely on the adjectival ratings of proposals instead of considering each proposal’s underlying individual merits.  Stratera Protest at 8.  Stratera later withdrew this ground of protest.  Stratera Comments & Supp. Protest at 3 n.1. 

[27] M6-VETS similarly contends that the agency ascribed undue weight to the program management and staffing approach factor compared to the technical approach factor.  M6-VETS argues that, because the technical approach factor was identified as the most important evaluation factor for small business offerors, it was unreasonable for the agency to make awards to small business offerors that received a “satisfactory” technical approach rating when other non-awardees received a technical approach rating of “superior.”  M6-VETS Comments & Supp. Protest at 11.  However, we find that M6-VETS cannot demonstrate that it was prejudiced here.  The record shows that M6‑VETS was rated “satisfactory” under the technical award factor and would therefore not be in line for award even if the agency had conducted its comparative analysis as M6‑VETS argues it should have.  See AR, Tab B06, Technical Approach Consensus Evaluation, tab “12F2.” 

Downloads

GAO Contacts

Office of Public Affairs