Skip to main content

Solers, Inc.

B-414672.3,B-414672.8 Oct 09, 2018
Jump To:
Skip to Highlights

Highlights

Solers, Inc., of Arlington, Virginia, protests its failure to receive a contract award under request for proposals (RFP) No. HC1047-17-R-0001, issued by the Department of Defense (DoD), Defense Information Systems Agency (DISA), for information technology (IT) engineering services. The protester challenges the agency's evaluation of its proposal, as well as the agency's evaluation of the proposals of several awardees. The protester also contends that the agency abused its discretion in failing to conduct discussions with offerors and that the agency failed to meaningfully consider price in its best-value tradeoff analysis.

We sustain the protest in part, deny it in part, and dismiss it in part.

We sustain the protest in part, deny it in part, and dismiss it in part.
View Decision

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of:  Solers, Inc.

File:  B-414672.3; B-414672.8

Date:  October 9, 2018

Michael Gardner, Esq., Shomari B. Wade, Esq., and Brett Castellat, Esq., Greenberg Traurig, LLP, for the protester.
Daniel R. Forman, Esq., and Laura J. Mitchell Baker, Esq., Crowell & Moring LLP, for Vencore, Inc.; and Deneen J. Melander, Esq., Richard A. Sauber, Esq., and Lanora  C. Pettit, Esq., Robbins, Russell, Englert, Orseck, Untereiner & Sauber LLP, and Kenneth M. Reiss, Esq., Northrop Grumman Corporation, for Northrop Grumman Systems Corporation, the intervenors.
Sarah L. Carroll, Esq., and Aubri Dubose, Esq., Defense Information Systems Agency, for the agency.
Elizabeth Witwer, Esq., and Jennifer D. Westfall-McGrail, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1.  Protest challenging the agency's evaluation of the protester's proposal under the most important non-price factor is sustained where the record demonstrates that the agency's conclusion regarding the impact of an assigned strength is inconsistent with the underlying evaluation.

2.  Protest alleging that agency identified strengths in other offerors' proposals, but unreasonably failed to recognize similar strengths in the protester's proposal is sustained where the agency did not provide a meaningful explanation for differences in its assignment of strengths to the proposals.

3.  Protest alleging that the protester's proposal warranted additional strengths under the problem statements factor is dismissed as untimely where the agency disclosed the lack of strengths in the debriefing and the protester first challenged the lack of strengths in its comments on the agency's report.

4.  Protest alleging that the agency improperly made award to an offeror who required execution of binding arbitration agreements as a condition of employment is denied where there is no evidence that the protester was prejudiced by any alleged error.

5.  Protest challenging the agency's decision not to conduct discussions is denied where the protester erroneously contends that certain factors must be present in Department of Defense procurements valued over $100 million in order for an agency to reasonably forego discussions.

6.  Protest challenging the agency's best-value tradeoff is sustained where the record shows that the agency performed a mechanical analysis that failed to meaningfully consider price and resulted in the exclusion of technically acceptable proposals.

DECISION

Solers, Inc., of Arlington, Virginia, protests its failure to receive a contract award under request for proposals (RFP) No. HC1047-17-R-0001, issued by the Department of Defense (DoD), Defense Information Systems Agency (DISA), for information technology (IT) engineering services.[1]  The protester challenges the agency's evaluation of its proposal, as well as the agency's evaluation of the proposals of several awardees.  The protester also contends that the agency abused its discretion in failing to conduct discussions with offerors and that the agency failed to meaningfully consider price in its best-value tradeoff analysis. 

We sustain the protest in part, deny it in part, and dismiss it in part.

BACKGROUND

DISA issued the RFP on February 22, 2017, using full and open competition, with the intent to establish a Multiple Award Task Order Contract (MATOC) referred to as the Systems Engineering, Technology, and Innovation (SETI) contract.  Agency Report (AR), Tab 1, RFP at 1, 11, 102.  The primary objective of the SETI contract is to provide engineering and technical support, services, and products globally to DoD, DISA, and DISA mission partners.  Id. at 12.  In this regard, the SETI contract "provides an overarching streamlined process for ordering a wide variety of critical IT engineering performance-based services while ensuring consistency and maximum opportunity for competition."  Id.

The scope of the SETI contract includes a broad array of research and development, as well as "critical technical disciplines core to engineering, delivering, and maintaining DoD and DISA IT products and capabilities."  Id.  The performance work statement (PWS) identified the following eight task areas:  (1) system engineering; (2) design analysis engineering; (3) systems architecture; (4) software systems design and development; (5) systems integration; (6) systems test and evaluation; (7) systems deployment and life-cycle engineering; and (8) special systems engineering requirements.  Id. at 12-13.

The RFP anticipated the award of multiple indefinite-delivery, indefinite-quantity (IDIQ) contracts in two pools:  an unrestricted pool and a restricted pool.[2]  AR, Tab 5, RFP Amend. 4, at 30.  The protester submitted a proposal for consideration in the unrestricted pool and the awards at issue here relate to that pool.  The RFP indicated that DISA intended to award approximately 10 contracts on an unrestricted basis and approximately 20 contracts on a restricted basis.  Id.  However, the agency expressly reserved the right to award more, less, or no contracts at all.  Id.

The RFP provided that the ordering period would consist of a 5-year base period and a 5-year option period.  AR, Tab 1, RFP at 41.  Orders issued under the contract would be performed on a fixed-price, cost reimbursement, and/or time-and-materials basis, or a combination thereof, and might also include incentives.  AR, Tab 5, RFP Amend. 4, at 30.  The maximum dollar value for all contracts, including the base and option periods, is $7.5 billion.  AR, Tab 1, RFP, at 8. 

Evaluation Criteria

The solicitation provided for award on a best-value tradeoff basis consisting of price and the following four non-price factors, listed in descending order of importance:  (1) innovation, (2) past performance, (3) problem statements, and (4) utilization of small business.  AR, Tab 5, RFP Amend. 4, at 50-51.  When combined, the non-price factors were significantly more important than price.  Id. at 51.  The RFP indicated that DISA intended to award without discussions, but also reserved the right to conduct discussions if in the agency's best interest.  Id. at 33, 50, 58.

With respect to the most important factor, innovation, section L of the RFP provided a lengthy explanation of how DISA viewed innovation, id. at 39-42, including the following definition of the term:

Definition of Innovation as it relates to the evaluation of this factor:

To DISA, and the DoD, fostering a creative culture and driving Innovation in defense of the country are paramount success criteria in executing the SETI Contract.  In the SETI procurement, the Government is looking for innovative companies that accelerate attainment of new information system capabilities.  In this context and for the evaluation of this factor, "innovative" means -

(1)any new technology, process, or method, including research and development; or

(2)any new application of an existing technology, process, or method.

Id. at 39.  In explaining how DISA viewed innovation, the RFP also provided three categories of innovation.  "Level 1" included incremental improvements, defined as "[t]ypically representative of smaller tweaks that advance the core mission and/or a focused effort on continuous improvements to current processes, customer experiences, and mission services."  Id.  "Level 2" included major advancements, defined as "[t]ypically representative of creating a new and enhanced way of doing business and/or a new and enhanced way for the customer to interact with the system."  Id.  Finally, "Level 3" included disruptive innovation, defined as "[t]ypically representative of an innovation that transforms an existing market or sector by introducing simplicity, convenience, accessibility, and affordability where complication and high cost are the status quo."  Id.

As relevant here, section L of the RFP instructed offerors to address the following five criteria in their proposals in order to demonstrate their capabilities and proposed efforts to achieve and provide innovation: 

  • Section L.4.2.3.1, Corporate Philosophy/Culture on Innovation
  • Section L.4.2.3.2, Investment in Innovation
  • Section L.4.2.3.3, History of Engineering and Deploying Innovative Solutions
  • Section L.4.2.3.4, Outreach and Participation
  • Section L.4.2.3.5, Certifications, Accreditations, Awards, Achievements, and Patents. 

Id. at 40-42.  Under each criterion, the RFP included several bullets that offerors were to address.  Id.  As an example, under section L.4.2.3.5, Certifications, Accreditations, Awards, Achievements, and Patents, offerors were to address the following:

The Offeror shall:

  • List and describe Awards and Achievements received that were awarded because of Innovation.
  • List and describe the company's patents owned and applied for and how they relate to the SETI PWS.
  • List of published papers regarding Innovation and successful implementation of an innovative process and solutions.
  • Any other related information regarding the company's achievements related to Innovation.
  • List Certifications and Accreditations that your company has that relate to Innovation.

Id. at 42.

In section M, the RFP provided that DISA would evaluate offerors' responses to the criteria described in section L.  Id. at 52.  The RFP further provided that the agency would consider strengths, weaknesses, significant weaknesses, uncertainties, and deficiencies in an offeror's proposal and would assign the proposal one of the following color/adjectival ratings reflecting an overall risk of "failure to be innovative:" blue/outstanding, purple/good, green/acceptable, yellow/marginal, or red/unacceptable.  Id. at 51-52.

Under the past performance factor (factor 2), offerors were instructed to submit up to three references that the agency would evaluate for recency, relevancy, and quality of effort.  Id. at 42-43, 52-54.  The agency also reserved the right to obtain past performance information from any sources available to the government.  Id. at 52.  In evaluating past performance, the RFP provided that the agency would assign one of the following performance confidence ratings:  substantial confidence, satisfactory confidence, neutral confidence, limited confidence, or no confidence.  Id. at 54.

Under the problem statements factor (factor 3), offerors were asked to demonstrate their technical skills and ingenuity by solving two hypothetical problems.  The purpose of these notional problems was to provide DISA insight into an offeror's ability to meet the requirements and the offeror's problem solving methodologies.  Id. at 44.  Offerors in the unrestricted pool were to respond to problem statements 1 and 2.  AR, Tab 4, RFP Amend. 3, Attach. 7, Problem Statements.  The problem statements were weighted equally and were each assigned one of the following ratings:  blue/outstanding, purple/good, green/acceptable, yellow/marginal, or red/unacceptable.  AR, Tab 5, RFP Amend. 4, at 55.

Under the utilization of small business factor (factor 4), offerors were instructed to submit a small business participation plan outlining how offerors intended to maximize the utilization of small businesses.  Id. at 46.  The RFP stated that the agency would evaluate the offerors' submissions for adequacy of the proposed small business participation plan and proposed goals, and would assign one of the following ratings to the proposal:  blue/outstanding, purple/good, green/acceptable, yellow/marginal, or red/unacceptable.[3]  Id. at 56.

Finally, with respect to price, offerors were instructed to input direct labor rates and indirect rates, to include profit/fee, into a spreadsheet provided by DISA in which the agency listed the applicable labor categories with an estimate of the labor hours for each category.  Id. at 48; AR, Tab 4, RFP Amend. 3, Attach. 9, Pricing Spreadsheet.  The RFP indicated that the spreadsheet would calculate the fully burdened labor rate for each category and the total proposed price for the offeror.[4]  AR, Tab 5, RFP Amend. 4, at 48, 57.  The RFP informed offerors that the agency would review the fully burdened fixed price labor rates for reasonableness and completeness using one of the techniques defined in section 15.404 of the FAR.[5]  Id. at 57.  The RFP provided that DISA would also evaluate proposals for unreasonably high prices and unbalanced pricing, and that the existence of such prices "may be grounds for eliminating a proposal from competition."  Id. at 49, 57.  Only the total proposed price would be used for tradeoffs between price and non-price factors.  Id. at 48.

Evaluation of Proposals

The agency received 35 timely proposals in response to the solicitation.  AR, Tab 65, SSDD, at 1; Memorandum of Law (MOL)/Contracting Officer's Statement (COS) at 23.  As set forth in the RFP, proposals were evaluated using a multi-step process.  AR, Tab 5, RFP Amend. 4, at 57-58.  First, proposals were evaluated by five separate and distinct evaluation boards--one for each factor--and consensus reports were prepared.  AR, Tab 26, Technical Evaluation Board (TEB) Report (Innovation); Tab 27, TEB Report (Past Performance); Tab 28, TEB Report (Problem Statements); Tab 29, TEB Report (Utilization of Small Business); Tab 60, Price/Cost Evaluation Report. 

Next, the contracting officer executed a memorandum for record (MFR) determining that the proposed prices of all offerors were fair and reasonable.  AR, Tab 61, Pricing MFR, at 1.  In reaching this conclusion, the contracting officer relied upon the Price/Cost Evaluation Report, id., which found that price reasonableness "is normally established by adequate competition" and, therefore, because 35 offerors submitted proposals, "it is implicit that price reasonableness has been determined at the macro level."[6]  See e.g., AR, Tab 60, Price/Cost Evaluation Report, at 4.  The pricing evaluation board did not compare offerors' prices; the contracting officer likewise noted the 35 offers and concluded that "[t]he presumption is that all proposed prices are fair and reasonable if there is adequate competition."  AR, Tab 61, Pricing MFR, at 1 (citing FAR § 15.404-1(b)(2)(i)).

Around the same time, the Source Selection Evaluation Board (SSEB) reviewed the evaluation boards' consensus reports, verified the evaluation results, and prepared an SSEB report.  AR, Tab 62, SSEB Report.  In some cases, the SSEB made changes to the assessments recommended by the various TEBs.  The SSEB did not conduct a comparative analysis of proposals.  Id. at 2. 

Next, the Source Selection Advisory Council (SSAC) reviewed the evaluation record and ratings, and prepared a report for the Source Selection Authority (SSA).  AR, Tab 63, SSAC Report.  Additionally, the SSAC performed a comparative analysis of the proposals, recommending award to the 14 offerors the SSAC deemed to be the "highest rated proposals" under the non-price factors.  The SSAC concluded that "the technical merit of those proposals justifies paying a price premium over lower-rated, lower-priced proposals."  Id. at 13.  Finally, the SSA reviewed and analyzed the SSEB and SSAC reports prior to executing a Source Selection Decision Document (SSDD).  AR, Tab 65, SSDD.

With respect to the agency's evaluation of Solers' proposal under the innovation factor, the agency identified one strength and no weaknesses and assigned the proposal a rating of green/acceptable.  AR, Tab 62, SSEB Report, at 172.  Under the past performance factor, the agency assigned Solers' proposal the highest rating of substantial confidence.  Id. at 173.  Under the problem statements factor, the agency identified no strengths and no weaknesses for either problem, and assigned both problem statements ratings of green/acceptable.  Id.  Under the utilization of small business factor, the agency identified two strengths and assigned the proposal a rating of purple/good.  Id. at 173-74.  Solers' total proposed price was [DELETED].  AR, Tab 60, Price/Cost Report, at 2. 

Below is a summary of the agency's ratings of the proposals of the 14 awardees and Solers:

Offeror Factor 1 Factor 2 Factor 3 Factor 4 Factor 5
  Innovation Past Performance Problem 1 Problem 2 Small Business Price
IBM Outstanding Substantial Acceptable Good Good $234,935,046
Accenture Outstanding Neutral Good Good Good $179,623,424
Northrop Grumman Good Substantial Good Good Good $269,623,861
Vencore Good Substantial Acceptable Acceptable Outstanding $156,662,569
BAH Good Substantial Acceptable Acceptable Good $134,266,455
Leidos Good Substantial Acceptable Acceptable Good $144,662,993
Harris Good Satisfactory Outstanding Outstanding Good $184,752,341
BAE Acceptable Substantial Outstanding Good Good $156,962,323
NES Acceptable Substantial Good Good Good $137,217,707
LinQuest Acceptable Substantial Good Good Good $175,049,125
Deloitte Acceptable Substantial Acceptable Good Outstanding $126,180,137
Parsons Acceptable Substantial Good Acceptable Outstanding $178,181,282
KeyW Acceptable Substantial Good Acceptable Good $123,192,620
AASKI Acceptable Substantial Good Acceptable Good $181,683,274
Solers Acceptable Substantial Acceptable Acceptable Good $157,193,400

AR, Tab 65, SSDD, at 3, 16.

Best-Value Tradeoff Analysis

In conducting the source selection, the SSA adopted a multi-step methodology to narrow the pool of proposals under consideration.  First, noting the importance of the innovation factor, the SSA looked to the ratings under this factor. The SSA identified seven proposals that were rated either outstanding or good under the innovation factor:  IBM, Accenture, Northrop Grumman, Vencore, BAH, Leidos, and Harris.  Id. at 5.

For each of the seven offerors included in this first pool, the SSA noted where the offeror's price fell among all offerors.[7]  Id. at 5-9.  Next, the SSA listed, at a high level, the strengths (and any weaknesses) assigned to the proposal under the non-price factors.  After summarizing the underlying evaluation, the SSA concluded with regard to each of the seven proposals that the proposal's strengths under the non-price factors merited "its selection over lower-rated, lower-priced proposals."  See e.g., id. at 5.  Despite the SSA's stated conclusion, however, these awards were made without any consideration of whether associated price premiums were justified by increased technical merit, as set forth in greater detail later in the decision.

In order to identify a second pool of proposals for consideration, the SSA next looked to those proposals that received a rating of acceptable on the innovation factor.  Id. at 9.  Noting that there were 26 such proposals, the SSA further narrowed the pool to those proposals with a rating of acceptable on the innovation factor in combination with a rating of substantial confidence on the past performance factor.  Id.  Noting that there were still a large number of proposals (18), the SSA further narrowed the pool to those proposals with a rating of acceptable on the innovation factor, a rating of substantial confidence on the past performance factor, and a rating of good or better on both problem statements.  Id.  This resulted in a pool of three proposals:  BAE, NES, and LinQuest.  Id. at 9-11. 

For each of the three offerors included in this second pool, the SSA performed essentially the same analysis as used for the first pool.  That is, the SSA noted where the offeror's price fell among all offerors; listed, at a high level, the strengths assigned to the proposal under the non-price factors; and concluded that strengths under the non-price factors merited "selection over lower-rated, lower-priced proposals."  See e.g., id. at 10.

At this point, the SSA considered concluding the tradeoff analysis in light of the fact that the agency had selected 10 proposals for award, which was the approximate number of anticipated awards set forth in the RFP.  Id. at 11; AR, Tab 5, RFP Amendment 4, at 30.  The SSA noted, however, that four additional proposals--those of Deloitte, Parsons, KeyW, and AASKI--were assigned essentially the same ratings as the proposals of BAE, NES, and LinQuest with the exception that the four additional proposals received one good rating and one acceptable rating for the problem statements.  Id. at 11.  The SSA decided to include these four proposals because more offerors could "potentially driv[e] prices down in future task order competitions."  Id.  The SSA further observed that "[t]wo of the next four most highly rated proposals also have comparatively low total proposed prices (ranked [DELETED] and [DELETED] by price), which could benefit future price competition."[8]  Id.  Like the proposals in the first two pools, the SSA recommended each of these four proposals for award on the basis that its strengths under the non-price factors merited its "selection over lower-rated, lower-priced proposals."  See id. at 11-13.

With respect to the remaining technically acceptable proposals that were not included in the above pools, the SSA considered the proposals of the two lowest-priced offerors, which we refer to herein as Offeror A (lowest-price offeror) and Offeror B (second lowest-priced offeror).  Id. at 3, 13-14.  In both instances, the SSA confirmed the ratings on the non-price factors and concluded that "considering the selection methodology where all non-price factors are significantly more important than price, the SSAC does not believe [the proposal] merits an award as compared to the proposals with higher prices and higher technical merit, and I agree."  See e.g., id. 14. 

Finally, with respect to the 19 remaining technically acceptable proposals, including Solers' proposal, the SSA notes that the SSAC "took a final look at all of the ratings for all of the Offerors to see if there were any other proposals with significant technical merit."[9]  Id. at 14.  The SSA further noted that the SSAC "did not identify sufficient technical merit in any other proposals to justify their recommendation for award, and I agree."[10]  Id. at 13. 

In summarizing the SSAC's recommendation, with which the SSA agreed, the SSA stated that the SSAC "recommended award to the 14 highest rated proposals in the non-price factors, as identified above, because the technical merit of those proposals justifies paying a price premium over lower-rated, lower-priced proposals."  Id. at 16.  With respect to making any "additional awards," the SSA concluded that the remaining 21 proposals "do not have sufficient technical merit."  Id.

Notice of Award, Debriefing, and Protest

On June 14, the agency provided notice to all unsuccessful offerors, including Solers.  AR, Tab 66, Unsuccessful Offeror Letter (Solers).  Solers timely requested a debriefing, which the agency provided on June 15.  AR, Tab 98, Solers Debriefing.  In its letter, the agency advised Solers that it could request an enhanced debriefing pursuant to DoD Class Deviation 2018-0011, id. at 3, which Solers did by submitting additional questions.  AR, Tab 102, Enhanced Debriefing Questions from Solers.  The agency responded to the questions on June 25.  AR, Tab 106, Agency Response to Enhanced Debriefing Questions.

This protest followed on June 29.  On August 9, Solers filed its comments and two supplemental protests.  These filings were submitted as three separate documents in the same docket entry.  See Electronic Protest Docketing System, Docket Entry No. 37.  For ease of reference, in our decision, we refer to the comments as "Solers' Comments" and to its supplemental protests using the nomenclature employed by Solers in the docket entry, i.e., "Supplemental Protest--Unequal Treatment" and "Supplemental Protest--Employment Agreement."

DISCUSSION

Solers raises six primary grounds of protest:  (1) the agency unreasonably assigned Solers' proposal a rating of acceptable under the innovation factor; (2) the agency treated offerors unequally under the innovation factor by identifying strengths in the proposals of other offerors but ignoring similar strengths in Solers' proposal; (3) the agency unreasonably evaluated Solers' proposal under the problem statements factor; (4) the agency improperly evaluated the proposal of [DELETED], one of the awardees, regarding that offeror's compliance with a DFARS clause prohibiting entering into or enforcing binding arbitration agreements with employees; (5) the agency abused its discretion in making an award without conducting discussions; and (6) the agency's best-value tradeoff was flawed.  For the reasons discussed below, we sustain protest grounds 1, 2, and 6; we dismiss protest ground 3; and we deny protest grounds 4 and 5.[11]

Innovation Factor Rating

Apart from its arguments of disparate treatment, which we address in the subsequent section, Solers challenges the agency's rating of its proposal as merely acceptable under the innovation factor.  The protester argues that, according to the definition set forth in the solicitation, a rating of acceptable is applicable only where strengths and weaknesses are offsetting or will have little or no impact on contract performance.[12]  Solers argues that there was "no offset of strengths and weaknesses [in the evaluation of its proposal under the innovation factor] since Solers was not given any weaknesses."  Protest at 9.  Solers further argues that, although the SSEB found that the one strength identified in the protester's proposal would have little or no impact on contract performance, this finding was arbitrary and inconsistent with other findings of the SSEB.  As explained below, we conclude the protester's argument has merit.

It is well-established that the evaluation of proposals is a matter within the discretion of the contracting agency.  Vectrus Sys. Corp., B-412581.3 et al., Dec. 21, 2016, 2017 CPD ¶ 10 at 3.  An offeror's disagreement with an agency's judgment, without more, is insufficient to establish that the agency acted unreasonably.  Id.  In reviewing an agency's evaluation, we will not substitute our judgment for that of the agency, but instead will examine the agency's evaluation to ensure that it was reasonable and consistent with the solicitation's evaluation criteria and with procurement statements and regulations.  MicroTechnologies, LLC, B-413091, B-413091.2, Aug. 11, 2016, 2016 CPD ¶ 219 at 4-5.

The record reflects that Solers' proposal was assigned one strength under the innovation factor for its history of engineering and deploying innovative solutions, and was rated as acceptable.  Specifically, the SSEB report provides, in relevant part:

1)L.4.2.3.3 History of Engineering and Deploying Innovative Solutions

(C-12-15, section C.3.1, 3-2)  The Offeror demonstrated exceptional experience developing many different types of technologies that directly align to DISA mission areas [DELETED].  The Offeror provided multiple examples of innovative research result[ing] in operational capabilities.  This will benefit the Government as their exceptionally strong history of innovation increases the probability of successful performance on future SETI task orders.

After review and discussion, the SSEB made no changes to the above strength.

With the final strength as listed above, the rating of Acceptable is confirmed.  The proposal addresses all Innovation elements and indicates an adequate approach and understanding of Innovation.  The one identified strength will have little or no impact on Contract performance.  Risk of unsuccessful performance is no worse than moderate.

AR, Tab 62, SSEB Report, at 172 (emphasis added).

As Solers correctly points out, the agency's conclusion that the identified strength would have little or no impact on contract performance is inconsistent with its preceding finding that Solers' proposal showed an "exceptionally strong history of innovation" that "increases the probability of successful performance on future SETI task orders."  Protest at 9 (citing AR, Tab 62, SSEB Report, at 172).  The same inconsistency appears in the evaluation of a number of other offerors--that is, the SSEB concluded that a strength increased the probability of successful performance only to assign a rating of acceptable on the basis that the strength will have little or no impact on contract performance.[13]  AR, Tab 62, SSEB Report, at 6, 32-33, 81, 85-86, 90, 96, 116, 142, 146, 187-89.

Source selection officers are responsible for reconciling contradictory evaluation findings that are potentially of significance to the agency's source selection decision, and where the record fails to demonstrate that the findings were reasonably reconciled, we will sustain a protest.  See CIGNA Gov't Servs., LLC, B-401062.2, B-401062.3, May 6, 2009, 2010 CPD ¶ 283 at 13-14.  Because the record here fails to explain how the agency could reasonably have concluded both that Solers' exceptionally strong history of innovation increased the probability of successful performance and that this strength would have little or no impact on contract performance, we sustain this basis of protest.[14]

Unequal Treatment

Solers also contends that DISA evaluated offerors unequally under the innovation factor.  Specifically, Solers claims that its proposal should have received additional strengths because it proposed capabilities comparable to those for which various other offerors received strengths.  Solers' Comments, Aug. 9, 2018, at 5-12; Solers' Supp. Protest--Unequal Treatment, Aug. 9, 2018, at 2-16.  We agree.

It is a fundamental principle of government procurement that agencies must treat offerors equally, which means, among other things, that they must evaluate proposals in an even-handed manner.  SRA Int'l, Inc., B-408624, B-408624.2, Nov. 25, 2013, 2013 CPD ¶ 275 at 10.  Where a protester alleges unequal treatment in an evaluation, we will review the record to determine whether the differences in ratings reasonably stem from differences in the proposals.  See SURVICE Eng'g Co., LLC, B-414519, July 5, 2017, 2017 CPD ¶ 237 at 9; Exelis Sys. Corp., B-407111 et al., Nov. 13, 2012, 2012 CPD ¶ 340 at 20-21.  For the majority of the assigned strengths, DISA has provided reasonable explanations demonstrating that differences in the evaluators' findings were based on meaningful differences between the proposals.  Discussed below are those areas in which we find that the agency has not provided a reasonable explanation for assigning a strength to another offeror's proposal, but failing to assign a strength to Solers' proposal for proposing similar capabilities.

  Relationships with Universities

First, Solers contends that it should have received a strength under bullet 1 of section L.4.2.3.4 of the RFP for its outreach, participation, and partnerships with universities.  In this regard, bullet 1 required offerors to describe the following:

Relationships, partnerships, and/or interactions with fundamental research and commercial/academic sector entities to work together to innovate, create or collaboratively solve problems or share information.  This may include hosting lab time/space to assist in solving problems, creating new services, certifying products and/or sharing material resources (funds, equipment, facilities) and human capital (students, faculty, staff, industry researchers, industry representatives).

AR, Tab 5, RFP Amend. 4, at 41-42.  The record reflects that the agency assigned a strength to the proposal of one of the awardees for its ties with a number of universities.  Solers contends that its proposal contained similar ties.  Solers' Supp. Protest--Unequal Treatment, Aug. 9, 2018, at 11-12. 

Specifically, with respect to the awardee's proposal, the SSEB found:

The Offeror [DELETED] at a number of universities.  These [DELETED] allow the Offeror to be exposed to new and innovative methods to solve current challenges.  Further, it allows the Offeror to determine which students to try to actively recruit.  This infusion of ideas makes the Offeror more creative and raises the probability of successful innovations during future SETI performance.

AR, Tab 62, SSEB Report, at 106 (emphasis added).  In responding to the protest, DISA emphasized that the awardee was assigned the strength for its [DELETED], "which exposes [the awardee] to 'new and innovative methods' and also provides an opportunity for [the awardee] 'to determine which students to try to actively recruit.'"  Supp. MOL/COS at 11.

Solers argues that it possesses similar ties with a number of universities that afford the same benefits touted by the agency above, namely exposure to new and innovative methods and an opportunity to recruit.  Solers' Supp. Protest--Unequal Treatment, Aug. 9, 2018, at 12.  Specifically, Solers' proposal provides, in relevant part:

Solers maintains strong ties with a number of universities [DELETED] because we recognize that having a pipeline of potential entry-level new hires is vital for the longevity of our company.  We have [DELETED] that allow us to [DELETED], and to allow us to gain fresh ideas and insights from those who are still in academia.  Maintaining ties to a number of universities not only increases our potential hiring pool, but it ensures that we get a diversity of skills, approaches, ideas, and training disciplines as a further way to ensure a broad base from which to get innovative ideas. . . .  In addition to our relationship with [DELETED], we are also connected to [DELETED].  This [DELETED] aids our recruiting efforts while our connection with [DELETED] helps us stay tied into the latest academic research in a number of technology areas pertinent to our business base.

AR, Tab 21, Solers Proposal, Vol. 2, Tab C, at C-17 (emphasis added).

In response, the agency concedes that Solers' [DELETED] offers the same benefits as the awardee's [DELETED], i.e., exposure to new and innovative methods and an opportunity to recruit.  Supp. MOL/COS at 11.  Nevertheless, the agency asserts that Solers' proposal did not merit a strength because "it had no equivalent of the awardee's [DELETED] in its proposal."  Id.  We find the agency's explanation to be unreasonable.

The contemporaneous record confirms that the agency assigned the strength to the awardee's proposal due to the benefits resulting from the awardee's [DELETED], not for the [DELETED] themselves.  In this respect, we agree with the protester's contention that, although Solers and the awardee employ different means by which to garner new and innovative methods or to recruit potential new hires, the resulting benefits are the same.  Accordingly, we conclude that the agency has failed to provide a meaningful explanation for its unequal treatment of the two offerors' proposals.

  Developing and Sustaining Solutions from Infancy to Full Maturity

Next, Solers contends that it should have received a strength under bullet 2 of section L.4.2.3.3 of the RFP for its track record of developing and sustaining solutions from infancy to full maturity.  Solers' Supp. Protest--Unequal Treatment, Aug. 9, 2018, at 3-4.  In this respect, bullet 2 required offerors to:

Describe your track record of developing solutions and successfully sustaining the solutions from infancy to full maturity.  It is desirable for the projects/programs/solutions to map to one of the SETI Task Areas.  Expressly list these.  Show proof of implementation on a live program, to include quantitative and qualitative measured benefits. Direct correlation to supporting the Warfighter or other DISA missions is desirable as well.

AR, Tab 5, RFP Amend.4, at 41.

The record reflects that the agency assigned a strength to the proposal of one of the awardees for providing three examples of developing solutions from infancy to full development.  AR, Tab 62, SSEB Report, at 11.  Specifically, with respect to the awardee's proposal, the SSEB report provides:

L.4.2.3.3 History of Engineering and Deploying Innovative Solutions

The Offeror's response on page C-12 provided multiple, strong examples of taking a wide range of technologies from infancy to full deployment.  This level of experience of delivering solid innovative results raises the probability of success on future SETI task orders.

AR, Tab 62, SSEB Report, at 11. 

Solers argues that it too provided three examples of solutions that it developed from infancy to maturity across a wide range of technologies.  Solers' Supp. Protest--Unequal Treatment, Aug. 9, 2018, at 3 (citing AR, Tab 21, Solers Proposal, Vol. 2, Tab C, at C-14, C-15).  Moreover, Solers correctly notes that its examples mapped to more SETI task areas, meaning that they were more relevant to the requirement.  Id.  Despite this, the agency did not assign a strength to Solers' proposal for its examples.

In response, the agency acknowledges the benefits of the three examples described in Solers' proposal.  Supp. MOL/COS at 8.  In fact, the agency concedes that they merited a strength.  Id.  DISA explains, however, that the single strength assigned to Solers' proposal under bullet 1 of section L.4.2.3.3, which was assigned for the company's "history of engineering and deploying Innovative Solutions," also includes the three examples submitted in response to bullet 2 of section L.4.2.3.3.[15]  Id.  In other words, the agency assigned a single overarching strength that applied to aspects of Solers' proposal addressing two separate requirements of the RFP.  We find this argument to be unsupported and unreasonable. 

The plain language of the SSEB report discussing the strength assigned to Solers' proposal under bullet 1 does not expressly indicate that it also includes aspects of Solers' proposal that address bullet 2.[16]  See AR, Tab 62, SSEB Report, at 172.  Nor does it discuss Solers' ability to sustain "solutions from infancy to maturity."  Furthermore, the SSEB report does not reflect that two strengths were being combined, or provide a rationale for doing so.[17]  Id.  More importantly, however, the agency's position presupposes that bullet 1 and bullet 2 of section L.4.2.3.3 were indistinguishable or that it was impossible for offerors to receive a strength under different bullets of the same section L criterion.  The record reflects, however, that other offerors received multiple strengths under different bullets of the same section L criterion.[18]  See e.g., AR, Tab 26, TEB Report (Innovation), IBM's Evaluation (strengths assigned under bullets 1 and 3 of section L.4.2.3.3).

For the foregoing reasons, we conclude that, with respect to the examples discussed above, the agency has not provided reasonable explanations demonstrating that the unequal treatment was based on meaningful differences in proposals.  Accordingly, we sustain this protest ground.

Evaluation of Problem Statements

Solers also alleges that its proposal should have been assigned strengths for its response to the two problem statements.  Solers' Comments, Aug. 9, 2018, at 12-13.  We dismiss this allegation as untimely raised.  Our Bid Protest Regulations contain strict rules for the timely submission of protests.  Interactive Tech. Solutions, LLC, B-413665.2, B-413665.3, Mar. 1, 2017, 2017 CPD ¶ 82 at 7.  Pursuant to these rules, a protest based upon other than alleged improprieties in a solicitation must be filed not later than 10 calendar days after the protester knew, or should have known, of the basis for the protest, whichever is earlier.  4 C.F.R. § 21.2(a)(2).  Where a protester initially files a timely protest, and later supplements it with new grounds of protest, the later-raised allegations must independently satisfy our timeliness requirements, since our Regulations do not contemplate the piecemeal presentation or development of protest issues.  Vigor Shipyards, Inc., B-409635, June 5, 2014, 2014 CPD ¶ 170 at 5.

Here, Solers knew, or should have known, of this basis of protest on June 15 when it received its initial debriefing.  In the written debriefing, DISA informed Solers of the following:  (a) its adjectival ratings for each factor; (b) assigned strengths (if any) assigned to the proposal under each factor; and (c) a description of the basis for any assigned strengths.[19]  AR, Tab 98, Solers Debriefing.  Accordingly, Solers knew on that date that DISA had not assigned any strengths to its responses.  If Solers believed that its proposal warranted strengths under this factor, it should have identified those strengths in its initial protest.  Because Solers waited until its comments to raise this ground, we dismiss it as untimely.  Interactive Tech. Solutions, LLC, supra, at 7-8 (dismissing as untimely allegation that proposal warranted additional strengths where agency disclosed lack of assigned strengths in debriefing).

In response, Solers argues that its allegation should be characterized as one of disparate treatment, i.e., that Solers' proposal merited strengths for its problem statement responses because DISA assigned strengths to AASKI's proposal for similar responses.  Solers' Supp. Comments, Aug 23, 2018, at 14.  Solers further argues that it could not have raised this allegation until after the agency produced the record.  Id.  We disagree. 

Although Solers claimed in its comments that the two offerors' proposals were "similar technically," Solers made this factual claim in the context of challenging the agency's tradeoff analysis.  Solers' Comments, Aug. 9, 2018, at 13 ("In summary, the [SSA] failed to fully justify the $24 million price premium between AASKI and Solers' proposals, even though both were similar technically.") (emphasis added).  See also id. at 12 ("The Agency evaluation failed by not assigning strengths to Solers' Problem 1 response, while also not fully considering the $24 million price premium between AASKI and Solers' proposals.").  Solers' comments do not reveal any substantive comparison of the problem statement responses of Solers and AASKI, nor an explanation of why the two responses might be considered similar.  See id. at 12-13.  As a result, we disagree with Solers' characterization of this allegation as one of disparate treatment and dismiss this protest ground.[20]

[DELETED]'s Employment Agreement

In its supplemental protest labeled "Employment Agreement," Solers argues that DISA unreasonably evaluated [DELETED]'s compliance with DFARS clause 252.222-7006.  Solers' Supp. Protest--Employment Agreement, Aug. 9, 2018, at 2-5.  This clause requires the contractor to agree not to enter into or enforce any employee agreement that requires mandatory arbitration to resolve "[a]ny claim under title VII of the Civil Rights Act of 1964" or "[a]ny tort related to or arising out of sexual assault or harassment, including assault and battery, intentional infliction of emotional distress, false imprisonment, or negligent hiring, supervision, or retention."  DFARS clause 252.222-7006(b)(1).[21]   

During its evaluation of proposals, the agency discovered that it had inadvertently omitted the DFARS clause from the solicitation.  AR, Tab 59, MFR on DFARS 252.222-7006, at 1.  Nevertheless, DISA concluded that it was still bound by the clause, id. (citing DFARS 222.7402), and proceeded to evaluate the sample employment agreements submitted by offerors.[22]  Solers contends that the sample agreement submitted by [DELETED] violates the DFARS clause and that the agency's conclusion to the contrary is unreasonable.  Solers' Supp. Protest--Employment Agreement, Aug. 9, 2018, at 2-5.

The record shows that [DELETED] submitted an employment agreement with its proposal and represented that the firm requires the agreement "to be signed by all incoming employees as a condition of employment."  AR, Tab 7, [DELETED] Proposal, Vol. 1, Tab I, at 13.  The submitted agreement mandates that employees "agree that any dispute or controversy arising out of or relating to any interpretation, construction, performance, or breach of this Agreement, shall be settled by arbitration[.]"  AR, Tab 7, [DELETED] Proposal, Vol. 1, Tab I, App. B-4, para. 8(a).  As Solers points out, unlike the agreements submitted by some of other offerors, [DELETED]'s employment agreement does not expressly exclude claims stemming from Title VII and/or torts related to sexual assault or harassment.  Solers' Supp. Protest--Employment Agreement, Aug. 9, 2018, at 4. 

During its evaluation, the agency remarked that the mandatory arbitration clause in [DELETED]'s agreement "could run afoul" of the DFARS prohibition.  AR, Tab 59, MFR on DFARS 252.222-7006, at 1.  However, the agency concluded that the agreement "has a more narrow focus" and does not include Title VII claims or claims related to sexual assault or harassment.  Id.  In reaching this conclusion, the agency relies on the title of the agreement, "Employee Confidential Information & Invention Assignment Agreement," and other language in the agreement to conclude that the agreement is "limited" to issues related to "the protection of confidential information and allocating rights to intellectual property developed by the employee."  Id.

Although the agency is correct that the agreement relates primarily to the protection of confidential information and the parties' rights involving intellectual property, the agreement also governs a number of other employment-related issues, such as a requirement not to engage in conflicting employment or to solicit company employees to leave their employment.  AR, Tab 7, [DELETED] Proposal, Vol. 1, Tab I, App. B, paras. 3, 10.  Of particular note, the agreement requires employees to affirm that they will "diligently adhere to the Conflict of Interest Guidelines" included as an exhibit to the agreement.  Id., para. 6.  These guidelines are best described as principles of business ethics with which employees are required to comply.  AR, Tab 7, [DELETED] Proposal, Vol. 1, Tab I, App. B, Exh. C.  The agreement is clear that violation of the guidelines "may result in discharge without warning."  Id. at 2. 

Solers argues that two guidelines are noteworthy.  First, employees may not initiate or approve "personnel actions involving reward or punishment of employees or applicants where there . . . is or appears to be a personal or social involvement."  Id., para. 4.  Second, employees may not initiate or approve "any form of personal or social harassment of employees."  Id., para. 5.  Solers argues that the former could include actions related to "negligent hiring, supervision or retention" of employees as contemplated under the DFARS clause and that the latter is a sexual harassment policy also within the scope of the DFARS clause.  Solers' Supp. Protest--Employment Agreement, Aug. 9, 2018, at 4.  The agency responds--with no details--that the actions prohibited by the guidelines do not fall within the ambit of the DFARS provision.

Without further legal argument by the parties, it is not clear whether the actions described in the guidelines could fall within the type of claims described in the DFARS clause.[23]  In any event, even were we to find that the agreement did not comply with the DFARS clause, we conclude that the protester has not been prejudiced by any errors in this regard.  Competitive prejudice is an essential element of every viable protest; where, as here, the record establishes no reasonable possibility of prejudice, we will not sustain a protest even if a defect in the procurement is found.  See Procentrix, Inc., B-414629, B-414629.2, Aug. 4, 2017, 2017 CPD ¶ 255 at 11-12.

As explained above, the agency represents that it was required to include this clause in the resulting contracts, but inadvertently failed to do so.  See AR, Tab 59, MFR on DFARS clause 252.222-7006, at 1.  As a result, the agency represents that it intends to bilaterally modify the contracts to add the clause.  Id.; Supp. MOL/COS at 18.  The agency further represents that, if [DELETED] agrees to modify its contract to add this clause, it would be agreeing not to "[t]ake any action to enforce any provision of an existing agreement with an employee" that requires mandatory arbitration to resolve the types of claims set forth in the DFARS clause.  See DFARS clause 252.222-7006(b)(1)(ii).  If [DELETED] refuses to modify its contact, the agency represents that it will terminate [DELETED]'s contract.  Supp. MOL/COS at 19.  Either way, any action taken by the agency to correct this omission would not result in an award of a contract to Solers.  Accordingly, we deny this ground of protest because we conclude that Solers has not been prejudiced by any alleged errors.

Award Without Discussions

Solers also alleges that the agency abused its discretion by making the award without conducting discussions.  Protest at 17.  Solers relies upon section 215.306(c) of the DFARS, which provides that "[f]or acquisitions with an estimated value of $100 million or more, contracting officers should conduct discussions."  Based on this regulation, and the fact that this is a DoD procurement valued over $100 million, Solers argues that our Office should review the agency's decision not to conduct discussions. 

We recently had occasion to review the cited DFARS section.  In Science Applications Int'l Corp., B-413501, B-413501.2, Nov. 9, 2016, 2016 CPD ¶ 328 (hereinafter "SAIC"), we concluded that the regulation "is reasonably read to mean that discussions are the expected course of action in DoD procurements valued over $100 million, but that agencies retain the discretion not to conduct discussions if the particular circumstances of the procurement dictate that making an award without discussions is appropriate."  Id. at 9-10.  See also McCann-Erickson USA, Inc., B-414787, Sept. 18, 2017, 2017 CPD ¶ 300 at 9 n.10.  In SAIC, we further stated that "we see the inquiry here to be whether the record shows, given the particular circumstances of this procurement, that there was a reasonable basis for the agency's decision not to conduct discussions."  SAIC, supra, at 10.  Applying this reasonableness standard, our Office determined that, under the particular circumstances of the procurement in SAIC, the agency's decision to forego discussions was reasonable where:  (1) the awardee had received a higher technical score under virtually all evaluation factors; (2) the awardee's price was reasonable; and (3) the protester's proposal contained deficiencies.  Id. at 10-11. 

Solers contends that our Office adopted a three-part test based on the factors above and that such a test must be met before an agency may dispense with discussions in a DoD procurement valued over $100 million.  Although we concluded in SAIC that these factors were sufficient to dispense with discussions, we did not find that these factors were necessary to award without discussions.  An agency's decision to award without discussions could be reasonable under other fact patterns as well.  As a result, our decision in SAIC is not properly read to mean that those three factors are the only factors a DoD agency can consider when deciding to award without discussions within the meaning of this DFARS clause.  We therefore reject Solers' argument that the factors discussed by our Office in SAIC are necessary in order for DISA to award without discussions here.

The agency explains that it elected to award without discussions because:  (1) offerors were on notice that the agency intended to make awards without discussions; (2) the initial proposals demonstrated clear technical advantages and disadvantages to differentiate among the offerors; and (3) the initial proposals demonstrate significant technical merit at fair and reasonable prices.  AR, Tab 64, Discussions MFR, at 1-3.  We find that Solers has not provided a sufficient basis to warrant sustaining its protest of the agency's decision not to conduct discussions.

Best-Value Tradeoff Analysis

Finally, Solers challenges the agency's tradeoff analysis and, in particular, argues that the agency failed to meaningfully consider price.  Protest at 10, 16.  Solers also focuses on the similarity in the non-price ratings of its proposal and that of AASKI, one of the awardees, arguing that the slightly higher rating for one of the problem statements in AASKI's proposal cannot reasonably justify the $24 million price premium associated with AASKI's proposal.  Id. at 16.  We sustain Solers' challenge to the tradeoff analysis.  The record shows that DISA performed a mechanical tradeoff that relied exclusively on adjectival ratings, excluded technically acceptable proposals without any consideration of the price of those proposals, and, in general, did not meaningfully consider price.

Source selection officials have considerable discretion in determining the manner and extent to which they will make use of technical and cost evaluation results, and their judgments are governed only by the tests of rationality and consistency with the stated evaluation criteria.  The SI Org., Inc., B-410496, B-410496.2, Jan. 7, 2015, 2015 CPD ¶ 29 at 14.  Where, as here, a solicitation provides that technical factors are more important than price in source selection, selecting a technically superior, higher-priced proposal is proper where the agency reasonably concludes that the price premium is justified in light of the proposal's technical superiority.  The MIL Corp., B-294836, Dec. 30, 2004, 2005 CPD ¶ 29 at 8.  

Any such conclusion, however, must be adequately documented and supported by a rational explanation as to why the higher-rated proposal is, in fact, superior, and why its technical superiority warrants paying a price premium.  Arcadis U.S., Inc., B-412828, June 16, 2016, 2016 CPD ¶ 198 at 10; Cyberdata Techs., Inc., B-406692, Aug. 8, 2012, 2012 CPD ¶ 230 at 5.  In sum, the documentation must show, not merely the tradeoff decision or business judgment made, but the rationale for that decision or judgment.  FAR § 15.308; Blue Rock Structures, Inc., B-293134, Feb. 6, 2004, 2004 CPD ¶ 63 at 5.

Here, instead of documenting a reasonable basis for the tradeoffs made, the record indicates that the agency mechanically made award to the 14 offerors whose proposals exhibited, in descending order, the best combination of adjectival ratings under the non-price factors.  In this regard, DISA grouped similarly rated proposals into pools and made awards to all offerors in those pools--regardless of price--until DISA reached what it referred to as a "clear break" in the proposals.  AR, Tab 65, SSDD, at 16.  The record reflects, however, that this "clear break" in the proposals was based exclusively on the adjectival ratings assigned to proposals under the non-price factors. 

For instance, in declining to make an award to Offeror B--the second lowest-price proposal--the SSA observed that if the proposal had achieved a rating of substantial confidence under the past performance factor, rather than a rating of satisfactory confidence, the proposal would have been "in line for an award."  Id. at 14.  In this respect, the SSA notes, "Offerors with similar ratings in Factors 1 and 3 were only recommended for award if they achieved the highest rating in Factor 2," which Offeror B did not.  Id. at 14.  The SSA reviewed Offeror B's past performance and determined that the satisfactory confidence rating was reasonable.  The SSA's analysis, however, did not examine whether a proposal with a significant price advantage might be among those proposals offering the best value to the government notwithstanding a slightly lower rating in past performance.[24]  Accordingly, without any weighing of its low price, Offeror B was not recommended for award because its proposal was not assigned the necessary combination of adjectival ratings.

Likewise, in considering the proposal of [DELETED], the SSA concluded that, but for its rating of marginal under the innovation factor, it could have been "in line for an award" based upon the combination of ratings its proposal received under other factors.  Id. at 15.  Finally, and perhaps most telling, in considering the proposals of those offerors whose combination of ratings did not place them in one of the pools, the SSA stated that the SSAC "took a final look at all the ratings for all of the Offerors to see if there were any other proposals with significant technical merit."  Id. at 14 (emphasis added).  Based on their ratings, the SSA concluded they do not have sufficient technical merit to warrant an award.  Id. at 13.

We have long recognized that an agency's source selection decision cannot be based on a mechanical comparison of the offerors' technical scores or ratings, but must rest upon a qualitative assessment of the underlying technical differences among competing offerors.  See The MIL Corp., supra, at 8 (sustaining protest where agency mechanically made award to all proposals that received "blue" ratings on the two non-price factors, and declined to make award to any proposal that did not receive a "blue" rating for the non-price factors).  See also One Largo Metro LLC, et al., B-404896 et al., June 20, 2011, 2011 CPD ¶ 128 at 14-15.  Here, in adopting such a mechanical approach, DISA failed to make a qualitative assessment of the technical differences among the competing proposals in order to determine whether the perceived technical superiority of those proposals receiving the best combination of ratings justified paying the price premium associated with those proposals. 

For example, as Solers points out, Protest at 16, the agency assigned the following similar ratings to the proposals of AASKI and Solers:

Offeror Factor 1 Factor 2 Factor 3 Factor 4 Factor 5
  Innovation Past Performance Problem 1 Problem 2 Small Business Price
AASKI Acceptable Substantial Good Acceptable Good $181,683,274
Solers Acceptable Substantial Acceptable Acceptable Good $157,193,400

Despite this similarity in the ratings, there is no comparison of the underlying technical differences between the proposals or any explanation of why AASKI's alleged technical superiority merited a $24 million price premium.  The record here shows that Solers' proposal was essentially excluded from consideration for award--without any consideration of its price--because its proposal did not receive the necessary combination of ratings under the non-price factors and, thus, did not fall before the "clear break" in proposals. 

In a tradeoff source selection process, however, an agency cannot eliminate a technically acceptable proposal from consideration for award without taking into account the relative cost of that proposal to the government.[25]  See e.g., Cyberdata Techs., Inc., supra, at 5 (protest sustained where technically acceptable proposal excluded from consideration for award without consideration of its price); System Eng'g Int'l, Inc., B-402754, July 20, 2010, 2010 CPD ¶ 167 at 5 (protest sustained where record shows that agency in best-value procurement performed tradeoff between two higher-rated, higher-priced quotations but did not consider the lower prices submitted by other lower-rated, technically acceptable vendors); Coastal Environments, Inc., B-401889, Dec. 18, 2009, 2009 CPD ¶ 261 at 4 (protest sustained where agency conducted a tradeoff between the two highest-rated, highest-priced proposals, but did not consider the lower prices offered by other lower-rated, technically acceptable offerors); Checchi and Co. Consulting, Inc., B-285777, Oct. 10, 2000, 2001 CPD ¶ 132 at 4 n.4 (protest sustained where agency failed to consider awardee's proposed costs at any time prior to award or to consider the proposed costs of offerors other than the awardee); Kathpal Techs., Inc.; Computer & Hi-Tech Mgmt., Inc., B-283137.3 et al., Dec. 30, 1999, 2000 CPD ¶ 6 at 9 (agency cannot eliminate a technically acceptable proposal from consideration for award without taking into account the relative cost of that proposal to the government). 

When using a tradeoff source selection process, if the agency excludes acceptable offerors without considering price, the agency has failed to conduct the essence of a tradeoff, which requires the agency to consider and trade off offerors' higher (or lower) prices in relation to the perceived benefits of the proposals.  Sevatec, Inc., supra, at 9; A&D Fire Protection, Inc., B-288852, Dec. 12, 2001, 2001 CPD ¶ 201 at 3.  Here, we find the agency's elimination of technically acceptable proposals, including Solers', without meaningful consideration of price, to be inconsistent with the agency's obligation to evaluate proposals under all of the solicitation's criteria, including price.

Finally, this brings us to the weight the agency afforded to price in the source selection process.  The Competition in Contracting Act of 1984 (CICA) requires agencies to include cost or price as an evaluation factor in every solicitation, and agencies must consider cost or price to the government in evaluating competitive proposals.[26]  10 U.S.C. § 2305(a)(3)(A)(ii); FAR § 15.304(c)(1); Lockheed Missiles & Space Co., Inc. v. Bentsen, 4 F.3d 955, 959 (Fed. Cir. 1993); Sevatec, Inc., supra, at 7; I.M. Sys. Grp., B-404583 et al., Feb. 25, 2011, 2011 CPD ¶ 64 at 7; Electronic Design, Inc., B-279662.2 et al., Aug. 31, 1998, 98-2 CPD ¶ 69 at 8.  Even where, as here, price is stated to be of significantly less importance than the non-price factors, an agency must meaningfully consider cost to the government in making its selection decision.  See Lockheed Missiles & Space Co., Inc. v. Bentsen, supra ("Moreover, the importance of price in a price/technical tradeoff must not be discounted to such a degree that it effectively renders the price factor meaningless."); Arcadis U.S., Inc., supra, at 10; Coastal Int'l Sec., Inc., B-411756, B-411756.2, Oct. 19, 2015, 2015 CPD ¶ 340 at 14.  In this respect, an evaluation and source selection that fails to give significant consideration to cost or price cannot serve as a reasonable basis for award.  I.M. Sys. Grp., supra, at 7; The MIL Corp., supra, at 9; Electronic Design, Inc., supra, at 8.  In our view, the record in this case demonstrates that the agency has failed to comply with the statutory requirement that agencies give cost or price meaningful consideration in source selection. 

Here, price was not considered in any meaningful way in the source selection decision.  In this respect, the record shows that price had no material impact on an offeror's ability to be selected for award.  Once the higher-rated proposals were identified, the agency did not perform a price/technical tradeoff; rather, award was based strictly on technical merit.[27]  In a tradeoff source selection process, however, an agency may not so minimize the impact of price to make it merely a nominal evaluation factor because the essence of the tradeoff process is an evaluation of price in relation to the perceived benefits of an offeror's proposal.  Sevatec, Inc., supra, at 8 (citing FAR § 15.101-1(c)); Electronic Design, Inc., supra.

Although we acknowledge that the SSA repeatedly documented a nearly verbatim, one-sentence conclusion that, due to strengths on the non-price factors, the 14 awardees merited selection over lower-rated, lower-priced proposals, see e.g., AR, Tab 65, SSDD, at 9, we find such consideration of price to be nominal.  See Cyberdata Techs., Inc., supra, at 5 n.1 (sustaining protest where agency emphasized the importance of technical superiority and concluded that selection of the lower-priced proposals "would be at the reduction of technical quality and not worth a trade-off to that extent.").  Indeed, anything less would be to ignore price completely.

In sum, we find that the agency performed a mechanical tradeoff analysis that failed to meaningfully consider price and resulted in the exclusion of technically acceptable proposals.

Competitive Prejudice

Prejudice is an essential element of a viable protest.  AdvanceMed Corp., B-414373, May 25, 2017, 2017 CPD ¶ 160 at 16.  Here, but for the errors discussed above, the agency might have rated the protester's proposal more favorably under the innovation factor, the most important factor.  An increase in the standing of Solers' proposal under this factor could have resulted in a different best-value tradeoff determination, particularly had the agency meaningfully considered offerors' prices given that Solers' proposal was significantly lower priced than some of the awardees' proposals.  In these circumstances, we resolve doubts regarding competitive prejudice in favor of the protester.  Id.  Where, as here, the protester has shown a reasonable possibility that it was prejudiced by the agency's actions, we will sustain its protest.  Id. at 16-17.

RECOMMENDATION

For the reasons discussed above, we conclude that DISA's evaluation under the innovation factor and its best-value tradeoff source selection decision were unreasonable.  We further conclude that Solers was prejudiced by the agency's evaluation.  We recommend that DISA reevaluate Solers' proposal under the innovation factor and prepare a new source selection decision with appropriate consideration given to all evaluation factors.  We also recommend that the agency reimburse the protester's reasonable costs associated with filing and pursuing its protest, including attorneys' fees.  4 C.F.R. § 21.8(d).  The protester's certified claim for costs, detailing the time expended and costs incurred, must be submitted to the agency within 60 days after the receipt of this decision.  4 C.F.R. § 21.8(f).

The protest is sustained in part, denied in part, and dismissed in part.

Thomas H. Armstrong
General Counsel



[1] DISA awarded contracts to the following 14 firms:  AASKI Technology, Inc. (AASKI); Accenture Federal Services (Accenture); BAE Systems Technology Solutions & Services (BAE); Booz Allen Hamilton, Inc. (BAH); Deloitte Consulting LLP (Deloitte); Harris Corporation (Harris); International Business Machines Corporation (IBM); KeyW Corporation (KeyW); Leidos Innovations Corporation (Leidos); LinQuest Corporation (LinQuest); NES Associates LLC (NES); Northrop Grumman Systems Corporation (Northrop Grumman); Parsons Government Services, Inc. (Parsons); and Vencore, Inc.

[2] The restricted pool was set aside for small business concerns under North American Industry Classification System (NAICS) Code 541412.  AR, Tab 5, RFP Amend. 4, at 30.

[3] Under this factor, firms other than small businesses were also required to submit a subcontracting plan meeting the requirements of Federal Acquisition Regulation (FAR) clause 52.219-9 and Defense Federal Acquisition Regulation Supplement (DFARS) clause 252.219-7003.  AR, Tab 5, RFP Amend. 4, at 48. 

[4] The labor rates would be the "capped ceiling" for that labor category for any fixed-price or time-and-materials task orders.  AR, Tab 5, RFP Amend. 4, at 48.  Offerors would be permitted to propose less than the capped rates.  Id.

[5] The RFP provided that no cost or price realism analysis would be conducted at the IDIQ contract level, but that costs for cost-reimbursement work would be evaluated at the task order level and would be subject to cost realism analysis at that time.  AR, Tab 5, RFP Amend. 4, at 48.

[6] The agency repeated this same conclusion verbatim in its review of each offeror's proposal.

[7] For example, with respect to IBM's price, the SSA noted that "[t]his Offeror's total proposed price was ranked [DELETED] of 35."  Id. at 5. 

[8] Of course, the SSA's comment regarding price does not explain why the other two proposals were included.  Their inclusion appears to be based solely on the adjectival ratings assigned to the proposals.

[9]  In considering the ratings of the other proposals, the SSAC observed that one of the proposals appears to be an "anomaly" because, but for its rating of marginal under the innovation factor, it "could have been in line for award" based on the combination of ratings its proposal received under the other non-price factors.  AR, Tab 63, SSAC Report, at 12; Tab 65, SSDD, at 14-15.  Other than this one proposal, there is no discussion of any other proposal.

[10] The SSAC report does not contain any additional analysis, concluding likewise that it "did not identify sufficient technical merit in any other proposals to justify their recommendation for award."  AR, Tab 63, SSAC Report, at 10.

[11] Solers raises other collateral arguments.  We have reviewed these arguments and find that none provides a basis to sustain the protest.  For example, Solers claims that numerous provisions of the RFP were latently ambiguous, including the definition of the term "innovation."  Protest at 19-21; Solers' Comments, Aug. 9, 2018, at 18-24, 25.  Solers, however, fails to make the threshold showing required to prevail on this allegation.  For an ambiguity to exist, there must be two or more reasonable interpretations of a term.  ACADEMI Training Ctr., LLC dba Constellis, B-415416, Dec. 18, 2017, 2018 CPD ¶ 3 at 5.  Solers does not identify any alternative interpretation of the RFP language, let alone a reasonable interpretation.  Absent such a threshold showing, we conclude that this protest ground fails to state a legally sufficient basis of protest.  4 C.F.R. §§ 21.1(c)(4), (f); 21.5(f).  See Team People LLC, B-414434, B-414434.2, June 14, 2017, 2017 CPD ¶ 190 at 5 n.6 (dismissing protest as legally insufficient where protester does not identify, with any specificity, what solicitation term it believes to be ambiguous).

[12] The RFP defined a rating of "acceptable" under the innovation factor as follows:

Proposal addresses all Innovation elements and indicates an adequate approach and understanding of Innovation.  Strengths and weaknesses, if any, are offsetting or will have little or no impact on Contract performance.  Risk of unsuccessful performance is no worse than moderate.

AR, Tab 5, RFP Amend. 4, at 52. 

[13]  In at least four instances, the SSEB concluded that a weakness "increases the risk of unsuccessful SETI contract performance" only to assign a rating of acceptable on the basis that the weakness "will have little or no impact on contract performance."  AR, Tab 62, SSEB Report, at 111, 167, 177, 183.

[14]  It appears from the record that the SSEB's finding of little or no impact on contract performance (and the accompanying rating of acceptable) was based on the number of strengths/weaknesses assigned to the proposal, as opposed to an assessment of the qualitative information underlying the strengths/weaknesses.  For example, although the RFP explained that innovation fell into three levels, there is no indication in the record that the agency considered whether the "exceptional" examples of innovation in Solers' proposal fell within Level 1, 2, or 3.  Instead, in each instance in which a proposal was assigned three or fewer strengths in combination with two or fewer weaknesses and no significant weaknesses, the SSEB concluded that the strength(s) and/or weakness(es) would have little or no impact on contract performance.

[15] Bullet 1 of L.4.2.3.3 provided: 

Describe the company's history of engineering and deploying Innovative Solutions.  Innovative Solutions can be both successes and failures in an Offeror's history of driving Innovation.  DISA realizes that there is value in failure, and as such, historical citations can be those that were successfully implemented, or those that were engineered and tested up to Milestone C or Milestone C equivalent, but were never actualized in an operating environment.  These citations should date no further back than 10 years prior to proposal submission.

AR, Tab 5, Amend. 4, at 41.

[16] Likewise, the TEB's consensus report also does not indicate that the strength assigned to Solers' proposal under bullet 1 encompasses aspects of Solers' proposal responding to bullet 2.  See AR, Tab 26, TEB Report (Innovation), Solers' Evaluation.

[17] For this reason, the agency reliance on its evaluation of another offeror to justify its treatment of Solers' is inapposite.  DISA explains that it initially assigned two separate strengths to an offeror's proposal under bullets 1 and 2, but later combined them into one strength.  Supp. MOL/COS at 8 (citing AR, Tab 62, SSEB Report, at 142).  In doing so, the SSEB stated that the two strengths were assigned for examples that the agency did not consider to be "significantly different enough to warrant two separate strengths."  AR, Tab 62, SSEB Report, at 142.  In contrast, there is no explanation as to why two aspects of Solers' proposal were combined into a single strength.

[18] The agency points out that no offeror received separate strengths under bullets 1 and 2 of section L.4.2.3.3.  Supp. MOL/COS at 8.  This fact alone, without more, demonstrates very little.  The agency fails to show that these proposals contained aspects meriting strengths under both bullets, like Solers.  In fact, the record seems to demonstrate the opposite with respect to the majority of the examples highlighted by the agency.  Regarding two examples ([DELETED] and [DELETED]), however, it appears at first glance that the agency may have combined strengths under bullets 1 and 2 into an overarching strength without any explanation, like it did for Solers.  Assuming for the sake of argument that those strengths are comparable, this does not prove that the agency's actions were proper.  Rather, it could demonstrate that the agency took the same unexplained and unsupported action in three instances.

[19] Solers' proposal was not assigned any weaknesses, significant weaknesses, uncertainties, and/or deficiencies.  Hence, none were discussed in the debriefing letter.

[20] Although not dispositive, we also point out that Solers did not include this alleged claim of "disparate treatment" in the supplemental protest filing in which it set forth its arguments of unequal treatment vis-à-vis numerous offerors.  See Solers' Supp. Protest--Unequal Treatment, Aug. 9, 2018.

[21] We recently concluded that compliance with this requirement relates to the propriety of the award decision and is not a matter of contract administration.  L3 Unidyne, Inc., B-414902 et al., Oct. 16, 2017, 2017 CPD ¶ 317 at 4 n.5.

[22] The RFP required offerors to "[d]escribe if your company requires any kind of agreements to be signed as a condition of employment and provide samples."  AR, Tab 5, Amend. 4, at 38.

[23] For instance, it is not clear to us whether a wrongful termination claim brought by an employee who is alleged to have initiated or approved of the personal or social harassment of employees could fall within the definition of a "tort related to or arising out of sexual assault or harassment," as set forth in the DFARS clause.

[24] It is worth noting that, like Offeror B, Harris received a rating of satisfactory for its past performance.  Unlike Offeror B, however, Harris' rating of satisfactory under this factor did not preclude it from receiving an award despite being priced $75 million higher than Offeror B.  This is because Harris' proposal was included in the first pool of proposals considered.  In this pool, the record reflects that awards were made strictly based upon the ratings under the innovation factor.  To the extent an offeror did not possess sufficient technical merit under another factor, the SSA simply omitted from the one-sentence award recommendation any reliance on that factor.  See e.g., AR, Tab 65, SSDD, at 6, 9 (omitting reliance on factor 2, Accenture & Harris); id. at 7, 8, 9 (omitting reliance on factor 3, Vencore, BAH, Leidos).

[25]  Although the agency might have elected to employ a "highest technically rated offerors with fair and reasonable prices" source selection methodology, see Sevatec, Inc., et al., B-413559.3 et al., Jan. 11, 2017, 2017 CPD ¶ 3 at 8-9, it did not.  Indeed, the RFP here provided that the agency intended to make awards using a best-value tradeoff evaluation scheme, which does not permit the agency to eliminate a technically acceptable proposal from consideration for award without taking into account the relative cost of the proposals.

[26]  There is no exception to the requirement set forth in CICA that cost or price to the government be considered in selecting proposals for award because the selected awardees will be provided the opportunity to complete for task orders under the awarded contracts.  The MIL Corp., supra, at 9; C.W. Gov't Travel, Inc., B-295530.2, et al., July 25, 2005, 2005 CPD ¶ 139 at 6.

[27] In this regard, there is no indication that price played a role in determining the "clear break" in the proposals. 

Downloads

GAO Contacts

Office of Public Affairs