OGSystems, LLC

B-414672.6,B-414672.9: Oct 10, 2018

Additional Materials:

Contact:

Ralph O. White
(202) 512-8278
WhiteRO@gao.gov

Kenneth E. Patton
(202) 512-8205
PattonK@gao.gov

 

Office of Public Affairs
(202) 512-4800
youngc1@gao.gov

OGSystems, LLC, of Chantilly, Virginia, protests its failure to receive a contract award under request for proposals (RFP) No. HC1047-17-R-0001, issued by the Department of Defense (DoD), Defense Information Systems Agency (DISA), for information technology (IT) engineering services. The protester challenges the reasonableness of the agency's evaluation of proposals under the non-price factors. The protester also contends that the agency abused its discretion in failing to conduct discussions with offerors and the agency's best-value tradeoff analysis was unreasonable.

We sustain the protest in part and deny it in part.

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of:  OGSystems, LLC

File:  B-414672.6; B-414672.9

Date:  October 10, 2018

C. Peter Dungan, Esq., and Jason A. Blindauer, Esq., Miles & Stockbridge P.C., for the protester.
Daniel R. Forman, Esq., and Laura J. Mitchell Baker, Esq., Crowell & Moring LLP, for Vencore, Inc.; and Deneen J. Melander, Esq., Richard A. Sauber, Esq., and Lanora C. Pettit, Esq., Robbins, Russell, Englert, Orseck, Untereiner & Sauber LLP, and Kenneth M. Reiss, Esq., Northrop Grumman Corporation, for Northrop Grumman Systems Corporation, the intervenors.
Sarah L. Carroll, Esq., and Aubri Dubose, Esq., Defense Information Systems Agency, for the agency.
Kenneth Kilgour, Esq., and Jennifer D. Westfall-McGrail, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1.  Protest challenging the Source Selection Evaluation Board's (SSEB) rationale for removing a strength assigned by lower-level evaluators is sustained where the SSEB's rationale is not meaningfully explained in the contemporaneous record.

2.  Protest that the agency abused its discretion in failing to conduct discussions under Department of Defense procurement valued at over $100 million is denied where the record shows that agency decision not to conduct discussions was reasonable.

3.  Protest challenging the agency's best-value tradeoff is sustained where the record shows that the agency performed a mechanical analysis that failed to meaningfully consider price and resulted in the exclusion of technically acceptable proposals from consideration for award.

DECISION

OGSystems, LLC, of Chantilly, Virginia, protests its failure to receive a contract award under request for proposals (RFP) No. HC1047-17-R-0001, issued by the Department of Defense (DoD), Defense Information Systems Agency (DISA), for information technology (IT) engineering services.[1]  The protester challenges the reasonableness of the agency's evaluation of proposals under the non-price factors. The protester also contends that the agency abused its discretion in failing to conduct discussions with offerors and the agency's best-value tradeoff analysis was unreasonable.

We sustain the protest in part and deny it in part.

BACKGROUND

DISA issued the RFP on February 22, 2017, using full and open competition, with the intent to establish a Multiple Award Task Order Contract (MATOC) referred to as the Systems Engineering, Technology, and Innovation (SETI) contract.  Agency Report (AR), Tab 1, RFP at 1, 11, 102.  The primary objective of the SETI contract is to provide engineering and technical support, services, and products globally to DoD, DISA, and DISA mission partners.  Id. at 12.  In this regard, the SETI contract "provides an overarching streamlined process for ordering a wide variety of critical IT engineering performance-based services while ensuring consistency and maximum opportunity for competition."  Id.

The scope of the SETI contract includes a broad array of research and development, as well as "critical technical disciplines core to engineering, delivering, and maintaining DoD and DISA IT products and capabilities."  Id.  The performance work statement (PWS) identified the following eight task areas:  (1) system engineering; (2) design analysis engineering; (3) systems architecture; (4) software systems design and development; (5) systems integration; (6) systems test and evaluation; (7) systems deployment and life-cycle engineering; and (8) special systems engineering requirements.  Id. at 12-13.

The RFP anticipated the award of multiple indefinite-delivery, indefinite-quantity (IDIQ) contracts in two pools:  an unrestricted pool and a restricted pool.  AR, Tab 5, RFP Amend. 4, at 30.  The protester submitted a proposal for consideration in the unrestricted pool and the awards at issue here relate to that pool.  The RFP indicated that DISA intended to award approximately 10 contracts on an unrestricted basis and approximately 20 contracts on a restricted basis.  Id.  However, the agency expressly reserved the right to award more, less, or no contracts at all.  Id.

The RFP provided that the ordering period would consist of a 5-year base period and a 5-year option period.  AR, Tab 1, RFP at 41.  Orders issued under the contract would be performed on a fixed-price, cost reimbursement, and/or time-and-materials basis, or a combination thereof, and might also include incentives.  AR, Tab 5, RFP Amend. 4, at 30.  The maximum dollar value for all contracts, including the base and option periods, is $7.5 billion.  AR, Tab 1, RFP, at 8. 

  Evaluation Criteria

The solicitation provided for award on a best-value tradeoff basis consisting of price and the following four non-price factors, listed in descending order of importance:  (1) innovation, (2) past performance, (3) problem statements, and (4) utilization of small business.  AR, Tab 5, RFP Amend. 4, at 50-51.  When combined, the non-price factors were significantly more important than price.  Id. at 51.  The RFP indicated that DISA intended to award without discussions, but also reserved the right to conduct discussions if in the agency's best interest.  Id. at 33, 50, 58.

As relevant here, section L of the RFP instructed offerors to address the following five criteria in their proposals in order to demonstrate their capabilities and proposed efforts to achieve and provide innovation: 

  • Section L.4.2.3.1, Corporate Philosophy/Culture on Innovation
  • Section L.4.2.3.2, Investment in Innovation
  • Section L.4.2.3.3, History of Engineering and Deploying Innovative Solutions
  • Section L.4.2.3.4, Outreach and Participation
  • Section L.4.2.3.5, Certifications, Accreditations, Awards, Achievements, and Patents. 

Id. at 40-42.  Under each criterion, the RFP included several bullets that offerors were to address.  Id.  For example, under section L.4.2.3.1, Corporate Philosophy/Culture on Innovation, three of the bullets that offerors were to address were as follows:

The Offeror shall:

  • Provide Company definition of Innovation.
  • Describe how the company's core competency of Innovation significantly aligns with DISA's mission needs and requirements.
  • Approach to Risk.  How does the company manage risk in an innovative environment?  How does the company determine the level of acceptable risk?  How should the company and the DoD share risk?  What is the cost of failure?  And who should pay for it?  How much "failure" should the Government accept?

Id. at 40.

In section M, the RFP provided that DISA would evaluate offerors' responses to the criteria described in section L.  Id. at 52.  The RFP further provided that the agency would consider strengths, weaknesses, significant weaknesses, uncertainties, and deficiencies in an offeror's proposal and would assign the proposal one of the following color/adjectival ratings reflecting an overall risk of "failure to be innovative:" blue/outstanding, purple/good, green/acceptable, yellow/marginal, or red/unacceptable.  Id. at 51-52.  The RFP stated that proposals would be evaluated more favorably for demonstrating a number of innovation-related achievements, including, inter alia, a demonstrated long term corporate philosophy regarding innovation, extensive publications on the topic of innovation, and a mature definition of innovation.  Id. at 52.

Under the past performance factor (factor 2), offerors were instructed to submit up to three references that the agency would evaluate for recency, relevancy, and quality of effort.  Id. at 42-43, 52-54.  The agency expressed a desire that the references include examples of deployed innovative technologies.  Id. at 42-43.  The agency also reserved the right to obtain past performance information from any sources available to the government.  Id. at 52.  In evaluating past performance, the RFP provided that the agency would assign one of the following performance confidence ratings:  substantial confidence, satisfactory confidence, neutral confidence, limited confidence, or no confidence.[2]  Id.

Under the problem statements factor (factor 3), offerors were asked to demonstrate their technical skills and ingenuity by solving two hypothetical problems.  The purpose of these notional problems was to provide DISA insight into an offeror's ability to meet the requirements and the offeror's problem solving methodologies.  Id. at 44.  Offerors in the unrestricted pool were to respond to problem statements 1 and 2.  AR, Tab 4, RFP Amend. 3, Attach. 7, Problem Statements.  Once again, the agency expressed a desire that offerors' responses highlight innovation and innovative processes.  The problem statements were weighted equally and were each assigned one of the following ratings:  blue/outstanding, purple/good, green/acceptable, yellow/marginal, or red/unacceptable.  AR, Tab 5, RFP Amend. 4, at 55.

Under the utilization of small business factor (factor 4), offerors were instructed to submit a small business participation plan outlining how offerors intended to maximize the utilization of small businesses.  Id. at 46.  The RFP stated that the agency would evaluate the offerors' submissions for adequacy of the proposed small business participation plan and proposed goals, and would assign one of the following ratings to the proposal:  blue/outstanding, purple/good, green/acceptable, yellow/marginal, or red/unacceptable.  Id. at 56.  Firms other than small businesses were also required to submit a subcontracting plan meeting the requirements of Federal Acquisition Regulation (FAR) clause 52.219-9 and Defense Federal Acquisition Regulation Supplement (DFARS) clause 252.219-7003.  Id. at 48.  The RFP advised offerors that the subcontracting plan should be consistent with the commitments offered in the small business participation plan.  The RFP further advised that the agency would review only the subcontracting plans submitted by those offerors deemed to be apparent awardees.  Id.

Finally, with respect to price, offerors were instructed to input direct labor rates and indirect rates, to include profit/fee, into a spreadsheet provided by DISA in which the agency listed the applicable labor categories with an estimate of the labor hours for each category.  Id. at 48; AR, Tab 4, RFP Amend. 3, Attach. 9, Pricing Spreadsheet.  The RFP indicated that the spreadsheet would calculate the fully burdened labor rate for each category and the total proposed price for the offeror.[3]  AR, Tab 5, RFP Amend. 4, at 48, 57.  The RFP informed offerors that the agency would review the fully burdened fixed-price labor rates for reasonableness and completeness using one of the techniques defined in section 15.404 of the FAR.[4]  Id. at 57.  The RFP provided that DISA would also evaluate proposals for unreasonably high prices and unbalanced pricing, and that the existence of such prices "may be grounds for eliminating a proposal from competition."  Id. at 49, 57.  Only the total proposed price would be used for tradeoffs between price and non-price factors.  Id. at 48.

Evaluation of Proposals

The agency received 35 timely proposals in response to the solicitation.  AR, Tab 65, SSDD, at 1; Memorandum of Law (MOL)/Contracting Officer's Statement (COS) at 22.  As set forth in the RFP, proposals were evaluated using a multi-step process.  AR, Tab 5, RFP Amend. 4, at 57-58.  First, proposals were evaluated by five separate and distinct evaluation boards--one for each factor--and consensus reports were prepared.  AR, Tab 26, Technical Evaluation Board (TEB) Report (Innovation); Tab 27, TEB Report (Past Performance); Tab 28, TEB Report (Problem Statements); Tab 29, TEB Report (Utilization of Small Business); Tab 60, Price/Cost Evaluation Report. 

Next, the contracting officer executed a memorandum for record (MFR) determining that the proposed prices of all offerors were fair and reasonable.  AR, Tab 61, Pricing MFR, at 1.  In reaching this conclusion, the contracting officer relied upon the Price/Cost Evaluation Report, which found that price reasonableness "is normally established by adequate competition" and, therefore, because 35 offerors submitted proposals, "it is implicit that price reasonableness has been determined at the macro level."[5]  See e.g., AR, Tab 60, Price/Cost Evaluation Report, at 4.  The pricing evaluation board did not compare offerors' prices; the contracting officer likewise noted the 35 offers and concluded that "[t]he presumption is that all proposed prices are fair and reasonable if there is adequate competition."  AR, Tab 61, Pricing MFR, at 1 (citing FAR § 15.404-1(b)(2)(i)).

Around the same time, the Source Selection Evaluation Board (SSEB) reviewed the evaluation boards' consensus reports, verified the evaluation results, and prepared an SSEB report.  AR, Tab 62, SSEB Report.  The SSEB made some changes to the assessments of the various TEBs, including removing one strength assigned to OGSystems' proposal under the innovation factor and lowering the proposal's rating under the past performance factor rating from substantial to satisfactory confidence.  Id. at 146-7.  The SSEB identified one remaining strength and two weaknesses in OGSystems' proposal under the innovation factor and assigned the proposal a rating of green/acceptable for the factor.  Id. at 146.  Under the problem statements factor, the agency identified two strengths in problem 1 and assigned it a rating of purple/good.  Id. at 148.  The agency identified one strength in problem 2 and assigned it a rating of green/acceptable.  Id.  Under the utilization of small business factor, the agency identified two strengths and assigned the proposal a rating of purple/good.  Id. at 148-49.  OGSystems' total proposed price was the [DELETED].  AR, Tab 60, Price Report, at 3.

The Source Selection Advisory Council (SSAC) reviewed the evaluation record and ratings, and prepared a report for the Source Selection Authority (SSA).  AR, Tab 63, SSAC Report.  Additionally, the SSAC performed a comparative analysis of the proposals, recommending award to the 14 offerors the SSAC deemed to be the "highest rated proposals" under the non-price factors.  The SSAC concluded that "the technical merit of those proposals justifies paying a price premium over lower-rated, lower-priced proposals."  Id. at 13.  Finally, the SSA reviewed and analyzed the SSEB and SSAC reports prior to executing a Source Selection Decision Document (SSDD).  AR, Tab 65, SSDD.

Below is a summary of the agency's ratings of the proposals of the 14 awardees and OGSystems:

Offeror Factor 1 Factor 2 Factor 3 Factor 4 Factor 5
  Innovation Past Performance Problem 1 Problem 2 Small Business Price
IBM Outstanding Substantial Acceptable Good Good $234,935,046
Accenture Outstanding Neutral Good Good Good $179,623,424
Northrop Grumman Good Substantial Good Good Good $269,623,861
Vencore Good Substantial Acceptable Acceptable Outstanding $156,662,569
BAH Good Substantial Acceptable Acceptable Good $134,266,455
Leidos Good Substantial Acceptable Acceptable Good $144,662,993
Harris Good Satisfactory Outstanding Outstanding Good $184,752,341
BAE Acceptable Substantial Outstanding Good Good $156,962,323
NES Acceptable Substantial Good Good Good $137,217,707
LinQuest Acceptable Substantial Good Good Good $175,049,125
Deloitte Acceptable Substantial Acceptable Good Outstanding $126,180,137
Parsons Acceptable Substantial Good Acceptable Outstanding $178,181,282
KeyW Acceptable Substantial Good Acceptable Good $123,192,620
AASKI Acceptable Substantial Good Acceptable Good $181,683,274
OGSystems Acceptable Satisfactory Good Acceptable Good $182,541,402

AR, Tab 65, SSDD, at 3, 16.

Best-Value Tradeoff Analysis

In conducting the source selection, the SSA adopted a multi-step methodology to narrow the pool of proposals under consideration.  First, noting the importance of the innovation factor, the SSA looked to the ratings under this factor. The SSA identified seven proposals that were rated either outstanding or good under the innovation factor:  IBM, Accenture, Northrop Grumman, Vencore, BAH, Leidos, and Harris.  Id. at 5.

For each of the seven offerors included in this first pool, the SSA noted where the offeror's price fell among all offerors.[6]  Id. at 5-9.  Next, the SSA listed, at a high level, the strengths (and any weaknesses) assigned to the proposal under the non-price factors.  After summarizing the underlying evaluation, the SSA concluded with regard to each of the seven proposals that the proposal's strengths under the non-price factors merited "its selection over lower-rated, lower-priced proposals."  See e.g., id. at 5.  Despite the SSA's stated conclusion, these awards were made without any consideration of whether associated price premiums were justified by increased technical merits, as set forth in greater detail later in the decision.

In order to identify a second pool of proposals for consideration, the SSA next looked to those proposals that received a rating of acceptable on the innovation factor.  Id. at 9.  Noting that there were 26 such proposals, the SSA further narrowed the pool to those proposals with a rating of acceptable on the innovation factor in combination with a rating of substantial confidence on the past performance factor.  Id.  Noting that there were still a large number of proposals (18), the SSA further narrowed the pool to those proposals with a rating of acceptable on the innovation factor, a rating of substantial confidence on the past performance factor, and a rating of good or better on both problem statements.  Id.  This resulted in a pool of three proposals:  BAE, NES, and LinQuest.  Id. at 9-11. 

For each of the three offerors included in this second pool, the SSA performed essentially the same analysis as the agency used for the first pool.  That is, the SSA noted where the offeror's price fell among all offerors; listed, at a high level, the strengths assigned to the proposal under the non-price factors; and concluded that strengths under the non-price factors merited selection of the proposal over "lower-rated, lower-priced proposals."  See e.g., id. at 10.

At this point, the SSA considered concluding the tradeoff analysis in light of the fact that the agency had selected 10 proposals for award, which was the approximate number of anticipated awards set forth in the RFP.  Id. at 11; AR, Tab 5, RFP Amendment 4, at 30.  The SSA noted, however, that four additional proposals--those of Deloitte, Parsons, KeyW, and AASKI--were assigned essentially the same ratings as the proposals of BAE, NES, and LinQuest with the exception that the four additional proposals received one good rating and one acceptable rating for the problem statements.  AR, Tab 65, SSDD at 11.  The SSA decided to include these four proposals because more offerors could "potentially driv[e] prices down in future task order competitions."  Id.  The SSA further observed that "[t]wo of the next four most highly rated proposals also have comparatively low total proposed prices (ranked [DELETED] and [DELETED] by price), which could benefit future price competition."[7]  Id.  Like the proposals in the first two pools, the SSA recommended each of these four proposals for award on the basis that its strengths under the non-price factors merited its selection over "lower-rated, lower-priced proposals."  See id. at 11-13.

With respect to the remaining technically acceptable proposals that were not included in the above pools, the SSA considered the two lowest-priced proposals.  Id. at 3, 13-14.  In both instances, the SSA confirmed the ratings on the non-price factors and concluded that "considering the selection methodology where all non-price factors are significantly more important than price, the SSAC does not believe [the proposal] merits an award as compared to the proposals with higher prices and higher technical merit, and I agree."  See e.g., id. 14. 

Finally, with respect to the 19 remaining technically acceptable proposals, including OGSystems' proposal, the SSA noted that the SSAC "took a final look at all of the ratings for all of the Offerors to see if there were any other proposals with significant technical merit."[8]  Id. at 14.  The SSA further noted that the SSAC "did not identify sufficient technical merit in any other proposals to justify their recommendation for award, and I agree."[9]  Id. at 13. 

In summarizing the SSAC's recommendation, with which the SSA agreed, the SSA stated that the SSAC "recommended award to the 14 highest rated proposals in the non-price factors, as identified above, because the technical merit of those proposals justifies paying a price premium over lower-rated, lower-priced proposals."  Id. at 16.  With respect to making any "additional awards," the SSA concluded that the remaining 21 proposals "do not have sufficient technical merit."  Id.

Notice of Award, Debriefing, and Protest

On June 14, the agency provided notice to all unsuccessful offerors, including OGSystems.  AR, Tab 69, Unsuccessful Offeror Letter.  OGSystems timely requested a debriefing, which the agency provided on June 15.  AR, Tab 101, Debriefing Letter.  In its letter, the agency advised OGSystems that it could request an enhanced debriefing pursuant to DoD Class Deviation 2018-0011, id. at 3, which the protester did by submitting additional questions.  AR, Tab 105, Enhanced Debriefing Questions.  The agency responded to the questions on June 25.  AR, Tab 109, Agency Response to Enhanced Debriefing Questions.  This protest followed on July 2. 

DISCUSSION

The protester challenges the agency's evaluation, asserting that the SSEB's changes to the TEBs' evaluation findings were unreasonable.  The protester also argues that the evaluation of proposals under the innovation factor was conclusory.  OGSystems further asserts that the agency unreasonably and unequally failed to conduct discussions with all offerors.  Lastly, the protester argues that the agency's best-value tradeoff was unreasonable.  As explained below, we sustain the protest in part and deny it in part.[10]

Change in Evaluation Ratings for the Innovation and Past Performance Factors

The protester asserts that the SSEB failed to provide an adequate rationale for removing from the protester's proposal a strength identified by the TEB under the innovation factor and lowering the protester's past performance confidence rating from substantial to satisfactory confidence.  Protest at 13-18; Comments at 20-23.

While source selection officials and higher-level evaluators are free to disagree with the evaluation findings of lower-level evaluators, they are nonetheless bound by the fundamental requirements that their independent judgments be reasonable, consistent with the stated evaluation factors, and adequately documented.  Prism Maritime, LLC, B-409267.2, B-409267.3, Apr. 7, 2014, 2014 CPD ¶ 124 at 5. 

The strength removed by the SSEB was as follows:

L.4.2.3.1 Corporate Philosophy/Culture on Innovation

. . .  The Offeror displayed an exceptional alignment of competencies that can be directly applied to the SETI program.  This is a benefit to the Government as the Offeror has demonstrated that the company has the experience and resources that are needed to successfully address the PWS task areas.  Their demonstration of experience in [DELETED] of the 21 DISA mission areas increases the probability of success on future SETI task performance.

AR, Tab 62, SSEB Report at 145.  In removing the strength, the SSEB stated that it did not agree that this was an aspect of an offeror's proposal that had merit or exceeded specified performance or capability requirements in a way that will be advantageous to the Government during contract performance.  Id. at 146.  In its agency report, DISA noted that the SSEB removed a similar strength from 11 other proposals.  MOL/COS at 32.

In contrast, the TEB described the protester's "alignment of competencies" as exceptional and of benefit to the government.  The TEB further found that the protester's demonstration of experience in several mission areas increased the probability of success on future SETI task performance. AR, Tab 62, SSEB Report, at 145.

In disagreeing with the TEB, the SSEB did not explain how it reached its conclusion to remove the strength.  Rather, it simply stated the definition of a strength and concluded that this aspect of the proposal did not meet that definition.  See id. at 146. 

Because the SSEB failed to document any rationale for its removal of this strength, we cannot conclude that the agency's evaluation in this regard was reasonable.  The Arcanum Grp., Inc., B-413682.2, B-413682.3, Mar. 29, 2017, 2017 CPD ¶ 270 at 8 (removal of weakness unreasonable where the SSA concluded that the findings did not meet the definition of weakness but did not explain how she reached her conclusions); Immersion Consulting, LLC, B-415155, B-415155.2, Dec. 4, 2017, 2017 CPD ¶ 373 at 6 (removal of strength unreasonable where SSA's rationale was not meaningfully explained in the record and where underlying evaluation was specific, identified the impact of the approach, and described how the government would benefit from the approach); IBM U.S. Fed., a division of IBM Corp.; Presidio Networked Solutions, Inc., B-409806 et al., Aug. 15, 2014, 2014 CPD ¶ 241 at 14-16 (SSA's decision to overrule technical board unreasonable where contemporaneous documentation evidences the SSA's conclusion but fails to explain why he disagreed with the substance of the board's findings).  We therefore sustain OGSystems' challenge to this aspect of the evaluation.

The protester also challenges the SSEB's reduction in OGSystems' past performance rating from substantial to satisfactory confidence.  The SSEB explained that, while performance was exceptional on all references, with only one relevant and two somewhat relevant contract ratings, there was a reasonable expectation--not a high expectation--that OGSystems would successfully perform the required effort.  AR, Tab 62, SSEB Report at 147.  The agency argues that the SSEB reasonably lowered the protester's past performance confidence rating because every offeror with a substantial confidence rating had submitted at least one reference demonstrating successful performance of a very relevant contract--a standard that the protester's past performance did not meet.  MOL at 35.  While the protester argues that the agency placed too much emphasis on the relevance of the contracts and gave short shrift to the quality of past performance, Comments at 23, it has not demonstrated that it was unreasonable for the agency to require successful performance on a very relevant contract for a rating of substantial confidence.  Thus, we find this argument to be without merit.  Landoll Corp., B-291381 et al., Dec. 23, 2002, 2003 CPD ¶ 40 at 8 (noting that a protester's mere disagreement with an agency's judgment, without more, does not render an evaluation unreasonable).

Evaluation of Innovation Factor

The protester asserts that the agency's evaluation of the innovation factor, which employed a spreadsheet to document evaluators' findings, "appears conclusory and lacks adequate documentation."[11]  Comments at 17. 

In reviewing protests of an agency's evaluation and source selection decision, our Office will not reevaluate proposals; rather, we review the record to determine whether the evaluation and source selection decision are reasonable and consistent with the solicitation's evaluation criteria, and applicable procurement laws and regulations.  Velos, Inc., B-400500.8, B-400500.9, Dec. 14, 2009, 2010 CPD ¶ 13 at 11.

The innovation factor evaluation panel created a spreadsheet to track responses to the solicitation's numerous requirements.  See AR, Tab 26, Innovations Evaluation Spreadsheet.  The spreadsheet contained six columns:  the bullet number under which the requests were grouped; the specific request for information; two columns noting whether the information was complete and whether it was responsive; and two columns noting whether the proposal was evaluated as having a strength or weakness in that area.  See id.  The spreadsheet contained narrative comments only if a proposal was evaluated as having a strength or weakness. 

Here, the agency was evaluating whether 35 proposals contained complete information responding to over 40 criteria.  We see nothing unreasonable in the agency utilizing a spreadsheet as a means to record whether requested information was included in proposals, and whether included information met the solicitation criteria.  In addition, the spreadsheet also included a narrative assessment of strengths and weaknesses of each response to the factor.  These narratives demonstrate the qualitative nature of the agency's evaluation.  In sum, the protester has not demonstrated that the evaluators' findings were unreasonable.  We thus find this allegation to be without merit.

Discussions

OGSystems asserts that the agency has failed to provide a reasonable rationale for failing to hold discussions.  The protester further argues that the agency did, in fact, conduct discussions with some, but not all offerors, which was unequal.  As discussed below, we find no merit to either argument.

The protester asserts that the agency failed to provide a reasonable rationale for its decision not to conduct discussions.  OGSystems argues that, for a defense procurement over $100 million, the DFARS states that "contracting officers should conduct discussions with offerors in the competitive range."  Comments at 19, quoting DFARS § 215.306.  The protester asserts that had the agency conducted discussions, it would have submitted a revised proposal that addressed the weaknesses identified by the agency.  Id. at 20.

Where a protester challenges an agency decision not to conduct discussions in a DoD procurement valued at over $100 million, we will review the record to determine whether in the particular circumstances of the procurement in question, there was a reasonable basis for the agency decision not to conduct discussions.  Science Applications Int'l Corp., B-413501, B-413501.2, Nov. 9, 2016, 2016 CPD ¶ 328 at 9-10.  In this regard, we have read DFARS § 215.306 to mean that discussions are the expected course of action in DoD procurements valued over $100 million, but that agencies retain the discretion not to conduct discussions if the particular circumstances of the procurement dictate that making an award without discussions appropriate.  Id.  Here, the agency explains that it elected to award without discussions because:  (1) offerors were on notice that the agency intended to make awards without discussions; (2) the initial proposals demonstrated clear technical advantages and disadvantages to differentiate among the offerors; and (3) the initial proposals demonstrate significant technical merit at fair and reasonable prices.  AR, Tab 64, Discussions MFR, at 1-3.  Given that the solicitation here advised offerors that the agency intended to make award without discussions, that the DFARS provision discussed above does not require discussions, and that the agency's rationale for not holding discussions is otherwise reasonable, we see no basis to sustain this challenge.

The protester also asserts that the agency engaged in unequal discussions by permitting "some of the presumptive awardees to change their subcontracting goals."  Comments, Sept. 10, 2018, at 12.  Because the agency required the subcontracting plan to be consistent with offerors' proposals under the utilization of small businesses factor, the protester asserts that the agency must have permitted changes to proposals, which constituted discussions. 

The RFP advised offerors of the following:

Subcontracting plans will only be reviewed for acceptability for those "other than small business" Offerors deemed to be apparent awardees.  If a subcontracting plan cannot be determined to be acceptable, the entire proposal will be rendered unawardable.  Compliance and acceptability of Subcontracting Plans are a matter of responsibility and are separate from the evaluation of the Utilization of Small Business Factor.

RFP at 51.  Offerors were on notice that the agency intended to review the subcontracting plans for the apparent awardees, that attempts would be made to make those plans acceptable, and that the acceptability of the plans was a matter of responsibility separate from the evaluation of proposals.  The allegation that changes to unacceptable subcontracting plans necessarily constituted unequal discussions is an untimely challenge to the terms of the proposal, where the solicitation stated that the acceptability of subcontracting plans was a responsibility matter separate from the evaluation of proposals.[12]  4 C.F.R. § 21.2(a)(1) (noting that challenges to the terms of a solicitation apparent prior to time set for receipt of proposals are timely if filed prior to the deadline for proposal submission).

Best-value Tradeoff

The protester challenges the agency's best-value tradeoff, asserting that the SSA failed to give appropriate consideration to price in relation to the non-price factors.  The protester also contends that the agency improperly failed to look beneath the adjectival ratings and compare the underlying strengths and weaknesses of the competing proposals.  As explained below, we sustain this ground of protest.

In a best-value procurement, such as the one here, it is the function of the source selection authority to perform a tradeoff between price and non-price factors to determine whether one proposal's superiority under the non-price factors is worth a higher price.  Arcadis U.S., Inc., B-412828, June 16, 2016, 2016 CPD ¶ 198 at 10.  Even where price is of less importance than the non-price factors, an agency must meaningfully consider cost or price to the government in making its selection decision.  Id.  Specifically, before an agency may select a higher-priced proposal that has been rated technically superior to a lower-priced but acceptable proposal, the award decision must be adequately documented and supported by a rational explanation of why the higher-rated proposal is, in fact, superior, and explain why its technical superiority warrants paying a price premium.  Id.  In a tradeoff source selection process, an agency may not "so minimize the impact of price to make it merely a 'nominal evaluation factor' because the essence of the tradeoff process is an evaluation of price in relation to the perceived benefits of an offeror's proposal."  Sevatec, Inc., et al., B-413559.3 et al., Jan. 11, 2017, 2017 CPD ¶ 3 at 8, quoting FAR § 15.101-1(c).

Here, instead of documenting a reasonable basis for the tradeoffs made, the record indicates that the agency mechanically made award to the 14 offerors whose proposals exhibited, in descending order, the best combination of adjectival ratings under the non-price factors.  In this regard, DISA grouped similarly rated proposals into pools and made awards to all offerors in those pools--regardless of price--until DISA reached what it referred to as a "clear break" in the proposals.  AR, Tab 65, SSDD, at 16.  The record reflects, however, that this "clear break" in the proposals was based exclusively on the adjectival ratings assigned to proposals under the non-price factors. 

For instance, in declining to make an award to Offeror B--the second lowest-priced proposal--the SSA observed that if the proposal had achieved a rating of substantial confidence under the past performance factor, rather than a rating of satisfactory confidence, the proposal would have been "in line for an award."  Id. at 14.  In this respect, the SSA notes, "Offerors with similar ratings in Factors 1 and 3 were only recommended for award if they achieved the highest rating in Factor 2," which Offeror B did not.  Id. at 14.  The SSA reviewed Offeror B's past performance and determined that the satisfactory confidence rating was reasonable.  The SSA's analysis, however, did not examine whether a proposal with a significant price advantage might be among those proposals offering the best value to the government notwithstanding a slightly lower rating in past performance.[13]  Accordingly, without any weighing of its low price, Offeror B was not recommended for award because its proposal was not assigned the necessary combination of adjectival ratings.

Likewise, in considering the proposal of Offeror C, the SSA concluded that, but for its rating of marginal under the innovation factor, it could have been "in line for an award" based upon the combination of ratings its proposal received under other factors.  Id. at 15.  Finally, and perhaps most telling, in considering the proposals of those offerors whose combination of ratings did not place them in one of the pools, the SSA stated that the SSAC "took a final look at all the ratings for all of the Offerors to see if there were any other proposals with significant technical merit."  Id. at 14 (emphasis added).  Based on their ratings, the SSA concluded they do not have sufficient technical merit to warrant an award.  Id. at 13.

We have long recognized that an agency's source selection decision cannot be based on a mechanical comparison of the offerors' technical scores or ratings, but must rest upon a qualitative assessment of the underlying technical differences among competing offerors.  See The MIL Corp., B-294836, Dec. 30, 2004, 2005 CPD ¶ 29 at 8 (sustaining protest where agency mechanically made award to all proposals that received "blue" ratings on the two non-price factors, and declined to make award to any proposal that did not receive a "blue" rating for the non-price factors).  See also One Largo Metro LLC, et al., B-404896 et al., June 20, 2011, 2011 CPD ¶ 128 at 14-15.  Here, in adopting such a mechanical approach, DISA failed to make a qualitative assessment of the technical differences among the competing proposals in order to determine whether the perceived technical superiority of those proposals receiving the best combination of ratings justified paying the price premium associated with those proposals. 

Further, price was not considered in any meaningful way in the source selection decision.  In this respect, the record shows that price had no material impact on an offeror's ability to be selected for award.  Once the higher-rated proposals were identified, the agency did not perform a price/technical tradeoff; rather, award was based strictly on technical merit.[14]  In a tradeoff source selection process, however, an agency may not so minimize the impact of price to make it merely a nominal evaluation factor because the essence of the tradeoff process is an evaluation of price in relation to the perceived benefits of an offeror's proposal.  Sevatec, Inc., supra, at 8 (citing FAR § 15.101-1(c)).

Although we acknowledge that the SSA repeatedly documented a nearly verbatim, one-sentence conclusion that, due to strengths on the non-price factors, the 14 awardees merited selection over lower-rated, lower-priced proposals, see e.g., AR, Tab 65, SSDD, at 9, we find such consideration of price to be nominal.  See Cyberdata Techs., Inc., B-406692, Aug. 8, 2012, 2012 CPD ¶ 230 at 5 n.1 (sustaining protest where agency emphasized the importance of technical superiority and concluded that selection of the lower-priced proposals "would be at the reduction of technical quality and not worth a trade-off to that extent.").  Indeed, anything less would be to ignore price completely.

In sum, we find that the agency's best-value tradeoff methodology was flawed.  As a result, we also sustain the protest on this basis.

Prejudice

The protester's proposal was lower-rated and lower-priced than proposals from IBM, Northrop Grumman, and Harris.  In addition, OGSystems' proposed price was considerably less than the price proposed by IBM and Northrop Grumman.  Based on the record here, we cannot conclude that the SSA would have reached the same source selection decision had the agency accorded price more than nominal weight and properly considered the proposals' underlying strengths and weaknesses in the best-value tradeoff.  As a result, we conclude that there was a reasonable possibility of prejudice sufficient to sustain the protest.  Arcadis U.S., Inc., supra at 11. 

RECOMMENDATION

We sustain OGSystems' protest challenging the agency's evaluation of its proposal under the innovation factor and best-value tradeoff decision.  We recommend that the agency reevaluate OGSystems' proposal under the innovation factor and that the agency make a new best-value tradeoff decision.  In addition, we recommend that DISA reimburse OGSystems the costs associated with filing and pursuing its protest, including reasonable attorneys' fees.  4 C.F.R. § 21.8(d).  OGSystems' certified claim for costs, detailing the time expended and costs incurred, must be submitted to the agency within 60 days after receipt of this decision. 4 C.F.R. § 21.8(f).

The protest is sustained in part and denied in part.[15]

Thomas H. Armstrong
General Counsel



[1] DISA awarded contracts to the following 14 firms:  AASKI Technology, Inc. (AASKI); Accenture Federal Services (Accenture); BAE Systems Technology Solutions & Services (BAE); Booz Allen Hamilton, Inc. (BAH); Deloitte Consulting LLP (Deloitte); Harris Corporation (Harris); International Business Machines Corporation (IBM); KeyW Corporation (KeyW); Leidos Innovations Corporation (Leidos); LinQuest Corporation (LinQuest); NES Associates LLC (NES); Northrop Grumman Systems Corporation (Northrop Grumman); Parsons Government Services, Inc. (Parsons); and Vencore, Inc.

[2] The solicitation defined ratings of substantial and satisfactory confidence as follows:

Substantial Confidence:  Based on the Offeror's recent/relevant performance record, the Government has a high expectation that the Offeror will successfully perform the required effort.

Satisfactory Confidence:  Based on the Offeror's recent/relevant performance record, the Government has a reasonable expectation that the Offeror will successfully perform the required effort.

Id. at 45.

[3] The labor rates would be the "capped ceiling" for that labor category for any fixed-price or time-and-materials task orders.  AR, Tab 5, RFP Amend. 4, at 48.  Offerors would be permitted to propose less than the capped rates.  Id.

[4] The RFP provided that no cost or price realism analysis would be conducted at the IDIQ contract level, but that costs for cost-reimbursement work would be evaluated at the task order level and would be subject to cost realism analysis at that time.  AR, Tab 5, RFP Amend. 4, at 48.  The RFP stated that "[f]ixed price direct labor rates may be used in determining the reasonableness and realism of direct labor rates for future cost-reimbursement work at the Task Order level."  Id. at 49.

[5] The agency repeated this same conclusion verbatim in its review of each offeror's proposal.

[6] For example, with respect to IBM's price, the SSA noted that "[t]his Offeror's total proposed price was ranked [DELETED] of 35."  Id. at 5. 

[7] The SSA's comment regarding price does not explain why the other two proposals were included.  Their inclusion appears to be based solely on the adjectival ratings assigned to the proposals.

[8]  The SSAC observed that one proposal appeared to be an "anomaly" because, but for its rating of marginal under the innovation factor, it "could have been in line for award" based on the combination of ratings its proposal received under the other non-price factors.  AR, Tab 63, SSAC Report, at 12; Tab 65, SSDD, at 14-15.  Other than this one proposal, there is no discussion of any other proposal.

[9] The SSAC report does not contain any additional analysis, concluding likewise that it "did not identify sufficient technical merit in any other proposals to justify their recommendation for award."  AR, Tab 63, SSAC Report, at 10.

[10] While we do not address each protest ground, we have considered them all; except for the two allegations on which we sustain this protest, we find the protest grounds to be without merit.

[11] The protester also argued that the agency disparately evaluated proposals under the innovation factor by failing to assess in other proposals a weakness that the agency assigned to the protester's proposal.  Comments at 16-19.  OGSystems withdrew that allegation.  Supp. Comments at 10.

[12] The protester also asserts that, at the time the agency sought to make the subcontracting plans of the apparent awardees acceptable, the agency had not yet identified those firms.  Comments, Sept. 10, 2018, at 11-12.  The protester bases its contention on the dates on which final procurement documents were electronically signed.  The agency has explained that the dates reflect when final review of the documents was complete; the agency asserts, however, that it had selected the apparent awardees prior to a review of their subcontracting plans.  Any suggestion that the agency is falsely characterizing the discrepancy between when procurement milestones were achieved and when the final documents were signed is, essentially, an allegation of bad faith, for which there is no support in the record.  Trailboss Enters., Inc., B-415812.2 et al., May 7, 2018, 2018 CPD ¶ 171 at 12 (noting that to establish bad faith, a protester must present convincing evidence that agency officials had a specific and malicious intent to harm the firm and noting that the burden of establishing bad faith is a heavy one).

[13] It is worth noting that, like Offeror B, Harris received a rating of satisfactory for its past performance.  Unlike Offeror B, however, Harris' rating of satisfactory under this factor did not preclude it from receiving an award despite being priced $75 million higher than Offeror B.  This is because Harris' proposal was included in the first pool of proposals considered.  In this pool, the record reflects that awards were made strictly based upon the ratings under the innovation factor.  To the extent an offeror did not possess sufficient technical merit under another factor, the SSA simply omitted from the one-sentence award recommendation any reliance on that factor.  See e.g., AR, Tab 65, SSDD, at 6, 9 (omitting reliance on factor 2, Accenture & Harris); id. at 7, 8, 9 (omitting reliance on factor 3, Vencore, BAH, Leidos).

[14] In this regard, there is no indication that price played a role in determining the "clear break" in the proposals. 

[15] The protester also asserts that had the agency properly considered the effect of Harris Corporation's sale of information technology services to Veritas Capital on Harris' evaluation and its manner of performance, it would not have selected Harris for award.  Comments at 4-10.  OGSystems argues that it was prejudiced because the agency might have selected its proposal instead.  The agency asserts that had it found Harris' proposal ineligible, it would simply have reduced the number of awards by one.  Even if that were not the case, there is no reasonable basis on which to conclude that the agency would have replaced Harris as an awardee with OGSystems.  We therefore see no prejudice to the protester from the agency's failure to consider the effect of this corporate transaction, even assuming the truth of its assertion.