Skip to main content

CGI Federal Inc.

B-410330.2 Dec 10, 2014
Jump To:
Skip to Highlights

Highlights

CGI Federal Inc., of Fairfax, Virginia, protests the award of contracts to five other offerors under request for proposals (RFP) No. N00039-13-R-0013, issued by the Department of the Navy, Space and Naval Warfare Research Center (SPAWAR) for the production of build-to-print network systems to be installed on Navy ships in support of the Consolidated Afloat Networks & Enterprise Services (CANES) program. The protester argues that the agency failed to amend its price evaluation scheme notwithstanding the fact that it knew, prior to award, that it did not reasonably reflect the agency's changed ordering needs. The protester also challenges the agency's price and past performance evaluations, and argues that the agency engaged in unequal discussions.

We sustain the protest in part and deny it in part.

We sustain the protest in part and deny it in part.
View Decision

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of: CGI Federal Inc.

File: B-410330.2

Date: December 10, 2014

Neil H. O'Donnell, Esq., Jeffery M. Chiow, Esq., Lauren B. Kramer, Esq., and Lucas T. Hanback, Esq., Rogers Joseph O'Donnell, PC, for the protester.
Michael R. Charness, Esq., David R. Johnson, Esq., Jamie F. Tabb, Esq., Erin N. Rankin, Esq., and Elizabeth Krabill McIntyre, Esq., Vinson & Elkins LLP, and Catherine K. Ronis, Esq., BAE Systems, Inc., for BAE Systems Technology Solutions and Services Inc.; David A. Churchill, Esq., Kevin P. Mullen, Esq., and James A. Tucker, Esq., Jenner & Block LLP, for General Dynamics C4 Systems; Jonathan J. Frankel, Esq., John P. Janecek, Esq., Brett J. Sander, Esq., and Nichole A. Best, Esq., Frankel PLLC, Anne B. Perry, Esq., and Townsend L. Bourne, Esq., Sheppard Mullin Richter & Hampton LLP, and William J. Colwell, Esq., and Linda T. Maramba, Esq., Northrop Grumman Corp., for Northrop Grumman Systems Corp.; and Kelly E. Buroker, Esq., Kevin P. Connelly, Esq., Kirsten W. Konar, Esq., Jacob W. Scott, Esq., Caroline A. Keller, Esq., and Kyle E. Gilbertson, Esq., Vedder Price, P.C., for Serco, Inc., the intervenors.
Marian Ciborski, Esq., Susanna Torke, Esq., and Laura Biddle, Esq., Department of the Navy, Space and Naval Warfare Systems Command, for the agency.
Jennifer D. Westfall-McGrail, Esq., and Edward Goldstein, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1. Protest against solicitation’s price evaluation methodology is sustained where agency failed to amend price evaluation scheme notwithstanding the fact that it knew, prior to award, that the scheme did not reasonably reflect the agency’s changed ordering strategy.

2. Protest arguing that agency failed to perform required price realism analysis is denied where agency compared offerors’ pricing to historical pricing for purposes of determining whether offerors’ prices were too low.

3. Protest that awardee’s prices are unbalanced is denied where record fails to show that any of awardee’s line item prices are overstated.

4. Agency did not conduct unequal discussions by notifying other offerors, but not the protester, that their prices were noncompetitive, where protester’s high price was not recognized as an obstacle to award during the evaluation process.

5. Protest of past performance ratings is denied where record shows reasonable basis for ratings assigned.

DECISION

CGI Federal Inc., of Fairfax, Virginia, protests the award of contracts to five other offerors under request for proposals (RFP) No. N00039-13-R-0013, issued by the Department of the Navy, Space and Naval Warfare Research Center (SPAWAR) for the production of build-to-print network systems to be installed on Navy ships in support of the Consolidated Afloat Networks & Enterprise Services (CANES) program.[1] The protester argues that the agency failed to amend its price evaluation scheme notwithstanding the fact that it knew, prior to award, that it did not reasonably reflect the agency’s changed ordering needs. The protester also challenges the agency’s price and past performance evaluations, and argues that the agency engaged in unequal discussions.

We sustain the protest in part and deny it in part.

BACKGROUND

The agency explains that CANES is the Navy’s designated program to modernize information technology at sea; a CANES system consolidates legacy standalone networks into a single system. The RFP here, which is a follow-on to previously awarded contracts for system design and development and the production of limited deployment units, contemplates the award of contracts pursuant to which production units for unit, force, and submarine platforms will be ordered on a build-to-print basis.[2] The contractors are to source, assemble, load and configure software, factory acceptance test, and deliver CANES units and sub-assemblies as directed by individual delivery orders, for which the firms will compete.[3]

The RFP, issued on May 30, 2013, provided for the award of up to three indefinite‑delivery/indefinite-quantity (ID/IQ) contracts with firm-fixed-price (FFP) and cost‑plus-fixed-fee (CPFF) contract line items (CLINs). RFP at 56. CLIN 0001, for CANES production units and sub-assemblies, has a ceiling amount of over $2.4 billion.[4] The solicitation included a chart identifying how many units in each of 19 platform classes, to include DDGs (Navy destroyers), the agency expected to acquire over the life of the contract. The chart indicated that in fiscal year (FY) 2014, the agency anticipated the acquisition of 46 total CANES units, in FY 2015, 48 CANES units; in FY 2016, 62 CANES units; and in FY 2017, 61 CANES units.[5] Id. at 67.

The solicitation provided for award to the offerors whose proposals represented the best value to the government, with technical factors significantly more important than price in the determination. Offerors were cautioned that “[p]roposals which are unrealistic in terms of technical or schedule commitments, or unrealistically high or low in terms of price, may be deemed to be reflective of an inherent lack of technical competence, or indicative of a failure to comprehend the complexity and risks of the proposed work and may be grounds for rejection of the proposal.” Id. at 85.

The RFP included five non-price evaluation factors. Under factor 1, which was most important, offerors were to describe their technical approaches to producing one CANES DDG unit.[6] Under factor 2, which was equal in weight to factor 3, offerors were to describe their CANES production processes and capacity. Under Factor 3, offerors’ past performance was to be evaluated. Factor 4 (Small Business Utilization and Commitment) was less important than factors 1-3. Factor 5 (Brand Name or Equal) was to be evaluated on a pass/fail basis, with a rating of fail rendering the proposal ineligible for award.[7] Proposals were to be rated under factors 1, 2, and 4 as outstanding, good, acceptable, marginal, or unacceptable.[8] Under the past performance factor, proposals were to be rated as substantial, satisfactory, limited, no, or unknown confidence.

For purposes of the price evaluation, offerors were to provide a price for a single CANES DDG unit to be ordered at the time of contract award. Supra at fn 6. In addition, the RFP instructed offerors to provide not-to-exceed (NTE) FFP unit price ceilings for CLIN 0001 CANES DDG production units for quantities 1/each through 15/each for calendar year (CY) 14, CY15, CY16, and CY17.[9] The RFP advised that the unit price ceilings proposed for CYs 15-17 were for evaluation purposes only and did not apply to the resulting awarded contract. Id. at 70. Offerors were further advised that the government might apply the NTE unit prices proposed for CY14 “in any quantity on a single Delivery Order, or multiple Delivery Orders, up to maximum quantity of fifteen (15) per Delivery Order.”[10] Id. at 6. An offeror’s total evaluated price was to be calculated by adding together its price for the single delivery order 0001 unit; its price for 15 CANES DDG units for CY14,[11] 15 CANES DDG units for CY15, 15 CANES DDG units for CY16, and 15 CANES DDG units for CY17; and the evaluated CPFF amount for the CANES Analysis and Assessment work in CLIN 0005.

The solicitation provided for a price analysis of the CLIN 0001 prices in accordance with Federal Acquisition Regulation (FAR) §15.404-1(b). The RFP also provided for evaluation of the “extent to which evidence of unbalanced pricing exists, between CLINs 0001 calendar year . . . pricing, and between different quantities within one CY.” Id. at 87. A cost realism analysis of the CLIN 0005 prices was also to be performed, with proposed costs to be adjusted based on the results.

Seven offerors submitted proposals by the August 21, 2013 closing date. The agency evaluated the proposals, established a competitive range consisting of all seven, and conducted discussions with each offeror. Each offeror submitted a final proposal revision (FPR) by the May 21, 2014 due date. After reviewing the FPRs, agency technical and price evaluators rated the proposals as follows:

 

Factor 1

Factor 2

Factor 3

Factor 4

Factor 5

Evaluated Price[12]

BAE

Acceptable

Good

Satisfactory

Acceptable

Pass

$108 M.

CGI

Good

Acceptable

Satisfactory

Acceptable

Pass

$129 M.

DRS

Good

Acceptable

Substantial

Acceptable

Pass

$128 M.

GD

Good

Acceptable

Substantial

Acceptable

Pass

$106 M.

GTS

Acceptable

Acceptable

Satisfactory

Neutral

Pass

$94 M.

NG

Acceptable

Acceptable

Satisfactory

Good

Pass

$85 M.

Serco

Good

Good

Substantial

Acceptable

Pass

$97 M.


Source Selection Evaluation Board (SSEB) Report, June 30, 2014, at 8; Price Evaluation Board (PEB) Report, July 21, 2014, at 19.

The SSEB identified [deleted] and no weaknesses in the protester’s proposal under factor 1, leading to the rating of good. The [deleted] strengths pertained to CGI’s [deleted]. Under factor 2, the evaluators identified no particular strengths or weaknesses, leading to a rating of acceptable for the factor. Under factor 3, the SSEB assigned the protester a performance confidence rating of satisfactory based on its finding that while the contracts cited by the protester were “somewhat recent to recent and relevant” and reflected “Very Good to Exceptional quality,” the majority of the performance on two of the protester’s three contracts occurred more than five years ago, “reducing the relevancy of the high quality ratings and leading to a reasonable rather than high expectation of successful performance.” SSEB Report at 24. The SSEB assigned the protester’s proposal a rating of acceptable under factor 4, finding that because CGI’s proposal “did not clearly demonstrate” that the protester [deleted], a rating higher than acceptable could not be supported. Id. at 27.

A source selection advisory council (SSAC) reviewed the findings of the technical and price evaluation teams. Taking into account the ratings under the five non-price factors only, the SSAC ranked the proposals in the following order:

Technical Ranking

Rank

Offeror

1

Serco

2

DRS

3

GD

4

CGI

5

BAE

6

NG

7

GTS


SSAC Report, Aug. 4, 2014, at 36. The SSAC then conducted a best value price/technical tradeoff taking into account the findings of both the SSEB and the PEB, which resulted in the following ranking of proposals:

Best Value Ranking

Rank

Offeror

1

Serco

2

GD

3

NG

4

GTS

5

BAE

6

DRS

7

CGI


Id. at 43.

With regard to its decision that the proposals of the other six offerors represented a better value than the protester’s, the SSAC explained as follows:

The SSAC ranked CGI as the lowest value proposal. CGI’s evaluated price was the highest of all Offerors, with an evaluated premium of 50.7 percent over the lowest priced Offeror, NG, and on par with DRS at 49.8 percent over NG. However, where DRS’ proposal was assessed with significant discriminators in non-price factors over all but Serco’s proposal, the non-price comparison showed CGI’s proposal was less superior than Serco, DRS, and GD’s proposals. As these three Offerors evaluated prices were also evaluated as lower than CGI’s, the SSAC considered these proposals of greater value to the Government than CGI’s.

 

In comparison to BAE, both Offerors were found to have equivalent past performance in Factor 3, Satisfactory Confidence with a reasonable expectation of success. CGI was rated slightly superior in technical non-price evaluation based on a Good rating with [deleted] among the five elements of Factor 1, the most important factor, compared to BAE’s Acceptable rating with [deleted] in Factor 1 and [deleted] in the less important Factor 2 for [deleted]. Despite the difference in the number of Factor 1 strengths identified between CGI and BAE, BAE met requirements in all areas of Factor 1 with a logical, detailed proposal, and CGI’s Factor 1 superiority is slightly offset by BAE’s Factor 2 Good rating with [deleted] compared to CGI’s Acceptable rating with an [deleted]. The SSAC determined that the evaluated price premium between the proposals is so high that it diminished the value of CGI’s superiority in Factor 1. The SSAC did not consider the value of the slight non-price discriminator in favor of CGI to warrant CGI’s evaluated price premium of 19.2 percent over BAE.

 

Finally, as stated with more detail above, with an evaluated price premium of 50.7 percent over NG, the SSAC did not consider CGI’s [deleted] in Factor 1 to warrant the evaluated price premium when compared to NG’s [deleted] in Factor 1. Similarly, the SSAC did not consider CGI’s discriminators in Factor 1 against GTS’ [deleted] in Factor 1 to merit the evaluated price premium of 37.6 percent. The SSAC determined that CGI’s price premium significantly diminished the value of its technical superiority on the non-price factors. The SSAC therefore concluded that CGI’s proposal was of lesser value to the Government than those of NG or GTS.

Id. at 49-50.

The SSAC recommended that despite the solicitation language indicating that the agency intended to award up to three contracts, award be made to the five top-ranked offerors. In conjunction with its decision to increase the number of awards, the agency decided to change its strategy for placing delivery orders with the selected contractors. Specifically, the agency decided that it would not place 3-4 orders of larger quantities of CANES units annually as originally planned, but rather it would issue more orders for smaller quantities of CANES units in order to achieve more competition on a per order basis. In the foregoing connection, the SSAC explained as follows:

Although the RFP indicated intent to award 3 contracts, the SSAC believes that given the best value outcome of the source selections, award to more than three is more advantageous to eliminate risk to delivery order competition that would result if all three Offerors do not bid on every order.

 

The SSAC believes that executing awards to five (5) Offerors provides best value for the Government, given the result of the evaluation. The SSAC believes that the amount of CANES work to be procured over the life of this contract, as estimated in Factor 2 of the RFP, will support five (5) awardees. The Government will change its DO [deliver order] strategy to provide continuous opportunity, for example by breaking up requirements into 10-12 DOs annually vice 3-4 annually as originally conceived. This should ensure competition even if one or two awardees decide not to bid on a particular order.

Id. at 51.

The SSA essentially adopted the SSAC’s findings. In agreeing with the SSAC’s recommendation that CGI not receive an award due to its higher price as evaluated at the 15 unit ordering level, the SSA noted in relevant part as follows:

The best value tradeoff analysis and resulting recommendation from the SSAC leads to my decision not to award to DRS and CGI based on the extent of the price premium associated with their proposals; compared to the other offerors, and considering the particular technical discriminators I have found that DRS’ and CGI’s prices are so significantly high as to diminish the technical merit of their proposals.

Source Selection Decision Document, Aug. 6, 2014, at 8.

The agency awarded contracts to the five selected firms on August 20. CGI timely requested a debriefing, which the agency furnished on August 27. CGI protested to our Office on September 2.[13]

DISCUSSION

CGI raises a number of challenges to the agency evaluation of proposals. As a threshold matter, the protester argues the agency used a flawed price evaluation methodology, which produced a misleading result. In this regard, CGI contends that the agency knew prior to award that the solicitation’s stated approach to evaluating offerors’ prices, based solely on the highest order level of 15 units, departed from how the agency actually intended to order the units, which was to place orders for much lower levels. Accordingly, CGI maintains that the agency should have amended the solicitation’s price evaluation scheme to comport with the agency’s actual ordering needs. CGI further contends that the agency failed to conduct a price realism analysis, as required by the terms of the solicitation, and that it failed to recognize that the prices of one of the awardees were unbalanced. CGI also argues that the agency engaged in unequal discussions, assigned its proposal too low a rating for past performance, and unreasonably assigned NG’s proposal a performance confidence rating of satisfactory. As discussed below, we find that the protester’s first argument has merit and sustain the protest on this ground. We deny the protester’s remaining arguments.[14]

Misleading Price Evaluation Methodology

CGI contends that the agency’s price evaluation methodology, which provided for comparing offerors’ prices at the maximum order level of 15 units, did not match the agency’s planned ordering needs as determined by the agency prior to award. Given this disconnect, the protester argues that the agency was required to amend the solicitation’s evaluation scheme to provide a reasonable basis for comparing offerors’ prices, one which matched the agency’s ordering needs. CGI further maintains that if the price analysis had been based on offerors’ NTE unit prices for quantities of 5 per delivery order, which is far more in line with the agency’s revised acquisition strategy,[15] its evaluated price would have been [deleted], rather than highest, which would clearly have had an impact on the best value tradeoff decision.[16] The protester supports its argument with computations, which do indeed show that when offerors’ prices are evaluated using the NTE prices for quantities of 5, as opposed to 15, per delivery order, its total evaluated price is lower than the total evaluated prices of several other offerors. Protester’s Comments, Oct. 30, 2014, Encl. 2 and Exh. E. We agree with the protester.

This case turns on two fundamental principles. One is that, while it is up to the agency to decide on some appropriate and reasonable method for evaluating offerors’ prices, an agency may not use an evaluation method that produces a misleading result. Raymond Express Int’l, B-409872.2, Nov. 6, 2014, 2014 CPD ¶ 317 at 6; Air Trak Travel et al., B-292101 et al., June 30, 2003, 2003 CPD ¶ 117 at 22. That is, the method chosen must include some reasonable basis for evaluating or comparing the relative costs of proposals, so as to establish whether one offeror’s proposal would be more or less costly than another’s. Id.

The other is that where an agency’s requirements materially change after a solicitation has been issued, it must issue an amendment to notify offerors of the changed requirements and afford them an opportunity to respond. Federal Acquisition Regulation (FAR) § 15.206(a); Murray-Benjamin Elec. Co., L.P., B‑400255, Aug. 7, 2008, 2008 CPD ¶ 155 at 3-4. For example, where an agency’s estimate for the amount of work to be ordered under an ID/IQ contract changes significantly, prior to award, the agency must amend the solicitation and provide offerors an opportunity to submit revised proposals. See Symetrics Indus., Inc., B‑274246.3 et al., Aug. 20, 1997, 97-2 CPD ¶ 59 at 6. In Symetrics, our Office concluded that the agency should have amended a solicitation for an ID/IQ contract because although the solicitation initially estimated the agency would require 3,755 sequencers, the agency subsequently learned--prior to award--that the agency no longer had a requirement for 3,219 of the sequencers. Id. Similarly, in Northrop Grumman Info. Tech., Inc., et al., B-295526 et al., Mar. 16, 2005, 2005 CPD ¶ 45 at 13, our Office sustained a protest where the Department of the Treasury, prior to award, negotiated a memorandum of understanding with OMB and the General Services Administration that significantly changed the approach set forth in the solicitation and the FAR for determining whether to exercise contract options, making it significantly less likely that the options, which were part of the evaluation, would be exercised.

The circumstances here are unusual in that the meaningfulness of the price evaluation scheme set forth in the RFP changed between the closing date for receipt of FPRs and the date of award. That is, the record reflects that it made perfect sense to evaluate on the basis of 15/each unit pricing when the agency intended to award three contracts and issue 3-4 delivery orders per year. Given the agency’s projection that it would acquire approximately 45-60 units annually in FYs 2014-2017, each order would necessarily need to be placed at the 15/each unit level based on the intended number of orders issued. Evaluating on the basis of 15/each unit pricing no longer provided a rational basis for comparison, however, when, during the source selection process, the agency decided to increase the number of awardees to five, and to alter its ordering strategy to significantly increase the number of delivery orders annually, thereby decreasing the number of units to be acquired per delivery order. Given this fundamental shift in the agency’s anticipated ordering plans, it was unreasonable for the agency to proceed with a price evaluation methodology that was divorced from these plans. Rather, the appropriate course of action was for the agency to amend the solicitation in a manner that would enable it to evaluate, and make a tradeoff decision based on, the offerors’ relative relevant prices.

The agency defends its actions on the basis that it followed the terms of the solicitation and that all offerors competed on an equal basis because the solicitation did not establish that the agency would, in fact, place orders at the 15/unit level. Regarding the latter point, the agency notes that offerors were to submit prices at each level and should have understood that orders could be placed at any of the 15 price levels. The agency’s arguments, however, miss the point. We agree that the agency followed the terms of the solicitation, and that the offerors submitted prices on an equal basis. The problem is that the price evaluation, and resulting selection decision under which CGI did not receive an award due to its high price, were based on comparing prices for quantities of units that the agency now knows it does not intend to order. We recognize that price evaluation in the context of an ID/IQ contract may be representative, and therefore something of a fiction; nevertheless, the fiction employed must bear some rational relationship to the agency’s needs. See CW Govt Travel, Inc.--Recon.; CW Gov’t Travel, Inc., et al., B-295330.2 et al., July 25, 2005, 2005 CPD ¶ 139 at 4-5. Where the agency’s intended ordering strategy does not anticipate placing orders at the 15 unit per order level, we fail to see how comparing prices at this level, and using such prices as the basis for a tradeoff decision, can be understood to be reasonable.[17] As explained above, where the disconnect between the terms of the solicitation and the agency’s order needs became apparent prior to award, it was incumbent on the agency to instead amend the solicitation to correct the flaw in the solicitation.

Price Realism

CGI next argues that the solicitation required a price realism analysis, but that the agency failed to perform one. The protester contends that since pricing for the out-years, which comprises a substantial percentage of each offeror’s evaluated price, is nonbinding, only an effective price realism analysis will ensure that the Navy will not be “taking on unknown price risk as a result of unrealistically low out-year pricing to which no offeror could be held.” Protest at 8. In response, the agency maintains that the RFP neither contemplated, nor required, a price realism analysis, and that it did in fact perform a price analysis that established the realism of offerors’ prices.

As previously noted, the solicitation provided that proposals that were “unrealistically high or low in terms of price” might be “deemed to be reflective of an inherent lack of technical competence, or indicative of a failure to comprehend the complexity and risks of the proposed work” and might “be grounds for rejection of the proposal.” RFP at 85. We have previously held that where a solicitation advises offerors that unrealistically low prices may serve as a basis for rejection of a proposal, it is implicit that the agency will consider whether offerors’ prices are in fact unrealistic. Esegur-Empresa de Segurança, SA, B-407947, B-407947.2, Apr. 26, 2013, 2013 CPD ¶ 109 at 4. In other words, where a solicitation advises that unrealistically low prices may serve as a basis for rejection of a proposal, the agency must perform a price realism analysis. Logistics 2020, Inc., B-408543, B‑408543.3, Nov. 6, 2013, 2013 CPD ¶ 258 at 8.

The nature of the analysis required to assess whether an offeror’s price is so unrealistically low as to reflect a lack of technical competence or understanding is within the agency’s discretion, however. AMEC Earth & Envtl., Inc., B-404959.2, July 12, 2011, 2011 CPD ¶ 168 at 8. Agencies may use a variety of price evaluation methods to assess realism, including a comparison of prices received to one another, to previously proposed or historically paid prices, or to an independent government estimate. General Dynamics-Ordnance & Tactical Sys., B-401658, B‑401658.2, Oct. 26, 2009, 2009 CPD ¶ 217 at 3.

Here, the record shows that while the agency did not refer to price realism in its price analysis, it did perform an analysis to assess whether offeror prices were too low; an analysis that focuses on whether offeror prices are too low, as opposed to unreasonably high, is in essence a realism analysis. The Matthews Group, Inc. t/a TMG Constr. Corp., B-408003.3, B-408004.3, Mar. 21, 2014, 2014 CPD ¶ 104 at 8‑9. In this connection, the Business Clearance Memorandum’s (BCM) Supporting Price Analysis noted that while the PEB Report of July 21, 2014, found offerors’ FPR CLIN 0001 prices to be fair and reasonable, “given the degree of unit price decreases from previous prices paid for CANES DDG production units,” the agency had conducted additional price analysis. BCM, Att. 9 (Supporting Price Analysis, July 21, 2014), at 1. While the protester maintains that this analysis, which involved comparison of the proposed CLIN 0001 CANES DDG unit prices to historical prices paid, focused on whether the prices offered were reasonable, as opposed to realistic, we think that it is clear from the introductory language acknowledging that the pricing had been determined fair and reasonable and indicating that the additional analysis was performed in light of the decreases from previous prices paid that the additional analysis was focused on whether the pricing was too low.

The supporting price analysis noted that the historical prices had been proposed “under significantly different contractor risks and materially differing terms and conditions than those in the current CANES build-to-pricing effort.” Id. Among the differences were that the preceding contractor was responsible for developing and controlling the CANES Production Baseline and meeting the CANES Functional Specification requirements; for end-of-life product replacements and updates to the bill of materials; and for completing first article testing while simultaneously building CANES production units. Id. at 1-2. According to the agency, “[t]hese risks do not exist under [the current RFP], which is a build-to-print production effort in accordance with a Government provided, Government controlled PBL, shifting much of the overall risk to the Government.” Id. at 2. Another difference impacting pricing noted by the agency was that the unit prices under the preceding contract included the price of all network software necessary to meet the CANES functional specification requirements, whereas under the instant RFP, some of the software is to be provided by the government.

After adjusting the historical unit price for a single DDG unit downward to take into account the cost of the software noted above, the agency compared the adjusted price to the prices proposed by the offerors here for a single DDG unit. The agency found that six of the seven offerors had proposed prices less than the historical price as adjusted, and that the proposed prices were “within 14 percentage points of each other in a range from 18.5% to 32.5% below” the historical price. Id. The agency concluded that this “range of proposed pricing” was “reasonable” given the different contract risks. The agency also compared the average adjusted historical price for quantities 1 to 15 to the average proposed prices of the offerors here, and concluded that “[n]one of the differences from the adjusted historical average are so large as to be considered unreasonable given the difference in risk to the contractor under the current RFP requirements compared to those risks relevant to the FY 13 historical prices.” Id. at 5. Again, while we recognize that the agency refers to the reasonableness, as opposed to the realism, of offeror prices in the above findings, it is clear from the context that the findings in fact pertain to the realism of the proposed pricing. In our view, the comparisons performed by the agency provided it with a reasonable basis for concluding that none of the offerors here proposed prices that were so unrealistically low as to reflect a lack of technical competence or understanding.

Unbalanced Pricing

CGI further argues that the Navy unreasonably failed to recognize that GD’s pricing was unbalanced. In this regard, the protester contends that the drop in GD’s evaluated price for the best estimated quantity (BEQ) of 15 between CY14 and CY 17 was much greater than the drop in other offerors’ evaluated prices. According to CGI, this “should have given the SSA great pause, as it was a strong indicator that GD was frontloading its price.” Protester Comments, Oct. 14, 2014, at 35.

The protester’s argument lacks merit. Unbalanced pricing exists where the prices of one or more line items are significantly overstated, despite an acceptable total evaluated price (typically achieved through under pricing of one or more other line items). Inchcape Shipping Servs. Holding, Ltd., B-403399.3, B-403399.4, Feb. 6, 2012, 2012 CPD ¶ 65 at 4. The protester has not demonstrated--indeed, it has not even alleged--that GD’s evaluated price for the CY14 BEQ is overstated, and given that GD’s evaluated price for the CY14 BEQ is lower than the protester’s, the protester could not reasonably make such an allegation. Moreover, there is little, if any, risk that award to GD will result in the government paying unreasonably high prices for contract performance (which is the risk inherent in overstated prices) since in order to prevail in the delivery order competitions, GD will have to propose prices lower than its competitors’ prices.

Unequal Discussions

CGI argues that the agency failed to conduct equal discussions by notifying BAE and GTS that their initial proposed prices were so high as to render award unlikely (and by confirming certain pricing assumptions for NG), but failing to notify it that its price was noncompetitive. CGI contends in this connection that while the Navy was under no obligation to notify any of the offerors here that its price was considered too high (since an agency need not advise an offeror that its prices are not competitive unless the offeror’s prices are so high as to be considered unreasonable and unacceptable for award, which was not the case with regard to any offeror here),[18] DeTekion Security Sys., Inc., B-298235, B-298235.2, July 31, 2006, 2006 CPD ¶ 130 at 15, once the agency elected to raise the issue of noncompetitively high prices with some offerors, it was obligated to raise the issue with all offerors whose prices were considered high. See AMEC Earth & Envtl., Inc., B-401961, B‑401961.2, Dec. 22, 2009, 2010 CPD ¶ 151 at 6 (where agency decides to hold discussions with firms that go beyond the FAR’s minimum requirements, it is incumbent upon the agency to do so with all offerors equally).

In response, the agency argues that BAE and GTS’s initial proposed prices were so significantly out of line with other offerors’ prices as to render them noncompetitive and to make contract award to either unlikely, but that neither the protester’s initial price, nor its final price, was so out of line with its competitors’ prices as to be considered noncompetitive. In the foregoing connection, the initial evaluated prices of BAE and GTS were [deleted] and [deleted] million, respectively, whereas the initial prices of other five offerors were [deleted] (Serco), [deleted] (GD), [deleted] (NG), [deleted] (DRS), and [deleted] million (CGI).[19] PEB Report, Feb. 26, 2014, at 22. After the agency notified BAE and GTS during discussions that their pricing was so high as to render their proposals not competitive and make contract award to them unlikely, both BAE and GTS dramatically reduced their proposed prices--BAE from [deleted] to $108 million, and GTS from [deleted] to $94 million. NG also reduced its final price significantly (from [deleted] to $85 million). PEB Report, July 21, 2014, at 19. As a result of the significant decreases in these offerors’ prices, the protester’s evaluated price became high (despite the fact that CGI made little change to its price). According to the agency, the protester’s final price was not so out of line that it was apparent to the evaluators that the protester would have little chance of award, however. Rather, the agency maintains, the protester’s price was recognized as an obstacle to award only after the source selection officials weighed the technical advantages associated with the protester’s proposal against the price advantages associated with other proposals as part of their price/technical tradeoff assessment.

We find the agency’s actions unobjectionable. While the protester’s initial pricing was third highest, it was not so high as to be clearly noncompetitive (in contrast to BAE and GTS’s prices); thus, we fail to see that the agency treated the protester unequally by failing to notify it during discussions that its price was too high. Moreover, we do not think that the agency was obligated to reopen discussions with CGI after FPRs were received, and, as a result of the drastic reductions in other offerors’ prices, the protester’s price became high, given that, according to the agency, the evaluators still did not regard the protester’s overall evaluated price as so out of line with competitors’ prices as to render it noncompetitive. The fact that the price premium associated with the protester’s proposal ultimately became a determinative issue in the agency’s best value tradeoff decision does not call into question the reasonableness of that determination. Booz, Allen & Hamilton, Inc., B‑249236.4, B-249236.5, Mar. 5, 1993, 93-1 CPD ¶ 209 at 7. Accordingly, we deny this basis of protest.

Past Performance

The protester challenges the agency’s evaluation of its past performance, arguing that its proposal should have received a performance rating of substantial, as opposed to satisfactory, confidence. CGI also argues that it was unreasonable for the evaluators to assign NG’s proposal a performance confidence rating of satisfactory.

The evaluation of past performance is a matter within the discretion of the contracting agency. In reviewing an agency’s evaluation of past performance, we will not reevaluate proposals, but instead will examine the agency’s evaluation to ensure that it was reasonable and consistent with the solicitation. Maywood Closure Co., LLC, B-408343 et al., Aug. 23, 2013, 2013 CPD ¶ 199 at 5.

The RFP instructed offerors to provide past performance information on a maximum of three previous government contracts relevant to the effort here. The solicitation explained that “relevant and recent past performance” was defined as experience in the previous five years that demonstrated production experience equivalent to “production experience equivalent to assembly and integration of COTS equipment into a multiple rack system that has been fielded on US Navy ships and/or submarines.”[20] RFP at 68, 86. The contracts identified by the offerors were to be rated as very relevant, relevant, somewhat relevant, or not relevant.[21] Considering both the relevance of the identified contracts and the quality of the offerors’ performance on them, performance confidence ratings of substantial, satisfactory, limited, no, or unknown (neutral) confidence were to be assigned. As relevant to the protest here, ratings of substantial and satisfactory confidence were defined as follows:

·         Substantial Confidence: Based on the Offeror’s recent/relevant performance record, the Government has a high expectation that the Offeror will successfully perform the required effort.

·         Satisfactory Confidence: Based on the Offeror’s recent/relevant performance record, the Government has a reasonable expectation that the Offeror will successfully perform the required effort.

Id.

In its proposal, CGI identified, and submitted past performance information pertaining to, three prior (or ongoing) contracts. The SSEB rated each of the three for recency, relevance, and quality of performance as follows:

 

Contract 1

Contract 2

Contract 3

Recency

Apr. 2008-Apr. 2016, Ongoing

Aug. 2003-Aug. 2009, Somewhat Recent

Oct. 2002-Sept. 2008, Somewhat Recent

Relevance

Relevant

Relevant

Relevant

Quality

[deleted]

[deleted]

[deleted]


SSEB Report, June 30, 2014, at 24-25.

As previously noted, based on the above information, the SSEB assigned the protester’s proposal a rating of satisfactory confidence, noting that “[f]or two of the three contracts, the majority of the performance occurred more than five years ago, reducing the relevancy of the high quality ratings and leading to a reasonable rather than high expectation of successful performance.” Id. at 24.

CGI argues that it was unreasonable for the evaluators to discount the high quality ratings of its performance on two of the three contracts on the basis that the majority of the contract performance occurred more than five years ago. The protester contends that the pertinent issue is not the percentage of performance that occurred outside the 5-year window, but rather the extent of the performance that occurred within the 5-year window. The protester maintains that it had 15 months of performance on contract 2 and 4 months of performance on contract 3 within the 5‑year window.

In response, the agency argues that the protester’s argument ignores the fact that the past performance information furnished by CGI in its proposal pertained almost entirely to performance outside the 5-year window. That is, the latest CPARS [Contractor Performance Assessment Reporting System] record provided by the protester for contract 2 was from August 2008, meaning that it covered only 3 months of performance within the 5-year window, and the only CPARS record for contract 3 was from 2007, meaning that none of the performance rated was within the window.

Again, we find the agency’s evaluation to be unobjectionable. The RFP placed offerors on notice that experience within the past 5 years would be viewed more favorably than experience outside that window; in addition, the solicitation provided offerors with notice that “[t]he burden of providing thorough and complete past performance information” was with the offeror. RFP at 68. Accordingly, where the past performance information provided by the protester covered only a 3-month period of performance on contract 2, and no performance on contract 3, within the relevant window,[22] we think that it was both reasonable and consistent with the terms of the RFP for the evaluators to give less weight to the high quality ratings of the protester’s performance on those contracts. Moreover, given the lesser weight of that performance, we think that the evaluators could reasonably have concluded that there was a reasonable, as opposed to a high, expectation of successful performance associated with the protester’s proposal.

As noted above, the protester also argues that NG’s performance should not have been rated as satisfactory confidence. CGI contends in this regard that one of the three contracts cited by NG as relevant past performance was the predecessor contract to the RFP here, and the most recent CPARS for the contract rated NG’s performance as [deleted] in every evaluated area. CGI further argues that on another of NG’s cited contracts, NG received [deleted] ratings in several areas.[23]

The agency responds that while negative information pertaining to NG’s performance on the predecessor contract contributed to an initial past performance rating of [deleted] for NG’s proposal, the weakness was raised with NG during discussions, and in its FPR, NG provided additional information pertaining to performance improvements subsequent to the CPARS rating period. In the foregoing connection, the SSEB explained as follows:

The SSEB noted [NG] had [deleted] performance on quality and schedule based on CPARS. [NG] provided sufficient detail in its FPR articulating problems encountered and steps taken to remedy CANES production quality and schedule issues. The overall condition and timeliness of units at delivery has improved. [NG] has improved previous errors in wiring interconnection and cabling in comparison to the Technical Data Package (TDP). [NG’s] performance indicates significant improvement in meeting schedule requirements.

 

SSEB Report, June 30, 2014, at 60. The SSEB further explained that it had given NG’s performance on the second contract less weight than its other relevant past performance because the performance assessments/surveys were provided by a prime contractor, rather than the government. The agency also noted that while NG had received several element ratings of [deleted], it had received more element ratings of [deleted].

Taking into account the foregoing information, the evaluators could reasonably have assigned NG’s proposal a past performance rating of satisfactory confidence. Accordingly, we find the protester’s argument pertaining to NG’s past performance rating to be without merit.

RECOMMENDATION

We recommend that the agency amend the solicitation’s price evaluation methodology to reflect its actual ordering needs, invite the offerors to submit revised prices, and make a new source selection decision. If, as a result, a different group of offerors is selected for award, the agency should terminate the contracts awarded to non-selected offerors and make awards to any newly selected offerors. We also recommend that the agency reimburse the protester for the reasonable costs of filing and pursuing the protest, including reasonable attorneys’ fees. 4 C.F.R. § 21.8(d)(1)(2013). The protester’s certified claim for costs, detailing the time spent and the costs incurred, must be submitted to the agency within 60 days after receipt of this decision.

The protest is sustained in part and denied in part.

Susan A. Poling
General Counsel



[1] The five awardees are BAE Systems Technology Solutions & Services, Inc. (BAE), of Rockville, Maryland; General Dynamics C4 Systems (GD), of Taunton, Massachusetts; Global Technical Systems (GTS), of Virginia Beach, Virginia; Northrop Grumman Systems Corp. (NG), of Herndon, Virginia; and Serco Inc., of Reston, Virginia.

[2] According to the solicitation, CANES architecture on a unit level platform may have between 15 and 18 server, switch, media, and data storage racks; support approximately 80 applications on server hosted and connected systems; and service between 106 and 126 wireless and hard-wired workstations, plus peripheral devices. Force level platforms may have between 38 and 49 server, switch, media, and data storage racks; support approximately 120 software applications; and include anywhere from 450 to 960 workstations, plus peripheral devices. RFP (Rev. 11) at 66-67.

[3] The solicitation indicated that fair opportunity competitions for the delivery orders would “employ low-price-technically-acceptable selection criteria for fixed priced orders, unless criteria based on a cost technical trade-off is determined to be in the Government’s best interest.” Id. at 17.

[4] Other CLINs and their associated price ceilings were as follows:

CLIN

0002

0003

0004

0005

0006

0007

0008

DESCRIPTION

Contract Data Requirements List (CDRL)

CANES Spares/System Components

CDRL

CANES Analysis & Assessment (CPFF)

CANES Analysis & Assessment (FFP)

CANES Analysis & Assessment--ODCs

CDRL

CEILING AMOUNT

Not Separately Priced (NSP)

$96,000,000

NSP

$4,000,000

$4,000,000

$2,000,000

NSP


AR at 6.

[5] The chart further indicated that for FYs 2018, 2019, 2020, 2021, and 2022, anticipated acquisitions of CANES units were 46, 40, 20, 7, and 5 respectively.

[6] A delivery order for one DDG CANES production unit was to be issued to each awardee at the time of contract award. Id. at 65.

[7] The solicitation explained that an offeror would receive a rating of pass under the brand name or equal factor if it submitted “an explicit statement indicating intent to produce the DO [delivery order] 0001 CANES DDG unit using the brand name components listed in the CANES PBL [production baseline],” or if “the Government determines that proposed “equal” component(s) meet the requirements of “equal” as set forth in Section L, Attachment L-4.” Id. at 87.

[8] Under factor 4, a rating of neutral was also possible.

[9] In other words, offerors were to provide a NTE price per unit for a quantity of 1, a NTE price per unit for a quantity of 2, a NTE price per unit for a quantity of 3, and so on up to a quantity of 15.

[10] In responding to a subsequently abandoned argument raised by the protester pertaining to the meaning of this language, the agency explained that price quantities would be determined on an individual delivery order basis, as opposed to on a cumulative, annualized basis. That is, for example, if the agency issued a CY 14 delivery order for 7 DDGs, and then issued a second CY14 delivery order for 3 DDGs, contractors would compete for the second delivery order using their NTE unit prices for a quantity of 3, as opposed to their NTE unit prices for a quantity of 10. Supplemental Agency Report, Oct. 23, 2014, at 12.

[11] The RFP indicated that 15 units was the agency’s best estimate of the CANES DDG quantities expected to be procured during each of the four calendar years.

[12] Evaluated price is rounded to nearest million dollars. The evaluated cost for the CLIN 0005 analysis and assessment work constituted a miniscule portion (i.e., less than 0.5 percent) of the overall evaluated prices; that is, overall evaluated prices were based almost entirely on DDG unit prices.

[13] The other offeror not selected for award, DRS, also protested to our Office. We address its arguments in a separate decision.

[14] CGI raised multiple arguments in addition to those that we discuss in this decision. While we do not address every argument raised, instead focusing on what we regard as the protester’s strongest arguments, we did consider all of the arguments, and find that none of the other arguments provides a basis for sustaining the protest.

[15] The protester maintains that a quantity of 5 per delivery order is in line with the agency’s revised acquisition strategy. This contention is supported in the agency’s status update briefing document of August 5, 2014, which indicates that the agency will change its delivery order strategy “to provide continuous opportunity (i.e. 10-12 DOs annually, smaller vs. larger procurement QTYs),” and which describes the smaller number of units per delivery order as “[n]otionally, 1-3 Unit Level (U/L) and 1-2 Force Level (F/L) CANES per DO.” CANES Production Contract Status Update, August 5, 2014, at 15.

[16] As noted in the background section of this decision, offerors were to provide not to exceed unit prices for quantities of 1/each to 15/each. The various offerors proposed differing rates of discount on the incremental quantities. For example, the NTE unit prices (rounded to the nearest million) for each of the seven offerors for quantities of 3, 5, 7, 10, 12, and 15 for CY14 were as follows:

 

Quantity

3

5

7

10

12

15

Offeror

 

BAE

[deleted]

[deleted]

[deleted]

[deleted]

[deleted]

[deleted]

CGI

[deleted]

[deleted]

[deleted]

[deleted]

[deleted]

[deleted]

DRS

[deleted]

[deleted]

[deleted]

[deleted]

[deleted]

[deleted]

GD

[deleted]

[deleted]

[deleted]

[deleted]

[deleted]

[deleted]

GTS

[deleted]

[deleted]

[deleted]

[deleted]

[deleted]

[deleted]

NG

[deleted]

[deleted]

[deleted]

[deleted]

[deleted]

[deleted]

Serco

[deleted]

[deleted]

[deleted]

[deleted]

[deleted]

[deleted]


PEB Report at 12-18.

[17] As noted above, CGI appears to have been prejudiced by the agency’s error where, had the agency considered offerors’ prices at a level consistent with the agency’s intended ordering strategy, CGI may have been in line for an award.

[18] In its initial price evaluation report, the PEB found the overall evaluated prices of all seven offerors to be reasonable. PEB Report, Feb. 26, 2014, at 27, 41, 52, 78, 88, 100, and 118.

[19] The prices are rounded to the nearest million.

[20] According to the agency, since the RFP was released on May 30, 2013, “the previous 5 years” is June 2008 to May 2013. Agency Report, Oct. 2, 2014, at 23 n.10.

[21] A rating of very relevant was to be assigned if the present/past effort involved “essentially the same scope and magnitude of effort and complexities” as the effort here, and ratings of relevant, somewhat relevant, and not relevant if the present/past effort involved similar, some of, or little or none of the scope and magnitude of the effort here, respectively. RFP at 90.

[22] We recognize that the protester maintains that it should have received credit for performance within the 5-year window on contract 3 because the Navy contacted the government point of contact for the contract and confirmed that “contract performance represented significant experience with rack production in a Build‑To‑Pricing environment, and that the Offeror has ‘a lot of hard experience in C2 tactical communication and C41 systems.’” Protester’s Comments, Oct. 30, 2014, at 22, citing SSEB Report, June 30, 2014, at 25. We do not agree with the protester’s contention that these comments confirm that it had superior performance on the contract within the 5-year period. While the comments confirm that the protester has relevant experience, they do not address the quality of that experience.

[23] The quality of NG’s performance on its third contract was [deleted], but the evaluators gave the contract lesser weight in their evaluation because it was found to be only somewhat relevant.

Downloads

GAO Contacts

Office of Public Affairs