DRS Laurel Technologies

B-410330: Dec 10, 2014

Additional Materials:

Contact:

Ralph O. White
(202) 512-8278
WhiteRO@gao.gov

Kenneth E. Patton
(202) 512-8205
PattonK@gao.gov

 

Office of Public Affairs
(202) 512-4800
youngc1@gao.gov

DRS Laurel Technologies, of Jonestown, Pennsylvania, protests the award of contracts to five other offerors under request for proposals (RFP) No. N00039-13-R-0013, issued by the Department of the Navy, Space and Naval Warfare Research Center (SPAWAR) for the production of build-to-print network systems to be installed on Navy ships in support of the Consolidated Afloat Networks & Enterprise Services (CANES) program. The protester challenges the agency's price and technical evaluations and argues that the agency conducted unequal discussions.

We deny the protest.

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of: DRS Laurel Technologies

File: B-410330

Date: December 10, 2014

Mark D. Colley, Esq., Craig A. Holman, Esq., Stuart W. Turner, Esq., Dominique L. Casimir, Esq., Steffen G. Jacobsen, Esq., and Thomas D. McSorley, Esq., Arnold & Porter LLP, for the protester.
Michael R. Charness, Esq., David R. Johnson, Esq., Jamie F. Tabb, Esq., Erin N. Rankin, Esq., and Elizabeth Krabill McIntyre, Esq., Vinson & Elkins LLP, and Catherine K. Ronis, Esq., BAE Systems, Inc., for BAE Systems Technology Solutions and Services Inc.; David A. Churchill, Esq., Kevin P. Mullen, Esq., and James A. Tucker, Esq., Jenner & Block LLP, for General Dynamics C4 Systems;
Jonathan J. Frankel, Esq., John P. Janecek, Esq., Brett J. Sander, Esq., and Nichole A. Best, Esq., Frankel PLLC, Anne B. Perry, Esq., and Townsend L. Bourne, Esq., Sheppard Mullin Richter & Hampton LLP, and William J. Colwell, Esq., and Linda T. Maramba, Esq., Northrop Grumman Corp., for Northrop Grumman Systems Corp.; Kelly E. Buroker, Esq., Kevin P. Connelly, Esq., Kirsten W. Konar, Esq., Jacob W. Scott, Esq., Caroline A. Keller, Esq., and Kyle E. Gilbertson, Esq., Vedder Price, P.C., for Serco, Inc., the intervenors.
Marian Ciborski, Esq., Susanna Torke, Esq., and Laura Biddle, Esq., Department of the Navy, Space and Naval Warfare Systems Command, for the agency.
Jennifer D. Westfall-McGrail, Esq., and Edward Goldstein, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1. Protest arguing that agency failed to perform required price realism analysis is denied where agency compared offerors’ pricing to historical pricing for purposes of determining whether offerors’ prices were too low.

2. Protest that awardees’ prices are unbalanced is denied where record fails to show that any of awardees’ line item prices are overstated.

3. Agency did not conduct unequal discussions by notifying other offerors, but not the protester, that their prices were noncompetitive where protester’s prices were not viewed as noncompetitive until the agency conducted its tradeoff analysis.

4. Protest that agency did not give protester’s proposal sufficient credit for proposed 2-year workmanship warranty is denied where (1) protester failed to furnish adequate detail regarding the warranty’s terms, and (2) agency reasonably concluded that the warranty did not represent a surpassing of the solicitation’s requirements, in any event.

5. Protest that agency misevaluated protester’s proposal under small business utilization and commitment factor is denied where agency reasonably concluded that protester’s proposal did not establish that protester had met or exceeded its small business subcontracting goals under prior contracts.

DECISION

DRS Laurel Technologies, of Jonestown, Pennsylvania, protests the award of contracts to five other offerors under request for proposals (RFP) No. N00039‑13‑R‑0013, issued by the Department of the Navy, Space and Naval Warfare Research Center (SPAWAR) for the production of build-to-print network systems to be installed on Navy ships in support of the Consolidated Afloat Networks & Enterprise Services (CANES) program.[1] The protester challenges the agency’s price and technical evaluations and argues that the agency conducted unequal discussions.

We deny the protest.

BACKGROUND

The agency explains that CANES is the Navy’s designated program to modernize information technology at sea; a CANES system consolidates legacy standalone networks into a single system. The RFP here, which is a follow-on to previously awarded contracts for system design and development and the production of limited deployment units, contemplates the award of contracts pursuant to which production units for unit, force, and submarine platforms will be ordered on a build‑to-print basis.[2] The contractors are to source, assemble, load and configure software, factory acceptance test, and deliver CANES units and sub-assemblies as directed by individual delivery orders, for which the firms will compete.[3]

The RFP, issued on May 30, 2013, provided for the award of up to three indefinite‑delivery/indefinite-quantity (IDIQ) contracts with firm-fixed-price (FFP) and cost-plus-fixed-fee (CPFF) contract line items (CLINs). RFP at 56. CLIN 0001, for CANES production units and sub-assemblies, has a ceiling amount of over $2.4 billion.[4] The solicitation included a chart identifying how many units in each of 19 platform classes, to include DDGs (Navy destroyers), the agency expected to acquire over the life of the contract. The chart indicated that in fiscal year (FY) 2014, the agency anticipated the acquisition of 46 total CANES units, in FY 2015, 48 CANES units; in FY 2016, 62 CANES units; and in FY 2017, 61 CANES units.[5] Id. at 67.

The solicitation provided for award to the offerors whose proposals represented the best value to the government, with technical factors significantly more important than price in the determination. Offerors were cautioned that “[p]roposals which are unrealistic in terms of technical or schedule commitments, or unrealistically high or low in terms of price, may be deemed to be reflective of an inherent lack of technical competence, or indicative of a failure to comprehend the complexity and risks of the proposed work and may be grounds for rejection of the proposal.” Id. at 85.

The RFP included five non-price evaluation factors. Under factor 1, which was most important, offerors were to describe their technical approaches to producing one CANES DDG unit.[6] Under factor 2, which was equal in weight to factor 3, offerors were to describe their CANES production processes and capacity. Under Factor 3, offerors’ past performance was to be evaluated. Factor 4 (Small Business Utilization and Commitment) was less important than factors 1-3. Factor 5 (Brand Name or Equal) was to be evaluated on a pass/fail basis, with a rating of fail rendering the proposal ineligible for award.[7] Proposals were to be rated under factors 1, 2, and 4 as outstanding, good, acceptable, marginal, or unacceptable.[8] Under the past performance factor, proposals were to be rated as substantial, satisfactory, limited, no, or unknown confidence.

For purposes of the price evaluation, offerors were to provide a price for a single CANES DDG unit to be ordered at the time of contract award. Supra at fn. 6. In addition, offerors were to provide not-to-exceed (NTE) FFP unit price ceilings for CLIN 0001 CANES DDG production units for quantities 1/each through 15/each for calendar year (CY) 14, CY15, CY16, and CY17.[9] The RFP advised that the unit price ceilings proposed for CYs 15-17 were for evaluation purposes only and did not apply to the resulting awarded contract. Id. at 70. Offerors were further advised that the government might apply the NTE unit prices proposed for CY14 “in any quantity on a single Delivery Order, or multiple Delivery Orders, up to maximum quantity of fifteen (15) per Delivery Order.” Id. at 6. An offeror’s total evaluated price was to be calculated by adding together its price for the single delivery order 0001 unit; its price for 15 CANES DDG units for CY14,[10] 15 CANES DDG units for CY15, 15 CANES DDG units for CY16, and 15 CANES DDG units for CY17; and the evaluated CPFF amount for the CANES Analysis and Assessment work in CLIN 0005.

The solicitation provided for a price analysis of the CLIN 0001 prices in accordance with Federal Acquisition Regulation (FAR) §15.404-1(b). The RFP also provided for evaluation of the “extent to which evidence of unbalanced pricing exists, between CLINs 0001 calendar year . . . pricing, and between different quantities within one CY.” Id. at 87. A cost realism analysis of the CLIN 0005 prices was also to be performed, with proposed costs to be adjusted based on the results.

Seven offerors submitted proposals by the August 21, 2013 closing date. The agency evaluated the proposals, established a competitive range consisting of all seven, and conducted discussions with each offeror. Each offeror submitted a final proposal revision (FPR) by the May 21, 2014 due date. After reviewing the FPRs, agency technical and price evaluators rated the proposals as follows:

 

Factor 1

Factor 2

Factor 3

Factor 4

Factor 5

Evaluated Price[11]

BAE

Acceptable

Good

Satisfactory

Acceptable

Pass

$108 M.

CGI

Good

Acceptable

Satisfactory

Acceptable

Pass

$129 M.

DRS

Good

Acceptable

Substantial

Acceptable

Pass

$128 M.

GD

Good

Acceptable

Substantial

Acceptable

Pass

$106 M.

GTS

Acceptable

Acceptable

Satisfactory

Neutral

Pass

$94 M.

NG

Acceptable

Acceptable

Satisfactory

Good

Pass

$85 M.

Serco

Good

Good

Substantial

Acceptable

Pass

$97 M.


Source Selection Evaluation Board (SSEB) Report, June 30, 2014, at 8; Price Evaluation Board (PEB) Report, July 21, 2014, at 19.

The SSEB identified one major strength, three minor strengths, and no weaknesses in the protester’s proposal under factor 1, leading to the rating of good. The major strength pertained to DRS’s approach to risk identification and management, and the three minor strengths pertained to the protester’s Integrated Master Schedule, its manufacturing plan, and its approach to Factory Acceptance Testing. Under factor 2, the evaluators identified no particular strengths or weaknesses, leading to a rating of acceptable. Under factor 3, the SSEB assigned the protester a performance rating of substantial confidence, finding that its past performance references were recent and relevant, and that they reflected very good to exceptional quality. The SSEB assigned the protester’s proposal a rating of acceptable under factor 4, finding that a rating higher than acceptable was unsupported due to DRS’s failure to furnish information demonstrating the extent to which it had met or exceeded small business subcontracting goals on its prior contracts.

A source selection advisory council (SSAC) reviewed the findings of the technical and price evaluation teams. Taking into account the ratings under the five non-price factors only, the SSAC ranked the proposals in the following order:

Technical Ranking

Rank

Offeror

1

Serco

2

DRS

3

GD

4

CGI

5

BAE

6

NG

7

GTS


SSAC Report, Aug. 4, 2014, at 36. The SSAC then conducted a best value price/technical tradeoff taking into account the findings of both the SSEB and the PEB, which resulted in the following ranking of proposals:

Best Value Ranking

Rank

Offeror

1

Serco

2

GD

3

NG

4

GTS

5

BAE

6

DRS

7

CGI


Id. at 43.

With regard to its decision that five proposals represented a better value than the protester’s, the SSAC explained as follows:

In non-price factor comparison, Serco was the only Offeror to exceed contract requirements in [deleted], and thus ranked superior to DRS based on the assessment of [deleted] in Factors 1 and 2 and high expectation of successful performance in Factor 3, compared to the absence of strengths in DRS’s Factor 2 proposal. As Serco’s evaluated price was also lower than DRS, the SSAC determined that Serco represented a better value to the Government than DRS.

 

In the non-price comparison to GD, both Offerors were rated as good in Factor 1. The SSAC found a discriminating advantage for DRS in Factor 1, where DRS’ major strength and three minor strengths compared to GD’s [deleted]. Both Offerors exceeded requirements; both Offerors had [deleted] in the manufacturing plan and factory acceptance test areas. However, DRS was assessed with a major strength in risk management and a minor strength in its IMS, while GD was assessed with [deleted] in risk management and its IMS was determined to be [deleted]. The SSAC determined GD and DRS to be essentially equal in Factors 2 and 3 where neither had [deleted] in Factor 2, and the Factor 3 past performance referenced by both Offerors was assessed by the SSAC as giving a high expectation of success.

 

Despite the slight non-price superiority of DRS’s proposal in Factor 1, DRS’s evaluated price is 20.8 percent higher than GD’s evaluated price, and the SSAC considers GD’s lower evaluated price to be of greater value to the Government than the difference between the proposals elements in Factor 1 where GD’s Factor 1 proposal exceeded requirements to a lesser extent than DRS’. The SSAC determined that the evaluated price premium is so significantly high that it diminished the value of DRS’ superiority in Factor 1. The SSAC therefore considered DRS’s proposal to be of lesser value overall than GD’s proposal.

 

As documented above in the non-price comparison, DRS’ proposal was superior to NG’s in every factor except Factor 4, the least important. In comparison to NG, DRS’ Good Factor 1 was assessed with strengths across four areas and Substantial Confidence Factor 3 had a high expectation of successful performance; NG’s Acceptable Factor 1 had [deleted] and Satisfactory Confidence in Factor 3 had a reasonable expectation of successful performance. However NG’s proposal contains the lowest evaluated cost and DRS contains the second highest with a premium of 49.8 percent over NG’s. The SSAC did not consider DRS’ non-price benefits of sufficient value to warrant the 49.8 percent evaluated price premium over NG. The SSAC determined that the evaluated price premium significantly diminished the value of DRS superiority in the non-price factors. The SSAC therefore considers DRS’s proposal to be of lesser value to the Government than NG’s.

 

Similarly, in comparison to GTS, DRS Good Factor 1 was assessed with strengths across four areas and Substantial Confidence in Factor 3 had a high expectation of successful performance; GTS’ Acceptable Factor 1 had [deleted], and Satisfactory Confidence in Factor 3 had a reasonable expectation of successful performance. However, GTS’s proposal contains the second lowest evaluated cost, and DRS’ contains the second highest with a premium of 36.8 percent over GTS. Based on the on-price comparison detailed above, the SSAC did not consider DRS’ non-price benefits of sufficient value to warrant the proposed premium between the proposals. The SSAC determined that the evaluated price premium significantly diminished the value of DRS’s technical superiority in the non-price factors. The SSAC therefore considers DRS’ proposal to be of lesser value to the Government than GTS’.

 

While DRS was ranked superior in on-price technical Factors, its evaluated price presented a premium of 18.5 percent over BAE. DRS’ single major and three minor strengths across multiple elements in Factor 1, and high expectation of successful performance in Factor 3 compares to BAE’s [deleted] in Factor 2 and reasonable expectation of successful performance in Factor 3. Considering the equal relative importance of Factors 2 and 3, where both Offerors exceed contract requirements in one factor, the technical discriminator between BAE and DRS arises in Factor 1, where BAE’s proposal did not rise above an acceptable level despite its [deleted]. However, the SSAC determined that the evaluated price premium between the proposals was so high as to diminish the value of DRS’ superiority in Factor 1 and therefore the discriminators of DRS’ proposal did not warrant an 18.5 percent price premium. The SSAC concluded that DRS’ proposal was of lesser value to the Government than BAE’s.

Id. at 48-49.

The SSAC recommended that despite the solicitation language indicating that the agency intended to award up to three contracts, award be made to the five top‑ranked offerors. In conjunction with its decision to increase the number of awards, the agency decided to change its strategy for placing delivery orders with the selected contractors. Specifically, the agency decided that it would not place 3‑4 orders of larger quantities of CANES units annually as originally planned, but rather it would issue more orders for smaller quantities of CANES units in order to achieve more competition on a per order basis. In the foregoing connection, the SSAC explained as follows:

Although the RFP indicated intent to award 3 contracts, the SSAC believes that given the best value outcome of the source selections, award to more than three is more advantageous to eliminate risk to delivery order competition that would result if all three Offerors do not bid on every order.

 

The SSAC believes that executing awards to five (5) Offerors provides best value for the Government, given the result of the evaluation. The SSAC believes that the amount of CANES work to be procured over the life of this contract, as estimated in Factor 2 of the RFP, will support five (5) awardees. The Government will change its DO [deliver order] strategy to provide continuous opportunity, for example by breaking up requirements into 10-12 DOs annually vice 3-4 annually as originally conceived. This should ensure competition even if one or two awardees decide not to bid on a particular order.

Id. at 51.

The SSA essentially adopted the SSAC’s findings. In agreeing with the SSAC’s recommendation that DRS not receive an award due to its higher price as evaluated at the 15 unit ordering level, the SSA noted in relevant part as follows:

The best value tradeoff analysis and resulting recommendation from the SSAC leads to my decision not to award to DRS and CGI based on the extent of the price premium associated with their proposals; compared to the other offerors, and considering the particular technical discriminators I have found that DRS’ and CGI’s prices are so significantly high as to diminish the technical merit of their proposals.

Source Selection Decision Document, Aug. 6, 2014, at 8.

The agency awarded contracts to the five selected firms on August 20. DRS timely requested a debriefing, which the agency furnished on August 28. DRS protested to our Office on September 2.[12]

DISCUSSION

DRS raises a number of challenges to the agency’s evaluation of proposals. The protester contends that the agency failed to conduct a price realism analysis and to recognize that three offerors had proposed unbalanced prices. DRS also argues that the agency engaged in unequal discussions and failed to acknowledge strengths in its proposal pertaining to the technical approach and small business utilization/commitment factors. We discuss these allegations below.

Price Realism

DRS argues that the agency failed to perform a meaningful price realism analysis, as required by the terms of the RFP. The protester contends that the agency’s failure to perform a meaningful realism analysis is unfair to offerors with realistic pricing and exposes the agency to significant risk. In response, the agency maintains that the RFP neither contemplated, nor required, a price realism analysis, and that it did in fact perform a price analysis that established the realism of offerors’ prices.

As previously noted, the solicitation provided that proposals that were “unrealistically high or low in terms of price” might be “deemed to be reflective of an inherent lack of technical competence, or indicative of a failure to comprehend the complexity and risks of the proposed work” and might “be grounds for rejection of the proposal.” RFP at 85. We have previously held that where a solicitation advises offerors that unrealistically low prices may serve as a basis for rejection of a proposal, it is implicit that the agency will consider whether offerors’ prices are in fact unrealistic. Esegur-Empresa de Segurança, SA, B-407947, B-407947.2, Apr. 26, 2013, 2013 CPD ¶ 109 at 4. In other words, where a solicitation advises that unrealistically low prices may serve as a basis for rejection of a proposal, the agency must perform a price realism analysis. Logistics 2020, Inc., B-408543, B‑408543.3, Nov. 6, 2013, 2013 CPD ¶ 258 at 8.

The nature of the analysis required to assess whether an offeror’s price is so unrealistically low as to reflect a lack of technical competence or understanding is within the agency’s discretion, however. AMEC Earth & Envtl., Inc., B-404959.2, July 12, 2011, 2011 CPD ¶ 168 at 8. Agencies may use a variety of price evaluation methods to assess realism, including a comparison of prices received to one another, to previously proposed or historically paid prices, or to an independent government estimate. General Dynamics--Ordnance & Tactical Sys., B-401658, B‑401658.2, Oct. 26, 2009, 2009 CPD ¶ 217 at 3.

In an attachment to its Business Clearance Memorandum (BCM), the agency assessed the realism of the prices proposed here through comparison with the prices paid under the predecessor contract for limited deployment units.[13] The supporting price analysis noted that the historical prices had been proposed “under significantly different contractor risks and materially differing terms and conditions that those in the current CANES build-to-print effort.” BCM, Att. 9 (Supporting Price Analysis, July 21, 2014), at 1. Among the differences were that the preceding contractor was responsible for developing and controlling the CANES Production Baseline and meeting the CANES Functional Specification requirements; for end-of-life product replacements and updates to the bill of materials; and for completing first article testing while simultaneously building CANES production units. Id. at 1-2. According to the agency, “[t]hese risks do not exist under [the current RFP], which is a build-to-print production effort in accordance with a Government provided, Government controlled PBL, shifting much of the overall risk to the Government.” Id. at 2. Another difference impacting pricing noted by the agency was that the unit prices under the preceding contract included the price of all network software necessary to meet the CANES functional specification requirements, whereas under the instant RFP, some of the software is to be provided by the government.

After adjusting the historical unit price for a single DDG unit downward to take into account the cost of the software noted above, the agency compared the adjusted price to the prices proposed by the offerors here for a single DDG unit. The agency found that six of the seven offerors had proposed prices less than the historical price as adjusted, and that the proposed prices were “within 14 percentage points of each other in a range from 18.5% to 32.5% below” the historical price. Id. The agency concluded that this “range of proposed pricing” was “reasonable” given the different contract risks. The agency also compared the average adjusted historical price for quantities 1 to 15 to the average proposed prices of the offerors here, and concluded that “[n]one of the differences from the adjusted historical average are so large as to be considered unreasonable given the difference in risk to the contractor under the current RFP requirements compared to those risks relevant to the FY 13 historical prices.” Id. at 5. While the protester disputes several aspects of the agency’s analysis,[14] we think that the comparisons performed provided the agency with a reasonable basis for concluding that none of the offerors here proposed prices that were so unrealistically low as to reflect a lack of technical competence or understanding.

Unbalanced Pricing

DRS further argues that the Navy unreasonably failed to recognize that [deleted] had proposed unbalanced prices. In support of its argument that the pricing of the preceding three offerors was unbalanced, the protester points to the significant disparity between their unit prices for an order quantity of 15 versus their unit prices for an order quantity of 1.

The protester’s argument lacks merit. Unbalanced pricing exists where the prices of one or more line items are significantly overstated, despite an acceptable total evaluated price (typically achieved through under pricing of one or more other line items). Inchcape Shipping Servs. Holding, Ltd., B-403399.3, B-403399.4, Feb. 6, 2012, 2012 CPD ¶ 65 at 4. The protester has not demonstrated--indeed, it has not even alleged--that any of [deleted] prices are overstated. Moreover, there is little, if any, risk that award to any of three will result in the government paying unreasonably high prices for contract performance (which is the risk inherent in overstated prices) since in order to prevail in the delivery order competitions, the contractors will have to propose prices lower than their competitors’ prices.

Unequal Discussions

DRS argues that the agency conducted non-meaningful and unequal discussions by advising BAE and GTS that their initial prices were noncompetitive, but failing to notify it that its pricing was “significantly high.” Protester’s Comments and Supplemental Protest, Oct. 14, 2014, at 3.

The protester’s allegation that the agency conducted non-meaningful discussions by failing to advise it that its pricing was high is without merit. While it is true that where an agency conducts discussions, those discussions must be meaningful, Unisys Corp., B-406326 et al., Apr. 18, 2012, 2012 CPD ¶ 153 at 5, and for discussions to be meaningful, an agency must advise an offeror if its proposed price is considered to be unreasonably high, DeTekion Security Sys., Inc., B-298235, B‑298235.2, July 31, 2006, 2006 CPD ¶ 130 at 15, where an offeror’s price is not viewed as unreasonable, an agency need not advise it that its price is higher than those of its competitors for discussions to be meaningful. Lyon Shipyard, Inc., B‑407771.2, July 15, 2013, 2013 CPD ¶ 173 at 4; L-3 Sys. Co., B-404671.2, B‑404671.4, Apr. 8, 2011, 2011 CPD ¶ 93 at 15. Here, the evaluators found both the protester’s initial and final proposed prices to be reasonable. PEB Report, Feb. 26, 2014, at 118; PEB Report, July 21, 2014, at 43. Accordingly, there was no requirement that the agency advise DRS that its price was higher than the prices of a number of its competitors.

Regarding the protester’s further argument that the agency engaged in unequal discussions by notifying BAE and GTS that their initial prices were noncompetitive, but failing to advise it that its price was out of line with the competition, the agency explains that while it considered BAE and GTS’s initial proposed prices to be so high as to make contract award unlikely, it did not find DRS’s price to be “‘so significantly high’ as to make it noncompetitive [without regard to [the protester’s] technical proposal.” Agency Report, Oct. 23, 2014, at 4. In the foregoing connection, the initial evaluated prices of BAE and GTS were [deleted] and [deleted] million, respectively, whereas the initial prices of other five offerors were [deleted] (Serco), [deleted] (GD), [deleted] (NG), [deleted] (DRS), and [deleted] million (CGI).[15] PEB Report, Feb. 26, 2014, at 22. After the agency notified BAE and GTS during discussions that their pricing was so high as to render their proposals not competitive and make contract award to them unlikely, both BAE and GTS dramatically reduced their proposed prices--BAE from [deleted] to $108 million, and GTS from [deleted] to $94 million. NG also reduced its final price significantly (from [deleted] to $85 million). PEB Report, July 21, 2014, at 19. As a result of the significant decreases in these offerors’ prices (as well as an increase in the protester’s proposed price from [deleted] to $128 million), DRS’s final evaluated price became second high.

The agency’s failure to advise DRS that its price was significantly higher than the prices of several of its competitors did not constitute unequal discussions. While the protester’s initial price was higher than the prices of three of its competitors, it was not so high as to be clearly noncompetitive (in contrast to BAE and GTS’s prices); thus, we fail to see that the agency treated the protester unequally by failing to notify it during discussions that its price was too high. We recognize that the protester contends in the foregoing connection that its price was noncompetitive by virtue of being higher than the prices of three of its competitors given that at the time the agency conducted discussions, it intended to make only up to three awards, but we are not persuaded by this argument. The mere fact that the protester’s initial proposed price was not among the three lowest does not show that it was noncompetitive such that contract award was unlikely, particularly where, as here, the price differential between the protester’s proposal and the next lowest-priced proposal--which was also lower-rated technically than the protester’s proposal--was only approximately 6 percent.

Further, the agency was not required to reopen discussions with DRS after FPRs were received due to the drastic reductions in other offerors’ prices, which resulted in the protester’s price becoming second highest. According to the agency, it was only after it weighed the technical merits of DRS’s proposal against the proposal’s price as part of the price/technical tradeoff process that it considered DRS’s price to be high; that is, the protester’s price was not viewed as noncompetitive until the agency conducted its tradeoff analysis. Under such circumstances, reopening of discussions was not required. See Booz, Allen & Hamilton, Inc., B-249236.4, B‑249236.5, Mar. 5, 1993, 93-1 CPD ¶ 209 at 7.

Technical Evaluation

DRS also challenges the agency’s evaluation of its proposal under the technical approach factor, arguing that the Navy unreasonably failed to acknowledge as a strength under the factor its offering of a 2-year workmanship warranty. The protester also argues that the agency misevaluated its proposal under the small business utilization and commitment factor.

In reviewing protests of an agency’s evaluation of offerors’ technical proposals, our Office does not reevaluate proposals; rather, we review the evaluation to determine if it was reasonable, and consistent with the solicitation’s evaluation scheme, as well as applicable procurement statutes and regulations. Athena Tech. Group, Inc., B‑409984, Sept. 11, 2014, 2014 CPD ¶ 273 at 4.

With regard to the protester’s warranty argument, the RFP instructed offerors that in describing their technical approaches to producing a CANES DDG unit under factor 1, they should furnish detail regarding their “tools, techniques and procedures for ensuring the CANES DDG unit is delivered on schedule and without defect.” RFP at 66. In its technical proposal, DRS stated it would provide “a 2-year workmanship warranty to ensure [the customer’s] complete satisfaction with the quality of the DRS Team’s work on the CANES DDG production unit.” Protest, Ex. A, at I-1-22. The protester contends that its offering of this warranty exceeded the RFP’s requirements in a way advantageous to the agency, and should thus have been acknowledged as a strength.[16] DRS argues that the identification of another strength in its proposal under factor 1 would have had an impact on the agency’s price/technical tradeoff analysis.

In response, the agency explains that it did not regard DRS’s offering of a 2-year workmanship warranty as a strength because the protester’s proposal did not furnish any detail regarding the terms and conditions of the warranty offered, leaving the SSEB without the detail it needed to assess the benefit of the proposed warranty to the government. The agency further explained that since the warranty merely ensured corrective measures in the event the contractor failed to meet the government’s requirement for defect-free delivery, it did not represent a surpassing of the government’s requirements. While the protester disputes the Navy’s explanations, arguing that the agency should have regarded its proposed warranty as a strength despite the lack of detail as to its terms because its guarantee of defect-free delivery was “patently beneficial” to the government, Protester’s Comments, Oct. 14, 2014, at 35, the agency’s explanation is reasonable. It is the responsibility of the offeror to submit an adequately written proposal that establishes its capability and the merits of its proposed technical approach in accordance with the evaluation terms of the solicitation, TPMC EnergySolutions Envtl. Servs., LLC, B-406183, Mar. 2, 2012, 2012 CPD ¶ 135 at 6, and an agency may reasonably decline to credit a proposal where, due to a lack of detail, the benefit of a proposed feature cannot be ascertained. Moreover, we find persuasive the agency’s argument that, in any event, the warranty did not represent a surpassing of the agency’s requirements.

As noted above, the protester also disputes the evaluation of its proposal under the small business utilization and commitment factor, arguing that the proposal deserved a rating of good or better for the factor. DRS contends that the agency unreasonably concluded that its risk of unsuccessful performance under the factor was moderate because “[a]lthough the Offeror’s Small Business Subcontracting Plan met or exceeded the [Navy] goals as stated in the RFP, the proposal did not clearly demonstrate that the Offeror has met subcontracting goals on previous contracts since the statistics provided were for the Prime on the relevant contracts, not the Offeror.”[17] DRS Debriefing, Aug. 28, 2014, at 33; SSEB Report, June 30, 2014, at 35. The protester argues that the finding that it provided statistics regarding the prime contractor’s compliance with small business subcontracting goals, as opposed to its own compliance, was incorrect.[18]

The protester’s argument is misplaced since it premised on a misunderstanding as to the statistics to which the agency was referring. It was not information regarding the small business utilization percentages achieved by the protester under the contracts in question that the agency found to be lacking; rather, it was information regarding the small business utilization targets themselves. That is, the protester furnished information regarding the prime contractor’s small business utilization targets for the contracts, without establishing that these goals had been relevant to it as a subcontractor. As explained by the agency, “[s]ince the subcontracting goals are at the prime contract level (. . .), and the Offeror did not explain how these were relevant to DRS’ small business subcontracting effort, the SSEB could not validate the extent to which DRS met or exceed[ed] its small business subcontracting goals . . .”. Id. at 36.

While in seeking to rebut the agency’s explanation, the protester contends that the agency should have assumed that the prime contractor’s small business utilization targets flowed down to it as a subcontractor, the evaluators clearly did not regard such an assumption as warranted. We have no basis upon which to question the reasonableness of the evaluators’ judgment in this regard.

The protest is denied.

Susan A. Poling
General Counsel



[1] The five awardees are BAE Systems Technology Solutions & Services, Inc., of Rockville, Maryland; General Dynamics C4 Systems, of Taunton, Massachusetts; Global Technical Systems, of Virginia Beach, Virginia; Northrop Grumman Systems Corp., of Herndon, Virginia; and Serco Inc., of Reston, Virginia.

[2] According to the solicitation, CANES architecture on a unit level platform may have between 15 and 18 server, switch, media, and data storage racks; support approximately 80 applications on server hosted and connected systems; and service between 106 and 126 wireless and hard-wired workstations, plus peripheral devices. Force level platforms may have between 38 and 49 server, switch, media, and data storage racks; support approximately 120 software applications; and include anywhere from 450 to 960 workstations, plus peripheral devices. RFP (Rev. 11) at 66-67.

[3] The solicitation indicated that fair opportunity competitions for the delivery orders would “employ low-price-technically-acceptable selection criteria for fixed‑priced orders, unless criteria based on a cost technical trade-off is determined to be in the Government’s best interest.” Id. at 17.

[4] Other CLINs and their associated price ceilings were as follows:

CLIN

DESCRIPTION

CEILING AMOUNT

0002

Contract Data Requirements List (CDRL)

Not Separately Priced (NSP)

0003

CANES Spares/System Components

$96,000,000

0004

CDRL

NSP

0005

CANES Analysis & Assessment (CPFF)

$4,000,000

0006

CANES Analysis & Assessment (FFP)

$4,000,000

0007

CANES Analysis & Assessment--ODCs

$2,000,000

0008

CDRL

NSP


AR at 6.

[5] The chart further indicated that for FYs 2018, 2019, 2020, 2021, and 2022, anticipated acquisitions of CANES units were 46, 40, 20, 7, and 5 respectively.

[6] A delivery order for one DDG CANES production unit was to be issued to each awardee at the time of contract award. Id. at 65.

[7] The solicitation explained that an offeror would receive a rating of pass under the brand name or equal factor if it submitted “an explicit statement indicating intent to produce the DO [delivery order] 0001 CANES DDG unit using the brand name components listed in the CANES PBL [production baseline], or if “the Government determines that proposed “equal” component(s) meet the requirements of “equal” as set forth in Section L, Attachment L-4.” Id. at 87.

[8] Under factor 4, a rating of neutral was also possible.

[9] In other words, offerors were to provide a NTE price per unit for a quantity of 1, a NTE price per unit for a quantity of 2, a NTE price per unit for a quantity of 3, and so on up to a quantity of 15.

[10] The RFP indicated that 15 units was the agency’s best estimate of the CANES DDG quantities expected to be procured during each of the four calendar years.

[11] Evaluated price is rounded to nearest million dollars. The evaluated cost for the CLIN 0005 analysis and assessment work constituted a miniscule portion (i.e., less than 0.5 percent) of the overall evaluated prices; that is, overall evaluated prices were based almost entirely on DDG unit prices.

[12] The other offeror not selected for award, CGI, also protested to our Office. We address its arguments in a separate decision.

[13] While the agency did not characterize its comparison of current prices to historical prices as a “realism” analysis, the analysis focused on whether offeror prices were too low; as acknowledged by the protester, see Protester’s Comments, Oct. 14, 2014, at 15, an analysis that focuses on whether offeror prices are too low, as opposed to unreasonably high, is in essence a realism analysis. The Matthews Group, Inc. t/a TMG Constr. Corp., B-408003.3, B‑408004.3, Mar. 21, 2014, 2014 CPD ¶ 104 at 8‑9.

[14] DRS contends that the agency should have performed the comparison using offerors’ unit prices for an order quantity of 15 rather than their average unit price. The protester also argues that as part of its realism analysis, the agency should have considered whether offeror pricing reflected consistency with the solicitation instruction that the agency would provide certain government-furnished-material (GFM) for inclusion in the delivery order 001 DDG unit, but that offerors should account for the cost of procuring comparable material in their NTE pricing. While we do not address the foregoing arguments in detail in this decision, we considered them and find that neither argument provides a basis for sustaining the protest.

[15] The prices are rounded to the nearest million.

[16] The RFP defined a proposal “strength” as “an aspect of the Offeror’s proposal that has merit or exceeds specified performance or capability requirements in a way that will be advantageous to the Government during contract performance.” RFP at 91.

[17] The RFP provided that under the small business utilization and commitment factor, the agency would evaluate the extent to which the offeror demonstrated a commitment to meet the small business subcontracting target percentages set forth in the solicitation; the extent to which the offeror’s management approach enhanced its small business subcontractors, demonstrated that the tasks assigned to them were meaningful, and resulted in broadening the subcontractors’ technical capabilities; and the extent to which the offeror had met small business subcontracting goals on the relevant contracts submitted under the past performance factor. RFP at 87.

[18] DRS was a subcontractor on each of the three contracts that it cited under the past performance factor.