DynCorp International, LLC

B-412451,B-412451.2: Feb 16, 2016

Additional Materials:

Contact:

Ralph O. White
(202) 512-8278
WhiteRO@gao.gov

Kenneth E. Patton
(202) 512-8205
PattonK@gao.gov

 

Office of Public Affairs
(202) 512-4800
youngc1@gao.gov

DynCorp International, LLC, of Fort Worth, Texas, protests the award of a contract to URS Federal Services, Inc., under request for proposals (RFP) No. W58RGZ-14-R-0270, which was issued by the Department of the Army, Army Contracting Command, Redstone, for aviation field maintenance support services. DynCorp challenges the Army's evaluation of the offerors' technical proposals, the protester's past performance, and URS's price/cost proposal, and also argues that the best value tradeoff and source selection was unreasonable.

We deny the protest.

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of:  DynCorp International, LLC

File:  B-412451; B-412451.2

Date:  February 16, 2016

Paul Hurst, Esq., Patrick F. Linehan, Esq., Michael J. Navarre, Esq., and Nicholas Petts, Esq., Steptoe & Johnson LLP, for the protester.
Richard B. O’Keeffe, Jr., Esq., William A. Roberts, III, Esq., and Gary S. Ward, Esq., Wiley Rein LLP, for URS Federal Services, Inc., the intervenor.
Debra J. Talley, Esq., and Patrick G. Nelson, Esq., Department of the Army, for the agency.
Heather Weiner, Esq., and Jonathan L. Kang, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1.  Protest challenging the agency’s evaluation of the protester’s and awardee’s technical proposals is denied where the evaluation was reasonable, consistent with the stated evaluation criteria, reflected equal treatment of the offerors, and was adequately documented.

2.  Protest challenging the agency’s evaluation of the protester’s past performance is denied where the evaluation was reasonable and consistent with the stated evaluation criteria.

3.  Protest challenging the agency’s price realism analysis is denied where, even if the evaluation was flawed, the protester cannot demonstrate that it would have been prejudiced by the agency’s actions.

4.  Agency’s selection of a lower-rated, lower-priced proposal for award is unobjectionable where the agency’s tradeoff decision was reasonable, and where the agency adequately documented its trade-off rationale.

DECISION

DynCorp International, LLC, of Fort Worth, Texas, protests the award of a contract to URS Federal Services, Inc., under request for proposals (RFP) No. W58RGZ-14-R-0270, which was issued by the Department of the Army, Army Contracting Command, Redstone, for aviation field maintenance support services.  DynCorp challenges the Army’s evaluation of the offerors’ technical proposals, the protester’s past performance, and URS’s price/cost proposal, and also argues that the best value tradeoff and source selection was unreasonable.

We deny the protest.

BACKGROUND

On February 4, 2015, the Army issued the RFP, which sought aviation maintenance support for aviation reset (i.e., restoring aircraft for redeployment), installation of modifications of work orders, and auxiliary maintenance support, for the geographical region known as Regional Aviation Sustainment Area Management-Central (RASM-C).[1]  RFP at 2.  The solicitation anticipated the award of a hybrid time-and-materials, cost-reimbursement, and fixed-price contract, for a base year and a 60‑day transition-in period.  Id.

The solicitation provided for award on a best-value basis, considering the following factors:  (1) technical, (2) past performance, (3) small business plan, and (4) price/cost.  RFP at 105-06.  The technical factor included the evaluation of the following subfactors, in descending order of importance:  (a) quality, (b) performance management, (c) regional/site management, and (d) transition‑in/transition-out.  Id. at 105.  The quality subfactor included the evaluation of the following seven quality elements:  (i) certifications, (ii) quality management reviews, (iii) Aviation Field Maintenance (AFM) site AS9110 Rev B certification,[2] (iv) ground and flight risk procedures, (v) flight and ground operating procedures, (vi) safety management program implementation and sustainment plan, and (vii) environmental management system.  Id. at 106.  For purposes of evaluating the quality subfactor, the solicitation included a quality evaluation matrix, as Exhibit B to the RFP, which defined the adjectival ratings for each of the seven elements under this subfactor.  See RFP, exh. B, Quality Evaluation Matrix, at 1-5. 

For purposes of award, the technical factor was more important than the past performance factor.  RFP at 105.  The past performance factor was more important than the price/cost factor.  Id.  The price/cost factor was significantly more important than the small business factor.  Id.  When combined, the three non-price/cost factors were more important than the price/cost factor.  Id.  The solicitation also advised that proposals must receive a rating of at least acceptable for the technical factor and all technical subfactors to be considered for award.  Id.

As relevant here, the solicitation required that offerors provide current and approved flight operating procedures (FOPS) and ground operating procedures (GOPS) in accordance with Defense Contract Management Agency (DCMA) Instruction 8210.1C, also referred to as Army Regulation (Reg.) 95-20.  Id. at 111.  This instruction establishes requirements for flight and ground operations involving contracted work performed on aircraft, as well as policy and procedures to be followed by government flight representatives.  Army Reg. 95-20.  It also describes the content of the contractor’s ground and flight operations procedures and approval for the procedures.  Id.

The Army received proposals from six offerors, including DynCorp and URS.  Agency Report (AR) at 2.  Following the evaluation of proposals, the contracting officer established a competitive range of all six offerors.  Id.  The agency then conducted discussions with the offerors and requested final proposal revisions (FPRs) by August 11.  Id. at 3. 

The agency received FPRs from all six offerors.  Id.  The agency evaluated DynCorp’s and URS’s FPRs as follows: [3]

 

DYNCORP

URS

TECHNICAL OVERALL

GOOD

ACCEPTABLE

Quality

Outstanding

Good

Certifications

Outstanding

Outstanding

Quality Management Reviews

Outstanding

Outstanding

AS9110 Rev B Certification

Outstanding

Outstanding

Ground & Flight Risk Procedures

Outstanding

Acceptable

Flight & Ground Operating Procedures

Outstanding

Acceptable

Safety Management Program

Outstanding

Outstanding

Environmental Mgmt System

Outstanding

Outstanding

Performance Mgmt, Reg/Site Mgmt, Transition

Acceptable

Acceptable

PAST PERFORMANCE

RELEVANT/ SATISFACTORY

RELEVANT/ SATISFACTORY

SMALL BUSINESS PARTICIPATION

ACCEPTABLE

ACCEPTABLE

COST/PRICE

$49,434,057

$43,936,903


AR, Tab 63, Source Selection Decision Document (SSDD), at 5.

Based on the evaluation of the offerors’ proposals, the source selection authority (SSA) concluded that URS’s FPR provided the best value under the terms of the solicitation.  Id. at 10-11.  In comparing DynCorp’s and URS’s proposals, the SSA acknowledged that DynCorp’s proposal was more highly rated under the non-price factors.  The SSA noted that the protester’s proposal “received a Good rating in the Technical Factor and a rating of Outstanding in the Quality Sub-Factor as compared to [URS] which received an Acceptable rating in the Technical Factor and a Good rating in the Quality Sub-factor.”  Id. at 6.  In addition, the SSA noted that DynCorp “has ratings of Outstanding in all seven elements under the Quality Sub‑Factor whereas [URS] has ratings of Outstanding in only five of the seven elements.”  Id.  With regard to the remaining two elements, the SSA explained that URS “received Acceptable ratings . . . as a result of having 2007 FOPs/GOPs (versus current 2013 procedures).”  Id.

In addition, the SSA noted that for the quality subfactor of the technical evaluation factor, DynCorp’s proposal received “an additional strength . . . for current and approved FOPS/GOPS in an area that is identical to the PWS.”  Id.  The SSA stated, “[i]n comparison, [URS] has an additional strength in the Quality Sub-Factor for the AS9110B certification plan which was [DELETED].”  Id.  The SSA noted that both offerors received equal past performance ratings and an acceptable rating in the small business plan factor.  Id. at 7. 

Overall, the SSA concluded that, while DynCorp received “a better rating in the Technical Factor (as a result of having FOPS/GOP[S] written to the current AR 95‑20 2013 version versus [URS’] FOP/GOPS written to the 2007 version) . . . the more favorable Technical rating does not warrant a $5.5 [million] (12.5%) price premium.”  Id. at 6-7.  The SSA explained that “[t]he primary difference between AR 95-20 (2007) and AR 95-20 (2013) is the numbering of procedural paragraphs and minor content change.”  Id. at 7.  Accordingly, the Army concluded that URS’s FPR offered the best value to the government, and awarded the contract to that firm.  Id. at 10‑11.  On November 4, DynCorp received a debriefing.  This protest followed.

DISCUSSION

DynCorp argues that the Army’s evaluation of its and URS’s proposals was flawed, and that the award decision was therefore unreasonable.  The protester raises six main arguments:  (1) the agency failed to credit DynCorp’s proposal with strengths under three of the subfactors of the technical evaluation factor; (2) the agency unreasonably evaluated URS’s proposal as outstanding under the first element of the quality subfactor; (3) the agency evaluated DynCorp’s and URS’s technical proposals in an unequal manner; (4) the agency unreasonably evaluated DynCorp’s past performance; (5) the agency unreasonably evaluated the realism of URS’s proposed price; and (6) the agency’s best value tradeoff failed to adequately compare the benefits of each proposal.  Although our decision does not address all of DynCorp’s arguments in detail, we have fully considered each of them and find that none provides a basis to sustain the protest.

In reviewing protests challenging an agency’s evaluation of proposals, our Office will not reevaluate proposals nor substitute our judgment for that of the agency, as the evaluation of proposals is generally a matter within the agency’s discretion.  Del‑Jen Educ. & Training Grp./Fluor Fed. Sols. LLC, B-406897.3, May 28, 2014, 2014 CPD ¶ 166 at 8.  Rather, we will review the record to determine whether the agency’s evaluation was reasonable; consistent with the stated evaluation criteria, applicable procurement statutes, and regulations; and adequately documented.  Shumaker Trucking & Excavating Contractors, Inc., B‑290732, Sept. 25, 2002, 2002 CPD ¶ 169 at 3.  An offeror’s disagreement with an agency’s judgment, without more, is insufficient to establish that the agency acted unreasonably.  Birdwell Bros. Painting & Refinishing, B-285035, July 5, 2000, 2000 CPD ¶ 129 at 5.  In addition, it is an offeror’s responsibility to submit an adequately written proposal that establishes its capability and the merits of its proposed technical approach in accordance with the evaluation terms of the solicitation.  Carolina Satellite Networks, LLC; Nexagen Networks, Inc., B-405558 et al., Nov. 22, 2011, 2011 CPD ¶ 257 at 4.

Evaluation of DynCorp’s Technical Proposal

DynCorp challenges the Army’s evaluation of its proposal under the technical factor, arguing that the agency improperly failed to recognize additional strengths under three of the subfactors of the technical evaluation factor:  (1) performance management, (2) regional site management, and (3) transition-in/transition-out.  Specifically, the protester contends that, in evaluating DynCorp’s proposal, “the TET identified a number of advantages and benefits in DynCorp[’s] technical plans demonstrating their effectiveness and efficiency, such as ‘sustained savings’ and ‘reduced risk,’” but unreasonably concluded that none of these benefits merited the assignment of strengths to DynCorp’s proposal.  Protester’s Comments (Dec. 18, 2015) at 4.  The protester asserts that, had the agency properly acknowledged these strengths, its proposal would have received higher than acceptable ratings under these three subfactors.[4]  The agency responds that it reasonably evaluated the information in DynCorp’s proposal and concluded that these aspects of the protester’s proposal did not exceed the requirements of the RFP such that they merited strengths.  We address one representative example of the protester’s arguments below, concerning the performance management subfactor, and conclude that neither this example nor any of the remaining arguments has merit.

As relevant here, a rating of “acceptable” was defined as follows: 

The proposal meets requirements and indicates an adequate approach and understanding of the requirements.  Strengths and weaknesses are offsetting or will have little or no impact on contract performance.  Risk of unsuccessful performance is no worse than moderate.[5]

AR, Tab 54, SSA Final Evaluation Briefing, at 59.  In addition, the agency defined “strength” as “[a]n aspect of an offeror’s proposal that has merit or exceeds specified performance or capability requirements in a way that will be advantageous to the Government during contract performance.”  Id. at 60. 

As relevant here, under the performance management subfactor, the RFP stated that the agency would evaluate an offeror’s “technical methodology for accomplishing the specific requirements of the Performance Work Statement (PWS) . . . .”  RFP at 107.  In addition, the solicitation stated that an offeror’s performance management plan would be evaluated based on whether the technical methodology “demonstrates an understanding of the processes and functions to perform the requirements of the individual missions . . . .”  Id.

In evaluating DynCorp’s proposal under this subfactor, the TET found that DynCorp’s proposed [DELETED] to aviation maintenance demonstrated “an effective and efficient plan for performing the mission tasks relating to this sub‑element.”  AR, Tab 45, DynCorp Performance Management Evaluation Report, at 2.  Specifically, the TET stated:

The Offeror demonstrates an understanding of [DELETED] based on [DELETED].  The Offeror has [DELETED].  The Offeror’s MWO [modification work orders] representative staffing [DELETED].  The Offeror presents a technical methodology which demonstrates an understanding of how to efficiently conduct operations as outlined in the PWS.

Id.  With regard to DynCorp’s proposed approach to system controls, the TET explained the following: 

[DynCorp]’s technical methodology demonstrates an understanding of how to efficiently and effectively conduct supply management operations.  [DynCorp] addresses [DELETED].  [DynCorp] implements [DELETED].  The [DELETED] proposed support [DELETED].  [DynCorp’s] [DELETED] is unique to the AFMD/RASM-C program and addresses [DELETED].

Id. at 2.  The TET did not assign any strengths or weaknesses under this subfactor, stating:  “No aspect of this sub‑element, in [DynCorp’s] proposal, exceeds the specified requirements.”  Id. at 3.

DynCorp argues that its proposal deserved a strength for its proposed “[DELETED] to aviation maintenance,” as well as for its proposed system controls to account for [DELETED].  In support of this argument, the protester points to statements made by the technical evaluators, quoted above, that DynCorp’s proposed approach was “effective and efficient” and “unique to the AFMD/RASM‑C program.”  Protester’s Comments (Dec. 18, 2015), at 7.  The protester asserts that the evaluators’ statements demonstrate that the agency identified aspects of DynCorp’s proposal that benefit the Government, but failed to assess strengths to DynCorp’s proposal based on these benefits.  Id.

Based on this record, we find nothing unreasonable regarding the Army’s evaluation.  As detailed above, and as the protester acknowledges, the evaluation record reflects that the agency specifically acknowledged the aspects of DynCorp’s proposal challenged by the protester.  In this regard, the agency contends, and we agree, that although the evaluators found that DynCorp’s proposed approach demonstrates an “effective and efficient plan for performing” certain required tasks, and also described DynCorp’s [DELETED] plan as “unique to the AFMD/RASM‑C program,” these findings did not obligate the agency to assess strengths to DynCorp’s proposal.  AR, Tab 45, DynCorp Performance Management Evaluation Report, at 2.  Although the protester points to the evaluator’s statements regarding these aspects of DynCorp’s proposal as an indication that DynCorp’s proposal merited additional strengths, the protester has not demonstrated or otherwise explained how any of the noted aspects of DynCorp’s proposed technical approach in fact exceeded the solicitation’s requirements or otherwise met the definition of “strength.”  See AR, Tab 54, SSA Final Evaluation Briefing, at 60 (defining “strength” as “[a]n aspect of an offeror’s proposal that has merit or exceeds specified performance or capability requirements in a way that will be advantageous to the Government during contract performance”).  While DynCorp’s disagrees with the agency’s evaluation, such disagreement, without more, does not render the evaluation unreasonable or provide a basis to sustain the protest.  See Ben-Mar Enters., Inc., supra

Evaluation of URS’s Technical Proposal

DynCorp next challenges the Army’s evaluation of URS’s technical proposal under the quality subfactor of the technical evaluation factor.  Specifically, the protester contends that the agency unreasonably assessed an outstanding rating to URS’s proposal under the first element of the quality subfactor (certifications).  DynCorp also asserts that the agency failed to evaluate DynCorp’s and URS’s technical proposals equally under the third element of the quality subfactor (AS9110 Rev B certification).  For the reasons discussed below, we find that the agency’s evaluation under the quality subfactor was reasonable.[6]

DynCorp first argues that the Army unreasonably assessed an outstanding rating to URS’s proposal under the first element of the quality subfactor (certifications), which required offerors to demonstrate actual and verifiable experience in obtaining “one or more of the following current certifications:  Aerospace Standard (AS) 9110 or AS9100.”  RFP at 106.  As relevant here, the RFP stated that the Army would assess an outstanding rating for an approved certification under AS9110B at a facility “performing rotary craft maintenance work nearly identical to the PWS.”  Id., exh. B, Technical (Quality) Evaluation Matrix, at 1.  The RFP stated that the agency would assess a good rating at a facility “performing work similar to the PWS.”  Id.

In response to this requirement, URS’s proposal stated the following:

At present, URS Army activity at Fresno, California is registered to both AS9110B Rev B Aerospace Quality Management System (AQMS) and ISO 9001:2008 Quality Management System (QMS).  This site supports the 1106th AVCRAD-TASMG as one of four depot-level aviation maintenance units within the Army National Guard with a responsibility of performing for limited depot aircraft maintenance, component repair, aviation intermediate maintenance (AVIM), and operation of a supply support activity (SSA), supporting various U.S. Army aircraft such as UH-60A/L, HH-60M/UH-60M and CH-47 aircraft. 

AR, Tab 66, URS Technical Proposal, at I-7.

In evaluating URS’s proposal under this element of the quality subfactor, the TET assessed an outstanding rating.  Specifically, in completing the Quality Evaluation Matrix, the TET noted that URS’s proposal met the criteria for an outstanding rating because the offeror demonstrated a “[f]acility performing rotary craft maintenance work nearly identical to the PWS.”  AR, Tab 65, URS Quality Evaluation Matrix, at 1.  In addition, in the TET’s Quality Management Evaluation Report, the evaluators explained their rationale for the outstanding rating as follows:

The Offeror provided current certifications:  (1) AS9110B, (1) AS9100C.  The Offeror also provided audit agency/approving office point of contact information for verification.  The provided certifications encompass 2 individual locations.  The one facility mentioned (operating under AS9110B) performs rotary craft maintenance work which is nearly identical to the requirements of the Performance Work Statement (PWS) . . . .  The Offeror indicates current and previous experience implementing the requirements of this section of the solicitation.  The Risk of unsuccessful performance is very low. 

AR, Tab 64, URS Quality Management Evaluation Report, at 2.

DynCorp argues that URS should have received a rating of good, rather than outstanding, under this quality subfactor element because the SSA Final Evaluation Briefing & Report, which was prepared by the SSEB based on the TET’s evaluation, states that the work URS is performing is only “similar” to the requirements of the RFP.  See AR, Tab 54, SSA Final Evaluation Briefing & Report, at 54 (“[URS] is AS9110 Rev B certified and is performing work under that certification that is similar at the 1106th AVCRAD in California.”).  DynCorp asserts that this statement reflects that the TET concluded that URS’s facility performed work only “similar to the PWS,” rather than “nearly identical to the PWS,” and therefore, per the definitions in the RFP, URS merited, at most, a good rating under this element.

The Army responds that the use of the word “similar” in the SSA briefing document was an inadvertent, clerical mistake, and that consistent with the TET’s underlying evaluation findings quoted above, it should have read as “nearly identical.”  Supp. AR at 7.  In support of this position, the agency points to the SSDD, in which the SSA both acknowledged the outstanding rating for URS’s proposal and stated that “[URS] has performed work ‘nearly identical to the PWS’ for this requirement.”  AR, Tab 63, SSDD, at 10.  In this regard, the agency asserts that the SSA’s reference to the URS’s facility being “nearly identical” demonstrates that, despite the statement in the SSA briefing document, the SSA correctly understood and agreed with the TET’s underlying evaluation, which found URS’s facility “nearly identical” to the PWS. 

Based on our review of the record, we find the agency’s explanation persuasive.  Although the protester asserts that the TET actually intended to assign a good rating to URS’s proposal because the SSA briefing document stated that the awardee was performing work under a certification that was “similar to” the PWS, which was part of the evaluation criteria’s definition of “good,” the contemporaneous record does not support the protester’s position.  Instead, the record reflects, as discussed above, that the TET’s evaluation of URS under this element concluded that the URS facility “(operating under AS9110B) performs rotary craft maintenance work which is nearly identical to the requirements of the Performance Work Statement (PWS),” which the agency found “indicates current and previous experience implementing the requirements of this section of the solicitation.”  AR, Tab 64, URS Quality Management Evaluation Report, at 2.  In addition, we note that the SSA briefing document for URS’s proposal under this element assigned the same outstanding rating assigned by the TET evaluators, despite the reference to “similar”--which adds further support to the agency’s explanation that both the TET and SSEB evaluators intended to evaluate URS’s proposal as outstanding.  See AR, Tab 54, SSA Final Evaluation Briefing & Report at 53.  Accordingly, we find nothing unreasonable regarding the agency’s evaluation.[7]

Next, DynCorp asserts that the agency’s evaluation of its and URS’s proposals reflects disparate treatment under the quality subfactor’s third element (AFM Site AS9110 Rev B Certification).  As discussed above, the solicitation stated that the agency would evaluate each offeror’s plan for obtaining certification under the AS9110 Revision B standard.  RFP at 106.  Specifically, the solicitation stated the following: 

The offeror will be evaluated on a milestone chart and accompanying discussion for implementation of a Quality Management System and subsequent site certification to AS9110 Revision B within six months of contract full performance start date for the AFM Ft. Campbell Site.

RFP, exh. B, Quality Evaluation Matrix, at 2.  In addition, the RFP stated that an outstanding rating would be assessed under this element for “detailed information as to how to achieve [the certification] within 6 months with timeline and resources identified.”  Id.

In responding to this requirement, URS’s proposal explained how it would use its existing certifications to timely achieve the required certification for the AFM site:

[DELETED].

AR, Tab 66, URS Technical Proposal, at I-15.  URS’s proposal also included a two‑page timeline detailing the 50 steps to achieving certification within six months, as well as the resources it would use to obtain the certification.  Id. at I-17-I-19.  In addition, URS’s proposal included what it called “pre-work” with its audit agency.  Id.  As part of the pre-work, URS provided its [DELETED] AS9110B registration approach for RASM-C to its auditor, [DELETED], which reviewed and endorsed URS’s approach as “reasonable and realistic” with “sufficient resources.”  Id.  URS included [DELETED] endorsement in its proposal.  Id.  This endorsement also stated that “[DELETED] has not been compensated in any way for [its] review or endorsement.”  Id. at I-20.

The Army rated URS’s proposal outstanding under this element, and assessed a strength to URS’s proposal for its pre-work with the auditor.  Specifically, the evaluators explained the strength as follows:

The Offeror has had their AS9110B plan independently reviewed by a third party registrar who in return has provided that the Offeror’s AS9110B implementation plan, schedule and resources are reasonable and realistic.  [Strength QUAC-0001].  The Offeror having this independent review completed prior to award has merit and is advantageous to the Government.

AR, Tab 64, URS Quality Evaluation Matrix, at 3. 

DynCorp argues that the agency’s evaluation reflects unequal treatment because its proposal offered what the protester contends was a benefit similar to that offered by the awardee, but that DynCorp’s proposal did not similarly receive a strength under this element.  Specifically, DynCorp states that its proposal demonstrated that DynCorp had, under contracts identical to the instant contract, “obtained AS9110 Revision B certifications for rotary-wing maintenance on [DELETED] programs within [DELETED].”  AR, Tab 31, DynCorp Technical Proposal, at 4.  DynCorp asserts that its proposal also was entitled to a strength because “each offeror presented essentially the same benefits and likely confidence to the Army--URS’s untested but ‘endorsed’ plan and DynCorp[’s] tested and proven plan based on successfully obtaining this certification for the same work in multiple locations.”  Protester’s Supp. Comments (Jan. 7, 2016), at 9.

Based on our review of the record, we find that the agency’s evaluation was reasonable.  The protester, in essence, argues that its demonstrated experience obtaining required certifications on other contracts “presented essentially the same benefits and likely confidence” to the Army as URS’s independently reviewed and endorsed plan.  Id.  As mentioned above, however, under this element, the solicitation stated that the agency would evaluate each offeror’s plan for obtaining certification under the AS9110 Revision B standard within 6 months of the contract start date, to include review of a milestone chart and accompanying discussion for implementation of a Quality Management System, and that an outstanding rating would be assessed for detailed information as to how the offeror would achieve the certification within 6 months with timeline and resources identified.  Id.  Thus, the RFP clearly advised offerors that the agency was seeking “detailed information” as to how the offeror would achieve the certification “within 6 months with timeline and resources identified.”  RFP, exh. B, Quality Evaluation Matrix, at 2.  While the protester’s proposal presented a plan based on its experience with other contracts, the awardee’s proposal presented a plan, which was reviewed and endorsed by an independent auditor.  Considering that the TET was evaluating based on detailed information, including the resources that would be used for this contract, we do not find it unreasonable that the agency assessed a strength for the awardee’s plan, but not the protester’s.  Accordingly, we find no basis to sustain the protest.

Past Performance Evaluation

DynCorp next challenges the Army’s evaluation of its past performance, arguing that it should have received a higher past performance rating than URS based on what the protester contends was its more relevant performance record.[8]  Protester’s Comments (Dec. 18, 2015), at 17.  For the reasons discussed below, we find that the agency’s evaluation under this factor was reasonable.

An agency’s evaluation of past performance, including its consideration of the relevance, scope, and significance of an offeror’s performance history, is a matter of agency discretion which we will not disturb unless the agency’s assessments are unreasonable or inconsistent with the solicitation criteria.  SIMMEC Training Solutions, B-406819, Aug. 20, 2012, 2012 CPD ¶ 238 at 4.  Where a protester challenges an agency’s past performance evaluation and source selection, we will review the evaluation and award decision to determine if they were reasonable and consistent with the solicitation’s evaluation criteria and procurement statutes and regulations, and to ensure that the agency’s rationale is adequately documented. DynCorp Int’l LLC, B-406523.2, B-406523.3, Dec. 16, 2013, 2014 CPD ¶ 7 at 6; Falcon Envtl. Servs., Inc., B-402670, B-402670.2, July 6, 2010, 2010 CPD ¶ 160 at 7. 

As discussed above, the solicitation stated that the agency would evaluate “the offerors[’] demonstrated record of past and current performance to ascertain the probability of successfully meeting the solicitation requirements.”  RFP at 107.  Specifically, the RFP stated:  “There are two aspects to the past performance evaluation:  relevancy and performance confidence assessment.”  Id.  The solicitation further stated that “[t]he evaluation will consider the recency and relevancy of the current/past performance and assess a single performance confidence rating.”  Id.  In this regard, the solicitation advised that “[o]nly recent/relevant data will be utilized in assessing the performance confidence rating,” and that “definitions of relevant/not relevant will be used.”  Id.

With regard to the relevance of past performance information, the RFP stated that “[t]he criteria to establish what is relevant may include similarity of service/support; the level of complexity; performance under AR 95-20, AS9100 or AS9110 (any revision); dollar value; contract type; location and degree of subcontract/teaming.”  Id.  As for the recency of the past performance information, the solicitation stated that “[d]ata used in conducting performance confidence assessments shall not extend past three years prior to the issue date of the solicitation,” but may include “performance data generated during the past three years without regard to the contract award date.”  Id.

For assessing the degree of confidence, the solicitation stated that “the Government will focus its inquiries on the offerors (and major subcontractors) record of performance as it relates to all solicitation requirements, including cost, schedule, performance, and management of subcontractors.”  Id. at 108.  It also stated that “[a] significant achievement, problem, or lack of relevant data in any element of the work can become an important consideration in the evaluation process.”  Id.

DynCorp’s FPR identified three contracts in its past performance volume.  AR, Tab 32, DynCorp Past Performance Proposal, at 3-4.  The agency concluded that the past performance reflected in these three contracts was “relevant” because “[t]he contracts submitted for review holistically cover the full range of efforts outlined in the [PWS].”  AR, Tab 48, DynCorp Past Performance Report, at 8.  Based on the agency’s assessment of the quality of the responses provided in past performance questionnaires, as well as in Contractor Performance Assessment Reporting System (CPARS) ratings, for these relevant contracts, the agency assessed DynCorp’s past performance a rating of “Satisfactory Confidence.”  Id.

DynCorp argues that the agency should have considered the relevance of past performance as part of its performance confidence assessment, and asserts that, had the agency done so, it would have found that DynCorp’s past performance was “significantly more relevant” than URS’.  In this regard, the protester contends that, “after [the agency’s] initial determination of relevance,” the RFP’s evaluation criteria obligated the Army to “undertake a comparative assessment of the relevance of references submitted by offerors.”  Protester’s Supp. Comments (Jan. 7, 2016), at 12.  The agency disagrees with the protester’s interpretation of the solicitation, and maintains that the RFP’s stated evaluation criteria established relevance as a threshold criteria (e.g., relevant or not relevant) for further evaluation of an offeror’s past performance information.  AR at 6-7; Supp. AR at 12-13.  The agency argues, therefore, that the comparative ratings for the offerors was therefore to be based on the performance quality responses provided for offerors’ relevant contract references.  Id.

Where a protester and agency disagree over the meaning of solicitation language, we will resolve the matter by reading the solicitation as a whole and in a manner that gives effect to all of its provisions; to be reasonable, and therefore valid, an interpretation must be consistent with the solicitation when read as a whole and in a reasonable manner.  Assist Consultants Inc., B-408365.2, Aug. 2, 2013, 2013 CPD ¶ 181 at 5; Alluviam LLC, B-297280, Dec. 15, 2005, 2005 CPD ¶ 223 at 2.

In support of its interpretation, the protester relies on the following provision in the RFP: 

In assessing the degree of confidence, the Government will focus its inquiries on the offeror[’s] (and major subcontractors) record of performance as it relates to all solicitation requirements, including cost, schedule, performance, and management of subcontractors. . . .  A significant achievement, problem, or lack of relevant data in any element of the work can become an important consideration in the evaluation process. 

RFP at 108.  DynCorp argues that the above-quoted statement that “the Government will focus its inquiries on [an offeror’s] record of performance as it relates to all solicitation requirements,” indicated that the confidence assessment must consider relevancy.  Protester’s Supp. Comments (Jan. 7, 2016), at 13.  In addition, the protester contends that, by stating that a “lack of relevant data in any element of the work can become an important consideration in the evaluation process,” the solicitation reflected that relevancy would remain a significant consideration in assessing an offeror’s confidence rating.

We agree with the Army that the solicitation provided for consideration of relevance as a matter separate from the confidence assessment, and conclude that the protester’s differing interpretation is not reasonable.  As discussed above, the solicitation stated:  “There are two aspects to the past performance evaluation:  relevancy and performance confidence assessment.”  RFP at 107.  The solicitation also stated that the evaluation “will consider the recency and relevancy of the current/past performance and assess a single performance confidence rating,” and that “definitions of relevant/not relevant will be used.”  Id. at 106-107.  In addition, the solicitation advised that “[o]nly recent/relevant data will be utilized in assessing the performance confidence rating.”  Id.  In essence, we think that the RFP clearly advised that past performance information would be evaluated as either relevant or not relevant, and if found relevant, then would be evaluated to assess a performance confidence rating based on the quality of performance.  As our Office has explained, where, as here, both offerors have relevant past performance, an agency is not required to further differentiate between the past performance ratings based on a more refined assessment of the relative relevance of the offeror’s prior contracts, unless specifically required by the RFP.  See University Research Co., LLC, B‑294358.6, B-294358.7, Apr. 20, 2005, 2005 CPD ¶ 83 at 18.  We therefore find no basis to sustain the protest.

Price/Cost Evaluation

DynCorp raises numerous arguments challenging the agency’s evaluation of URS’s price/cost proposal.  While we do not discuss all of these arguments in detail, we have reviewed them all and conclude that none provides a basis to sustain the protest.  We discuss two representative examples below. 

DynCorp first argues that the price/cost evaluation team (CPET) found a moderate risk concerning URS’s overall proposed price, which the protester asserts was never mitigated.  As discussed above, the solicitation required offerors to propose fixed labor rates, which consisted of time-and-materials labor, in an excel spreadsheet pricing template attached to the RFP.  RFP at 100.  The template included plug numbers for labor hours, and offerors were asked to allocate those hours among productive, non-productive, and fringe time.  Id. at 109.  The template included two separate sections for proposed rates:  (1) rates for labor categories subject to the Service Contract Act (SCA), and (2) rates for labor categories not subject to the SCA. 

The solicitation also stated that the agency would evaluate proposed costs/prices for reasonableness, and that the agency would conduct a cost/price realism analysis “to determine that the proposed cost elements are realistic for the work to be performed; reflect a clear understanding of contract requirements; and are consistent with the unique methods of performance described in the Offeror’s technical proposal.”  Id.  In addition, the RFP stated that “[t]he results of the realism analysis will be utilized in a performance risk assessment as it relates to the technical approach proposed by each Offeror,” and that “[n]o adjustments will be made to the offered prices as a result of the realism analysis.”  Id.

During the evaluation of URS’s initial proposal, the CPET compared URS’s proposed prices to the independent government estimate (IGE).  The CPET found that the IGE for the base year was [DELETED] percent higher than URS’s proposed price, and that “the overall delta of $[DELETED] between the IGE and proposed value indicates a moderate risk.”  AR, Tab 50, URS Cost/Price Evaluation Report, at 5.  As a result, the CPET conducted additional analysis “to provide confidence in and substantiation of the risk assessment and reasonableness determination.”  Id.  The agency’s additional analysis consisted of a detailed evaluation, by CLIN, of the realism and reasonableness of URS’s proposed prices.  Id.

As relevant here, in comparing URS’s proposed fixed hourly rates, the agency found that URS “proposed a total of $[DELETED] for the T&M labor CLIN 3000, which is [DELETED]% less than the IGE.”  Id. at 6.  The evaluators concluded that “this delta [indicates] moderate risk,” and therefore “additional analysis was completed” by the CPET.  Id.  This additional analysis consisted of a detailed evaluation of the ratio of URS’s proposed hours (e.g., productive, non-productive, and fringe), as well as reasonableness and realism assessments of URS’s proposed direct rates, indirect rates, and profit.  During the final evaluation, the CPET explained the results of its additional analysis as follows:

[URS] proposed a total of $[DELETED] for CLIN 3000, which is $[DELETED] less than the IGE.  The CPET found that all previously identified price concerns (CSTC-0001 – [DELETED], CSTC-0004 [DELETED], and CSTC-0006 – [DELETED]) were negated in the FPR.  One new price concern was identified . . . due to [DELETED].  The Government Evaluated/Probable Cost equals the Proposed Cost [in accordance with] 15.404-1(d)(3); no other exceptions were noted.

Id. at 8.  In the final evaluation, the CPET concluded that the awardee’s proposed costs were reasonable and realistic, and that “[p]rice Concerns (risks) identified during the initial Cost and Price Analysis were negated.”  Id. at 5-13. 

DynCorp first asserts that the initial “moderate risk” identified by the CPET based on the delta between URS’s total evaluated price and the IGE was never resolved by the CPET.  Specifically, the protester argues:  “[W]hile the CPET Report for URS summarized additional analysis the CPET undertook, nothing in the record indicates that the CPET downgraded or eliminated this Moderate risk rating based on URS’s overall FPR price.”  Protester’s Supp. Comments (Jan. 7, 2016), at 15.  Based on our review of the record, however, we find no merit to the protester’s argument.

As mentioned previously, after identifying during the initial evaluation that the overall delta of $[DELETED] between the IGE and URS’s proposed value “indicates a moderate risk,” the CPET conducted additional analysis, which consisted of a detailed evaluation, by CLIN, of the realism and reasonableness of URS’s proposed prices.  AR, Tab 50, URS Cost/Price Evaluation Report, at 5.  While we do not discuss in detail the agency’s additional analysis of URS’s proposed prices/costs for every CLIN, we conclude that the Final Cost/Price Evaluation Report reflects a detailed reasonableness and realism assessment of URS’s proposed prices/costs on a CLIN by CLIN basis.  Id. at 5-13.  It also reflects, for each CLIN, that the CPET concluded that URS’s proposed prices/costs were reasonable and realistic.  Id. at 5‑13.  For example, with regard to CLIN 3070 (Transition-In), which was a fixed‑priced CLIN, the agency explained its analysis as follows:

The CPET verified the labor rates used were consistent with [Wage Determination] rates and with the rates proposed in the T&M CLIN. . . .  The CPET compared the indirect rates proposed to those provided by the DCMA[; n]o discrepancies were found in the application of rates. . . .  As this is a fixed price CLIN, the Government evaluated value is equal to the proposed value . . . .  The Offeror initially proposed [DELETED] fee on CLIN 3070, which prompted [a concern].  The Offeror’s FPR revised its approach and [the] CPET noted [DELETED]% fee proposed in this CLIN, therefore negating [the] Price Concern . . . .  The CPET verified the application of fee and found no issues.  The CPET compared skill mix and hours proposed in the Cost Volume III to the proposed Transition-In plan in the Technical Volume.  This comparison found proposed hours to match to both volumes.  No discrepancies noted.  In summary, the CPET made no adjustments to the evaluated price . . . .  [T]he Government evaluated price equals the Offeror’s proposed price for CLIN 3070.  Adequate price competition establishes reasonableness and, and the realism of [URS’s] Transition-in plan is no longer questioned as all Price Concerns have been negated. . . .

Id. at 10-11.

Accordingly, although the protester asserts that the CPET never “downgraded or eliminated the Moderate risk rating based on URS’s overall FPR price,” we find that the agency’s additional analysis sufficiently addressed this concern.  As such, we find the protester’s argument provides no basis to sustain the protest.

Next, DynCorp argues that the Army improperly failed to evaluate risks associated with some of URS’s proposed labor rates.[9]  For example, the protester argues that the agency’s price realism analysis of URS’s proposed fixed hourly rates was unreasonable because the agency failed to consider the risk associated with URS’s use of [DELETED], instead of the national wage determination included in the solicitation, for establishing its proposed rates for three labor categories. 

As discussed above, the solicitation included certain labor categories for which the labor rates were subject to the SCA, referred to as “non-exempt rates.”  RFP at 101.  With regard to the SCA, the RFP incorporated the applicable national wage determination as an attachment to the solicitation, and stated that “[t]he offeror shall provide a crosswalk of the proposed labor categories against the National Wage Determination.”  Id.  As relevant here, the pricing template identified six labor categories that did not have corresponding wage determination codes, which the template referred to as “conformed” labor categories.[10]  See RFP, attach. 18, Pricing Template.  For these six labor categories, offerors were required to classify new positions “so as to provide a reasonable relationship (i.e., appropriate level of skill comparison) between such unlisted classifications and the classifications listed in the wage determination.”  RFP, attach. 5, Wage Determination Guide, at 5.  In this regard, the national wage determination explained the conformance process as follows:

[C]onforming procedures shall be initiated by the contractor prior to the performance of contract work by such unlisted class(es) of employees (See 29 C.F.R. 4.6 (b)(2)(ii)).  The [Department of Labor,] Wage and Hour Division shall make a final determination of conformed classification, wage rate, and/or fringe benefits which shall be paid to all employees performing in the classification from the first day of work on which contract work is performed by them in the classification.  Failure to pay such unlisted employees the compensation agreed upon by the interested parties and/or fully determined by the Wage and Hour Division retroactive to the date such class of employees commenced contract work shall be a violation of the Act and this contract.  (See 29 C.F.R. 4.6 (b)(2)(iv)(C)(vi)).

Id.  The Department of Labor’s regulations further explain that establishing a rate for a conformed classification may require more than simply selecting an existing wage rate:

The process of establishing wage and fringe benefit rates that bear a reasonable relationship to those listed in a wage determination cannot be reduced to any single formula. The approach used may vary from wage determination to wage determination depending on the circumstances.  Standard wage and salary administration practices which rank various job classifications by pay grade pursuant to point schemes or other job factors may, for example, be relied upon. Guidance may also be obtained from the way different jobs are rated under Federal pay systems (Federal Wage Board Pay System and the General Schedule) or from other wage determinations issued in the same locality.  Basic to the establishment of any conformable wage rate(s) is the concept that a pay relationship should be maintained between job classifications based on the skill required and the duties performed.

29 C.F.R. § 4.6(b)(2)(iv)(A).

For two of the six conformed labor categories, URS’s proposal used [DELETED], rather than the national wage determination attached to the RFP, for determining the minimum applicable rates.  AR, Tab 35, URS Price/Cost Proposal, at 86.  For one of the other conformed labor categories, URS relied on [DELETED], as the basis for its rate.  Id. at 13.  URS explained in its proposal that it looked outside of the national wage determination to find the appropriate similar job description and compensation level because the national wage determination schedule “did not provide suitable labor categories to map [the] positions.”  Id.  Instead, URS’s proposal stated that it based its proposed rates for the two conformed labor categories mentioned above on the [DELETED].  See id. at 51/52; Supp. AR at 27.  For the conformed labor category where URS relied on information from [DELETED], URS’s proposal stated that “[t]he duties and qualifications of [this] Position listed in [DELETED] correspond well with those of [this position in] the RFP.”  Id. at 55/56. 

DynCorp argues that the awardee’s use of rates based on [DELETED] and [DELETED] for the conformed labor categories was prohibited by the solicitation.  The agency disagrees, and points to the conformance process, discussed previously, in support of its position that offerors were permitted to use other information, such as [DELETED] or [DELETED], to classify the conformed labor categories.  RFP, attach. 5, Wage Determination Guide, at 5.  The protester also asserts that the agency failed to consider the potential performance risk associated with URS’s lower proposed rates. 

Based on our review of the record, we do not see any provision in the solicitation or national wage determination that precludes an offeror from using information from [DELETED] or [DELETED] to support its proposed rates for a conformed labor category.  RFP, attach. 5, Wage Determination Guide, at 5.  The record also reflects that the agency compared the rates proposed by URS for the conformed labor categories to corresponding labor rates in the national wage determination, and acknowledged that some of the rates proposed by URS were lower.  See id., attach. 1, Wage Determination Rate Comparison, at 1.  In addition, as discussed in detail above, the record demonstrates that the agency conducted a detailed realism assessment of URS’s proposed rates and concluded that they were realistic.  AR, Tab 50, URS Cost/Price Evaluation Report, at 5-15.  To the extent DynCorp asserts that it was prejudiced by URS’s use of [DELETED] rates because it would have similarly reduced its rates, we find that any harm to the protester was de minimus, and therefore, not prejudicial to the protester.[11]  Accordingly, we find no basis to sustain the protest.

Trade-off Analysis and Source Selection Decision

Finally, DynCorp argues that the Army’s award to URS, a lower-rated, lower-priced offeror, was unreasonable because the SSA failed to look behind the ratings to make a comparative assessment of the qualitative merits of the proposals.  As discussed below, we find no merit to this argument.

Generally, in a negotiated procurement, an agency may properly select a lower‑rated, lower-priced proposal where it reasonably concludes that the price premium involved in selecting a higher-rated proposal is not justified in light of the acceptable level of technical competence available at a lower price.  Bella Vista Landscaping, Inc., B-291310, Dec. 16, 2002, 2002 CPD ¶ 217 at 4.  The extent of such tradeoffs is governed only by the test of rationality and consistency with the evaluation criteria.  Best Temporaries, Inc., B-255677.3, May 13, 1994, 94-1 CPD ¶ 308 at 3.  While an agency has broad discretion in making a tradeoff between price and non-price factors, an award decision in favor of a lower-rated, lower‑priced proposal must acknowledge and document any significant advantages of the higher-priced, higher-rated quotation, and explain why they are not worth the price premium.  See Research & Dev. Sols., Inc., B-410581, B‑410581.2, Jan. 14, 2015, 2015 CPD ¶ 38 at 9.  Our Office has found that when SSAs have performed this analysis, it is within their discretion to choose a lower‑rated, lower-priced proposal in a best value procurement.  See MD Helicopters, Inc.; AgustaWestland, Inc., B-298502 et al., Oct. 23, 2006, 2006 CPD ¶ 164.  A protester’s disagreement, without more, with the agency’s determinations does not establish that the evaluation or source selection was unreasonable.  Weber Cafeteria Servs., Inc., B‑290085.2, June 17, 2002, 2002 CPD ¶ 99 at 4. 

Here, as discussed in detail above, the record shows that the SSA considered the respective merits of the proposals in accordance with the RFP criteria, and concluded that DynCorp’s more favorable technical rating was not worth the price premium.  AR, Tab 63, SSDD, at 10-11.  Specifically, the SSA acknowledged that DynCorp’s proposal received “ratings of Outstanding in all seven elements under the Quality Sub‑Factor,” and in contrast, URS’s proposal received “ratings of Outstanding in only five of the seven elements.”  Id. at 6.  The SSA also noted, however, that URS “received Acceptable ratings in the remaining two elements as a result of having 2007 FOPS/GOPS (versus current 2013 procedures).”  Id.  The SSA explained that, while DynCorp received “an additional strength in the Quality Sub‑Factor for current and approved FOPS/GOPS,” the “primary difference between AR 95-20 (2007) and AR 95‑20 (2013) is the numbering of procedural paragraphs and minor content change.”  Id.at 6-7.  The SSA also noted that URS “has an additional strength in the Quality Sub-Factor for the AS9110B certification plan which was independently reviewed by a third party registrar.”  Id.  The SSA concluded that “[DynCorp] did receive a better rating in the Technical Factor (as a result of having FOPS/GOPS written to the current AR 95-20 2013 version versus [URS’s] FOP/GOPS written to the 2007 version),” but that “the more favorable Technical rating does not warrant a $5.5 [million] (12.5%) price premium.”  Id.  On this record, where the SSA clearly acknowledged the benefits associated with the protester’s higher-rated, higher-priced proposal, but concluded that the benefits did not merit paying the price premium, we find no basis to sustain the protest.

The protest is denied.

Susan A. Poling
General Counsel



[1] This geographic region consists of Wisconsin, Michigan, Ohio, Indiana, Illinois, Kentucky, Tennessee, Alabama, Mississippi, and Louisiana.  RFP at 2.

[2] The AS9110 certification consists of quality management system standards applicable to companies within the aerospace industry.

[3] The technical evaluation team (TET) assessed the technical factors as either outstanding, good, acceptable, marginal, or unacceptable.  AR, Tab 54, SSA Final Evaluation Briefing, at 59.  The evaluators assessed past performance as either substantial confidence, satisfactory confidence, limited confidence, no confidence, or unknown (neutral) confidence.  Id. at 63.

[4] As discussed above, the agency rated DynCorp’s proposal as good under the technical approach factor.  AR, Tab 63, SSDD, at 5.  Under the technical subfactors, DynCorp’s proposal was rated as outstanding for the most important subfactor, quality.  Id.  DynCorp’s proposal received ratings of acceptable for the other three subfactors.  Id.

[5] A good rating was defined as a proposal that “meets requirements and indicates a thorough approach and understanding of the requirements,” contains “strengths which outweigh any weaknesses,” and “[r]isk of unsuccessful performance is low.”  AR, Tab 54, SSA Final Evaluation Briefing, at 59.  An outstanding rating was defined as a proposal indicating “an exceptional approach” and understanding of the requirements, “[s]trengths far outweigh any weaknesses,” and “[r]isk of unsuccessful performance is very low.”  Id.

[6] We previously dismissed one of DynCorp’s protest arguments in response to the agency’s and intervenor’s dismissal requests because we found that the protester failed to demonstrate a reasonable possibility of prejudice.  DynCorp argued that URS’s proposal should have been rejected as unacceptable because it did not comply with the RFP requirement to identify current and approved FOPS/GOPS procedures that comply with the 2013 version of Army Regulation 95-20.  As both discussed above and addressed further below, however, with regard to the agency’s trade-off decision, the Army considered this issue as comparative risk.  In this regard, the record reflected that the agency waived the requirement, finding that “[t]he primary difference between AR 95-20 (2007) and AR 95-20 (2013) in terms of FOP/GOPs is the numbering of procedural paragraphs and minor content changes,” and therefore concluded that, “[w]hile current implementation of AR 95-20 2013 indicates less risk, the difference in risk between successful implementation of the 2013 and 2007 versions of AR 95-20 was not so significant as to warrant payment of a $5.5M (12.5%) price premium.”  AR, Tab 52, DynCorp Debriefing, at 26.  In response to the dismissal request, the protester failed to demonstrate how the agency’s waiver affected the submission of its proposal, and therefore, we dismissed this portion of the protest, concluding that DynCorp had failed to demonstrate a reasonable possibility of prejudice.  See Orion Tech., Inc.; Chenega Integrated Mission Support, LLC, B-406769 et al., Aug. 22, 2012, 2012 CPD ¶ 268 at 11 (explaining that in cases where an agency waives a solicitation requirement, we will sustain a protest only if the protester is prejudiced, and that in such circumstances, the pertinent question is whether the protester would have submitted a different offer that would have had a reasonable possibility of being selected for award had it known that the requirement would be waived).

[7] We note that, although DynCorp also asserts that the work being performed by URS “does not involve performing work ‘nearly identical to the PWS,” the only support for this assertion cited by the protester in its comments responding to the agency report is the statement in the SSA Briefing Report, which referred to URS’s facility as “similar.”  Protester’s Comments (Dec. 18, 2015), at 9 (citing AR, Tab 54, SSA Final Evaluation Briefing, at 54).  Because we conclude, as discussed above, that the record reflects that the SSA relied on the TET’s underlying evaluation, rather than the statement in the SSA Briefing Report, we find no merit to this argument.  The protester, however, in its comments responding to the agency’s supplemental report, raises additional allegations in support of this argument.  See, e.g., Protester’s Supp. Comments (Jan. 7, 2016), at 7 (asserting that URS’s proposal “omits any mention of work related to Reset or MWOs,” which the protester contends demonstrates that the URS facility is not “nearly identical” to the PWS requirements).  The protester’s new arguments on this issue, however, could have been made in its comments responding to the initial agency report.  Because the protester failed to raise these issues at that time, they are untimely.  4 C.F.R. § 21.2(a)(2) (requiring protest issues be filed within 10 days after the basis is known or should have been known); see Lanmark Tech., Inc., B‑410214.3, Mar. 20, 2015, 2015 CPD ¶ 139 at 5 n.2 (piecemeal presentation of protest grounds, raised for the first time in comments, are untimely).

[8] In its comments responding to the agency report, the protester raised two new challenges to the agency’s past performance evaluation:  (1) that the Army ignored “close at hand” information contained in the cover letter of DynCorp’s FPR submission; and (2) that the Army improperly discounted superior ratings that DynCorp received from its past performance references because the agency referred to them as “mixed.”  Protester’s Comments (Dec. 18, 2015), at 14-16.  The Army’s supplemental report specifically addressed these two challenges.  See Supp. AR (Dec. 31, 2015), at 14-16.  In its supplemental comments, however, the protester merely argued, in a footnote, that the supplemental agency report “fails to raise any factual issues with DynCorp’s arguments.”  Protester’s Supp. Comments (Jan. 7, 2016), at 14 n.4.  The comments then “refer[ed] GAO to DynCorp’s prior arguments setting out the legal and factual basis for these protest grounds.”  Id.  Where, as here, an agency provides a detailed response to a protester’s assertions and the protester fails to rebut or otherwise substantively address the agency’s arguments in its comments, the protester provides us with no basis to conclude that the agency’s position with respect to the issue in question is unreasonable or improper.  IntegriGuard, LLC d/b/a HMS Fed.--Protest & Recon., B-407691.3, B‑407691.4, Sept. 30, 2013, 2013 CPD ¶ 241 at 5; Israel Aircraft Indus., Ltd.--TAMAM Div., B‑297691, Mar. 13, 2006, 2006 CPD ¶ 62 at 6-7.

[9] DynCorp also contends that the Army’s price realism analysis failed to consider what the protester contends was URS’s unrealistically low overhead rates because the agency did not specifically consider the potential performance risk associated with URS’s inclusion of the costs of [DELETED] in its overhead cost pool.  As discussed above, for the T&M labor CLIN offerors were required to propose fixed hourly rates.  See RFP at 10.  Specifically, the RFP stated:  “All offerors are to propose fully burdened straight time and overtime rates through profit for the Labor categories identified in the AFM Pricing Template.”  Id.  For purposes of evaluation, the solicitation stated that an offeror’s “proposed rates, factors, and expenses will be examined to substantiate utilization of consistent forward-pricing procedures, i.e., negotiated forward-pricing rates, if applicable, or rates and factors contractors ordinarily utilize in proposals if no negotiated forward-pricing agreement exists,” such as, “indirect expense rates, projected rates, and projected expense pools.”  Id. at 109.  The solicitation further stated that “[t]he results of the realism analysis will be utilized in a performance risk assessment as it relates to the technical approach proposed by each offeror,” but that “[n]o adjustments will be made to the offered prices as a result of the realism analysis.”  Id.  DynCorp contends that URS’s inclusion of the costs of [DELETED] in its overhead pool artificially inflated URS’s overhead rates, which posed a performance risk that was not evaluated by the agency.  Although the protester complains that the agency did not assess the effect of URS’s inclusion of these costs in its overhead cost pool, there is no obligation in a price realism analysis to verify each and every element of an offeror’s costs.  See AMEC Programs, Inc., Bechtel Nat’l, Inc., B‑408708, B‑408708.2, Dec. 4, 2013, 2014 CPD ¶ 50 at 9.  In this regard, our Office has repeatedly held that the depth of an agency’s price realism is a matter within the sound exercise of the agency’s discretion.  HBC Mgmt. Servs., Inc., B‑408885.2, May 9, 2014, 2014 CPD ¶ 149 at 5; Smiths Detection, Inc.; Am. Sci. & Eng’g, Inc., B-402168.4 et al., Feb. 9, 2011, 2011 CPD ¶ 39 at 17.  As described above, the record reflects that the agency conducted a detailed realism assessment of URS’s proposed rates, including its indirect rates.  AR, Tab 50, URS Cost/Price Evaluation Report, at 5.  It also reflects that the agency compared the labor rates proposed by URS to corresponding labor rates in the national wage determination.  See id., attach. 1, Wage Determination Rate Comparison, at 1.  Accordingly, we find no basis to sustain the protest.

[10] These six categories were:  aircraft inspector, aviation life support equipment technician II, certified ECO/Hazmat inspector, fuel distribution systems operator/fuel handler, safety/OSHA inspector, and ground inspector.  RFP, attach. 18, Pricing Template.

[11] The protester asserts in its comments that if DynCorp had been permitted to use the [DELETED] rates for the three labor categories disputed by the protester, DynCorp would have been able to reduce its proposed price by approximately $1.5 million.  See Protester’s Comments (Dec. 18, 2015) at 24.  The agency challenges the protester’s calculation, arguing that, for the labor category with the most number of full time equivalents (FTEs), the protester used the wrong labor rate (i.e., the protester used $[DELETED], although URS’s labor rate for that category was $[DELETED]).  Supp. AR at 32-33.  The result of the mathematical error reduces the amount by which DynCorp would have been able to reduce its proposed price to approximately $284,000.  See id.; see Protester’s Comments (Dec. 18, 2015) at 24  In response, the protester does not dispute that its calculation used the higher labor rate (which was not used by URS) rather than the actual labor rate in URS’s proposal.  See Protester’s Supp. Comments (Jan. 7, 2016), at 20 n.11.  Considering that DynCorp’s total evaluated price was $49,434,057, we do not find that this potential difference is sufficient to establish prejudice--that is, that DynCorp would have reduced its proposed labor rates such that its proposal would have had a substantial change of being selected for award as the best value.  See Triad Logs. Servs. Corp., B-406416.2, June 19, 2012, 2012 CPD ¶ 186 at 2; Online Video Serv., Inc., B-403332, Oct. 15, 2010, 2010 CPD ¶ 244 at 2. 

Jul 22, 2016

Jul 20, 2016

Jul 15, 2016

Jul 14, 2016

Jul 8, 2016

Looking for more? Browse all our products here