Science Applications International Corporation

B-413112,B-413112.2: Aug 17, 2016

Additional Materials:

Contact:

Ralph O. White
(202) 512-8278
WhiteRO@gao.gov

Kenneth E. Patton
(202) 512-8205
PattonK@gao.gov

 

Office of Public Affairs
(202) 512-4800
youngc1@gao.gov

Science Applications International Corporation (SAIC), of McLean, Virginia, protests the issuance of a task order to BRTRC Federal Solutions, under task order request (TOR) No. W56HZV-15-X-BC01, issued by the Department of the Army, U.S. Army Contracting Command, for services related to platform integration of various systems onto military tactical vehicles. SAIC challenges the Army's evaluation of proposals under the solicitation's non-price evaluation factors.

We deny the protest.

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of:  Science Applications International Corporation

File:  B-413112; B-413112.2

Date:  August 17, 2016

Scott P. Fitzsimmons, Esq., Shelly L. Ewald, Esq., and Heather L. Stangle, Esq., Watt, Tieder, Hoffar & Fitzgerald, L.L.P., for the protester.
Philip J. Davis, Esq., Cara L. Lasley, Esq., and Tracye Winfrey Howard, Esq., Wiley Rein LLP, for BRTRC Federal Solutions, the intervenor.
Stacy G. Wilhite, Esq., and Diane Nelson, Esq., Department of the Army, for the agency.
Pedro E. Briones, Esq., Noah B. Bleicher, and Nora K. Adkins, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

Protest of an agency’s technical evaluation is denied where the evaluation was reasonable and consistent with the terms of the solicitation.

DECISION

Science Applications International Corporation (SAIC), of McLean, Virginia, protests the issuance of a task order to BRTRC Federal Solutions, under task order request (TOR) No. W56HZV-15-X-BC01, issued by the Department of the Army, U.S. Army Contracting Command, for services related to platform integration of various systems onto military tactical vehicles.  SAIC challenges the Army’s evaluation of proposals under the solicitation’s non‑price evaluation factors.

We deny the protest.

BACKGROUND

The TOR was issued pursuant to Federal Acquisition Regulation (FAR) subpart 16.5 to firms holding indefinite‑delivery, indefinite‑quantity (IDIQ) contracts under the Equipment Related Services (ERS) family of Strategic Services Solutions contracts.[1]  Agency Report (AR) at 1‑2.  The solicitation sought platform integration services which involve simultaneously networking multiple command, control, communications, computers, intelligence, surveillance, and reconnaissance (C4ISR) systems onto military vehicles.  Id. at 1.  To accomplish this work, the TOR’s performance work statement (PWS) specified 53 required tasks, which were grouped under 22 PWS sections, including five sections relevant here:  (1) Engineering Design; (2) Instrumentation Integration; (3) Logistics; (4) Metal and Cable Fabrication; and (5) Network Integration Evaluation (NIE) Field Support.  Id. at 5; TOR amend. 1 § J, attach. 40, PWS, at 12-26.

The solicitation provided for a best‑value award of a hybrid task order (cost‑plus‑fixed‑fee for labor with a fixed‑price phase‑in period) considering, in descending order of importance:  past performance, phase‑in and management plan, technical approach, and cost/price.[2]  TOR amend. 1 §§ A.5, M.2, M.3.  The TOR stated that the non‑cost evaluation factors, when combined, were significantly more important than the cost/price factor.  Id. § M.3. 

The solicitation instructed offerors to submit proposals under separate volumes for each evaluation factor and stated that, for purposes of the TOR’s instructions and evaluation criteria, “offeror” included the prime offeror’s team.  Id. §§ L.5, 6; Questions & Answers (Q&A) no. 1.

With respect to the past performance evaluation factor, offerors were to submit information for up to three contracts or task orders performed within the past 3 years, and provide a narrative describing how each effort was relevant (similar in scope, magnitude, and complexity) to the requirement.  TOR amend. 1 §§ L.8, L.8.a‑b; see § J, attach. 42, Past Performance Matrix.  Of significance here, the solicitation, as amended in responses to questions from offerors, stated that “[o]fferors [sic] Volume 1 [i.e., past performance] responses should include Prime Offeror teaming arrangements.”  TOR Q&A no. 40.  The TOR stated that the agency would evaluate the similarity in scope, magnitude, and complexity of the offeror’s contracts to the requirement, and assign an adjectival relevancy rating accordingly.[3]  TOR amend. 1 § M.2.a.ii.  The agency would also “assess the expectation that the offeror will successfully meet the requirements” based on its record, and assign a past performance confidence assessment rating to each proposal.  Id. § M.2.a.i.

With respect to the phase-in and management approach factor, offerors were to demonstrate their understanding of the task order requirements and their ability to provide uninterrupted high-quality work to assist the agency in achieving its goals and objectives.  Id. § L.10.a.  The TOR stated that the agency would evaluate the offeror’s phase-in plan to assess the risk probability that the offeror will successfully achieve the task order requirements, and would assess the extent to which the offeror demonstrated its ability to recruit and retain a workforce with the requisite qualifications to provide uninterrupted high-quality work.  Id. § M.2.c.

With respect to the technical approach evaluation factor, offerors were to propose an approach, including tasks and labor categories, necessary to successfully perform the five PWS sections listed above.  Id. § L.9.a, citing PWS §§ 3.4 (Engineering Design), 3.9 (Instrumentation Integration), 3.13 (Logistics), 3.14 (Metal & Cable Fabrication), 3.18 (NIE Field Support).  The TOR stated that the agency would evaluate the extent to which the offeror identified the tasks and labor categories necessary to perform the task order, and provided a detailed, reasonable explanation of its proposed technical approach for executing those tasks.  Id. § M.2.b.  The agency would also assess the risk that the offeror’s technical approach posed to the timely completion of the requirement and assign a risk assessment rating to each proposal.  Id.  Also significant here, the TOR informed offerors that the evaluation of technical approach would focus on the five PWS sections stated above.  Id.

The Army received proposals from four offerors, including SAIC (the incumbent) and BRTRC, and after discussions, the final revised proposals of SAIC and BRTRC were evaluated as follows:

 

Past Performance

Phase-In & Management Approach

Technical Approach

Proposed Cost/Price

Evaluated Cost/Price

SAIC

Substantial

Outstanding

Acceptable

$46,859,133

$46,859,133

BRTRC

Substantial

Outstanding

Outstanding

$51,853,862

$51,853,862


AR, Tab 10, Source Selection Decision Memorandum (SSD), at 1.[4]

Separate past performance and technical approach evaluation teams (PET and TET, respectively) assessed proposals under the evaluation factors.  The evaluators documented their findings, assessed strengths and weaknesses, and assigned ratings in detailed evaluation reports prepared for each offeror.  See AR, Tabs 6, 9, SAIC & BRTRC Evaluations.  Proposals were also evaluated by a source selection evaluation board.  See id., Tab 10, SSD, at 1.

The contracting officer, who was the source selection authority (SSA) for the procurement, reviewed the evaluation reports and conducted a comparative analysis of each proposal under each evaluation factor.  Id. at 1‑7.  With regard to past performance, the SSA found that all four offerors had performed tasks similar to the TOR and demonstrated a high level of quality performance.  Id. at 1.  However, the SSA concluded that BRTRC’s past performance was the most favorable because the number of vehicles that BRTRC had integrated and the overall complexity of its past work indicated that the firm would be able to handle the fluctuations in volume and complexity of the requirement, while performing at a high level of quality with minimal impact on performance and schedule.[5]  Id. at 2.  With regard to technical approach, the SSA compared each offer under the five relevant PWS sections and concluded that BRTRC’s proposal was significantly more advantageous than the proposals of the other offerors.  Id. at 3‑5.  Under engineering design (PWS § 3.4), for example, she found that BRTRC’s proposal exceeded SAIC’s proposal by demonstrating a structured process for tracking design configurations through the design process, which would result in cost savings and reduced schedule impacts.  Id. at 3.  Under NIE field support (PWS § 3.18), she found that BRTRC provided a detailed process for generating unclassified trouble tickets that would reduce schedule and performance risk, and presented a significant advantage over SAIC’s proposal.  See id. at 5.

The SSA performed a cost/technical tradeoff among the four proposals.  Id. at 6‑7.  She concluded that BRTRC’s proposal provided numerous significant technical strengths, reduced schedule and security risks, and potential cost savings that would provide a long term benefit and best meet the Army’s needs.  See id.  She also found that BRTRC’s slight advantage under the past performance factor and distinct advantage in technical approach was more important than SAIC’s lower cost, which did not offset the cumulative benefits of BRTRC’s proposal.  Id. at 7.  The SSA concluded that BRTRC’s proposal provided the best value to the agency.  Id.

The Army issued the task order to BRTRC and this protest followed.[6]

DISCUSSION

SAIC protests the Army’s evaluation of proposals under the TOR’s non‑price evaluation factors.  While our decision here does not specifically discuss each of SAIC’s arguments, we have considered all of the protester’s assertions and find none furnish a basis for sustaining the protest.

Past Performance Evaluation

SAIC complains that the agency unreasonably assigned both offerors the same past performance ratings contrary to the TOR’s rating definitions and even though, according to the protester, SAIC had more relevant experience, as the incumbent, and higher customer past performance ratings than BRTRC.  SAIC also claims that BRTRC’s past performance ratings do not reflect certain negative information in BRTRC’s subcontractor’s customer evaluation forms.  SAIC also points out that BRTRC did not identify any contracts actually performed by the firm, but relied solely on the past performance record of one of its proposed subcontractors.  In this respect, SAIC maintains that the Army improperly attributed to BRTRC the past performance of its proposed subcontractor, because the TOR “makes clear that past performance must be attributable to the named offeror.”  Protester’s Comments at 1‑3. 

As initial matter, SAIC’s disagreement with the assigned past performance ratings and its belief that its incumbency status entitles it to higher ratings or additional assessed strengths, lack merit and do not provide bases for finding the Army’s past performance evaluations unreasonable. [7]  See Belzon, Inc., B‑404416 et al., Feb. 9, 2011, 2011 CPD ¶ 40 at 5-6.  There is no requirement that an incumbent be given extra credit for its status as an incumbent, or that the agency assign or reserve the highest rating for the incumbent offeror.  See FFLPro, LLC, B-411427.2, Sept. 22, 2015, 2015 CPD ¶ 289 at 10.  Moreover, the essence of an agency’s evaluation is reflected in the evaluation record itself, not the adjectival ratings.  Sci. Applications Int’l Corp., B-407105, B‑407105.2, Nov. 1, 2012, 2012 CPD ¶ 310 at 9.  It is well established that ratings, be they numerical, adjectival, or color, are merely guides for intelligent decision making in the procurement process.  Burchick Constr. Co., B‑400342.3, April 20, 2009, 2009 CPD ¶ 102 at 4-5.  Here, besides quibbling with its assigned relevance and confidence ratings, SAIC identifies no aspect of the PET’s extensive evaluation of SAIC’s past performance, or the contracting officer’s source selection decision, that was supposedly unreasonable or inconsistent with the TOR’s terms.  Protester’s Comments at 3‑4, 10; Protester’s Supp. Comments at 3‑5.  Thus, we find no merit to these allegations. 

Additionally, based on our review of the record, we find that the Army’s past performance evaluation of BRTRC was reasonable and consistent with the TOR.  An agency’s evaluation of past performance, which includes its consideration of the relevance, scope, and significance of an offeror’s performance history, is a matter of discretion which we will not disturb unless the agency’s assessments are unreasonable, inconsistent with the solicitation criteria, or undocumented.  Silverback7, Inc., B‑408053.2, B‑408053.3, Aug. 26, 2013, 2013 CPD ¶ 216 at 8.  The evaluation of experience and past performance, by its very nature, is subjective; we will not substitute our judgment for reasonably based evaluation ratings, and an offeror’s disagreement with an agency’s evaluation judgments, without more, does not demonstrate that those judgments are unreasonable.  Id.

First, with respect to SAIC’s contention that the Army was not permitted to consider the past performance record of BRTRC’s subcontractor, we find that the agency’s evaluation was consistent with the TOR.  Indeed, the solicitation, as described above, explicitly stated that for purposes of the TOR’s instructions and evaluation criteria, “offeror” included the prime offeror’s team, and that an offeror’s past performance volume “should include Prime Offeror teaming arrangements.”  TOR Q&A nos. 1, 40.  Where a solicitation, as here, does not expressly prohibit its consideration, we have routinely held that the experience of a subcontractor proposed to do work that is evaluated under a solicitation properly may be considered in determining whether an offeror meets experience or past performance requirements.  See AC Techs., Inc., B‑293013, B‑293013.2, Jan. 14, 2004, 2004 CPD ¶ 26 at 3 (nothing in the solicitation precludes the agency from considering the past performance of a prime contractor’s subcontractor; in the absence of such a prohibition in the solicitation, the agency could properly evaluate and give weight to the past performance of the awardee’s subcontractor).

Next, the protester contends that BRTRC’s substantial confidence rating was unreasonable because BRTRC’s record “indicate[d] certain negative past performance information.”  See Protester’s Comments at 4; Protester’s Supp. Comments at 4‑5.  However, SAIC’s assertions in that regard are based on the protester’s selective quotation of four sentences from the procurement record and do not demonstrate that the agency unreasonably evaluated BRTRC’s proposal.  For example, the PET noted that one of BRTRC’s client questionnaires stated that “[d]uring the integration effort, the required number of integration kits was not met.”  AR, Tab 9‑1, BRTRC PET Report, at 14.  Conveniently, SAIC omits the subsequent sentence from the PET report and the client questionnaire, which noted that “[t]his was not entirely the contractor[’]s fault as the [agency’s] units did not always provide the necessary number of vehicles to integrate in a timely manner.”  Id.  Similarly, SAIC points out that the questionnaire reported that “[t]here was miscommunication between [the agency and the program management] as to the use of [overtime] at the integration sites that caused an overrun in labor costs.”  Protester’s Comments at 4; Protester’s Supp. Comments at 5.  Here, too, SAIC omits the subsequent sentence, which states that the “lack of vehicle availability throughout the effort negatively impacted the contractor[’]s charged costs versus output at no fault of their own.”  AR, Tab 9‑1, BRTRC PET Report, at 15. 

Notwithstanding SAIC’s myriad of objections, the contemporaneous record reflects the PET’s extensive consideration of the relevance of BRTRC’s contracts to the required effort, including the extent of BRTRC’s past performance of C4ISR systems and system integration efforts.  See AR, Tab 9‑1, BRTRC PET Report, at 4‑20.  Notably, SAIC does not challenge the PET’s conclusions that BRTRC’s past performance record involved highly complex systems that exceeded TOR requirements.  See id. at 19‑20.  Also notable, SAIC does not challenge the SSA’s conclusions that the number of vehicles that BTRRC had integrated and the overall complexity of its past work indicated that the firm would be able to handle the fluctuations in volume and complexity of the requirement, while performing at a high level of quality with minimal impact on performance and schedule.

As demonstrated above, the agency fully and properly considered the information regarding BRTRC’s subcontractor’s alleged negative past performance.  While SAIC disagrees with the agency’s conclusions regarding the merits of the awardee’s past performance records, SAIC’s disagreement with those conclusions are insufficient to establish that the agency acted unreasonably.  Ball Aerospace & Techs. Corp., B‑411359, B-411359.2, July 16, 2015, 2015 CPD ¶ 219 at 7; see Gentex Corp.--W. Operations, B‑291793 et al., Mar. 25, 2003, 2003 CPD ¶ 66 at 8 (protest denied where agency properly credited awardee with its subcontractor’s (SAIC) past performance and experience).  Accordingly, SAIC’s protest of the Army’s past performance evaluation is denied.

Phase‑In and Management Approach Evaluation

SAIC argues that the Army improperly evaluated SAIC’s and BRTRC’s proposals under the phase-in and management approach factor.  SAIC contends that BRTRC’s plan to recruit and hire incumbent personnel after award is not consistent with the TOR requirement that offerors must demonstrate an affirmative ability to recruit and retain personnel necessary to meet mission requirements.  SAIC also asserts that “[d]espite the fact that BRTRC’s proposal indicates a clear intent to recruit employees from SAIC to serve as key personnel following award (whereas SAIC was, of course, already engaged as the incumbent with all key personnel already in place),” the Army unreasonably determined that the proposals were essentially equal for this factor and neither has an advantage over the other.  Protester’s Comments at 5. 

The Army contends that it reasonably evaluated the proposals in accordance with the TOR criteria.  The agency argues that the evaluators and SSA properly found numerous areas in which both BRTRC and SAIC demonstrated their ability to recruit and retain a workforce with the requisite qualifications.  For example, with respect to BRTRC, the SSA found a unique strength in the awardee’s plan to [DELETED].  AR, Tab 10, SSD, at 2.  With respect to the protester, the SSA found a strength for SAIC’s plan to [DELETED].  Id.  As another example, the agency provides that the SSA found that the proposals of BRTRC and SAIC presented significant advantages because of their [DELETED].  Id.  Moreover, the SSA found significant advantages insofar as both offerors [DELETED].  Id.  The agency contends that these management strengths, and the offerors’ very detailed transition plans, resulted in the SSA finding that on balance the proposals were approximately equal overall for this factor.  See Id.

Here, the agency’s evaluation is unobjectionable.  As the agency asserts, the record demonstrates that the evaluators properly assessed BRTRC’s ability to recruit and retain a workforce with the requisite qualifications, and, reasonably assigned various strengths to BRTRC’s approach.  Contrary to SAIC’s allegations, the Army’s evaluation was consistent with the TOR evaluation criteria.  Moreover, we find that SAIC’s allegation with regard to the SSA’s finding that the proposals were approximately equal has no merit and simply reflects the incumbent’s disagreement with the award decision.  The SSA noted the various strengths of both proposals and concluded that overall the proposals were approximately equal in merit based on these evaluated findings.  While the protester contends that the agency could not have concluded that the proposals were approximately equal because SAIC had the incumbent employees on staff while BRTRC had to recruit these individuals, this aspect is but one area the agency considered in making its determination of overall equivalency; we see nothing improper with the agency’s evaluation.

Technical Approach Evaluation

SAIC contends that the Army’s award decision was based on unstated evaluation criteria, namely BRTRC’s technical approach to performing configuration management and trouble tickets.  In this respect, SAIC points out that configuration management and trouble tickets were “discrete, separately numbered sections of the PWS that were not among the five sections of the PWS that the Army directed offerors to address and that would be the focus of the [] evaluation.”  Protester’s Comments at 7.  In SAIC’s view, the solicitation strictly limited the evaluation of technical approach to the five PWS sections identified in the TOR.

The Army argues that it reasonably evaluated BRTRC’s technical approach and contends that many of the PWS tasks are interrelated, regardless of their titles and enumerated sections.  For example, the agency points out that at least seven tasks within four of the TOR’s specified PWS sections implicate configuration management.  Similarly, the Army asserts that trouble tickets support is “clearly and explicitly within three tasks under” the PWS’s NIE field support requirements.  AR at 5.  The agency maintains that SAIC cannot claim that the agency’s evaluation of technical approach deviated from the TOR, because both SAIC’s and BRTRC’s proposals addressed aspects of their configuration management processes and trouble tickets as related to other PWS sections.

We agree and find that the Army reasonably evaluated BRTRC’s technical approach in accordance with the evaluation criteria.  The contemporaneous record of the agency’s evaluation includes lengthy technical evaluation reports that show that the TET assessed BRTRC’s technical approach by comparing virtually every section of the proposal to the corresponding five PWS sections identified as evaluation factors in the TOR.  See AR, Tab 9‑2, BRTRC TET Rep., at 1‑50.  In this respect, consistent with the Army’s arguments, the PWS provisions at issue here are cross-referenced throughout the five PWS sections identified as evaluation factors in the TOR.  For example, PWS section 3.18, NIE Field Support, specifically identifies three tasks involving trouble tickets.  See AR at 6; PWS §§ 3.18.1, 3.18.4, 3.18.7.  Notably, SAIC’s proposal addresses trouble tickets under the proposal’s discussion of PWS sections 3.18, NIE field Support.  SAIC Tech. Approach Proposal, at II‑21.  Similarly, PWS section 3.4, Engineering Design, requires the contractor to plan and host periodic reviews for each platform configuration.  See PWS § 3.4.9.  Again, SAIC’s proposal discusses configuration management in this context.  AR at 19; Tab 4‑2, SAIC Tech. Approach Proposal, at II‑8.  On this record, we find nothing unreasonable with the agency’s consideration of configuration management and trouble tickets, as the protester alleges.  Sci. Applications Int’l Corp., B-406899, Sept. 26, 2012, 2012 CPD ¶ 282 at 5, 12 (an agency in its evaluation of proposals is permitted to take into account specific, albeit not expressly identified, matters that are logically encompassed by, or related to, the stated evaluation criteria).

Finally, SAIC also complains that “[d]espite the Army’s identification of five strengths and no weaknesses in SAIC’s proposal, the Army assigned an adjectival rating of ‘Acceptable’ to the proposal and assigned an ‘Outstanding’ rating to BRTRC’s technical proposal.”  Protester’s Comments at 10; see Protest at 12 (“SAIC, at a minimum, should have been given a “Good” if not “Outstanding” technical rating.”).  However, as we discuss above, and as explained in many decisions issued by our Office, the essence of the evaluation is reflected in the evaluation record itself, not the adjectival ratings.  See Sci. Applications Int’l Corp., B-407105 et al., supra (denying SAIC’s protest that each of its strengths under the technical/management factor should have been evaluated as significant strengths and that its proposal should have been rated outstanding).  Moreover, there is no legal requirement that an agency must award the highest possible rating, or the maximum point score, under an evaluation factor simply because the proposal contains strengths and/or is not evaluated as having any weaknesses.  See Applied Tech. Sys., Inc., B-404267, B-404267.2, Jan. 25, 2011, 2011 CPD ¶ 36 at 9.  Here, the record demonstrates that the SSA properly evaluated the proposals and also exercised her discretion in differentiating between the competing proposals and determining which provided the best value to the agency.  AR, Tab 10, SSD, 1-7.  We have no basis to sustain SAIC’s protest of the Army’s evaluation of the offerors’ technical approaches. 

The protest is denied.

Susan A. Poling
General Counsel



[1] Strategic Services Solutions contracts, also known as the “TS3” family of contracts, are multiple‑award IDIQ contracts awarded by the Army’s Tank‑Automotive and Armaments Command (TACOM).  TOR amend. 1 § A.1.

[2] The period of performance includes a 30‑day phase in‑period, an 8‑month base period, four option years, and a 4‑month option term.  TOR amend. 1 § F.1. 

[3] Magnitude was defined as similar in dollar value, including at least $5 million in labor.  TOR amend. 1 § L.8.b.  Complexity was defined as efforts demonstrating:  (1) integration of multiple C4ISR systems onto military tactical platforms; and (2) system integration efforts performed on at least 200 military tactical vehicles within a 12‑month period.  Id.  The TOR did not define scope.

[4] Under the TOR’s adjectival rating scheme, a substantial confidence rating reflected a high expectation of successful performance based on the offeror’s past performance record.  See TOR § M.2.a.i.  An outstanding technical approach rating reflected an exceptional approach and understanding of the requirements, with strengths that far outweigh any weaknesses, and a very low risk of unsuccessful performance; an acceptable rating reflected an adequate approach and understanding, with strengths and weaknesses that are offsetting or will have little or no impact on performance, and no more than a moderate risk of unsuccessful performance.  Id. at § M.2.b.i.

[5] The SSA concluded that BRTRC’s past performance was “slightly more advantageous” than SAIC.  AR, Tab 10, SSD, at 2.

[6] As reflected in the table above, the value of the task order at issue exceeds $10 million.  Accordingly, this procurement is within our jurisdiction to hear protests related to the issuance of task orders under multiple‑award IDIQ contracts.  10 U.S.C. § 2304c(e)(1)(B).

[7] With respect to SAIC, the PET found that all of the firm’s past performance contracts involved highly complex systems that exceeded TOR requirements, and the evaluators assessed six strengths and no weaknesses in SAIC’s performance record.  See AR, Tab 6‑1, SAIC PET Report, at 22‑23.

Dec 11, 2017

Dec 8, 2017

Dec 7, 2017

Dec 6, 2017

Looking for more? Browse all our products here