Skip to main content

Innovative Test Asset Solutions, LLC

B-411687,B-411687.2 Oct 02, 2015
Jump To:
Skip to Highlights

Highlights

Innovative Test Asset Solutions, LLC (ITAS), of Tullahoma, Tennessee, protests the award of a contract to National Aerospace Solutions, LLC (NAS), of Reston, Virginia, under request for proposals (RFP) No. FA9101-13-R-0100, issued by the Department of the Air Force for test operations and sustainment (TOS) services at the Arnold Engineering Development Complex (AEDC), Arnold Air Force Base (AFB), Tullahoma, Tennessee. ITAS argues that the agency's evaluation of offerors' proposals and resulting award decision were improper.

We sustain the protest in part and deny the protest in part.

We sustain the protest in part and deny the protest in part.
View Decision

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of:  Innovative Test Asset Solutions, LLC

File:  B-411687; B-411687.2

Date:  October 2, 2015

Robert J. Symon, Esq., Aron C. Beezley, Esq., Jennifer F. Brinkley, Esq., and Lisa A. Markman, Esq., Bradley Arant Boult Cummings LLP, for the protester.
Michael F. Mason, Esq., Thomas L. McGovern III, Esq., C. Peter Dungan, Esq., Michael D. McGill, Esq., and Christine Reynolds, Esq., Hogan Lovells US LLP, for National Aerospace Solutions, LLC, an intervenor.
Lt. Col. Mark E. Allen, Maj. Damund E. Williams, Lt. Col. Aaron E. Woodward, Gregory B. Porter, Esq., Capt. Brett A. Johnson, and Capt. Andrew S. Herzog, Department of the Air Force, for the agency.
Louis A. Chiarella, Esq., and Nora K. Adkins, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1.  Protest challenging the technical evaluation is sustained where the agency failed to reasonably evaluate technical risk in accordance with the terms of the solicitation.

2.  Protest challenging the agency’s evaluation of the awardee’s past performance is denied where the evaluation was reasonable and consistent with the solicitation’s stated evaluation criteria.

3.  Protest challenging agency’s cost realism evaluation is denied where the record demonstrates that the agency’s conclusions were reasonable.

4.  Protest that agency failed to give adequate consideration to awardee’s potential organizational conflict of interest (OCI) is denied, where the record shows that the agency investigated potential OCIs and reasonably concluded that awardee’s plan would adequately avoid, neutralize, or mitigate the potential conflicts of interest.

DECISION

Innovative Test Asset Solutions, LLC (ITAS), of Tullahoma, Tennessee, protests the award of a contract to National Aerospace Solutions, LLC (NAS), of Reston, Virginia, under request for proposals (RFP) No. FA9101-13-R-0100, issued by the Department of the Air Force for test operations and sustainment (TOS) services at the Arnold Engineering Development Complex (AEDC), Arnold Air Force Base (AFB), Tullahoma, Tennessee.  ITAS argues that the agency’s evaluation of offerors’ proposals and resulting award decision were improper.

We sustain the protest in part and deny the protest in part.

BACKGROUND

The Air Force’s AEDC is a national aerospace ground test facility that conducts tests, engineering analyses, and technical evaluations for research, system development, and operational programs of the Air Force, Department of Defense, other government agencies, and industry; AEDC provides the most comprehensive set of aerospace ground test facilities in the world.  Performance Work Statement (PWS) §§ 1.0, 1.3.  The test facilities include complexes of large wind tunnels, propulsion altitude and sea-level test cells, ballistics ranges, space chambers, and associated test instrumentation; the facilities can simulate flight conditions from sea level to 300 miles, and from subsonic velocities to Mach 20.  Contracting Officer’s Statement, July 28, 2015, at 7.

The RFP, issued on August 28, 2014, contemplated the award of a cost-plus-award-fee contract for a base year with seven 1-year options.[1]  In general terms, the PWS required the contractor to provide all personnel, materials, and supplies necessary to perform the specified TOS functions in the areas of test operations, technology development, equipment and facility sustainment, capital improvements, and support services at Arnold AFB as well as AEDC sites in White Oak, Maryland, and the National Aeronautics and Space Administration (NASA) Ames Research Center, Moffett Field, California.  PWS §§ 1.3, 2.0.  The solicitation established that contract award would be made on a best-value basis, based on three evaluation factors:  technical, past performance, and cost/price (hereinafter cost).[2]  RFP § M at 249-50.  The technical factor consisted of four equal subfactors:  technical operations, management approach, qualified personnel, and innovations and efficiencies.  Id. at 250.  The noncost factors, when combined, were significantly more important than cost.  Id.

Four offerors, including NAS[3] and ITAS[4], submitted proposals by the October 14 closing date.  An Air Force source selection evaluation board (SSEB)--comprised of technical, past performance, contract, and cost teams--evaluated offerors’ proposals using various adjectival rating schemes.  For the technical factor, proposals were rated as to their quality as follows:  outstanding, good, acceptable, marginal, or unacceptable.[5]  Past performance was rated using a performance confidence rating scheme as follows:  substantial confidence, satisfactory confidence, limited confidence, no confidence, or unknown confidence (neutral).[6]  The agency’s various evaluation rating schemes, as well as narrative definitions of the ratings themselves, were set forth in the solicitation.  RFP § M at 251-59.

On January 12, 2015, following the evaluation of offerors’ initial proposals, the Air Force established a competitive range that included the NAS and ITAS proposals.  The agency held several rounds of discussions with offerors, and received offerors’ final proposal revisions (FPR) by May 21.  The final evaluation ratings and costs of the NAS and ITAS FPRs were as follows:

 

NAS

ITAS

Technical

   

  Technical Operations

Outstanding/
Low Risk

Outstanding/
Low Risk

  Management Approach

Outstanding/
Low Risk

Outstanding/
Moderate Risk

  Qualified Personnel

Outstanding/
Low Risk

Outstanding/
Moderate Risk

  Innovations and Efficiencies

Acceptable/
Low Risk

Acceptable/
Moderate Risk

Past Performance

Substantial Confidence

Substantial Confidence

Total Proposed Cost

$1,516,101,909

$1,466,513,329

Total Evaluated Cost

$1,516,101,909

$1,492,673,902


AR, Tab 29, Proposal Analysis Report (PAR), at 437, 513, 679, 709.

The agency evaluators also identified strengths and weaknesses in offerors’ proposals in support of the adjectival ratings assigned.  Specifically, the technical evaluation team (TET) identified a total of 22 strengths and no weaknesses in NAS’s technical proposal, and a total of 17 strengths and 3 weaknesses in ITAS’s technical proposal.[7]  Id. at 266-318, 514-73.  The agency’s evaluation report also detailed the analysis of offerors’ cost proposals and evaluated cost determination.

An Air Force source selection advisory council (SSAC) then conducted a comparative assessment of offerors’ proposals and recommended NAS for award.  AR, Tab 30, Comparative Analysis Report, at 1-26.  The SSAC found NAS’s proposal to be technically superior to those of the other offerors, and found that this technical superiority outweighed in each instance the associated cost premium.  Id. at 25-26.

The agency source selection authority (SSA) subsequently reviewed and accepted the ratings, findings, and recommendations of the agency evaluators.  AR, Tab 31, Source Selection Decision, at 1-23.  The SSA found that NAS’s proposal was superior to ITAS’s under the technical factor and provided the Air Force with the greatest performance confidence (the offerors were considered equal as to past performance).  The SSA also concluded that NAS’s technical advantages outweighed ITAS’s cost advantage, and that NAS’s proposal represented the best value to the government.  Id. at 22-23.  The Air Force awarded the contract to NAS on June 11.

On June 24, after the Air Force provided ITAS with notice of contract award and a debriefing, ITAS filed its protest with our Office.

DISCUSSION

ITAS’s protest raises numerous issues regarding the agency’s evaluation and resulting award decision.  The protester contends that the agency’s evaluation of its technical proposal was improper.  ITAS also alleges that the agency’s evaluation of NAS’s past performance was improper, that the agency’s cost realism evaluation was improper, and that the agency’s evaluation of NAS’s organizational conflict of interest was unreasonable.  Lastly, ITAS alleges that the agency’s best value tradeoff decision was flawed.  As detailed below, we find the Air Force’s evaluation of ITAS’s technical proposal was improper.  Although we do not specifically address all of ITAS’s remaining issues and arguments, we have fully considered all of them and find they provide no basis on which to sustain the protest.

Technical Evaluation of ITAS

ITAS protests the Air Force’s evaluation of its proposal under the technical factor.  Specifically, the protester contends that the three weaknesses identified in ITAS’s proposal, under the management approach, qualified personnel, and innovations and efficiencies subfactors, were improper.

The evaluation of an offeror’s proposal is a matter within the agency’s discretion.  VT Griffin Servs., Inc., B-299869.2, Nov. 10, 2008, 2008 CPD ¶ 219 at 4; IPlus, Inc., B-298020, B-298020.2, June 5, 2006, 2006 CPD ¶ 90 at 7, 13.  In reviewing a protest of an agency’s evaluation of proposals, our Office will examine the record to determine whether the agency’s judgment was reasonable and consistent with the stated evaluation criteria and applicable procurement statutes and regulations. Shumaker Trucking & Excavating Contractors, Inc., B-290732, Sept. 25, 2002, 2002 CPD ¶ 169 at 3.  While we will not substitute our judgment for that of the agency, we will sustain a protest where the agency’s conclusions are inconsistent with the solicitation’s evaluation criteria, undocumented, or not reasonably based.  BAE Sys. Info. & Elec. Sys. Integration Inc., B-408565 et al., Nov. 13, 2013, 2013 CPD ¶ 278 at 5; DRS ICAS, LLC, B-401852.4, B-401852.5, Sept. 8, 2010, 2010 CPD ¶ 261 at 4-5.  As detailed below, we find two of the three weaknesses assigned to ITAS’s technical proposal to be improper.

Management Approach Subfactor

ITAS protests the agency’s evaluation of its proposal under the management approach subfactor.  The protester contends that the weakness assigned for the offeror’s planned use of a standard production unit (SPU) metric, as part of the offeror’s performance measurement approach, was unreasonable.

The RFP established that, under the management approach subfactor, the agency would evaluate an offeror’s ability to implement an effective and efficient management approach.  RFP § M at 253.  The management approach subfactor included six measures of merit, including performance measurement:  “[t]his element . . . is met when the [o]fferor demonstrates a clear understanding of the use of objective metrics in performance measurement [in accordance with] PWS [§] 3.25.”  Id.

ITAS proposed, as part of its management approach proposal, the use of an SPU as the method to track labor efficiencies.[8]  AR, Tab 46, ITAS Initial Proposal, Vol. II, Technical Proposal, at 223.  In its FPR, ITAS set forth how its use of SPUs would support the measurement of productivity, while accounting for a weighed parameter that measured the output (production) value of each of the 19 types of test facility operands[9] utilized at AEDC.  ITAS acknowledged that its SPU was not, however, an entirely objective measurement:  “[e]ven though these SPU production coefficients are based on decades of experience . . ., we acknowledge that the exact and relative values of these coefficients have some degree of subjectivity manifested through independent analysis and engineering judgment.”  AR, Tab 23, ITAS FPR, Vol. II, Technical Proposal, at 245.  ITAS also proposed various mitigation measures that it believed minimized the government risks and concerns regarding objectively measuring productivity changes.  Id.

The TET concluded that ITAS’s proposed use of SPUs represented a weakness.  AR, Tab 29, PAR, at 307.  The TET found that ITAS’s approach to establish fixed coefficients, based on the first year of the TOS contract, did not adequately account for the variation in the scope of work that was independent of the operand, thereby making it impossible to separate changes in the scope of work from changes in productivity or efficiency.  Id.  The agency evaluators also identified, as a key issue, that the relationship between operands and labor hours was not fixed; the labor hours for a test program were independent of the number of operands, or SPUs, required to complete the test.[10]  Id.  From this the TET concluded that because of the inability of ITAS’s SPU to separate out variations in test requirements and the influence of test duration, ITAS’s use of SPUs increased the government’s risk of being unable to assess the contractor’s year-to-year performance.  Id. at 308.

We find the agency’s evaluation here to be reasonable.  The record reflects that ITAS attempted to create a common, weighted, scale by which to measure the output (production) value of each of the 19 types of tests utilized at AEDC that would apply over the life of the TOS contract.  The agency evaluators, however, reasonably determined that ITAS’s SPU measurement was not sufficiently nuanced to account for all the different tests that would occur across 59 separate test units, and that even tests involving the same measurement could involve different durations and/or amounts of non-operational time that ITAS’s SPU methodology did not adequately take into account.  While ITAS challenges the Air Force’s understanding of its SPU measurement, there is nothing in the record (including the multiple discussion questions and responses on this issue) to support the protester’s assertion.  Accordingly, we find the agency’s evaluation was reasonable and consistent with the stated evaluation criteria.

Qualified Personnel Subfactor

ITAS challenges the agency’s evaluation of its technical proposal under the qualified personnel subfactor.  The protester contends that the weakness assigned to its proposal, for failing to achieve its estimated cost savings, does not represent a technical risk to the agency to be properly taken into account under the qualified personnel subfactor.[11]  ITAS alleges that it was prejudiced by the agency’s evaluation error because it would have otherwise received a low risk rating.  Id.  We agree.

The RFP established that, under the qualified personnel subfactor, the agency would evaluate the offeror’s proposed approach to provide personnel with the experience and capabilities to understand, lead, manage, and execute the PWS requirements.  RFP § M at 254.  The qualified personnel subfactor included two measures of merit:  key personnel and staffing plan.  Id.  For the latter, the agency would evaluate whether an offeror presented an effective plan for ensuring that the workforce provided would meet PWS requirements over the life of the contract, including proposed staffing, recruitment, retention, and training plan.[12]  Id.

With regard to the agency’s evaluation, as stated above, the RFP set forth a technical evaluation scheme under which the Air Force was required to evaluate proposals for both technical merit and technical risk.  RFP § M at 250.  As relevant here, the technical risk evaluation was to consider “the risk associated with the technical approach in meeting the requirement,” including assessments of “potential to cause disruption of schedule, increased costs or degradation of performance,” and need to increase government oversight.  Id. at 251-52.

ITAS submitted its qualified personnel proposal, which addressed both the offeror’s key personnel and staffing plan.  AR, Tab 23, ITAS FPR, Vol. II, Technical Proposal, at 296-314.  ITAS proposed an initial staffing level of [DELETED] full time equivalents (FTE), of which 99 percent were from the incumbent workforce.  Id. at 307.  Additionally, relevant to the protest here, ITAS’s cost proposal included various “Day One” cost savings measures (e.g., Day One fiscal year (FY) 14/15 productivity gains) that the offeror believed supported its decision to deviate from the government-provided workload estimates;[13] ITAS’s Day One cost savings measures were not part of the offeror’s qualified personnel (technical) proposal.

The TET found that ITAS’s qualified personnel proposal met the RFP’s measures of merit, and that the offeror’s staffing plan provided a clear and comprehensive recruitment, retention, and training strategy.  AR, Tab 29, PAR, at 309-11.  The TET also identified two strengths in ITAS’s proposal--its key personnel, and that the proposal provided for an initial critical skills analysis as well as an initial staffing level that would not reduce critical skills.  Id. at 311-12.  The TET also identified a weakness in ITAS’s staffing plan for “Day One Savings Cost Risk.”  Id. at 312.  In this regard, the TET determined ITAS’s Day One savings cost risk to be a technical risk as a result of the agency cost evaluation team’s (CET) conclusion that not all of ITAS’s Day One estimated cost savings were realistic.  The TET described the technical risk as follows:

Staffing plan approach to achieving Day One savings represents Moderate technical risk of increased cost [], as indicated by the assessed $20.6 [million] [most probable cost] adjustment.  ITAS has proposed a number of mitigations to achieving the Day One savings they have proposed . . . .  While many of their approaches will be successful in reducing the personnel required, the Government [subject matter experts] have assessed that their approach, including their proposed mitigations will not be completely successful in mitigating the risk, and there is some remaining risk.

Id. at 312.

ITAS argues that the assessment of a weakness in its evaluation of the offeror’s staffing plan was unreasonable, insofar as the agency identified no technical risk in the offeror’s staffing plan (e.g., insufficient staffing, inadequate compensation, unreasonable recruitment strategy).  The Air Force argues that it was proper to consider ITAS’s cost risk to also be a technical risk in light of the technical risk rating definitions.  We disagree.

The RFP established that the agency would evaluate both the quality, and the risk, of an offeror’s technical approach.  The technical risk ratings were to be based on risk associated with the offeror’s proposed technical approach to meeting the PWS requirements.  The technical risk ratings were to assess those aspects of an offeror’s technical proposal that had the potential to cause schedule, performance, or cost concerns.

The record reflects that the agency identified no technical shortcomings in either ITAS’s key personnel or staffing plan (the two components of the qualified personnel subfactor).  For example, the agency did not determine that ITAS’s proposed staffing levels were insufficient, such that the offeror’s plan had the potential to result in schedule, performance, or cost concerns.  Rather, the technical weakness assigned to ITAS’s qualified personnel proposal resulted entirely from the fact that ITAS did not adequately support its estimated cost savings--something that was neither part of, nor required to be part of, the qualified personnel proposal.  The agency’s interpretation that an offeror’s failure to demonstrate the achievement of all proposed cost savings, by itself, represents technical risk is not supported by the plain language of the solicitation.  Quite simply, while cost risk to the agency is to be taken into account as part of the cost realism evaluation, we find that lack of realistic cost savings does not by itself represent technical risk to be taken into account under the qualified personnel subfactor.  Thus, we find the evaluators’ application of technical risk here is unreasonable. 

Innovations and Efficiencies Subfactor

Similarly, ITAS also challenges the agency’s evaluation of its technical proposal under the innovations and efficiencies (I&E) subfactor.  The protester contends that the weakness assigned to its proposal, for failing to achieve estimated cost savings, also does not represent a technical risk to the agency to be properly taken into account under this technical subfactor.  Again, we agree.

The RFP established, under the I&E technical subfactor, that the criterion was met when the offeror proposed sound innovations and efficiencies with measurable savings over the government-provided, FY16 workload estimates and/or qualitative improvements to the PWS requirements.  RFP § M at 255.

ITAS submitted its proposal which included 23 innovations and efficiencies, as well as additional estimated cost savings that would begin “Day One.”  AR, Tab 23, ITAS FPR, Vol. II, Technical Proposal, at 315-72.  ITAS’s proposal detailed each proposed innovation, the PWS area(s) affected, and the estimated cost savings in both FTEs and dollars.  Relevant to the protest here, ITAS proposed I&E #22, “culture of innovation and transformation,” which represented the offeror’s approach to a continuous improvement process.  Id. at 364-66.

The TET found that ITAS proposed sound innovations and efficiencies with measurable savings over the FY16 workload estimates and/or qualitative improvements over the PWS requirements.  AR, Tab 29, PAR, at 313.  Specifically, the evaluators stated as follows:

ITAS proposes a total of 24 I&Es that are closely related to the operation of AEDC, and will improve AEDC’s efficiency as well as save operating cost, or will provide qualitative improvements over the [g]overnment provided PWS.  Each I&E will have an identified owner and will be matched to elements of the [statement of objectives].  Overall, this plan will provide sound innovations and efficiency opportunities with identified measurable savings over the life of the contract. . . .  Their I&Es cover a variety of cost reductions and performance improvements.[14]

Id. at 313.

The TET identified one weakness in ITAS’s technical proposal, specifically that the offeror had overstated the estimated cost savings that would result from I&E #22.  Id. at 318.  In its FPR, ITAS estimated the cost savings for I&E #22 to be $23.9 million; the agency’s cost realism evaluation found the estimated cost savings to be $18.5 million, thereby resulting in an upward adjustment of $5.5 million to ITAS’s evaluated cost.  The TET found that although ITAS had identified the risks to fully achieving the estimated cost savings, and proposed various mitigation strategies, “some technical risk of increased cost remains.”  Id.

We find that the evaluation weakness here was also improper.  The record reflects that the evaluators found no technical risk to the agency in ITAS’s I&E #22, such that a weakness was warranted.  In fact, the record reflects that the agency found that this I&E, like the rest, was a sound innovation that would improve AEDC’s efficiency, as well as save operating costs, and would provide qualitative improvements.  In fact, the only shortcoming the agency evaluators identified was that ITAS had not fully supported the estimated cost savings from I&E #22, which represented a cost risk to the agency.

The record reflects, once again, that the TET did not identify technical weaknesses that could, among other things, increase cost (such that it reasonably represented technical risk).  Rather, the agency evaluators essentially inverted the evaluation rating scheme, and improperly considered cost risk to be technical risk, even when no technical shortcoming had been identified.  We find this application of technical risk to be unreasonable--the lack of fully-supported cost savings estimates does not, by itself, represent technical risk under the RFP’s stated evaluation criteria.

We also find that ITAS was prejudiced by the errors in the agency’s technical evaluation.  Competitive prejudice is an essential element of a viable protest; where the protester fails to demonstrate that, but for the agency’s actions, it would have had a substantial chance of receiving the award, there is no basis for finding prejudice, and our Office will not sustain the protest.  Swets Info. Servs., B-410078, Oct. 20, 2014, 2014 CPD ¶ 311 at 14; see Statistica, Inc. v. Christopher, 102 F.3d 1577 (Fed. Cir. 1996).  The record reflects that the agency, when performing a comparative assessment of offerors’ proposals, found NAS to be technically superior based on an integrated assessment of both strengths and weaknesses (i.e., the combined value of all strengths and the total impact of all weaknesses).  AR, Tab 30, SSAC Report, at 4, 16-22; Tab 32, Source Selection Decision, at 12-18.  While we recognize that NAS’s proposal was found to have technical strengths that ITAS did not (and which the protester does not challenge), the record does not support the agency’s assertion that it was only NAS’s strengths upon which the SSA based his cost/technical tradeoff decision.  Consequently, we conclude that the agency’s actions here were prejudicial to the protester.  See West Sound Servs. Group, LLC, B-406583.2, B-406583.3, July 3, 2013, 2013 CPD ¶ 276 at 17.

Past Performance Evaluation of NAS

ITAS also protests the agency’s evaluation of NAS’s past performance.  The protester alleges that the Air Force’s relevance and performance quality assessments of NAS managing partner, Bechtel, were materially flawed.  ITAS also alleges the agency placed undue emphasis on the past performance of NAS team member, GP Strategies. 

Our Office will examine an agency’s evaluation of an offeror’s past performance only to ensure that it was reasonable and consistent with the stated evaluation criteria and applicable statutes and regulations, since determining the relative merit or relative relevance of an offeror’s past performance is primarily a matter within the agency’s discretion.  Richen Mgmt., LLC, B-409697, July 11, 2014, 2014 CPD ¶ 211 at 4.  A protester’s disagreement with the agency’s judgment, without more, does not establish that an evaluation was improper.  AT&T Corp., B-299542.3,
B-299542.4, Nov. 16, 2007, 2008 CPD ¶ 65 at 19.

The solicitation instructed offerors to submit up to three recent, relevant references for each joint venture partner, and up to five for each major subcontractor.  RFP § L at 229.  Similarly, the RFP established that the agency’s past performance evaluation would consider both the relevance (i.e., scope, magnitude, complexity, and how closely the work related to the technical subfactors) and the quality (i.e., how well the contractor previously performed) of the offeror’s prior work.[15]  Id., § M at 256-58.

NAS submitted a total of eight past performance references; these included three for Bechtel and one for GP Strategies—covering GP’s work on the incumbent AEDC contract as an ATA-joint venture member.  AR, Tab 66, NAS FPR, Vol. III, Past Performance, at 12.  The PPET separately identified three additional Bechtel contracts as relevant and considered them as part of its evaluation, including Bechtel’s Department of Energy NNSA Y12 National Security Complex maintenance and operation (DoE Y12) contract.  AR, Tab 29, PAR, at 576-77.  Additionally, the agency evaluators were aware that joint venture members of both ITAS (i.e., Jacobs, PAE) and NAS (i.e., GP Strategies) pointed to the incumbent AEDC contract as evidence of their past performance.  AR, July 28, 2015, at 73-76.

The PPET performed a detailed evaluation of the relevance and performance quality of each NAS reference.  AR, Tab 29, PAR, at 579-676.  Based on the determination that most references were relevant or very relevant, and with very good or exceptional quality, the PPET rated NAS’s performance confidence as substantial.[16]  Id. at 677.

Our review of the record leads us to conclude that the agency’s past performance evaluation of NAS was unobjectionable, insofar as it was reasonable, consistent with the stated evaluation criteria, and adequately documented.  For example, ITAS argues that the Air Force should not have considered Bechtel’s DoE Y12 contract as relevant to the TOS effort, and argues that if the agency did, it should have considered certain major security lapses on that contract to provide evidence of poor past performance.[17]  We find the agency reasonably assessed the relevancy of Bechtel’s DoE Y12 reference in accordance with the RFP, AR, Tab 29, PAR, at 638-43, as well as the quality of the offeror’s performance, including the security lapses to which the protester cites.  Id. at 671-76.  While ITAS argues that the work performed on the DoE Y12 contract is substantially different than the TOS PWS, the record reflects the agency’s rationale for its reasonable conclusion that the two efforts are similar in management approach, qualified personnel, and innovations and efficiencies.  The protester’s disagreement, without more, does not establish that the evaluation was unreasonable.

ITAS also argues that the agency’s past performance evaluation of the awardee was flawed by improperly giving NAS team member, GP Strategies, favorable credit for performance of the incumbent AEDC contract.  As set forth above, because the agency could not determine the role performed by the ATA joint-venture members, it gave each ATA team member (including GP Strategies) credit for the entire AEDC effort.  ITAS alleges that GP Strategies was responsible for only 5 percent of the work performed by the ATA joint venture and had no substantive role in managing any portion of the incumbent AEDC contract; thus, the protester argues, GP Strategies’ sole past performance reference is not relevant in magnitude to the work here.  ITAS fails to establish, however, how the agency evaluators would have been aware of these alleged facts, and has not cited to any documents in the record (other than its own protest) in support thereof.  See ITAS Comments, Aug. 7, 2015, at 31.  Accordingly, we find the agency’s decision to give each ATA joint-venture member credit for the entire incumbent contract to be reasonable, and conclude that this approach does not provide a basis on which to sustain the protest.

Cost Realism Evaluation of ITAS

ITAS challenges the agency’s cost realism evaluation.  Specifically, the protester alleges that:  the agency’s upward adjustments to ITAS’s proposed cost savings were unreasonable; the agency treated offerors unequally regarding proposed cost savings attributable to craft/union workforce flexibility; and the agency’s mechanical reliance upon a statistical modeling tool (“Crystal Ball”) for determining the amount of the cost realism adjustments was arbitrary.[18]  ITAS Comments, Aug. 7, 2015, at 13-26.  ITAS argues that but for the Air Force’s cost realism evaluation errors, the protester’s cost advantage over NAS would have been approximately $99 million (rather than approximately $24 million).  Id. at 14.

When an agency evaluates a proposal for the award of a cost-reimbursement contract, an offeror’s proposed estimated costs are not dispositive because, regardless of the costs proposed, the government is bound to pay the contractor its actual and allowable costs.  Wyle Labs., Inc., B-407784, Feb. 19, 2013, 2013 CPD ¶ 63 at 8; American Tech. Servs., Inc., B-407168, B-407168.2, Nov. 21, 2012, 2012 CPD ¶ 344 at 5; FAR § 15.404-1(d).  Consequently, the agency must perform a cost realism analysis to determine the extent to which an offeror’s proposed costs are realistic for the work to be performed.[19]  An agency is not required to conduct an in-depth cost analysis, or to verify each and every item in assessing cost realism; rather, the evaluation requires the exercise of informed judgment by the contracting agency.  See Cascade Gen., Inc., B-283872, Jan. 18, 2000, 2000 CPD ¶ 14 at 8. Further, an agency’s cost realism analysis need not achieve scientific certainty; rather, the methodology employed must be reasonably adequate and provide some measure of confidence that the proposed costs are reasonable and realistic in view of other cost information reasonably available to the agency as of the time of its evaluation.  See SGT, Inc., B-294722.4, July 28, 2005, 2005 CPD ¶ 151 at 7.  We review an agency’s judgment in this area to see that the agency’s cost realism evaluation was reasonably based and not arbitrary.  Hanford Envtl. Health Found., B-292858.2, B-292858.5, Apr. 7, 2004, 2004 CPD ¶ 164 at 8.  We find no merit to ITAS’s various challenges to the agency’s cost realism evaluation.

For example, ITAS argues that the agency’s cost realism evaluation was improper and disparate regarding craft/union workforce flexibility.  Both ITAS and NAS proposed, as an I&E, a flexible union workforce.[20]  AR, Tab 46, ITAS Initial Proposal, Vol. II, Technical Proposal, at 128 (I&E #1, craft workforce flexibility); Tab 60, NAS Initial Proposal, Vol. II, Technical Proposal, at 235 (I&E #3, flexible union workforce rules).  In their FPRs, ITAS proposed a cost savings of $25.8 million (approximately [DELETED] FTEs) which the agency found to be realistic, while NAS proposed a cost savings of $54.9 million (approximately [DELETED] FTEs) which the agency also found to be realistic.  AR, Tab 57, ITAS FPR, Vol. IV, Cost Proposal, at 247; Tab 67, NAS FPR, Vol. IV, Cost Proposal, at 111; Tab 29, PAR, at 457, 700.

ITAS argues that the agency failed to meaningfully consider the magnitude of each offeror’s proposed cost savings resulting from a flexible workforce I&E when performing its cost realism evaluation.  The protester also contends the Air Force failed to evaluate offerors’ proposed cost savings equally, and had it done so, it would have essentially normalized the cost savings for the two offerors.  We disagree. 

As a preliminary matter, the record reflects that ITAS computed the cost savings for its proposed I&Es after making various “Day One” reductions to the government- estimated labor hours, while NAS computed the savings of its I&Es beginning with the government-estimated hours; thus, it is unclear that the offerors measured the cost savings for a flexible workforce from a common starting point.  Additionally, what the offerors were proposing for their respective workforce flexibility I&E was not exactly the same, such that there were elements in NAS’s I&E that were not in ITAS’s.  See AR, Sept. 25, 2015, at 6.  Lastly, the agency evaluators found that NAS’s cost savings measures, while larger in magnitude than ITAS’s, were fully supported:  “The [offeror’s] ability to provide savings due to relaxing the work rules is clear.  NAS proposed a solid and well thought-out approach to negotiations.  They have extensive experience with many of our local unions at the National level.”  AR, Tab 29, PAR, at 700.  On this record, we find no merit to the protester’s assertion that the agency’s cost realism evaluation failed to consider the size of each offeror’s proposed cost savings amount or was otherwise unequal.

ITAS also alleges, with regard to its proposed “Day One Innovation” savings, that the agency’s discussions were misleading and resulted in the offeror involuntarily raising its proposed cost by $29 million.  ITAS Comments, Aug. 7, 2015, at 17-20; see Protest, June 24, 2015, at 27-29, citing CFS-KBR Marianas Support Servs., LLC; Fluor Fed. Solutions LLC, B-401486 et al., Jan 2, 2015, 2015 CPD ¶ 22.  We disagree.  In CFS-KBR, we sustained Fluor’s protest because we found that the agency misled Fluor in discussions about its proposed cost.  Id. at 6-7.  Here, by contrast, the record reflects that ITAS proposed approximately 27 separate cost savings measures, and the agency’s discussions with ITAS accurately informed the offeror of those instances, including Day One innovations, where the Air Force did not understand or find fully supported the claimed cost savings.  See AR, Tab 26, ITAS Evaluation Notices, at 1-598.  Quite simply, we find that the agency’s discussions with ITAS were reasonable, and the offeror made its own independent business judgment about how to respond to the agency’s meaningful discussions.  See Serco Inc., B-407797.3, B-407797.4, Nov. 8, 2013, 2013 CPD ¶ 264 at 5.

ITAS also contends that the amount of the agency’s cost realism adjustments was arbitrary and based on the mechanical reliance of a software tool.  The record reflects that the CET, when analyzing the I&Es and other cost savings proposed by the offerors, considered the degree to which the cost savings initiatives were adequately supported.  The cost evaluators then assessed the likelihood (i.e., probability) of the offeror achieving its full, estimated cost savings by assigning “minimum,” “most likely,” and “maximum” chances of success to each proposed I&E in percentage terms.[21]  From this the cost evaluators computed an average of the most likely cost savings for each I&E.  Id.

The CET also used a software tool called “Crystal Ball” to assist in its evaluation.[22]  Id. at 454, 691.  Specifically, the Crystal Ball tool took account of the evaluators’ minimum, most likely, and maximum assessed probability determinations, which were applied to all offerors’ proposals.  The agency’s cost realism adjustments were then computed based on the Crystal Ball results.

We find the agency’s cost realism analysis unobjectionable.  As a preliminary matter, the record reflects that the agency looked at each of the cost savings innovations individually and considered them in light of the information provided in the proposals.  In addition, the probability methodology used by the agency was not unlike the methodology used by ITAS in its proposal.  Specifically, ITAS elected to compute both the expected cost savings for each I&E, and a corresponding probability of success, from which it derived its proposed (i.e., discounted) cost savings amounts.  While the Air Force was not required to employ a mathematical methodology in order to perform an adequate cost realism evaluation, we do not find it unreasonable. 

OCI Evaluation of NAS

Lastly, ITAS alleges that NAS possesses an unmitigated organizational conflict of interest (OCI).  The protester contends that the debriefing it received provided no indication that the Air Force had given meaningful consideration to the corporate relationships between NAS team members and the original equipment manufacturers (OEM) that routinely have their products tested and evaluated at AEDC.[23]  The protester alleges that because of such corporate relationships with OEMs, NAS had an impaired objectivity OCI that would undermine the awardee’s ability to render impartial advice to the government.

The FAR provides that an OCI exists when, because of other activities or relationships with other persons or organizations, a person or organization is unable or potentially unable to render impartial assistance or advice to the government, or the person’s objectivity in performing the contract work is or might be otherwise impaired, or the person has an unfair competitive advantage.  See FAR § 2.101.

The situations in which OCIs arise, as described in FAR subpart 9.5 and decisions of our Office, can be broadly categorized into three groups:  biased ground rules, unequal access to information, and impaired objectivity.  See Organizational Strategies, Inc., B-406155, Feb. 17, 2012, 2012 CPD ¶ 100 at 5.  As relevant here, an impaired objectivity OCI exists where a firm’s ability to render impartial advice to the government will be undermined by the firm’s competing interests, such as a relationship to the product or service being evaluated.  FAR § 9.505-3; Pragmatics Inc., B-407320.2, B-407320.3, Mar. 26, 2013, 2013 CPD ¶ 83 at 3; PURVIS Sys., Inc., B-293807.3, B-293807.4, Aug. 16, 2004, 2004 CPD ¶ 177 at 7.

Contracting officers are required to identify and evaluate potential OCIs as early in the acquisition process as possible, and avoid, neutralize, or mitigate significant potential conflicts of interest before contract award.  FAR §§ 9.504(a), 9.505.  The responsibility for determining whether an actual or apparent conflict of interest will arise, and to what extent the firm should be excluded from the competition, rests with the contracting officer.  Alliant Techsystems, Inc., B-410036, Oct. 14, 2014, 2014 CPD ¶ 324 at 4; PricewaterhouseCoopers LLP; IBM U.S. Fed., B-409885 et al., Sept. 5, 2014, 2014 CPD ¶ 289 at 19.

We review the reasonableness of a contracting officer’s OCI investigation and, where an agency has given meaningful consideration to whether a significant conflict of interest exists, we will not substitute our judgment for the agency’s, absent clear evidence that the agency’s conclusion is unreasonable.  Alliant Techsystems, Inc., supra.  In this regard, the identification of conflicts of interest is a fact-specific inquiry that requires the exercise of considerable discretion.  Guident Techs., Inc., B-405112.3, June 4, 2012, 2012 CPD ¶ 166 at 7; see Axiom Res. Mgmt., Inc. v. United States, 564 F.3d 1374, 1382 (Fed. Cir. 2009).  A protester must identify “hard facts” that indicate the existence or potential existence of a conflict; mere inference or suspicion of an actual or potential conflict is not enough.  TeleCommunication Sys. Inc., B-404496.3, Oct. 26, 2011, 2011 CPD ¶ 229 at 3-4; see Turner Constr. Co., Inc. v. United States, 645 F.3d 1377, 1387 (Fed. Cir. 2011).

Here, the agency’s evaluation of NAS’s OCI was reasonable and the protester’s allegations generally fail for a lack of hard facts.  The record reflects that before the RFP was issued, the agency utilized an advisory multi-step process to advise potential offerors, from an OCI perspective, whether they would likely be a viable competitor.[24]  AR, Tab 3, Pre-solicitation Notice for OCI Mitigation Plans, May 20, 2014.  NAS thereafter submitted a detailed draft OCI mitigation plan, which the Air Force analyzed, and on which it provided feedback.  AR, Tab 4, NAS OCI Plan (various dates), at 1-211.  Following issuance of the RFP, NAS submitted an updated OCI mitigation plan as part of its initial proposal, AR, Tab 12, NAS Initial Proposal, Vol. V, OCI Plan, at 1-76, and a final mitigation plan with its FPR.  AR, Tab 20, NAS FPR, Vol. V, OCI Plan, at 1-85.  Among other things, NAS’s plan separately analyzed OCI issues for each member of the joint venture, including how impaired objectivity OCIs would be mitigated.  Id.

The contracting officer then evaluated NAS’s OCI mitigation plan and determined that:  NAS demonstrated a thorough understanding of the OCI concerns on the TOS effort and had provided a sufficient mitigation plan that detailed its comprehension of the issues and risks discussed in FAR Subpart 9.5; the NAS mitigation plan was fully compliant with the RFP requirements; the NAS team did not identify any existing unmitigatable OCIs; NAS provided a thorough review of existing contracted work for all joint venture members and major subcontractors; and each NAS team member would maintain a company-specific plan indicating how it would monitor, identify, and mitigate actual or apparent OCIs.[25]  AR, Tab 29, PAR, at 557-58.  While ITAS argues that NAS’s failure to disclose the existing SLI/Orbital ATK teaming relationship indicates that the awardee lacks a thorough understanding of OCI concerns, ITAS Comments, Aug. 28, 2015, at 1, we find the agency’s evaluation to be reasonable.  We conclude that this omission, if it is an omission, does not outweigh the many other ways that NAS’s OCI mitigation plan met the agency’s requirements and withstood the agency’s scrutiny.[26]

RECOMMENDATION

We recommend that the agency reevaluate ITAS’s technical proposal risk and, based on that reevaluation, make a new source selection determination.  If, upon reevaluation, ITAS is determined to offer the best value to the government, the Air Force should terminate NAS’s contract for the convenience of the government and make award to ITAS.  We also recommend that ITAS be reimbursed the costs of filing and pursuing the protest, including reasonable attorneys’ fees.  4 C.F.R. § 21.8(d)(1).  ITAS should submit its certified claim for costs, detailing the time expended and costs incurred, directly to the contracting agency within 60 days after receipt of this decision.  4 C.F.R. § 21.8(f)(1).

The protest is sustained in part and denied in part.

Susan A. Poling
General Counsel



[1] The RFP was subsequently amended four times.  Unless stated otherwise, all references are to the final version of the solicitation.

[2] The relative importance of the evaluation factors individually was unstated.  See RFP § M at 250.  There was one fixed-price contract line item (phase-in) which was not included in the total evaluated cost of offerors’ proposals.  RFP amend. 4 at 2.

[3] NAS is a joint venture consisting of Bechtel National, Inc., Sierra Lobo, Inc., and GP Strategies Corporation, along with teaming subcontractors.  Agency Report (AR), Tab 59, NAS Initial Proposal, Vol. I, Executive Summary, at 1.

[4] ITAS is a joint venture consisting of Jacobs Technology Inc., and PAE Applied Technologies, LLC, along with various teaming subcontractors.  AR, Tab 45, ITAS Initial Proposal, Vol. I, Executive Summary, at 1, 6.  ITAS’s Jacobs and PAE, and NAS’s GP Strategies, were the joint venture members of the incumbent AEDC contractor, Aerospace Testing Alliance (ATA).  Contracting Officer’s Statement, July 28, 2015, at 10-11.

[5] A separate technical risk rating--low, moderate, or high--was also to be assigned to offerors’ technical submissions.

[6] Past performance confidence assessments were based on evaluation of both the relevance (very relevant, relevant, somewhat relevant, or not relevant) and quality (exceptional, very good, satisfactory, marginal, unsatisfactory, or not applicable) of an offeror’s prior efforts.

[7] The past performance evaluation team (PPET) did not assign strengths and weaknesses to offerors’ past performance, but developed narrative findings in support of the adjectival ratings assigned.  Id. at 319-436, 573-678.

[8] The TET initially considered ITAS’s SPU an undefined performance metric, and raised the issue in discussions with the offeror.  AR, Tab 26, ITAS Evaluation Notices, at 192-231.  In response to discussions, ITAS provided additional information describing the SPU methodology.  Id. at 222-24. 

[9] Operand generally refers to a quantity upon which a mathematical operation is performed.  Here, ITAS uses operands to refer to the test units (e.g., shots, firings, runs) employed at different AEDC facilities.  AR, Tab 23, ITAS FPR, Vol. II, Technical Proposal, at 245; see also RFP, attach L-8, at 244-46.

[10] The TET also found that ITAS’s SPU measurement did not adequately account for the non-operational phases of different tests, or the length of tests.  Id.

[11] ITAS does not dispute that the cost risk should be taken into account as part of the agency’s cost realism evaluation.  See ITAS Comments, Aug. 7, 2015, at 8.

[12] Similarly, the RFP instructed offerors that proposals were to discuss their approach to meeting the qualified personnel requirements.  RFP § L at 227.

[13] The RFP included estimated workloads by AEDC test facility (in operational units) as well as estimated annual labor hours for each PWS work breakdown element.  RFP, attach. L-8, at 243-47.

[14] The TET also found, with regard to ITAS I&E #22, that it too represented measurable savings over the FY16 workload estimates.  Id. at 317.

[15] The solicitation also established that prior efforts of less than $20 million annually or $100 million in total contract value would not be considered relevant.  RFP § M at 256.

[16] The PPET initially attempted, as part of its evaluation, to determine the portion of work previously performed by each ATA joint-venture member, and learned that ATA had performed as a fully-integrated (i.e., fully-populated) joint venture.  Lacking the ability to determine the role played by each ATA team member, the PPET decided that the way to treat offerors equally was to give each ATA team member (Jacobs, PAE, and GP Strategies) credit for favorable past performance of the incumbent AEDC contract, even while recognizing that none of these entities was wholly responsible for the incumbent work.  AR, July 28, 2015, at 73-76; Tab 29, PAR, at 331, 346, 605.

[17] ITAS does not challenge the relevance or the quality ratings assigned to the remaining five Bechtel references which the PPET considered.

[18] ITAS initially alleged that the cost realism evaluation was also flawed because the agency failed to accept the offeror’s entire estimated cost savings for I&E #22.  In response to the protester’s allegation, the Air Force provided a detailed rebuttal in its agency report.  ITAS’s comments on the agency report, however, failed to address the agency’s response.  Thus, we consider the protester to have abandoned this argument.  See Organizational Strategies, Inc., B-406155, Feb. 17, 2012, 2012 CPD ¶ 100 at 4 n.3.

[19] The end product of a cost realism analysis is the total estimated cost (i.e., “most probable cost”) that the agency realistically expects to pay for the offeror’s proposed effort, and it is the estimated cost, and not the offeror’s proposed cost, that must be the basis of the agency’s source selection determination.  Magellan Health Servs., B-298912, Jan. 5, 2007, 2007 CPD ¶ 81 at 13 n.13.

[20] The agency evaluators found the cost savings amounts initially proposed by both ITAS and NAS to be unrealistic, which then became a topic of discussions with each offeror. 

[21] For example, for ITAS’s I&E #22 (as provided in its FPR), one agency evaluator estimated that ITAS had a 55 percent minimum probability of success, 66 percent most likely probability of success, and a 77 percent maximum probability of success.  AR, Tab 29, PAR, at 481.  Each agency evaluator made their own independent determinations.

[22] Oracle Crystal Ball is a spreadsheet-based application for predictive modeling, forecasting, simulation, and optimization.  See http://www.oracle.com/us/products/ applications/crystalball/overview/index.html; AR July 28, 2015, at 97 n.26.  It is a tool that can be used for cost risk analysis, particularly when cost models are developed in Microsoft Excel.  ITAS Comments, Aug. 7, 2015, at 24 n.12.

[23] ITAS alleges, among other things, that NAS team member Bechtel has “numerous problematic relationships” with Boeing, Lockheed Martin, and ATK--all of whom are frequent and prominent users of AEDC test facilities.  Protest, June 24, 2015, at 31.

[24] Prospective offerors were advised that the OCI advisory process would be iterative in nature, with offerors submitting draft OCI mitigation plan which the agency would review and then provide feedback.

[25] Additionally, in response to assertions raised in the ITAS protest regarding relationships between NAS team member Sierra Lobo, Inc. (SLI) and AEDC-user Orbital ATK (SLI and Orbital ATK have teamed for the environmental test and integration services (ETIS II) contract at NASA Goddard Space Flight Center), the contracting officer determined that NAS’s updated OCI plan adequately mitigated potential OCI issues regarding SLI.  AR, Tab 122, Contracting Officer’s Updated Determination, Aug. 24, 2015, at 1-7.

[26] ITAS also challenges the agency’s best value tradeoff determination, and argues the SSA failed to identify with any specificity the benefits in NAS’s proposal that outweighed ITAS’s $24 million cost advantage.  ITAS Comments, Aug. 7, 2015, at 35-40, citing, e.g., Trailboss Enters., Inc., B-407093, Nov. 6, 2012, 2013 CPD ¶ 232 (finding generalized statements insufficient to document the reasonableness of an agency’s best value determination); Si-Nor, Inc., B-282064, B-282064.2, May 25, 1999, 2000 CPD ¶ 159 (sustaining protest where tradeoff decision failed to explain why agency considered awardee’s evaluated advantage outweighed protester’s lower price).  In light of our determination that certain aspects of the evaluation of ITAS’s technical proposal were not reasonable, and our corresponding recommendations, we need not address this aspect of ITAS’s protest.

Downloads

GAO Contacts

Office of Public Affairs