Skip to main content

AAR Integrated Technologies

B-416859.4 Jun 11, 2019
Jump To:
Skip to Highlights

Highlights

AAR Integrated Technologies (AAR), of Huntsville, Alabama, protests the award of a contract to DRS Network & Imaging Systems, LLC (DRS), of Melbourne, Florida, under request for proposals (RFP) No. W15QKN-18-R-0037, issued by the Department of the Army, Army Materiel Command, for multipurpose standard automatic test equipment. The protester challenges the agency's evaluation of proposals and the selection decision.

We deny the protest.
View Decision

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of:  AAR Integrated Technologies

File:  B-416859.4

Date:  June 11, 2019

Paul R. Hurst, Esq., Michael J. Navarre, Esq., and Caitlin T. Conroy, Esq., Steptoe & Johnson, LLP, for the protester.
W. Jay DeVecchio, Esq., Kevin P. Mullen, Esq., and Rachael K. Plymale, Esq., Morrison & Foerster LLP, for DRS Network & Imaging Systems, LLC, the intervenor.
Jered J. Leo, Esq., and Jonathan A. Hardage, Esq., Department of the Army, for the agency.
Charmaine A. Stevenson, Esq., and Laura Eyester, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

Protest challenging the agency’s evaluation of proposals and selection decision is denied where the record shows that both were reasonable and in accordance with the terms of the solicitation.

DECISION

AAR Integrated Technologies (AAR), of Huntsville, Alabama, protests the award of a contract to DRS Network & Imaging Systems, LLC (DRS), of Melbourne, Florida, under request for proposals (RFP) No. W15QKN-18-R-0037, issued by the Department of the Army, Army Materiel Command, for multipurpose standard automatic test equipment.  The protester challenges the agency’s evaluation of proposals and the selection decision.

We deny the protest.

BACKGROUND

The agency issued the RFP on March 23, 2018, using Federal Acquisition Regulation subpart 15.3 procedures, for the award of a fixed-price, indefinite-delivery, indefinite-quantity contract with five 12-month ordering periods.  Agency Report (AR), Tab 4, RFP, at 2, 99.[1]  The RFP stated that the agency sought to procure “the first non-rugged, low cost variant of the next generation general-purpose standard At-Platform Automatic Test System (APATS) component of the [Army’s] Integrated Family of Test Equipment (IFTE).”  Id. at 2.  The test equipment devices will be used throughout all levels of maintenance to test and diagnose highly complex communications equipment, other electronic commodity equipment, missiles, aircraft, and ground vehicles.  AR, Tab 4a, Detail Specification, at 2.  The devices will also host interactive electronic technical manuals and/or specific application software and be used to upload/download mission data or software.  Id.  The RFP stated that the contract would include a minimum guarantee for 40 first article test units and a contract ceiling of $111,277,000.  RFP at 2. 

The RFP stated that the following factors would be evaluated:  technical, price, past performance, and small business participation.  RFP at 99.  The technical factor included the following two subfactors, in descending order of importance:  performance and display.  Id. at 100.  The RFP further stated that the technical factor was “significantly more important” than the price factor, which was more important than the past performance factor, which was more important than the small business participation factor.  Id. at 99.  Award was to be made to the offeror whose proposal offered the best value to the government utilizing a tradeoff source selection methodology.  Id. at 2, 99.

In addition to a written technical proposal, the RFP required offerors to submit a sample device with standard accessories for testing.  RFP at 94.  The solicitation provided a specification that detailed threshold and objective requirements as well as a performance test plan for the device.  See generally AR, Tab 4a, Detail Specification; Tab 4b, Performance Test Plan.  The evaluation for the performance subfactor was to be based on the offeror’s sample device “meeting or exceeding the threshold benchmark score utilizing PassMark[®] Software Performance Test requirements as specified” in the detail specification. [2]  RFP at 101.  The evaluation for the display subfactor was to be based on “the ability of the device display to meet or exceed the threshold requirements” of resolution and brightness.  Id. 

The agency received six proposals by the due date.  Contracting Officer’s Statement and Memorandum of Law (COS/MOL) at 7.  Following the initial evaluation of proposals, which included testing of the sample devices, the agency established a competitive range that included the protester, DRS, and a third offeror.  Id.  The threshold and objective requirements established by the detail specification, and the test results of the competitive range offerors’ sample devices, were as follows:

Technical Subfactor

AAR

DRS

Offeror 3

Performance-PassMark® Score
Threshold:  1400
Objective:  2000

2065.7

3574.4

2007.3

Display-Resolution
Threshold: 1024 x 768
Objective:  1920 x 1080

1920 x 1200

1920 x 1280

1920 x 1200

Display-Brightness
Threshold:  400 nits[3]
Objective:  800 nits

800 nits

500 nits

800 nits

 

AR, Tab 4a, Detail Specification, at 9, 27; AR, Tab 10, Source Selection Evaluation Board (SSEB) Report, at 9-11, 37-39, 63‑65). 

Because the agency did not intend to retest the devices, the discussions held with the competitive range offerors did not relate to the technical factor, and were limited to price and other matters.  Id. at 7-8.  After completing its final evaluation, the agency selected DRS for award.  On September 24, AAR filed a protest with our Office, which we dismissed as academic after the agency advised it would reevaluate proposals and make a new award decision.  AAR Integrated Technologies, B-416859, Oct. 31, 2018 (unpublished decision).

The agency reevaluated proposals, and consequently reduced DRS’s technical rating from outstanding to good, and increased AAR’s past performance rating from neutral confidence to satisfactory confidence.[4]  COS/MOL at 9.  As a result, all three offerors in the competitive range received an overall technical factor rating of good, with ratings of good in both the performance and display subfactors.  AR, Tab 10, SSEB Report, at 9.  All offerors also received a rating of satisfactory confidence for the past performance factor, and acceptable for the small business participation factor.  Id.  DRS’s proposed price was $84,236,735; AAR’s proposed price was $80,076,121; and the third offeror’s proposed price was $79,737,098.  Id.  On February 25, 2019, the agency again selected DRS for award.  On February 28, AAR received a debriefing.  This protest followed.

DISCUSSION

The protester challenges the agency’s evaluation under the technical factor, and argues that the agency failed to adhere to the RFP’s stated evaluation criteria.  AAR also argues that the agency’s evaluation of its past performance was arbitrary because it failed to consider relevant contract performance references.  In addition, the protester alleges that the best-value tradeoff was flawed because it deviated from the RFP’s evaluation criteria and failed to consider AAR’s price advantage.  As discussed below, we find the agency’s evaluation and selection decision to be reasonable.[5]

Technical Evaluation

AAR argues that the agency’s evaluation under the performance and display subfactors of the technical factor failed to adhere to the RFP criteria, and inconsistently credited proposals for exceeding objective requirements.  With regard to the performance subfactor, the protester argues that the agency improperly evaluated proposals against the higher objective requirement (2000) established in the RFP, rather than the minimum threshold requirement (1400).  Protest at 18.  According to the protester, this impropriety caused the agency to conclude that DRS’s proposal had a technical advantage under the performance subfactor rather than properly conclude that the offers were technically equivalent.  Id. at 17-20.  Related to the display subfactor, AAR argues that the agency irrationally found the proposals technically equal although AAR’s proposal met the higher objective requirement (800 nits) for display brightness and DRS’s proposal did not.  Id. at 21-22.  AAR argues that it was inconsistent for the agency to find DRS’s proposal more advantageous for exceeding the objective requirements under the performance subfactor, and yet find the DRS and AAR proposals technically equal under the display subfactor.  Id.

The agency argues that it reasonably evaluated proposals under the technical factor.  Specifically, the agency argues that under the performance subfactor, all three proposals in the competitive range exceeded the objective requirement, however, DRS’s proposal significantly exceeded the objective and was properly assigned additional credit.  COS/MOL at 19‑25.  The agency argues that it reasonably evaluated proposals under the display subfactor and concluded that all proposals met the threshold requirements.  Id. at 25-26.

In reviewing a protest challenging an agency’s evaluation of proposals, our Office will not reevaluate proposals nor substitute our judgment for that of the agency, as the evaluation of proposals is generally a matter within the agency’s discretion.  Del-Jen Educ. & Training Group/Fluor Fed. Solutions LLC, B-406897.3, May 28, 2014, 2014 CPD ¶ 166 at 8.  Rather, we will review the record to determine whether the agency’s evaluation was reasonable; consistent with the stated evaluation criteria, applicable procurement statutes, and regulations; and adequately documented.  Shumaker Trucking & Excavating Contractors, Inc., B-290732, Sept. 25, 2002, 2002 CPD ¶ 169 at 3.  A protester’s disagreement with an agency’s evaluation, without more, does not show that it lacked a reasonable basis.  Jacobs Tech., Inc., B-411784, B-411784.2, Oct. 21, 2015, 2015 CPD ¶ 342 at 6.

Here, as noted, the record shows that all three offerors exceeded the objective requirement (2000) under the performance subfactor.  AAR’s sample device achieved a PassMark® score of 2065.7, which the SSEB identified as a strength.  AR, Tab 10, SSEB Report, at 10.  In pertinent part, the SSEB explained the strength as follows:

This performance exceeds the objective and is advantageous to the [g]overnment because it provides for decreased time to perform user commands resulting in less time needed to support job functions, such as diagnosing faulty Line Replaceable Units (LRUs) or performing software updates.  Higher performance is advantageous to the [g]overnment because future [s]oftware acquisitions generally require greater computational power, but this bid sample has demonstrated an ability to meet the future needs of resource demanding software programs.

Id.  However, DRS’s sample device achieved a PassMark® score of 3574.4, which the SSEB identified as a significant strength, and explained as follows:

This performance far exceeds the objective and is appreciably advantageous to the [g]overnment because it provides for greatly shortened time to perform user commands resulting in less time needed to support job functions, such as diagnosing faulty Line Replaceable Units (LRUs) or performing software updates.  This will decrease Mean Time To Repair (MTTR) as the PassMark[®] performance score is 178% greater than the objective requirement thereby requiring only 56% of the time to perform computing functions as would a bid sample with a PassMark[®] performance score of 2000.  This will lead to increased Army readiness which is an Army Chief of Staff priority.  The exemplary performance shown by the bid sample is appreciably advantageous to the [g]overnment because future [s]oftware acquisitions generally require greater computational power, but this bid sample has demonstrated an ability to meet the future needs of resource demanding software programs.

Id. at 38.

Likewise, all three offerors exceeded the objective requirements under the resolution criterion, and exceeded the threshold under the brightness criterion, of the display subfactor.  The evaluators identified a strength for AAR’s device because its display resolution of 1920 x 1200 pixels “exceed[ed] the objective screen resolution [of 1920 x 1080] and provides additional capability to the user.”  Id. at 12.  The evaluators also found that AAR’s device demonstrated a display brightness of 800 nits “which exceeds the threshold [of 400 nits] and meets the objective requirement.”  Id. at 11.  Similarly, the evaluators identified a strength for DRS’s device because its display resolution of 1920 x 1280 pixels exceeded the objective requirement.  Id. at 40.  The DRS device was also found to exceed the threshold requirement, but not meet the objective, because it demonstrated a display brightness of 500 nits.  Id. at 39-40. 

On this record, we find the agency’s evaluation to be reasonable.  As noted, under the performance subfactor, the offerors exceeded the threshold requirement of 1400 and the objective requirement of 2000.  The agency assigned AAR a strength for exceeding the objective requirement and DRS a significant strength for far exceeding the objective requirement.  We find no basis to question the agency’s conclusion that DRS’s device warranted a significant strength because its higher PassMark® score exceeded the objective requirement to a far greater degree than did AAR’s device, and the agency believed the higher score meant that the computational power of DRS’s device would meet the future needs of demanding software programs. 

Further, under the display subfactor, as noted, both offerors’ devices exceeded the objective requirement for display resolution which the SSEB identified as a strength for each device.  Similarly, the SSEB found that both offerors’ devices exceeded the threshold requirement for display brightness, and that AAR’s device additionally met, but did not exceed, the objective requirement of 800 nits.  Thus, the record shows that both devices exceeded the threshold requirements, and the agency consistently identified strengths in the offerors’ proposals only when the device exceeded the objective requirements established in the detail specification.  Under these circumstances, we find the agency’s evaluation and assignment of a good rating to both the AAR and DRS proposals to be reasonable.

Past Performance

The protester argues that the agency’s evaluation of its past performance is unreasonable because the agency failed to properly credit AAR for its performance of subcontracts that are very similar to the RFP’s requirements.  Specifically, the protester challenges the agency’s conclusion that two subcontracts performed under a prime contract with the Kingdom of Saudi Arabia were not relevant because they were not government-related.  Protest at 24-26.  The agency argues that it properly did not consider the contracts because the agency “could not independently verify or trace these purchase orders back to a Government contract.”  COS/MOL at 34.  The agency further argues that even if its failure to consider these contracts was in error, AAR can demonstrate no prejudice because the past performance factor had no bearing on the source selection decision.  Id. at 35.

The evaluation of an offeror’s past performance is within the discretion of the contracting agency, and we will not substitute our judgment for reasonably based past performance ratings.  MFM Lamey Group, LLC, B-402377, Mar. 25, 2010, 2010 CPD ¶ 81 at 10.  Where a solicitation calls for the evaluation of past performance, we will examine the record to ensure that the evaluation was reasonable and consistent with the solicitation’s evaluation criteria and procurement statutes and regulations.  Divakar Techs., Inc., B-402026, Dec. 2, 2009, 2009 CPD ¶ 247 at 5.  In addition, the relative merit of an offeror’s past performance information is generally within the broad discretion of the contracting agency.  Lukos, LLC, B-416343.2, Aug. 13, 2018, 2018 CPD ¶ 282 at 8.  A protester’s disagreement with the agency’s judgment does not establish that an evaluation was unreasonable.  FN Mfg., LLC, B-402059.4, B‑402059.5, Mar. 22, 2010, 2010 CPD ¶ 104 at 7.

Here, the RFP stated that the “Offeror and its major/key subcontractors [would] be evaluated on the quality of their relevant and recent past performance, as it relate[d] to the probability of success on this contract.”[6]  RFP at 102.  The RFP stated that the agency would first establish whether a contract reference was recent and relevant, and then assign a performance confidence assessment based on how well the offeror performed on the recent and relevant contracts.  Id.  The record shows that the SSEB considered six of the ten contract references submitted for AAR and its major/key subcontractors to be relevant, and considered these six references when it assigned AAR a satisfactory confidence rating.  AR, Tab 10, SSEB Report, at 26-30.  Specifically, the SSEB stated, in pertinent part, as follows:

AAR had one contract that was determined to be [r]elevant with high remarks on the past performance questionnaire submitted.  In addition, the five subcontractor contracts that were submitted and found to be [r]elevant all had satisfactory or exceptional ratings. . . .  Given that there is one relevant contract for AAR and five for their subcontractors with no negative ratings or remarks, the [g]overnment has a reasonable expectation that the Offeror will successfully perform the required effort and therefore, the Overall Performance Confidence Assessment Rating Recommendation for AAR is “Satisfactory Confidence[.]”

Id. at 30.  In the best-value tradeoff analysis, the source selection official noted that all three competitive range offerors received a satisfactory confidence rating, and stated that the past performance factor “was not a significant differentiator in the source selection decision.”  AR, Tab 6, Source Selection Decision Document (SSDD), at 5.

The record further shows that the past performance questionnaire submitted by the prime contractor for AAR’s performance of the subcontract for the Kingdom of Saudi Arabia did not assign AAR the highest ratings, and indicated that AAR either “[m]eets [c]ontractual [r]equirements” rather than exceeding or failing to meet them, or rated AAR as “[s]atisfactory” rather than exceptional or unsatisfactory.  AR, Tab 17, AAR Past Performance Questionnaires, at 8-9.  Specifically, regarding the timeliness of performance for product deliverables, the questionnaire stated that “some of the required deliverable documentation (i.e. Technical Data Package and Technical Manuals) were delivered late.  It appeared personnel/resources were focused on other efforts instead of completing our contract.”  Id. at 8. 

On this record, we find that AAR has failed to establish that it was prejudiced by the agency’s evaluation of its past performance.  Competitive prejudice is an essential element of a viable protest; where the protester fails to demonstrate that, but for the agency’s actions, it would have had a substantial chance of receiving the award, there is no basis for finding prejudice, and our Office will not sustain the protest.  Swets Info. Servs., B-410078, Oct. 20, 2014, 2014 CPD ¶ 311 at 14.  AAR does not explain how or why the agency’s consideration of the subcontract would have improved its past performance rating such that it would have had a substantial chance of receiving the award.  In this regard, given the ratings assigned in the questionnaire, and the comment regarding the late delivery of required documentation, it is not clear how the agency’s consideration of this performance would have improved AAR’s rating under the past performance factor.  In addition, as discussed further below, past performance was not a discriminating factor in the selection decision.

Best-Value Tradeoff

The protester argues that the best-value tradeoff was fundamentally flawed because it deviated from the evaluation scheme and failed to consider AAR’s “substantially lower price.”  Protest at 26-31.  According to the protester, the agency “ignored the RFP’s direction to balance price and performance and failed to weigh AAR’s $4.16 million cost savings to the [U.S.] taxpayer as compared to DRS’s PassMark® rating and ignored AAR and DRS’s technical equivalency under [the performance subfactor] and AAR’s clear technical advantage under [the display subfactor].”  Id. at 28.  The agency argues that DRS was reasonably selected for award because its proposal provides the best value to the government.  COS/MOL at 36-45.

Where, as here, the RFP provides for a best-value tradeoff, the source selection official retains discretion to select a higher-priced but technically higher-rated submission, if doing so is in the government’s best interest and is consistent with the solicitation’s stated evaluation and source selection scheme.  All Points Logistics, Inc., B-407273.53, June 10, 2014, 2014 CPD ¶ 174 at 13-14.  The source selection official has broad discretion in determining the manner and extent to which he/she will make use of technical, past performance, and cost/price evaluation results, and this judgment is governed only by the tests of rationality and consistency with the stated evaluation criteria.  Id.  A protester’s disagreement with the agency’s determinations as to the relative merits of competing proposals, or disagreement with its judgment as to which proposal offers the best value to the agency, without more, does not establish that the source selection decision was unreasonable.  General Dynamics--Ordnance & Tactical Sys., B-401658, B-401658.2, Oct. 26, 2009, 2009 CPD ¶ 217 at 8. 

Here, the source selection official provided a detailed explanation of the selection of DRS for award.  Specifically, the selection official noted that:

AAR and [the third offeror] received a similar Passmark[®] Score – 2065.7 and 2007.3, respectively.  However, DRS received a Passmark[®] Score of 3574.4, which indicates approximately a 70% higher performance than AAR and [the third offeror] in the [p]erformance [s]ubfactor.  As such, DRS clearly provided the superior technical solution in the most important [s]ubfactor.  All offerors had a single strength in the [d]isplay [s]ubfactor, and DRS’s lower nits score (500 vs. 800) in relation to AAR and [the third offeror] will not negatively impact performance in any discernible way.

Because of the similarities in the AAR and [third offeror’s] proposals (they essentially offered the same tablet with an Intel® Dual Core i5-7Y57 processor running at 1.2 GHz with 8 GigaBytes of Random Access Memory), there is very little [t]echnical differences between them other than [p]rice.  [The third offeror] is the lowest priced offeror.  As such, this [t]radeoff [a]nalysis focuses on whether it is in the [g]overnment’s best interest to award to the technically superior offeror (DRS) or the lowest priced [third offeror].

AR, Tab 6, SSDD, at 5-6.  The selection official further explained that DRS’ technical solution was “the most advantageous to the [g]overnment because a Passmark[®] Score that is approximately 70% higher indicates that the [device] requires significantly less time to perform computing functions. . . the higher the score, the greater the unit’s performance. . . . ”  Id. at 6.  The higher score meant that the DRS solution “has the ability to perform 1.7 times as many computational functions as the other offerors.”  Id.  The selection official believed this was important because the agency “anticipated that future software acquisitions will require ever increasing computational power, but the performance level of the DRS solution reduces the likelihood of system slow-downs and obsolescence issues.”  Id.  Therefore, the selection official concluded that “the level of performance offered by [DRS’s] solution significantly surpasses the solutions offered by [the third offeror] and AAR.  It is in the best interest of the [g]overnment to award to DRS because their solution has significantly higher technical performance. . . .”  Id. 

With respect to price, the selection official stated that AAR ($80,076,121) and the third offeror ($79,737,098) were priced very similarly, and their performance was nearly identical.  Id.  With respect to DRS, the selection official explained that:

An award to DRS would cost the [g]overnment an additional $4,449,637, or 5.64%.  When looking closely at the pricing reports, DRS is priced only negligibly higher across various [h]ardware [contract line item numbers], including for the [device] Kits, and is more expensive with regards to [s]ervices.  Although [s]ervices are likely necessary during performance, the level of [s]ervices is unknown and it is likely that a higher performing [device] will require fewer of them.  Services are expected to include technical support, field trouble report analysis, and engineering change proposal support and assistance.  It is anticipated that the majority of these [s]ervices will involve user issues specifically in the realm of application execution.  As is typical with software applications, they will become more complex and require more computer processing power to execute properly, and given the nature of changing technology and unknown future requirements, it can be reasonably anticipated there will be less application and performance issues with DRS’s [device] than the products offered by AAR and [the third offeror].  Furthermore, when looking only at the [h]ardware CLINs, DRS is $3,409,264, or 4.38%, higher than the lowest priced [h]ardware offeror.  Although the costs of future software and performance upgrade [s]ervices are unknown, a 4.38% price premium for [h]ardware spread out over five (5) years appears to be well worth the cost to the [g]overnment for the best technical solution.

Id.

Despite the protester’s argument to the contrary, the RFP clearly stated that the agency would make award using a tradeoff source selection methodology, and as noted, stated that the technical factor was significantly more important than price and all other factors.  RFP at 99.  Further, the performance subfactor was more heavily weighted than the display subfactor.  Id. at 100.  The source selection official noted that the technical factor included limited criteria to evaluate the performance and display of the offered devices that resulted in the devices being equally rated, however, found that there were significant technical differences between the proposals.  Id.  The selection official here concluded that “DRS offered a technically superior proposal that, in the [g]overnment’s opinion, is worth the price premium.”  AR, Tab 6, SSDD, at 7.  AAR’s disagreement with the source selection official’s judgment as to which proposal offered the best value to the agency, without more, does not establish that the source selection decision was unreasonable.  See American Corr. Healthcare, Inc., B-415123.3 et al., Jan. 2, 2018, 2018 CPD ¶ 85 at 7-8.

The protest is denied.

Thomas H. Armstrong
General Counsel

 

[1] The RFP was amended seven times.  Citations are to the conformed copy of the RFP provided by the agency.

[2] PassMark® Performance Test software, once downloaded to a computer, allows for an objective benchmark to determine the computer’s performance using a variety of different tests.  See https://www.passmark.com/products/pt.htm (last visited June 6, 2019).  The PassMark® software offers standard tests, summary results, and the overall “PassMark® Rating” result.  Id. 

[3] Per the detail specification, one nit is equal to one candela per square meter.  AR, Tab 4a, Detail Specification, at 77. 

[4] The technical factor and subfactors, and the small business participation factor, were to be assigned the following ratings:  outstanding, good, acceptable, marginal, or unacceptable.  RFP at 100-101, 103.  The past performance factor was to be assigned the following ratings:  substantial confidence, satisfactory confidence, neutral confidence, limited confidence, or no confidence.  Id. at 102-103.

[5] The protester raised multiple allegations.  While our decision here does not specifically discuss each and every argument and/or variation of the arguments, we have considered all of AAR’s arguments and find no basis to sustain the protest. 

[6] The RFP defined a relevant contract as one “demonstrating technical/management capabilities the same as or similar to those required to perform on this item.  Major/key subcontractors are defined as those that will be providing critical hardware or whose subcontract is for more than 25% of the total proposed price.”  RFP at 102.  Relevance was additionally defined as “[p]resent/past performance effort involved similar scope and magnitude of effort and complexities this solicitation requires.”  Id.  Not relevant was defined as “[p]resent/past performance effort involved little or none of the scope and magnitude of effort and complexities this solicitation requires.”  Id.  Recent was defined as “1) occurring within the past three (3) years, or 2) awarded earlier than three (3) years ago, but for which deliveries/performance occurred or were scheduled to occur within the past three (3) years.”  Id. 

Downloads

GAO Contacts

Office of Public Affairs