Arktis Detection Systems, Inc.

B-416339,B-416339.2: Aug 10, 2018

Additional Materials:

Contact:

Ralph O. White
(202) 512-8278
WhiteRO@gao.gov

Kenneth E. Patton
(202) 512-8205
PattonK@gao.gov

 

Office of Public Affairs
(202) 512-4800
youngc1@gao.gov

Arktis Detection Systems, Inc. (Arktis), of Arlington, Virginia, protests the exclusion of its proposal from the competitive range for request for proposals (RFP) No. HSHQDN-17-R-0002, which was issued by the Department of Homeland Security (DHS), Customs and Border Protection, for replacement radiation portal monitors to provide passive security at U.S. ports of entry. The protester alleges that, during the testing phase of the evaluation, the agency misevaluated its proposed monitoring system in several of the tests and that, without these errors, it would have been included in the competitive range.

We dismiss the protest in part and deny the protest in part.

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of:  Arktis Detection Systems, Inc.

File:  B-416339; B-416339.2

Date:  August 10, 2018

Paul F. Khoury, Esq., Kendra P. Norwood, Esq., and Lindy C. Bathurst, Esq., Wiley Rein LLP, for the protester.
Andrew J. Baker, Esq., and Justin V. Briones, Esq., Department of Homeland Security, for the agency.
Stephanie B. Magnell, Esq., Amy B. Pereira, Esq., and Peter H. Tran, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1.  Protest challenging agency's exclusion of protester's proposal is dismissed, where the record shows that protester abandoned its initial protest allegation, and a similar--but materially-different--protest ground was untimely filed.

2.  Protest that the agency's test evaluations were flawed is denied, where the agency admits the error but the record shows that the protester suffered no prejudice.

DECISION

Arktis Detection Systems, Inc. (Arktis), of Arlington, Virginia, protests the exclusion of its proposal from the competitive range for request for proposals (RFP) No. HSHQDN-17-R-0002, which was issued by the Department of Homeland Security (DHS), Customs and Border Protection, for replacement radiation portal monitors to provide passive security at U.S. ports of entry.  The protester alleges that, during the testing phase of the evaluation, the agency misevaluated its proposed monitoring system in several of the tests and that, without these errors, it would have been included in the competitive range.

We dismiss the protest in part and deny the protest in part.

BACKGROUND

On January 19, 2017, DHS released the solicitation for replacement radiation portal monitors used to detect radiological and nuclear threats at U.S. sea- and land-based ports of entry.  The agency sought a commercially-available off-the-shelf solution under Federal Acquisition Regulation (FAR) part 12 to replace and upgrade its existing radiation portal monitors.  Agency Report (AR), Tab 8, RFP, at 2.  The agency intended to award up to three indefinite-delivery, indefinite-quantity (IDIQ) contracts with a period of performance consisting of one 5-year base period and two 5-year options, for a total period of 15 years, to three offerors whose monitoring systems provided the best value to the agency, considering technical factors, past performance, and price.[1]  Id. at 2, 38-39, 82. 

The solicitation provided for a two-phase evaluation process, consisting of an evaluation of written proposals in phase I and live testing in phase II.  RFP at 38-39; AR, Tab 7, Test and Evaluation Plan, at 5.  As relevant to this protest, the phase II technical evaluation consisted of three factors:  operational effectiveness, operational suitability, and supportabilty.  RFP at 85.  The agency intended to award up to three contracts to the offerors offering the best value, considering the three phase II technical factors and the phase I factors of vendor capability, past performance, and price.  Id. at 78.  The phase II evaluation scheme provided that the technical factors were equally important, and together, more important than the past performance factor.  Id. at 39, 79.  The technical and past performance factors were significantly more important than price, although as the technical advantages between proposals decreased, the importance of the price difference would rise.  Id. at 85.

The operational effectiveness factor consisted of seven elements:  nuclear/radiological detection, selectivity, flexibility, noise tolerance, applicability, capacity, and discrimination.  Id. at 86.  There were five possible adjectival evaluation ratings for the operational effectiveness factor:  outstanding, exceptional, good, acceptable, and unacceptable.[2]  AR, Tab 6, Tech. & Past Performance Evaluation, at 11-12.  In addition, monitoring systems were required to meet six KPPs.  Systems that did not meet the threshold level for one or more KPPs after phase II remained eligible for contract award for further testing, but would not be eligible for full-rate production orders.  RFP at 64. 

After phase I, the agency selected six monitoring systems, including Arktis', to participate in phase II testing.  AR, Tab 4, Competitive Range Determination, at 2;  Memorandum of Law (MOL) at 29.  The offerors provided the agency with complete monitoring systems for phase II testing, as well as software that permitted the agency to rerun--or replay--the test data in modeling and simulation scenarios in order to determine the system's performance against radiation sources and nuclear threats.  RFP at 65.  As part of the phase II test, offerors were "invited to the test site to participate in the initial setup and calibration of their systems" and to provide "operator/user training for members of the Government test team."  Id. at 73.

On June 12, 2017, the Arktis team arrived at the test site and set up their system, leaving 4 days later.  AR, Tab 3, Lead Scientist Statement of Facts (SOF) ¶ 9.  The agency tested the monitoring systems over the following months.  Id.

On April 10, 2018, DHS advised Arktis that it had been eliminated from the competitive range because its system was not among the most highly rated proposals.  AR, Tab 5, Arktis Notice of Elimination at 20.  DHS informed Arktis that the following adjectival ratings had been assigned to its proposal for each phase II non-price evaluation factor, and past performance:

Factor Rating
Operational Effectiveness Unacceptable
Operational Suitability Acceptable
Supportability Acceptable
Past Performance Acceptable

Id. at 1.[3]  The notice also advised Arktis of the weaknesses and deficiencies assigned to its monitoring system under the phase II elements.  See generally id.  As relevant to this protest, the following weaknesses and deficiencies are among the 14 weakness, five significant weaknesses and two deficiencies assigned to Arktis' monitoring system under the operational effectiveness factor:

TEST DESCRIPTION RATING
  Element - Detection
FR-58 Per ANSI [American National Standards Institute] standard, detect minimum of 177 out of 180 trucks with neutron source Californium-252 Deficiency
OR-06a (KPP) Meet ANSI standard for detection of various materials Deficiency
Element - Selectivity
FR-65-3 Recognize and categorize naturally occurring radioactive materials (NORM) cargo with flux from Thorium-232 Significant Weakness
OR-08 (KPP)

Avoid nuisance alarms

Weakness
 Element - Noise Tolerance
OR-05 Noise tolerance/False alarms - Refer no more than 0.1 of tests for further review Significant Weakness
 Element - Discrimination
OR-06 / OR-06a - FR-83 Categorize radiation sources when background was suppressed due to shielding Weakness
 Element - Capacity
FR-82 Categorize radiation source during vehicle speed change Weakness

AR, Tab 8A, Radiation Portal Monitor System Requirements, at 5-6, 10-11, 18; AR, Tab 5, Arktis Notice of Elimination, at 3-8.  The Agency provided a requested debriefing on April 23, and held the debriefing open until April 24 in order to respond to Arktis' written questions.  This protest followed on May 4.

DISCUSSION

Arktis protests its exclusion from the competitive range by raising numerous challenges to the agency's technical evaluation under the operational effectiveness factor.  Although our decision does not specifically address all of Arktis' protest grounds, we have fully considered them and have concluded that none provides a basis to sustain the protest.  We discuss below a selection of the protest grounds.[4]

Preset and Calibration

The protester contends that, when testing Arktis' system, the agency used the wrong preset setting, because the agency used the systems' recommended preset rather than the high sensitivity preset.[5]  Comments & Supp. Protest at 4.  Arktis claims that, had the agency initially set the Arktis system to the high sensitivity preset, rather than Arktis' recommended preset, its system would have avoided "two Deficiencies [FR-58 and OR-06a], a Significant Weakness [FR-65-3], and a Weakness [FR-82]."  Id.

The RFP required offerors to "[p]rovide calibration instructions and vendor recommended settings to be used during testing."  RFP at 74.  For nuclear and radiological testing, the solicitation also stated that DHS would use the "settings as initially supplied by the [o]fferor."  Id. at 87.  The Arktis monitoring system had three standard preset sensitivity configurations, including a recommended preset setting, as provided below:

Parameter Preset 1 Low Alarm Rate Preset 2 Arktis Recommended Preset 3 High Sensitivity
Alarms - FAR (gamma) 0.0001 0.001 0.001
Alarms - Background Suppression 1.00 1.00 0.90
Categorization - Background Auto Adjust True True False

Comments & Supp. Protest, Exh. 1, Appendix D, Sensitivity Configuration Presets, at 41.  The Arktis manual recommends using preset 2, and there is nothing in the record that provides guidance as to the specific environmental conditions that would direct a change from the recommended preset.[6]  Nevertheless, the protester argued that "a simple adjustment to the system's sensitivity settings, from preset 2 [(Recommended)] to preset 3 (High Sensitivity) would have resolved the detection problems encountered for OR-06a, FR-58, FR-65-3, and FR-82."  Comments & Supp. Protest at 28.  Arktis asserts that at the high sensitivity preset, "the Arktis system actually met or exceeded the OR-06a, FR-65-3, and FR-82 requirements and significantly improved on FR-58."  Id. at 4 (emphasis removed). 

As part of its response, DHS used Arktis' replay software to evaluate the data under the high sensitivity preset settings.[7]  Supp. MOL at 29 n.66.  The agency results indicate that, under the high sensitivity preset advodated by the protester, Arktis' system would still not achieve the threshold performance for tests OR-05 and OR-08 because the system would generate a false or nuisance alarm for every test run.  Id.  Test OR-08 is a KPP.  For a monitoring system to receive a rating higher than unacceptable under the operational effectiveness factor, it must meet the threshold requirement for all KPPs.  AR, Tab 6, Tech. & Past Performance Evaluation, at 12.  Therefore, even if Arktis' monitoring system had met the threshold requirements for OR-06a, FR-58, FR-65-3, and FR-82 at the high sensitivity preset, the system would not have met the threshold for OR-08, a KPP, and would still have received a rating of unacceptable under the operational effectiveness factor.

Furthermore, where an agency provides a detailed response to a protester's assertions and the protester fails to rebut or otherwise substantively address the agency's arguments in its comments, the protester provides us with no basis to conclude that the agency's position with respect to the issue in question is unreasonable or improper.  IntegriGuard, LLC d/b/a HMS Federal--Protest & Recon., B-407691.3, B-407691.4, Sept. 30, 2013, 2013 CPD ¶ 241 at 5.

In its supplemental comments, Arktis did not respond to the agency's argument that if  the high sensitivity preset were used, the system would not meet the threshold requirements for the OR-05 and OR-08 tests.  Instead, Arktis abandoned the argument that DHS should have used the high sensitivity setting by bringing a similar--but materially different--argument, namely, that DHS should have used the subpar test results to reverse-engineer a custom sensitivity setting to be used during the agency's tests.  Supp. Comments at 13-14. 

Specifically, the protester argued that the agency should have obtained "the correct calibration setting," which did not correspond to any of the system's three preset settings, "by starting at the lowest possible setting (0.90) to verify that increasing sensitivity has the expected effect, and then gradually adjusting the level back toward the baseline setting."  Id.  Thus, Arktis argued that by using the subpar test results themselves as the calibration metric, the customized "correct setting" could be derived.  Supp. Comments at 14.  We conclude that, by arguing in its supplemental comments that DHS was required to customize the sensitivity setting in this manner, Arktis abandoned its supplemental protest ground that DHS should have used the high sensitivity preset while testing Arktis' system.  Compare Comments & Supp. Protest at 29 ("[T]he correct sensitivity settings" were those found under the high sensitivity preset (0.90).) with Supp. Comments at 14 ("[T]he correct setting was 0.96" for background suppression).  On this record, we conclude that Arktis abandoned its protest ground related to the use of the high sensitivity setting and dismiss this protest ground.[8]  IntegriGuard, LLC d/b/a HMS Federal--Protest & Recon., supra.

Incorrect OR-05 Setting

Arktis alleges, and the agency acknowledges, that the agency ran the OR-05 test with the system's neutron false alarm rate improperly set to 0.01 rather than to the intended setting of 0.0001.  Comments & Supp. Protest at 3, 8.  The agency reviewed the data with the replay tool and affirmed that, had the OR-05 test been conducted at the proper false alarm setting, the results would have met the threshold requirements and Arktis would not have received the corresponding significant weakness.  Supp. MOL at 6 n. 18.  The protester asserts generally that the agency's errors resulted in prejudice to it.  Supp. Comments at 23.  The agency argues that its failure to use the proper setting did not, however, result in prejudice to the protester because the other ratings would still support an adjectival rating of unacceptable under the operational effectiveness factor.  Id. at 6-7. 

Prejudice is an essential element of every viable protest; we will not sustain a protest unless the protester demonstrates a reasonable possibility that it was prejudiced by the agency's actions.  Armorworks Enters., LLC, B-400394.3, Mar. 31, 2009, 2009 CPD ¶ 79 at 3.  Furthermore, agencies are not required to include a proposal in the competitive range where it is not among the most highly rated or where the agency reasonably concludes that the proposal has no realistic prospect of award.  Environmental Restoration, LLC, B-413781, Dec. 30, 2016, 2017 CPD ¶ 15 at 3.  Thus, a technically acceptable proposal may be excluded from the competitive range if it does not stand a real chance of being selected for award.  Id. at 5. 

Of the six monitoring systems tested in phase II, three received technical ratings of unacceptable and were eliminated from the competitive range.  AR, Tab 4, Competitive Range Determination, at 8.  Arktis' system, with a rating of unacceptable for the operational effectiveness factor and ratings of acceptable for the other non-price factors, was the lowest-rated of the six systems.  Id.  The three systems included in the competitive range received operational effectiveness ratings ranging from good to excellent and a minimum rating of acceptable for all other factors.  Id.  Thus, even if the protester had successfully demonstrated that removal of a significant weakness would have been sufficient to change its adjectival rating under the operational effectiveness factor, from unacceptable to acceptable, it would have nevertheless remained in the bottom half of the six monitoring systems evaluated in phase II.[9]  Id.  Since the agency intended to select for award no more than three monitoring systems, and the evaluation provided that the non-price factors were significantly more important than price, there is no basis to conclude that any improvement in its operational effectiveness adjectival rating would have placed Artkis' system among the top three.  On this record, we find no prejudice to Arktis from the agency's error in using the wrong setting in the OR-05 test.  Priority One Servs., Inc., B-415201.2, B-415201.3, Apr. 13, 2018, 2018 CPD ¶ 182 at 6 (finding no prejudice in the protester's exclusion from the competitive range, where agency's flawed evaluation would not have overcome technical deficiency).

We dismiss the protest in part and deny the protest in part.

Thomas H. Armstrong
General Counsel



[1] The RFP contemplated that systems selected for award would undergo further, post-award optimization and testing prior to full-rate production.

[2] As relevant to this protest, a rating of acceptable would be assigned to a system that:

Meets the threshold level of all KPPs [key performance parameters] in the subordinate elements, and achieves the threshold level for some Operational and Functional requirements.  If the evaluation team determines that a KPP that fails to meet the threshold value can be successfully upgraded to meet the threshold during the Optimization Phase, "Acceptable" may also be used, with an appropriate risk rating.

AR, Tab 6, Tech. & Past Performance Evaluation, at 11-12 (emphasis omitted).  A rating of unacceptable would be assigned to a system that "[f]ails to meet the threshold level of one or more KPPs in any of the subordinate elements, or has a number of significant weaknesses and/or deficiencies."  Id. at 12. 

[3] In fact, Arktis' system had the lowest technical ratings of the six tested systems.  AR, Tab 4, Competitive Range Determination, at 8. 

[4] Arktis withdrew its initial protest relating to the test site conditions.  Comments & Supp. Protest at 22 n.25.  The protester also withdrew its supplemental protest that the agency misinterpreted the OR-08 test results and engaged in unequal treatment through its discussions as compared to another offeror, Rapiscan.  Supp. Comments at 20 n.38.  In addition, in its supplemental comments, Arktis for the first time challenged the agency's evaluation of a different offeror, L-3.  Supp. Comments at 20-21.  Specifically, the protester alleges that L-3 should have received a rating of unacceptable under the operational effectiveness factor, given its test results.  Id.  The supplemental protest ground challenging the agency's evaluation of L-3 was first raised on July 3.  Id.  However, this argument rests on information contained in the agency report, which was provided to the protester on June 6, 2018.  EPDS Dkts. 13-17.  Because the record shows that the protester knew or should have known of the basis of its protest more than 10 days prior to the date it raised this challenge, we dismiss this protest ground as untimely.  4 C.F.R. § 21.2(a)(2). 

[5] Arktis attempted to link its claim that the agency used the incorrect preset setting to its initial protest ground that the agency should have contacted it when its system generated poor test results.  Comments & Supp. Protest at 26.  However, the argument that the agency should have used the high sensitivity setting was not part of the original protest.  Nevertheless, we consider it to be a timely-filed supplemental protest as the protester filed it within 10 days of receipt of the data underpinning the argument.  4 C.F.R. § 21.2(a)(2).

[6] Arktis' operating manual provided the following guidance regarding preset choice:

It is possible to set 3 sensitivity configurations for different user requirements.

Preset 1:  The expected false alarm rate will be below prescribed requirements.

Preset 2:  Arktis recommends to use preset 2 during normal operation.

Preset 3:  This preset will allow the system to alarm on sources with lower activity than prescribed by the requirements.

Comments & Supp. Protest, Exh. 1, Operating Manual, Appendix D, Arktis Sensitivity Configuration Presets, at 41. 

[7] According to Arktis, the replay tool "is essentially computer software that reads recorded measurement data and provides virtual output from the RPM [radiation portal monitor] as if a new measurement is being taken. This allows the software's analysis parameters to be easily adjusted, as needed, to see the effect of changing calibration or other settings, without having to re-run tests.  Importantly, the underlying data remains unchanged, and any alteration to the parameters only affects the analysis of the data - not the data itself."  Comments & Supp. Protest at 29 n.36.

[8] Arktis' argument that the optimal, customized system settings should be reverse-engineered from its otherwise-failing test results is also unavailing.  Consistent with the terms of the solicitation, the agency is under no obligation to recalibrate the protester's monitoring system in this manner.  Furthermore, the customized setting argument is untimely.  In this regard, the test data was made available to the protester no later than June 13.  Protester Email to GAO et al., June 13, 2018.  Yet the first time the protester asserts its custom-setting argument, which is based on this data, is on July 3.  Because this protest ground was raised more than 10 days after the protester knew or should have known about it, it is dismissed as untimely.  4 C.F.R. § 21.2(a)(2). 

[9] As a practical matter, the fact that Arktis' system was unable to meet the threshold requirements for KPPs OR-06a and OR-08 meant that its adjectival rating for the operational effectiveness factor would remain at unacceptable.  AR, Tab 6, Tech. & Past Performance Evaluation, at 12.

Dec 14, 2018

Dec 13, 2018

Dec 12, 2018

Looking for more? Browse all our products here