DRS ICAS, LLC

B-401852.4,B-401852.5: Sep 8, 2010

Contact:

Ralph O. White
(202) 512-8278
WhiteRO@gao.gov

 

Office of Public Affairs
(202) 512-4800
youngc1@gao.gov

DRS ICAS, LLC, of Buffalo, New York, protests the issuance of a delivery order to AAI Corporation, of Hunt Valley, Maryland, under request for proposals (RFP) No. STOCII-09-KOL-0002, issued by the Department of the Army, Army Materiel Command, for a man-portable aircraft survivability trainer (MAST) system. DRS challenges the Army's evaluation of its technical proposal, and the evaluation of the offerors' prices.

We sustain the protest.

B-401852.4; B-401852.5, DRS ICAS, LLC, September 8, 2010

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of: DRS ICAS, LLC

File: B-401852.4; B-401852.5

Date: September 8, 2010

Joel Singer, Esq., and Kyle J. Fiet, Esq., Sidley Austin LLP, for the protester.
Thomas J. Madden, Esq., and James Y. Boland, Esq., Venable LLP, for AAI Corporation, the intervenor.
Vera Meza, Esq., Joseph Fratarcangeli, Esq., and Harlan Gottlieb, Esq., Department of the Army, for the agency.
Jonathan L. Kang, Esq., and James A. Spangenberg, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1. Protest challenging evaluation of protester's technical proposal is sustained where the record shows that the agency assessed numerous weaknesses that lacked a reasonable basis.

2. Protest is sustained where the agency incorrectly assumed that it was required to ignore the passage of time between the agency's initial evaluation and its post-corrective action reevaluation with regard to the evaluation of the protester's ongoing work and its relevance to the evaluation of system maturity and schedule risk factors.

DECISION

DRS ICAS, LLC, of Buffalo, New York, protests the issuance of a delivery order to AAI Corporation, of Hunt Valley, Maryland, under request for proposals (RFP) No. STOCII-09-KOL-0002, issued by the Department of the Army, Army Materiel Command, for a man-portable aircraft survivability trainer (MAST) system. DRS challenges the Army's evaluation of its technical proposal, and the evaluation of the offerors' prices.

We sustain the protest.


BACKGROUND

The solicitation sought proposals to develop and provide MAST units, which are intended to allow users to simulate threats posed by man-portable air defense systems (MANPADS), i.e., shoulder-launched surface-to-air antiaircraft missiles. The MAST units will be used to train aircrews to respond to MANPADS threats. The solicitation statement of work (SOW) explained that "[t]he abundance of MANPADS, particularly among terrorist organizations, has emerged as an increasing threat to both civilian and military aircraft operating in support of Overseas Contingency Operations." RFP, SOW, at 1. The SOW identified "Russian-made MANPADS such as the SA-7, SA'14, SA-16, and SA-18" as popular models that pose a threat to aircraft. Id. The proposed MAST units will be required to integrate an existing technology used to simulate MANPADS threats, known as the multiple integrated laser engagement system (MILES), as well as provide additional simulation capabilities.

The RFP was issued on May 5, 2009, and was restricted to vendors under a multiple-award indefinite-delivery/indefinite-quantity contract, know as the Program Executive Office for Simulation, Training and Instrumentation Omnibus Contract II.[1] The solicitation provided for the issuance of a delivery order with a 1-year base period and four 1-year options. RFP at 1. The RFP stated that the delivery order would be based on fixed-unit prices for the MAST units and associated equipment, along with certain time-and-materials contract line items (CLINs), and that the agency could order up to 60 units per year. See RFP sect. B.

The solicitation advised offerors that proposals would be evaluated based on the following five factors: (1) path forward for system performance and schedule, which had two subfactors, system maturity and schedule risk; (2) technical approach, which had three subfactors, technology insertion and open architecture, target sensitivity, and MILES integration, (3) engagements (i.e., live demonstrations), which had two subfactors, daytime engagements and nighttime engagements; (4) recording and playback of data during the engagements, which had two subfactors, daytime digital video recording and playback and nighttime digital video recording and playback; and (5) price. RFP, Proposal Instructions, at 3-4. The non'price factors were of equal importance, and the subfactors within each factor were of equal importance. RFP, Proposal Instructions, at 5. For purposes of award, the non-price factors, when combined, were more important than price. Offerors' technical proposals were limited to 25 pages. Id.

The agency's source selection evaluation board (SSEB) reviewed the offerors' proposals and prepared a proposal evaluation report that documented the evaluation factor ratings and evaluated prices. The source selection authority (SSA) reviewed the evaluation report, and based on the findings of the SSEB, selected AAI's proposal for the issuance of the delivery order.

On August 17, 2009, the Army advised DRS that it had selected AAI's proposal for award. The agency provided a debriefing to the protester on August 25. DRS filed a protest with our Office on August 31, challenging the agency's treatment of the offerors during the engagements/live demonstrations, the evaluation of the engagement data, the evaluation of DRS's technical proposal, and the reasonableness of the selection decision. On September 15, prior to producing its report on the protest, the Army advised our Office that it would take corrective action in response to the protest. Based on the agency's notice, we dismissed the protest.

The Army issued a revised solicitation, and requested new proposals from offerors. The agency conducted a new round of engagements/live demonstrations, and evaluated the offerors' revised proposals. On December 17, the agency advised DRS that it had again selected AAI for issuance of the delivery order. The agency provided DRS a debriefing, and DRS filed a second protest with our Office on January 11, 2010. DRS's second protest challenged the Army's evaluation of its technical proposal, and also argued that the agency had improperly calculated offerors' prices. Following receipt of the agency report, DRS filed a supplemental protest, which argued that the record did not demonstrate that the agency had conducted an analysis for unbalanced pricing. On March 9, the agency advised our Office that it would take corrective action to address the unbalanced pricing issue. Based on the agency's notice, we dismissed the protest.

The Army's corrective action in response to the second protest consisted of preparing a new unbalanced pricing evaluation for both offerors. The agency, however, also reevaluated several areas of DRS's technical proposal to address issues raised in its earlier protests. As discussed in detail below, the revised evaluation report deleted the reference to a weakness concerning the evaluation of DRS's proposal regarding its performance of a similar contract, and added or revised three weaknesses concerning the protester's proposal under the target sensitivity subfactor. Agency Report (AR), Tab 18, Revised Evaluation Report, at 15-18, 20-22. The agency did not reevaluate AAI's technical proposal. The agency did not revise any of the ratings for DRS as a result of the reevaluation, and, ultimately, the final evaluation ratings for AAI and DRS were as follows:

DRS

AAI

PATH FORWARD FOR SYSTEM PERFORMANCE AND SCHEDULE

MARGINAL

HIGHLY SATISFACTORY

System maturity

Marginal

Highly satisfactory

Schedule risk

Marginal

Highly satisfactory


TECHNICAL APPROACH

MARGINAL

SATISFACTORY

Technology insertion and open architecture

Marginal

Satisfactory

Target sensitivity

Marginal

Highly satisfactory

MILES integration

Satisfactory

Satisfactory

ENGAGEMENTS

HIGHLY SATISFACTORY

OUTSTANDING

Daytime Engagements

Highly satisfactory

Outstanding

Nighttime engagements

Highly satisfactory

Outstanding

RECORDING AND PLAYBACK

SATISFACTORY

HIGHLY SATISFACTORY

Daytime recording and playback

Satisfactory

Highly satisfactory

Nighttime recording and playback

Satisfactory

Highly satisfactory

EVALUATED PRICE

$37,502,872

$43,218,239

AR, Tab 18, Evaluation Report, at 3-4, 29.

In the selection decision, the SSA found that "AAI's superior ratings in each factor as a result of a mature system with low technical and schedule risk has been determined to be the best value for the government." AR, Tab 22, Source Selection Decision (SSD), at 11. The SSA concluded that AAI's "clearly superior technical proposal justifies the Government paying the $5.7M price premium over the life of the delivery order." Id.

The Army advised DRS of the new award on May 28. The Army did not provide DRS with a debriefing; instead, the agency provided a summary of the changes that had been made to the evaluation report during the reevaluation. This protest followed.

DISCUSSION

DRS challenges the Army's evaluation of its technical proposal, and the agency's evaluation of offerors' prices. As discussed below, we conclude that the Army's evaluation of DRS's proposal had numerous prejudicial errors under the first two non-price evaluation factors--technical approach, and path forward for system performance and schedule. We conclude that the protester's challenge to the agency's evaluation offerors' proposed prices is untimely.

The evaluation of an offeror's proposal is a matter within the agency's discretion. IPlus, Inc., B'298020, B-298020.2, June 5, 2006, 2006 CPD para. 90 at 7, 13. A protester's mere disagreement with the agency's judgment in its determination of the relative merit of competing proposals does not establish that the evaluation was unreasonable. VT Griffin Servs., Inc., B-299869.2, Nov. 10, 2008, 2008 CPD para. 219 at 4. In reviewing a protest against an agency's evaluation of proposals, our Office will not reevaluate proposals but instead will examine the record to determine whether the agency's judgment was reasonable and consistent with the stated evaluation criteria and applicable procurement statutes and regulations. See Shumaker Trucking & Excavating Contractors, Inc., B-290732, Sept. 25, 2002, 2002 CPD para. 169 at 3. While we will not substitute our judgment for that of the agency, we will question the agency's conclusions where they are inconsistent with the solicitation criteria, undocumented, or not reasonably based. Public Communications Servs., Inc., B-400058, B-400058.3, July 18, 2008, 2009 CPD para. 154 at 17.

This decision is based, in part, upon testimony the Army SSEB Chair and SSA provided during a hearing conducted by our Office on August 21, 2010. In reviewing an agency's evaluation of offerors' proposals, we do not limit our consideration to contemporaneously-documented evidence, but instead consider all the information provided, including the parties' arguments, explanations, and any hearing testimony. Navistar Def., LLC; BAE Sys., Tactical Vehicle Sys. LP, B'401865 et al., Dec. 14, 2009, 2009 CPD para. 258 at 6. While we generally give little or no weight to reevaluations and judgments prepared in the heat of the adversarial process, Boeing Sikorsky Aircraft Support, B-277263.2, B-277263.3, Sept. 29, 1997, 97-2 CPD para. 91 at 15, post-protest explanations that provide a detailed rationale for contemporaneous conclusions, and simply fill in previously unrecorded details, will generally be considered in our review of the rationality of selection decisions--so long as those explanations are credible and consistent with the contemporaneous record. NWT, Inc.; PharmChem Labs., Inc., B-280988, B-280988.2, Dec. 17, 1998, 98'2 CPD para. 158 at 16.

A. Technical Approach Factor Evaluation

The protester raises several challenges to the evaluation of its proposal as marginal under the technology insertion and open architecture subfactor, and the target sensitive subfactor of the technical evaluation subfactor. As discussed below, we find that there were errors in five of the weaknesses identified by the agency in DRS's proposal under these two subfactors.

1. USB ports

DRS argues that the agency unreasonably assessed a weakness in its proposal under the technology insertion and open architecture subfactor, based on the protester's proposed use of USB ports.[2]

In the RFP performance specification, the "Data Transfer Interface" requirement required offerors to provide an input/output interface for uploading and downloading data. RFP, Performance Specification, sect. 3.7.1.2. The relevant provision is as follows: "Note: Due to Information Assurance requirements all MAST related USB ports, if part of the MAST design, shall preclude the use of commercial standard USB physical connections through keying or some other physical means." Id.

As relevant here, keying is a hardware-based means of encryption, which is considered among the most advanced forms of security for USB devices. See Tr. at 196:19-22. DRS's proposal stated in several places that it would use "keyed External USB ports" for data transfer. See AR, Tab 7A, DRS Technical Proposal, at 6-7, 14, 23.

In the revised evaluation report, the agency identified a weakness in DRS's proposal because "[t]he offeror does not provide a methodology to satisfy the [information assurance] USB requirement or an alternate solution of transferring data." AR, Tab 18, Revised Evaluation Report, at 19. The selection decision also stated that "no methodology is provided to satisfy the [information assurance] USB requirement." AR, Tab 22, SSD, at 6.

The agency contends that the plain language of the solicitation prohibited the use of any USB ports. AR at 63-64. The SSEB Chair testified that the intent of the provision was to prohibit the use of USB connections of any kind. Tr. at 197:5-12. In this regard, the agency argues that the word "preclude" refers to all USB ports, and the phrase "through keying or some other physical means" refers to clarifying examples of what kinds of USB ports are prohibited. AR at 64.

We think that the plain language of the solicitation provision does not support the agency's interpretation that all USB ports are prohibited. The first part of the provision states that "all MAST related USB ports, if part of the MAST design. . ." RFP, Performance Specification, sect. 3.7.1.2. We think this phrase plainly indicates that MAST related USB ports are allowed. The next part of the provision states that the USB ports that are part of the MAST design "shall preclude the use of commercial standard physical USB connections through keying or some other physical means." Id. We think that the second phrase plainly states that USB ports are permitted, provided that they "preclude the use of commercial standard physical USB connections" through the method specified, i.e., "keying or some other physical means." Id.

Based on the plain language of the solicitation, we think that the protester's interpretation of the RFP as permitting keyed USB ports is reasonable, and that the agency's interpretation of the RFP as barring use of all USB ports is unreasonable.[3] Thus, we think the agency's assessment of a weakness regarding DRS's proposed use of USB ports was not reasonable under the technical evaluation factor.[4]

2. Focal plane array scanning

DRS argues that the agency unreasonably assessed a weakness in its proposal under the target sensitivity subfactor, based on DRS's failure to address the use of focal plane array target scanning technology in its proposed simulation of SA-16 and SA-18 threats. In the revised evaluation report, the Army stated that "[t]he offeror provides no detail on how to simulate the target sensitivity of the SA-16 and SA-18 that use focal plane arrays which is outside of the scope of the offeror's proposed solution." AR, Tab 18, Revised Evaluation Report, at 21. During the hearing, the SSEB Chair conceded that this evaluation was in error, because he was misinformed by a technical evaluator as to the technology used by SA-16 and SA-18 missiles. Tr. at 175:5-176:12. Instead, the SSEB Chair stated that he was subsequently advised by the evaluator that these two missiles use [deleted] scan technology. Id. The SSEB Chair states that, had he known the correct information, he would have revised the weakness assessed in DRS's proposal concerning [deleted] scan technology to include references to SA-16 and SA'18 missiles. As discussed below, however, we also think that the agency's evaluation of DRS's proposal regarding [deleted] scan was flawed. To the extent that the agency's assessment of weakness here relied upon its conclusion that DRS's proposal failed to address focal plane technology, we think that the evaluation was unreasonable.

3. [Deleted] scanning

DRS argues that the agency unreasonably assessed a weakness in its proposal under the target sensitivity subfactor, based on its failure to discuss [deleted] scan technology, and the agency's belief that the protester would address this requirement only through a future upgrade.

The RFP required offerors to address target sensitivity requirements, and stated that the agency would evaluate "the Offeror's approach to satisfying the ability to change the target tracking sensitivity and its correlation to threat group parameters . . . within the specifications document." RFP, Proposal Instructions, at 3. [Deleted] scanning is a type of targeting technology used by SA-14, SA-16, and SA-18 missiles. Tr. at 176:10-12. Although [deleted] scan technology is not discussed in the RFP, both the performance specifications and the RFP state that SA-7, SA-14, SA-16, and SA-18 missiles are "popular" threats. RFP, SOW, para. 1.1; Performance Specifications, para. 1.2.

The protester explained that "[d]ifferent seeker lock-on characteristics of various missiles (early generation, current generation, and future generation) are implemented by using [deleted][5]" and that "[t]he flexibility of the DRS design allows the system to be configured as a [deleted]." AR, Tab 7A, DRS Technical Proposal at 21. DRS's proposal also stated in two charts that its proposed MAST unit provided the following feature: "[deleted]--Both Present and Future." Id. at 5, 23. With regard to the reticle, the protester stated that "t]he [deleted] is [deleted] and configurable to ensure MAST life-cycle supportability not only for today's MANPADS threats but evolving future threats such as the SA-7/14/16/18." Id. at 21.

The Army identified a weakness in DRS's proposal because "[t]he offeror goes into little detail on how to simulate an SA-14 using [deleted] scan technology and offers this as an upgrade; not part of the base delivery." AR, Tab 18, Revised Evaluation Report at 21. The SSEB Chair also testified that that he understood DRS's proposal to simulate the "[deleted]"[6] of an SA-7 missile, but that the simulation of the [deleted] scan technology of an SA-14 would not be addressed in the proposed MAST unit and instead was offered as a future upgrade. Tr. 76:16-77:3, 151:7-152:18 (citing AR, Tab 7A, DRS Technical Proposal, at 21). The SSEB Chair's interpretation of DRS's proposal relied on a reference to a proposed upgrade of the camera integrated into the protester's MAST unit, which records engagements. Tr. at 78:3-16 (citing AR, Tab 7A, DRS Technical Proposal, at 21). The SSEB Chair testified that the agency understood the reference to a potential future upgrade of the camera to indicate that any simulation feature beyond the [deleted] associated with the SA-7 would be a future upgrade. Id.

Based on our review of the plain language of DRS's proposal, we do not think that the Army's interpretation is reasonable. As discussed above, DRS's technical proposal stated that its proposed MAST unit would use [deleted] to simulate the "[deleted]" associated with an SA-7 missile, which does not use [deleted] scan technology. AR, Tab 7A, DRS Technical Proposal, at 21. The protester's proposal also stated, however, that the [deleted] can simulate various threats through [deleted], including those that use [deleted] scan technology. Id. at 5, 21, 23. While the SSEB Chair stated that he understood the description of a camera upgrade option to indicate that DRS's [deleted] would also require upgrading to simulate missiles other than an SA-7, we do not think that a reasonable reading of the proposal supports this conclusion. In this regard, while the discussion of the camera upgrade and the [deleted] are in the same paragraph, nothing in the text indicates that the [deleted] are linked to or dependent on the camera upgrade. See id. at 21. In sum, to the extent that the agency believed that DRS's proposal stated that its [deleted] would simulate only a SA'7 missile, we do not think that the record supports this conclusion.[7]

4. System growth and open architecture

DRS argues that the agency unreasonably assigned its proposal a weakness regarding the system growth and open architecture requirements of the technology insertion and open architecture subfactor. The protester contends that the agency's contemporaneous evaluation unreasonably concluded that its proposal lacked adequate detail, and also argues that the hearing testimony indicates that the agency applied undisclosed evaluation criteria in its evaluation.

The technology insertion and open architecture subfactor required offerors to "describe how their architecture supports system upgrades, technology insertion, ease of software/configuration changes . . . [and] ease of adding new devices to the MAST internal (or external) library." RFP, Proposal Instructions, at 3. The RFP also stated that the agency would "evaluate the Offeror's soundness of approach for meeting the system specifications and the overall design and architecture to support internal and external system growth, reprogrammability, and data security." Id.

DRS's proposal stated that it designed its MAST system to provide a "modular, programmable system which can be readily upgraded," and that would "be upgradeable to provide positive, realistic aircrew training against current MANPADS threats, future evolving MANPADS threats, while providing training for next generation missile warning systems." AR, Tab 7A, DRS Technical Proposal, at 19. The protester's proposal also contained a table titled "MAST Open Architecture Expandability and Upgradeability," which listed features and benefits for categories such as System Upgrades, Technology Insertion, and Software/Configuration Changes. Id. at 23.

The agency assessed a weakness in DRS's proposal because "[t]he offeror does not provide enough detail on their approach for system growth, interface flexibility, and maintainer reprogrammability tasks (to include password protection)." AR, Tab 18, Revised Evaluation Report, at 19. During the hearing, the SSEB Chair testified that the agency's concerns with this area of DRS's proposal were based on a lack of detail regarding the features and benefits listed in the table, and the agency's view that these features and benefits did not constitute an "approach" to the requirements. Hearing Transcript (Tr.) at 248:10-17.

In addition to the foregoing concerns regarding the lack of detail in DRS's proposal, the SSEB Chair also testified that this weakness in DRS's proposal was based in significant part on the protester's failure to identify specific technologies to be incorporated into the MAST system in the future, as follows:

[W]e know that there is technology out there that is current technology that is not a requirement under our solicitation. . . . [T]hat is the immediate next defined threat. I would have expected to see that in this proposal. It's the next generation in the family of systems.

Tr. 266:17-267:2. The SSEB Chair further stated that the agency expected offerors to address "[n]ot only the ability to change and grow, but also to look forward and to see what technologies are out there." Tr. at 271:2-4; see also Tr. at 255:4-10 (citing SA-24 missiles as an example of a threat not identified in the solicitation that the agency believed that an offeror should have discussed how to address in the future through its system architecture).

We think that the SSEB Chair's testimony indicates that the agency's concerns with DRS's proposal were beyond those contemplated by the RFP's evaluation factors. Agencies are required to evaluate proposals based solely on the factors identified in the solicitation, and must adequately document the bases for their evaluation conclusions. Intercon Assocs., Inc., B-298282, B-298282.2, Aug. 10, 2006, 2006 CPD

para. 121 at 5. While agencies properly may apply evaluation considerations that are not expressly outlined in the RFP where those considerations are reasonably and logically encompassed within the stated evaluation criteria, there must be a clear nexus between the stated criteria and the unstated consideration. Global Analytic Info. Tech. Servs., Inc., B-298840.2, Feb. 6, 2007, 2007 CPD para. 57 at 4.

We note that the solicitation required offerors to address how the "architecture" of their proposed MAST systems accommodates future upgrades and system growth. RFP, Proposal Instructions, at 3. The criticism discussed by the SSEB Chair, however, concerned the protester's failure to identify and address specific threats and technologies for future upgrades; we do not think an offeror would reasonably realize that it should submit this information. See Tr. at 255:4-10, 266:17-267:2,
271:2-4. Based on our review of the record and the hearing testimony, we think the agency applied an unstated evaluation criterion in finding a weakness for the protester's proposal in this area.[8]

In addition to this concern, the SSEB Chair stated for the first time during the hearing that the agency's assessment of this weakness was also based on substantive criticisms concerning certain specific features of the protester's proposal, even though the revised evaluation report stated that the Army's concern here was that DRS did not "provide enough detail on their approach" for this requirement. AR,
Tab 18, Revised Evaluation Report, at 19; see also AR at 53-55 (explaining that weakness was based on lack of details). For example, the SSEB Chair testified that while DRS's proposed use of [deleted] was a good approach to future upgradeability, he believed that this approach would require frequent software patches and upgrades to address security concerns arising from the connection of the MAST unit to secure Army networks. Tr. at 276:12-16. To the extent that this concern was first raised in the hearing, and is distinct from any other issues identified in the contemporaneous record or raised in the agency's earlier responses to the protest, we do not think it demonstrates that the agency's assessment of a weakness here was reasonable.[9] Boeing Sikorsky Aircraft Support, supra.

5. Data security

DRS argues that the Army unreasonably assessed a weakness in its proposal under the data security requirement of the technology insertion and open architecture subfactor. Here also, the protester contends that the agency's contemporaneous evaluation unreasonably concluded that its proposal lacked adequate detail and that the hearing testimony indicates that the agency applied undisclosed evaluation criteria in its evaluation.

The technology insertion and open architecture subfactor stated that the agency would "evaluate the Offeror's soundness of approach for meeting . . . data security." AR, Proposal Instructions, at 3. DRS argues that its proposal identified 12 features in its proposal regarding data security, including items such as solid state hard drives, elimination of vulnerable wireless data interfaces, keyed universal serial bus (USB) ports, and administrative software controls on changes to the MAST units. See AR, Tab 7A, DRS Technical Proposal, at 5-8, 23. The protester contends that these features demonstrate an adequate approach to the data security requirement in the solicitation.

In the revised evaluation report, the agency identified a weakness in DRS's proposal as follows: "Approach to data security is not addressed. The offeror does not provide a methodology to satisfy the [information assurance] USB requirement or an alternate solution of transferring data."[10] AR, Tab 18, Revised Evaluation Report, at 19. The selection decision also cited this concern as a weakness in DRS's approach. AR, Tab 22, SSD, at 6. In its response to the protest, the agency argues that the protester's proposal merited a weakness because its references to various data security features did not constitute a plan with sufficient detail that addressed the information assurance requirements set forth in the solicitation. AR at 57-59. In this regard, the agency argues that it assessed a weakness to DRS because it "inexplicably ignored large sections of the [SOW concerning information assurance] related to system and data security." AR at 56.

As relevant here, the SOW contained a requirement for compliance with the Army's information assurance regulations. SOW sect. 3.2.1.5. This SOW provision requires the contractor to "develop and maintain an information assurance process" and to assist the government during performance of the delivery order with the development of security documents and protocols. Id. sections 3.2.1.5, 3.2.1.5.2, 3.2.1.5.2.1. With regard to these information assurance requirements, the Army argues that offerors should have understood that the RFP's statement that the agency would consider the "soundness of approach" to "data security" required offerors to address the information assurance requirements set forth in the SOW.

During the hearing, the SSEB Chair conceded that the agency did not consider the RFP to require offerors to address all of the SOW requirements in their proposals.[11] Tr. at 51:13-18. Instead, the SSEB Chair states that he believed that offerors should have understood the term "data security" to mean a broad reference to the Army's information assurance requirements, which any contractor should have understood to be an area of significant concern for the Army that should be addressed in a proposal. Id. at 52:19-53:3.

Although the RFP required offerors to address data security, we do not think that the RFP evaluation criteria reasonably advised offerors as to the agency's view that offerors were required to address in detail the information assurance requirements of the SOW. In this regard, the SOW information assurance requirements address a broad range of information and configuration management process requirements, and require the contractor to assist the government in preparation of compliance and certification documents. RFP, SOW para. 3.2.1.5.[12] Nonetheless, the SSEB Chair testified that he expected offerors to address their understanding of the detailed information assurance requirements in the SOW. Tr. at 36:14-37:16. To the extent that the agency assessed a weakness in DRS's proposal based on a lack of a detailed discussion of data security with regard to the specific information assurance requirements, we think that the evaluation was not reasonable.[13]

In addition, the SSEB Chair argued for the first time at the hearing that DRS's proposed approach to data security was flawed because the features identified in its proposal were related solely to hardware, and did not address any software features concerning data security. This concern was not identified in the revised evaluation report or the agency report. See AR, Tab 18, Revised Evaluation Report, at 19-20; AR at 56-60. To the extent that this concern was first raised in the hearing, is not reflected in the contemporaneous record, and was not raised in the agency's earlier responses to the protest, we do not think this concern reasonably supports the agency's assessment of a weakness here.[14] Boeing Sikorsky Aircraft Support, supra.

6. Other technical factor challenges

In addition to the five issues discussed above, DRS raises a number of additional challenges to the evaluation of its proposal under the technical approach factor. Although our decision does not specifically address all of DRS's remaining arguments, we have reviewed them and conclude that none has merit.

For example, DRS argues that the agency unreasonably assessed a significant weakness regarding the seeker lock requirements of the target sensitivity subfactor. The RFP required offerors to address "how target tracking sensitivity in cluttered environments can be adjusted to replicate current and future threat MANPAD target lock algorithms and implementation of the seeker lock inhibit function." AR, Proposal Instructions, at 3. The MAST unit is required to prohibit a target lock at a distance of greater than 4 kilometers (km), in order to simulate a MANPADS threat. RFP, Performance Specifications para. 3.2.5.1. As part of this requirement, the RFP requires the MAST unit to address the effect of cluttered environments, such as trees and hills, on the ability to achieve a target lock. Id.

The Army assessed a significant weakness for DRS's proposal based on the seeker lock function, in part because "[t]he offeror does not adequately explain how the lock on function is affected by cluttered environments, aircraft orientation, and the seeker lock maximum range of 4 km." AR, Tab 18, Revised Evaluation Report, at 21. In its report on the protest, the agency explains that part of the weakness was based on the DRS's failure to adequately address how the 4 km target lock requirement would be achieved through [deleted]. Supp. AR at 17-18. The SSEB Chair testified that he understood DRS's proposed MAST to calculate distance by [deleted] and that this approach was flawed because it could not distinguish a large target at a long distance from a small target at a short distance. Tr. at 180:14-181:17.

Although the protester contends that its proposal did not rely on [deleted] to determine the range of the target, DRS's proposal appears to state that [deleted] are in fact the method used to calculate range as it relates to the seeker lock function:

The simulation identifies the target location [deleted].

AR, Tab 7A, DRS Technical Proposal, at 21. Furthermore, although the protester notes that a different technology for determining range is discussed elsewhere in its proposal ([deleted]), the protester does not clearly explain how this technology relates to the seeker lock function, or that it is unrelated to [deleted]. See id. at 5, 11. On this record, we think that the agency reasonably relied on the plain language of the protester's proposal in assessing this significant weakness.

B. Path Forward For System Performance and Schedule Factor Evaluation

Next, the protester argues that the agency's evaluation was unreasonable with regard to the evaluation of its proposal as marginal under the system maturity and schedule risk subfactors of the path forward for system performance and schedule factor.[15] As discussed below, we conclude that the agency's evaluation of these subfactors was flawed in two areas.

1. DRS's Air National Guard Contract

DRS argues that the agency did not give meaningful consideration to its experience in performing a contract for a similar item. The protester contends that proper consideration of this experience would have addressed some or all of the numerous weaknesses identified in the agency's evaluation of DRS's proposal under the system maturity and schedule risk subfactors.

The system maturity subfactor stated that the agency would evaluate "the effort needed to develop the prototype into a production capable unit," and also "the soundness of approach, product maturity, and suitability of the Offeror's solution" to be used in the anticipated field environments. RFP, Proposal Instructions, at 3. The schedule risk subfactor stated that the agency would evaluate "the offeror's soundness of approach for meeting [the] schedule." Id.

As relevant here, DRS's proposal stated that it had been awarded a contract by the Air Force Air National Guard in September 2009 for the Joint MANPADS (JMANPADS) system, which the protester describes as "virtually identical" to the MAST system proposed for this solicitation. AR, Tab 7A, DRS Technical Proposal, at 3. The proposal identified common components and technical approaches between the JMANPADS and MAST units. Id. at 3-4. The protester stated that its performance of the JMPANPADS contract demonstrated that "DRS has already started work on MAST, thereby further reducing technical and schedule risk to the Government." Id. at 4.

The agency's initial evaluation of DRS's proposal identified numerous weaknesses regarding system maturity and schedule risk. The agency also cited a weakness based on the lack of detail concerning the JMANPADS contract. In the March 2010 reevaluation, the agency removed the weakness concerning the JMANPADS contract, but did not address the matter further in its evaluation, that is, the agency did not revise or further discuss any of the weaknesses regarding DRS's proposal under the two subfactors.[16]

In its response to DRS's argument that the agency should have reconsidered the weaknesses in light of the potential benefits provided by its work under the JMANPADS contract, the agency argues that DRS's work provided no benefit under the system maturity and schedule risk subfactors because of the short period of time, approximately 45 days, between the award of the JMANPADS contract in September 2009 and the evaluation of DRS's proposal in October 2009. AR at 46. The SSEB Chair confirmed that, during the agency's reevaluation of DRS's proposal in March 2010, the SSEB limited its consideration of DRS's performance of the JMANPADS contract to the 45-day period between early September 2009 and the mid-October 2009 evaluation. Tr. at 293:4-9; 312:11-20. In this regard, the SSEB Chair explained that he believed that, during the March 2010 reevaluation of DRS's proposal, the agency was required to conduct the evaluation as if it were October 2009, and thus ignore the period of time between October 2009 and March 2010, as it related to DRS's performance of the JMANPADS contract. Tr. at 293:4-9.

We are not aware of any general rule that would require an agency to conduct an evaluation during corrective action as if the evaluation were conducted at the same time as the initial evaluation. Instead, we have held in similar circumstances that an agency should consider the information it knows concerning an offeror, even where the information was made known to the agency because of the passage of time during corrective action. See Futures Group Int'l, B-281274.5 et al., Mar. 10, 2000, 2000 CPD para. 148 at 9-10 (agency could not reasonably limit its cost realism evaluation to information known to the awardee at the time of initial proposal submission, thereby ignoring relevant information disclosed for the first time during corrective action); G. Marine Diesel, B-232619.3, Aug. 3, 1989, 89-2 CPD para. 101 at 6 (agency could not reasonably ignore, in its evaluation of past performance, the awardee's performance problems during the reevaluation period).

Here, the RFP stated that offerors would be evaluated for regarding system maturity and schedule risk. As discussed above, the protester stated that its performance of the JMANPADS contract demonstrated that it had already begun work on a contract with similar requirements, thereby advancing its system maturity and reducing schedule risk. To the extent that additional time had passed as a result of the second protest and the agency's second corrective action--from October 2009 and March 2010--we do not think that the agency could reasonably ignore the possibility that the additional time performing the JMANPADS contract could have affected its evaluation under the two subfactors when it chose to reevaluate these matters as part of its corrective action.

The agency also generally argues that the level of detail provided in DRS's proposal concerning the JMANPADS contract does not allow the agency to determine whether the work on that contract would provide a benefit with regard to system maturity and schedule risk. AR at 46-48. The SSEB Chair, however, acknowledged that the agency did not specifically consider during the reevaluation whether the discussion of the JMANPADS contract in DRS's proposal provided any basis to reconsider the specific weaknesses identified by the agency concerning system maturity and schedule risk. Tr. at 322:5-323:3. On this record, we think that the agency did not reasonably evaluate DRS's proposal.

2. Mean time between failure

Next, the protester argues that the agency unreasonably assessed a weakness under the system maturity subfactor based on the asserted lack of an explanation of its method for calculating the mean time between failure (MTBF) for the MAST units. A MTBF calculation refers to the average amount of time between the failure of an item, based on the failure rates of its components.

The RFP proposal instructions and evaluation criteria did not require offerors to provide an MTBF calculation. See RFP, Proposal Instructions, at 3. Instead, the RFP performance specifications stated that MAST units must have a minimum MTBF of 2,000 hours, that is, the units must function at least 2,000 hours on average before failure. RFP, Performance Specifications, para. 3.3.1. The performance specifications also stated that this performance level would be verified to determine that the minimum MTBF level is achieved, and stated that "[t]he contractor shall provide a reliability prediction for the MAST." Id. para. 4.2.10.1.

DRS stated in its proposal that its MTBF was [deleted] hours, which exceeded the 2,000'hour minimum. AR, Tab 7A, DRS Technical Proposal, at 13. DRS's proposal also contained a chart showing data used to calculate the [deleted] MTBF figure. Id. at 14.

The Army assessed a weakness in DRS's proposal under the system maturity subfactor because "[t]he offeror needs to clearly explain how an MTBF of [deleted] was calculated without a production unit." AR, Revised Evaluation Report, at 17. The agency contends that the weakness was properly assessed because the chart in DRS's proposal did not explain how the MTBF figure had been calculated. AR at 60. In his testimony, the SSEB Chair stated that the RFP performance specification, as part of the performance verification process, requires offerors to provide a "reliability prediction for the MAST system," but conceded that the specification does not require the contractor to explain its methodology for calculating the MTBF figure. Tr. at 223:4-7; 227:19-228:4. Nonetheless, the SSEB Chair stated that he believed that because DRS elected to provide the MTBF figure in its proposal, it should have explained the methodology for calculating the figure. Id. at 223:11-13.

Based on our review of the record, we think that the agency's assignment of a weakness, based on the protesters failure to provide an explanation of the method by which it calculated its MTBF figure in its proposal, was unreasonable. Not only was the MTBF information was not required to be included in the offerors' proposals, but the performance specification did not require the contractor to provide information concerning the calculation of MTBF during contract performance.

As the agency notes, the MTBF weakness was identified in the revised evaluation report, but was not specifically cited in the SSD. The agency acknowledges, however, that all strengths and weaknesses identified in the revised evaluation report were reviewed by the SSA. See Supp. AR at 30 n.14. Nevertheless, to the extent the MTBF weakness was part of the agency's evaluation of DRS's proposal under the system maturity subfactor, we think that this evaluation was unreasonable.

C. Price Evaluation

Next, DRS argues that the Army's evaluation of offerors' proposed prices was inconsistent with the terms of the solicitation. We dismiss this argument as untimely.

As relevant here, Section B of the solicitation required offerors to propose "ranged" prices for CLINs for the MAST units associated equipment, based on volume for each base and option year CLIN, as follows:

Range Pricing is as follows:
1-5 units: $____________ per unit

5-10 units: $____________ per unit

11-20 units: $____________ per unit

21-40 units: $____________ per unit

41-60 units: $____________ per unit

RFP sect. B at 2. The RFP stated that "[f]or those items that are range priced, the offeror's price proposal will be evaluated for award based on the sum of the Section B CLINs/SubCLINs quantities and unit price for the high end quantity of all priced ranges." RFP, Proposal Instructions, at 4.

After the initial award, the Army provided DRS a notice of the award to AAI, stating that the "[t]otal contract value, inclusive of options, was $48,162,891." Protest, January 11, 2010, Exh. 6, Agency Notice of Award, Aug. 17, 2009. The protester requested a debriefing following the first award, and also requested in writing that the agency explain how it calculated offerors' prices. The agency responded by citing the RFP price evaluation criteria. Id., Exh. 8, Email from Army to DRS, August 24, 2009. In the debriefing, the Army provided a "Final Proposal Analysis" chart that listed the evaluation ratings for AAI and DRS, and which stated that AAI's "Price" was $48,162,891 AAI, and DRS's "Price" was $[deleted]. Id., Exh. 5, DRS Debriefing Slides, Aug. 25, 2009, at 9.

On August 31, 2009, DRS filed a protest challenging several aspects of the agency's evaluation of its proposals; this protest did not challenge the calculation of offerors' prices. The agency took corrective action in response to the first protest, and amended the solicitation regarding certain technical evaluation issues; although the agency allowed offerors to revise their proposed prices, the agency did not amend the price evaluation criteria. The Army made a new award to AII, and provided a new debriefing. During this second debriefing, the protester for the first time asked the agency to clarify whether it had calculated offerors' prices based on all of the range-priced CLINs. The Army advised that, as discussed below, its calculation of offerors' prices did not comport with DRS's stated understanding of the RFP.

On January 12, 2010, DRS filed a second protest challenging the Army's evaluation of its technical proposal and the calculation of the offerors' prices. The agency argued that the challenge to the price calculations was untimely because the protester knew or should have known how the agency had calculated the offerors' prices based on the August 2009 debriefing. The agency, however, took corrective action a second time to address the protester's supplemental allegations that the record did not show that the agency had evaluated whether offerors' prices were unbalanced. On June 2, 2010, DRS filed the instant protest, again arguing that the agency's calculation of the offerors' prices was inconsistent with the solicitation.

The protester argues that the Army's approach to calculating offerors' prices was inconsistent with the range pricing methodology set forth in Section B of the RFP. The protester contends that the solicitation anticipated a cumulative calculation of prices that takes into account the prices for each range, i.e., the cost of 5 units at the price for the 1-5 unit range, plus the price for 5 units at the price for the 6-10 unit range, up through the final range of 41-60 units. Protest, June 2, 2010, at 20-21.

Rather than following a cumulative approach, the agency calculated offerors' prices by multiplying 60 units by each offeror's price for the 41-60 unit range. AR at 32-33. In effect, the agency's calculation ignored the pricing for all ranges other than the highest volume range. The protester argues that it was prejudiced by the agency's failure to use a cumulative calculation, because [deleted].

The Army argues that the protester was on notice of the agency's calculation method as of the August 2009 debriefing. We agree. In this regard, the agency notice of award advised DRS that the "Total contract value" was $48 million, and the agency's debriefing also identified this price on a chart titled "Final Proposal Analysis." The protester argues that the agency did not specifically state that the "Price" identified in the notice of award and debriefing represented the agency's calculation of all the ranged prices, as required by the solicitation. See Protest, June 2, 2010, at 13-14. In this regard, the protester states that, "[i]ndeed, given what DRS knew at the time, it was clear to DRS that these PRICES could not be the [total evaluated prices], which, given the language of the solicitation, would have been at least three times larger." Protester's Response to Request for Dismissal at 7.

We think that the protester knew or should have known that the price identified in the agency's August 2009 emails and debriefing reflected the agency's understanding of the solicitation's requirements for calculation of offerors' prices. In this regard, the notice of award and debriefing stated that the prices identified were those relied upon by the agency in making the award. Furthermore, the emails and debriefing identified a price for DRS which the protester knew did not comport with the protester's interpretation of the RFP as requiring a cumulative approach to calculating offerors' prices. In light of these considerations, we think the protester was on notice that the agency did not agree with its interpretation of the RFP price evaluation criteria.

We find that this argument is untimely under our Bid Protest Regulations as it was not raised within 10 days of when the basis of protest should have been known.
4 C.F.R. sect. 21.2(a)(2). Because DRS did not file a protest challenging the Army's calculation of offerors' proposed prices within 10 days of the August 2009 debriefing, the protester cannot now timely challenge this issue.[17] Id. In this regard, the fact that the agency made a new selection decision after taking corrective action does not provide a basis for reviving an otherwise untimely issue where, as in this case, the basis of the otherwise untimely protest allegation concerns an aspect of the agency's evaluation which was not affected by the subsequent corrective action. Shaw-Parsons Infrastructure Recovery Consultants, LLC; Vanguard Recovery Assistance, Joint Venture, B-401679.4, et al., March 10, 2010, 2010 CPD para. 77 at 13 n.13.

D. Prejudice

Finally, DRS argues that the agency's selection decision was flawed based on the errors in the evaluation, and that the protester was prejudiced as a result.[18] We think that the errors identified above regarding the evaluation of DRS's proposal under the technical approach evaluation factor and the path forward for system performance and schedule evaluation factor demonstrate that, but for the agency's actions, the protester would have a substantial chance for award. In light of the possibility for improving DRS's evaluation and ratings under these factors, and in light of the protester's lower price, we think that the record shows that the protester was prejudiced by the errors in the agency's evaluation. See McDonald-Bradley, B-270126, Feb. 8, 1996, 96-1 CPD para. 54 at 3; see also, Statistica, Inc. v. Christopher, 102 F.3d 1577, 1681 (Fed. Cir. 1996).

RECOMMENDATION

We recommend that the Army conduct a new evaluation of DRS's proposal, consistent with the issues identified above. With regard to the evaluation regarding system growth and open architecture, data security, USB ports, and MTBF data, the agency should examine the solicitation to determine whether it reflects the government's actual requirements. If the agency concludes that the RFP does not reflect its requirements, the agency should amend the solicitation and solicit revised proposals from the offerors. The agency also should make a new award determination, consistent with this decision.[19] If AAI is not found to offer the best value to the government, the agency should terminate AAI's delivery order.

We also recommend that DRS be reimbursed the costs of filing and pursuing its protest, including reasonable attorney fees. 4 C.F.R. sect. 21.8(d)(1). DRS should submit its certified claims for costs, detailing the time expended and cost incurred, 4 C.F.R. sect. 21.8(f)(1).

The protest is sustained.

Lynn H. Gibson
Acting General Counsel



[1] Although this solicitation was called an RFP, it sought proposals for the issuance of a delivery order under an ID/IQ contract. We use the term RFP and terms associated with this type of solicitation herein, as those were the terms employed by the agency.

[2] As discussed below, the Army also identified a weakness in DRS's proposal concerning the use of USB ports under the system maturity subfactor of the path forward for system performance and schedule factor.

[3] The agency also argues that two Army policy documents bar use of USB connections, which should have put offerors on notice as to the agency's interpretation of the USB provision. In turn, the agency suggests that this would have given rise to a patent ambiguity. However, the policies cited by the Army both provide for exceptions or waivers of the prohibition. See Agency Hearing Exhibit, Army Directive Concerning Removable Flash Media, Feb. 16, 2010, para. 6-7; Army Regulation 25-2, Information Assurance, para. 4'5(a)(6). On this record, we do not think an offeror could have known how the Army was interpreting this solicitation provision.

[4] We note for the record that AAI's proposal states that its proposed MAST unit will [deleted]. See AR, Tab 8a, AAI Technical Proposal, at 2, 4, 19; app. 1, at 3 (proposal provided to GAO, but not to protester). Although the evaluation report and the SSD cite the use of USB ports as a weakness for DRS's proposal, there is no mention of [deleted] in the agency's evaluation record.

[5] A [deleted] is a [deleted] that overlays the view of a MANPADS sensor. DRS's proposed MAST units contains a [deleted] to simulate the technologies associated with particular missiles. See AR, Tab 7A, DRS Technical Proposal, at 21.

[6] A [deleted] is a [deleted] associated with the scanning technology used in a
SA-7 missile.

[7] The agency also argues that the protester inaccurately described its proposed technical approach as addressing "not only today's MANPADS threats but evolving future threats such as the SA-7/14/16/18." AR, Tab 7A, DRS Technical Proposal, at 21. The agency contends that this reference suggests that the protester does not understand that these four missiles are current threats, and that this misunderstanding casts a doubt over the entirety of its technical approach. Supp. AR at 15-16. We do not think that the language cited by the agency reasonably indicates that the protester would not simulate [deleted] scan technology, in light of the proposal's statement that such a feature was in fact provided. In any event, to the extent that the agency believed that the protester was characterizing SA-7/14/16/18 missiles as future threats, DRS's proposal stated that its [deleted] scan simulation addressed "future threats." See AR, Tab 7A, DRS Technical Proposal,
at 5, 23.

[8] We note for the record that it appears that the kinds of future threats discussed by the SSEB Chair in the hearing were not addressed by AAI in its proposal. See AR, Tab 8A, AAI Technical Proposal, at 19.

[9] The protester contends that its proposed MAST units use [deleted] in a "closed system" that does not involve access to external sources of security threats that would normally be associated with [deleted] on a computer with access to the Internet, and that the "closed system" does not involve connecting the MAST unit to an Army network. Protester's Hearing Comments at 7-8, citing AR, Tab 7A, DRS Technical Proposal, at 6.

[10] As discussed above, we conclude that the agency's interpretation of the RFP provision concerning USB ports was unreasonable. We therefore conclude that the reference to USB ports as a weakness concerning data security was also unreasonable.

[11] In this regard, we note that the SOW is a 23-page document. We think that the testimony of the SSEB Chair indicates that it would not have been reasonable to expect offerors to substantively address all of the requirements in these documents in a technical proposal limited to 25 pages. Tr. at 51:13-18.

[12] In its report on the protest, the Army also argues that the cover letter to the solicitation states that "the contractor is required to comply with the stated requirements set forth in the MAST Propos[al] Submission Instructions and Evaluation Criteria, SOW, [performance specifications], [contract data requirements list], and Section B CLINs." AR at 56 (citing RFP amend. 4, Cover Letter, at 1). We do not think that this statement, which addressed a list of both the proposal evaluation criteria and performance requirements, reasonably advised offerors that they had to address all requirements of the SOW in their proposals. Further, as discussed above, the SSEB Chair testified that offerors were not expected to address all areas of the SOW in their proposals. Tr. at 51:13-18.

[13] We note for the record that the awardee's proposal under the technology insertion and open architecture subfactor does not mention data security, and makes two cursory references to information assurance in other sections. See AR, Tab 8A,
at 13, 16, 19. To the extent that the agency believed that offerors were required to submit an approach to data security that discusses a detailed approach to the SOW information assurance requirements, it does not appear to us that the awardee's proposal addresses this requirement.

[14] In response to the SSEB Chair's testimony, the protester notes that its proposal addressed software security issues. See Protester's Hearing Comments at 10-11, citing AR, Tab 7A, DRS Technical Proposal, at 23.

[15] As discussed above with regard to the technology insertion and open architecture subfactor, we conclude that the agency's evaluation of DRS's proposed use of keyed USB ports was unreasonable. The agency also identified this issue as a weakness in its evaluation of DRS under the system maturity subfactor, which we also find unreasonable for the same reasons.

[16] As discussed above, the Army advised our Office that it would take corrective action in response to DRS's second protest by reevaluating offerors' proposals for unbalanced pricing. The agency also elected at that time to reevaluate certain areas of DRS's technical proposal, including the JMANPADS contract.

[17] DRS also argues that the agency's evaluation of offerors' proposed prices for unbalancing was unreasonable. The protester's argument, however, relies on its interpretation of the RFP as requiring a cumulative calculation of prices. See Protest, June 2, 2010, at 22. Because the agency did not follow the protester's interpretation of the RFP in calculating offerors' prices and because the protester's challenge to the calculation of prices is untimely, we will not consider the protester's allegations concerning the agency's unbalanced pricing evaluation.

[18] DRS raises other collateral issues, in addition to those discussed above. We have reviewed all of the arguments raised by the protester, and find that, aside from those specifically mentioned above, none provides a basis to sustain the protest. For example, the protester argues that the agency unreasonably assessed a weakness in its proposal under the recording and playback factor, based on the lack of overlay information in DRS's recording of the engagements. The protester contends that this weakness was unreasonable because, although DRS's recording did not provide this information, the RFP stated that "[i]n instances where requirements are not fully met or exceeded [in the prototype unit], the Offeror shall address the technical approach and schedule risks of incorporating those system changes." RFP, Proposal Instructions, at 3. This solicitation provision, however, is contained in the system maturity evaluation subfactor. RFP, Proposal Instructions, at 3. In contrast, the two subfactors of the recording and playback factor explicitly state that the agency will "evaluate the aircraft engagement Digital Video Recording and Playback for . . . overlay information presentation." Id. at 4. We conclude that the provision cited by the protester in the system maturity subfactor does not apply to the requirements of the two recording and playback subfactors.

[19] DRS also contends that the Army's March 2010 corrective action was improper because, as indicated in the SSEB' Chair's testimony, the agency's reevaluation was not a new review of offerors' technical proposals, but rather a "higher level review" that was intended "to assist [GAO] in making a decision." Tr. at 17:19-18:9. We do not assume, as the protester suggests, that the agency's corrective action was taken merely to enhance its ability to win a subsequent protest. However, the agency should take corrective action that involves a full and meaningful review of the solicitation and the offerors' proposals at least with regard to the issues discussed above. In addition, in view of the apparent discrepancies between the evaluations of DRS and AAI, the Army may wish to revisit its evaluation of the awardee as well.

Apr 22, 2014

Apr 18, 2014

Apr 16, 2014

Apr 15, 2014

Apr 14, 2014

Looking for more? Browse all our products here