Skip to main content

Northrop Grumman Systems Corporation

B-293036.5,B-293036.6,B-293036.7 Jun 04, 2004
Jump To:
Skip to Highlights

Highlights

Northrop Grumman Systems Corporation protests the award of a contract to Raytheon Company under request for proposals (RFP) No. F19628-02-R-0072, issued by the Electronic Systems Center, Air Force Materiel Command, Department of the Air Force, Hanscom Air Force Base, Massachusetts, for the development, delivery, integration, installation, testing, and support of the Block 10.2 Multi-Intelligence (Multi-INT) Core Upgrade for the Distributive Common Ground System (DCGS). Northrop challenges the agency's evaluation of proposals and the source selection decision.

We deny the protest.
View Decision

B-293036.5; B-293036.6; B-293036.7, Northrop Grumman Systems Corporation, June 4, 2004

DOCUMENT FOR PUBLIC RELEASE

The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Matter of: Northrop Grumman Systems Corporation

File: B-293036.5; B-293036.6; B-293036.7

Date: June 4, 2004

Richard A. Sauber, Esq., Deneen J. Melander, Esq., Steven A. Alerding, Esq., and Joseph J. LoBue, Esq., Fried, Frank, Harris, Shriver & Jacobson, for the protester.

Mark D. Colley, Esq., David S. Black, Esq., Stuart W. Turner, Esq., Caitlin K. Cloonan, Esq., and Michele M. Brown, Esq., Holland & Knight, for Raytheon Company, the intervenor.

Clarence D. Long, Esq., Richard C. Bean, Esq., Maj. Lawrence Anderson, Capt. Michael J. Askew, and Capt. Laura K. Koepnick, Department of the Air Force, for the agency.

Linda S. Lebowitz, Esq., and Michael R. Golden, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

Protest is denied where the agency's evaluation of proposals was reasonable and consistent with the terms of the solicitation, and where the solicitation provided that the mission capability, proposal risk, and past performance evaluation factors, when combined, were significantly more important than the cost evaluation factor, the agency reasonably selected for award the firm submitting a higher technically rated, higher cost proposal.

DECISION

Northrop Grumman Systems Corporation protests the award of a contract to Raytheon Company under request for proposals (RFP) No. F19628-02-R-0072, issued by the Electronic Systems Center, Air Force Materiel Command, Department of the Air Force, Hanscom Air Force Base, Massachusetts, for the development, delivery, integration, installation, testing, and support of the Block 10.2 Multi-Intelligence (Multi-INT) Core Upgrade for the Distributive Common Ground System (DCGS). Northrop challenges the agency's evaluation of proposals and the source selection decision.

We deny the protest.

BACKGROUND

The Air Force DCGS is a family of fixed and deployable multi-source groundstation processing systems that support a range of intelligence, surveillance, and reconnaissance systems. The current DCGS system is comprised of stovepiped, proprietary, legacy, platform-based tasking, processing, exploitation, and dissemination (TPED) groundstations and elements which are expensive, as well as difficult, to upgrade and maintain because, among other things, the Air Force is required to contract with original system developers for upgrades and the equipment has reached the end of the life cycle, thus becoming unsupportable. The current modernization process is via numerous individual projects under multiple contracts. Agency Report (AR), Vol. 131, Tab 16A, Source Selection Evaluation Team (SSET) Briefing to SSA and Key Advisors, Jan. 26, 2004, at 5. The purpose of this procurement is to modernize the Air Force DCGS system through a block upgrade effort that will establish a multi-intelligence TPED environment with a robust, high capacity network connecting geographically separated, fixed, and deployable ISR groundstations, thereby enhancing the operation of the DCGS system. Id. at 4. In addition, under this procurement, the contractor will develop a DCGS Integration Backbone (DIB), a framework for upgrading other intelligence systems, which will facilitate data and product sharing with the other uniformed services while protecting service-unique application programs.

As relevant to this protest, the Block 10.2 Multi-INT Core Upgrade is required to be designed and fielded using a maximum of unmodified commercial-off-the-shelf (COTS)/government-off-the-shelf (GOTS) hardware and software with the objective of not using contractor-developed software. AR, Vol. 8, Tab 5S15, Statement of Objectives (SOO), July 9, 2003, at 2. As described in the SOO, an integral part of this procurement requires the contractor to establish a Distributed Ground System Experimental (DGS-X) capability to reduce the risk associated with integrating Air Force multi'INT core and future upgrades and modifications to operational Air Force DCGS sites. This risk reduction will be done by initially integrating the Air Force multi-INT core upgrade system in a DGS multi-INT site-like environment, known as DGS-X, located at Langley Air Force Base, Virginia, in order to support integrated system verifications, user evaluation, and directed government system implementation updates prior to fielding the systems to operational sites. This effort also includes the delivery, training, and support of the integration and operation of the DIB for other services or agencies to utilize as a framework for upgrading their intelligence systems. Id. According to paragraph 3.2.1.2 of the SOO, –DGS-X site acceptance testing shall be successfully completed within in [sic] 12 months of contract award as an objective requirement, and 15 months as a threshold requirement.— Id. at 5.

The RFP (which was amended 15 times) contemplated the award of an indefinite quantity contract with cost-plus-award-fee, fixed-price, labor-hour, and cost'reimbursable line items, for a period of approximately 6 years. The RFP provided that the award would be made to the responsible offeror whose proposal represented the best value to the government, technical evaluation factors, past performance, and cost considered. The RFP contained two technical evaluation factors--mission capability and proposal risk. Each of these technical evaluation factors contained two subfactors (listed in descending order of importance)--architecture and design and integrated processes. The RFP provided that the mission capability, proposal risk, and past performance evaluation factors were equal in importance and that each of these factors was more important than the cost evaluation factor; the RFP also stated that the mission capability, proposal risk, and past performance evaluation factors, when combined, were significantly more important than the cost evaluation factor. Under the RFP, the agency reserved the right to award to an offeror that submitted a higher technically rated, higher cost proposal. AR, Vol. 6, Tab 5K1, RFP amend. 5, M, at 1-2.

Under the mission capability evaluation factor, the agency would evaluate the offeror's understanding of the RFP requirements (included in the technical requirements document (TRD) and the SOO), as well as whether the proposed architecture and design, and implementation thereof, was sound, consistent, and supported by the offeror's proposed interim accomplishments and dates. More specifically, under the architecture and design subfactor, the agency would evaluate, among other things, whether the offeror's approach provides, –as a minimum, an [Air Force] DCGS Block 10.2 Multi-INT Core open architecture, top'level design— and whether the offeror's approach provides –an architecture, infrastructure, and design, including hardware, software, and interfaces that employs the DIB and is scalable, modular, flexible, and accommodates modifications, growth, system upgrades, and site-unique requirements for the [Air Force] DCGS Block 10 Program.— In addition, the agency would evaluate whether the offeror's proposed COTS-based DIB was, among other things, –hardware independent, scalable, modular, flexible, and accommodates growth with commercial standards and best practices—; –[i]mplements open systems/network centric protocols and processes to support DCGS integration—; –[i]s a web-enabled enterprise architecture—; –[i]s internet'protocol based and provides published data services—; –[is] interoperable at the data level and allows service[s] (i.e., Air Force, Navy, Army, Marine Corps) independence yet enables multi-service interoperability—; [p]romotes data sharing across the services . . . using published interfaces—; and [a]llows flexible integration of service'unique . . . applications.— Id. at 2-3.

Under the integrated processes subfactor, the agency would evaluate an offeror's proposal to ensure that the multi-INT core activities and products would provide an –executable, and integrated solution set consistent with [the offeror's] proposed architecture and design, [its] Integrated Master Plan and Integrated Master Schedule,— and with the offeror's statement of work (SOW) that satisfies the TRD and SOO. Id. at 4. For example, the agency would evaluate the soundness of the offeror's proposed approach for planning and implementing technology evolution and/or –fact-of-life— upgrades of COTS and GOTS hardware and software products. Id. at 5. The agency also would evaluate an offeror's program schedule and whether the firm demonstrated a clear understanding of the efforts and activities necessary to develop, integrate, test, build, and sustain the multi-INT core. Id. at 6.

For the architecture and design and integrated processes subfactors under the mission capability evaluation factor, the agency would assign one of four possible color ratings, as follows: red (unacceptable), yellow (marginal), green (acceptable), and blue (exceptional). As relevant here, a green/acceptable rating meant that the offeror met specified minimum performance or capability requirements necessary for acceptable contract performance, while a blue/exceptional rating meant that the offeror exceeded specified minimum performance or capability requirements in a way beneficial to the Air Force. AR, Vol. 131, Tab 16A, SSET Briefing to SSA and Key Advisors, supra, at 52.

Under the proposal risk evaluation factor, the agency would evaluate the risks and weaknesses associated with an offeror's proposed approach, including an assessment of the potential for disruption of schedule, degradation of performance, and the need for increased government oversight, as well as the likelihood of unsuccessful contract performance. In evaluating proposal risk, the agency also would assess the offeror's proposal for mitigating risks and why that approach was or was not manageable. AR, Vol. 6, Tab 5K1, RFP amend. 5, M, at 6.

For the architecture and design and integrated processes subfactors under the proposal risk evaluation factor, the agency would assign one of three possible risk ratings, as follows: low, moderate, or high. As relevant here, a proposal risk rating of low meant that there was little potential for an offeror's proposal to cause disruption of schedule, increased cost, or degradation of performance, while a proposal risk rating of moderate meant than an offeror's proposal could potentially cause some disruption of schedule, increased cost, or degradation of performance. AR, Vol. 131, Tab 16A, SSET Briefing to SSA and Key Advisors, supra, at 54.

Under the past performance evaluation factor, the agency would evaluate an offeror's (and all key or major subcontractors') demonstrated record of contract compliance in supplying products and services meeting the users' needs, including cost and schedule considerations. The evaluation of an offeror's past performance would result in the agency's assignment of one of six possible overall performance risk confidence assessment ratings, as follows: high confidence, significant confidence, confidence, unknown confidence, little confidence, or no confidence. As relevant here, the significant confidence rating meant that the agency had little doubt that the offeror would successfully perform the contract, while the confidence rating meant that the agency had some doubt that the offeror would successfully perform the contract.

Under the RFP, in order to be considered as part of the past performance evaluation, an offeror's past work efforts were required to have been performed –within the past three years.— AR, Vol. 6, Tab 5K1, RFP amend. 5, M, at 6.[2] In assigning an overall performance risk confidence assessment rating, the agency would evaluate only those efforts considered to be –somewhat relevant,— –relevant,— or –very relevant—; these relevancy determinations would be based on the agency's consideration of 11 criteria, as listed in the RFP and generally described here as related to a firm's experience with, and understanding of, the contract requirements, including contract types and contract values. Id.

In addition to evaluating the extent to which the offeror's performance of particular contract efforts met the user's mission requirements, the agency would evaluate the offeror's history of forecasting and controlling costs and of adhering to schedules (including the administrative aspects of performance), the offeror's reasonable and cooperative behavior and commitment to customer satisfaction, and the offeror's business-like concern for the interest of the customer. Further, the RFP stated that where relevant performance records indicated performance problems, the agency would consider the number and severity of the problems and the appropriateness and effectiveness of any corrective actions taken. The RFP also stated that more recent and more relevant performance would have a greater role in the agency's past performance confidence assessment than less recent or less relevant performance. Finally, under the RFP, the agency reserved the right to use not only past performance data provided by the offeror, but also data obtained from other sources, including contractor performance assessment reporting systems, past performance questionnaires, interviews with program managers and contracting officers, and other sources known to the government. Id. at 7.[3]

The agency received proposals from Northrop and Raytheon, and conducted multiple rounds of evaluation and discussions. The final revised proposals of Northrop and Raytheon were evaluated as follows:

Raytheon

Mission Capability

$240.3 million

$267.5 million

AR, Vol. 131, Tab 16A, SSET Briefing to SSA and Key Advisors, supra, at 79.

After making an integrated assessment of the competing proposals in the areas of mission capability, proposal risk, and past performance, the SSA determined that notwithstanding the approximate 11-percent premium associated with Raytheon's proposal, the proposal submitted by Raytheon represented the best value to the government. The SSA did not use past performance as a discriminator in making his source selection decision. In this respect, the SSA believed, based not only on his own personal knowledge of the performance histories of both firms, but also on the underlying past performance evaluations of each firm's recent and relevant contracts, that the proposals of Raytheon and Northrop each merited an overall significant confidence rating. The SSA determined that Raytheon's exceptional, low risk rating for its proposed architecture and design approach represented significant and important operational and technical advantages to the government and would best enable the user to streamline information flow so that operational decisions could be made logically and rapidly. Again, notwithstanding the premium associated with Raytheon's proposal, the SSA selected Raytheon's higher technically rated proposal as representing the best value to the government. AR, Vol. 131, Tab 16C, Source Selection Decision (SSD), Feb. 23, 2004, at 1-10; see also Tr. at 369-95.

Because the reasonableness of the source selection decision, based upon the underlying evaluation of proposals, is at issue in this protest, we quote here a long excerpt from the source selection decision:

III. INTEGRATED ASSESSMENT

. . . .

A. Mission Capability and Proposal Risk Factors

. . . .

1. Subfactor 1, Architecture and Design

. . . .

Raytheon's robust Architecture and Design approach implements a more open architecture accommodating modifications, growth, and system upgrades to a greater extent than Northrop Grumman's Architecture and Design approach. Raytheon's Architecture and Design approach leverages [GOTS] software with JAVA 2 Enterprise Edition (J2EE) framework . . . to provide the required Enterprise Services and integrate Government Furnished Equipment (GFE) components into an open architecture including capabilities to interoperate with external nodes . . . . Raytheon is using [DCGS-Imagery] core components as functionally designed with scripts and [JAVA commercially available software] in its proposed implementation of enterprise Information Manager/Workflow Manager. Subsequently, this results in easier integration when software updates occur with National Imagery and Mapping Agency (NIMA) and other components. . . . The Mission Capability/Proposal Risk Team consensus was that Raytheon's proposed approach maximizes use of COTS to provide an open standards based architecture and provides a high level of flexibility. Raytheon's proposed approach ensures maximum combat capability. This provides an excellent architecture foundation for Block 10.2. This excellent architecture foundation in combination with the nine strengths was assessed as EXCEPTIONAL and LOW risk.

Northrop Grumman's rating for . . . Architecture and Design is ACCEPTABLE, MODERATE risk and was assessed as having three strengths. Northrop Grumman's Architecture and Design approach leverages GOTS software with J2EE framework to provide the required Enterprise Services and [to] integrate GFE components into an open architecture including capabilities to interoperate with external nodes . . . . Three strengths were identified in Northrop Grumman's proposal for Subfactor 1, Architecture and Design. . . . [With respect to] Northrop Grumman's [12] proposed –value added strengths— submitted as part of [its] Final Proposal Revision[,] . . . the Mission Capability/Proposal Risk Team considered them not to be strengths. . . . Northrop's proposed Architecture and Design approach [was assessed] as providing an adequate architecture foundation, i.e., it met requirements. The combination of an adequate architecture foundation and the three strengths was assessed . . . as being acceptable but not exceptional.

The Mission Capability/Proposal Risk Team determined that the reuse of existing Exploitation Workflow process in conjunction with a new Enterprise Workflow manager (Oracle 9iAS) unnecessarily increases complexity associated with developing and maintaining workflow processes across the enterprise. As Northrop proposed, both the Exploitation Workflow manager and the Enterprise Workflow manager would provide enterprise wide services. These two aspects of workflow manager along with mirroring and duplicating functionality provided by GFE raises issues of database synchronization/integrity, software modules with similar functionality potentially competing for system resources, synchronizing or phasing out of duplicated functionality to accommodate modifications, growth, and system upgrades. As GFE components, such as the Imagery Product Library (IPL) and Imagery Exploitation Support System (IESS), are upgraded during the timeframe of DCGS Block 10.2 site installs, RMTS [Record Message Traffic Subsystem] and other middleware are likely to need additional changes to accommodate the updated GFE components. Northrop Grumman's approach to use a private Imagery Server vice NIMA's IPL increases system complexity as NIMA releases version upgrades to its DCGS-I Core Components. The limited flexibility to accommodate modifications, growth, and system upgrades would likely result in required changes to the existing proposed Exploitation Workflow process that could potentially have a rippling effect impacting more than one software module. Periodically updating and/or phasing out GOTS middleware could potentially result in some disruption of schedule leading to increased cost and/or degradation of performance. As part of Northrop's Final Proposal Revision, Vol II page 75 was updated to state[,] –It [Exploitation Workflow Manager] is essentially a GUI [Graphical User Interface] . . . similar to NIMA's Enhanced Analyst Client (EAC), that works with IESS. . . . If the government prefers EAC to the proposed exploitation GUI, it can be integrated with no additional risk.— Abbreviating the substantial concerns discussed in the ENs [evaluation notices] to a GUI issue and offering to replace a GUI did not address or mitigate the Mission Capability/Proposal Risk Team's concerns. Northrop Grumman's Architecture and Design approach is modular and flexible but it is not sufficiently flexible to support all planned GFE upgrades in a plug and play fashion. A MODERATE risk therefore exists with Northrop Grumman's Architecture and Design approach because it may not easily accommodate modifications, growth, and system upgrades.

2. Subfactor 2, Integrated Processes

. . . .

Northrop proposes sound processes based on Capability Maturity Model Integration (CMMI) Level 4 for Software and Level 3 for System Engineering. These processes address planning, scheduling, executing, and tracking activities to develop, integrate, test, field, and support the Air Force . . . Multi-INT Core. . . . Northrop Grumman is a Defense Intelligence Agency (DIA) designated Security Agent, which ensures high confidence for successful security certification and accreditation. Northrop Grumman proposes a 12 to 15-month schedule through DGS'X Site Acceptance Testing (SAT). The SSET's assessment of Northrop's proposed 12-month SAT schedule was that it was unlikely to be achieved, however, a 15-month schedule was possible. . . . Northrop Grumman re-uses existing middleware and interfaces. As software modules are pulled apart and then re'integrated with enterprise capabilities, unanticipated changes may be required as understanding of the effort improves.

. . . .

Raytheon proposes sound processes based on CMMI Level 3 for Software and System Engineering. These processes address planning, scheduling, executing, and tracking activities to develop, integrate, test, field, and support the Air Force . . . Multi-INT Core. . . . Raytheon proposes a 14-month schedule through DGS-X SAT. Raytheon also proposes developing new code for middleware and some interfaces, which may involve unanticipated changes as understanding of the effort improves.

The Mission Capability/Proposal Risk Team performed an independent analysis for each Offeror's proposed schedule based on the Offeror's software sizing estimate, estimated productivity rates, software schedule, and the level of effort estimate . . . . While there were differences between the Offerors' proposed productivity rates, these rates were deemed to be within industry standards. The results of the Team's analysis, based on each Offeror's productivity rate and sizing estimate, indicates that the software development efforts proposed by both Offerors are possible within their proposed software efforts. However, Team members were concerned that both proposed DGS-X schedules had a high degree of parallel activity, including parallel development efforts, and integration and test in parallel with software development with little to no time to accommodate any delays, i.e., reworks and regression testing. The Team was also concerned that initial software estimates are often underestimated. For these reasons, both Northrop Grumman's and Raytheon's Integrated Processes were assessed as MODERATE Risk with neither approach better than the other.

. . . .

B. Performance Risk Assessment Group

. . . .

Each of the Offerors' teams demonstrated the skill and experience necessary . . . to receive . . . a final overall rating of –Significant Confidence— based upon their performance on recent and relevant contracts in accordance with the evaluation criteria . . . . I considered all the information presented to me . . . in making my decision regarding performance risk and my own past experience with both offerors truly supports the [Significant Confidence] rating assigned . . . defined [as] –Based on the offeror's performance record, little doubt exists that the offeror will successfully perform the required effort.— Since my conclusion is that both offerors have Significant Confidence, Past Performance is not a discriminator in my decision.

C. Cost

. . . .

IV. SUMMARY

. . . .

Although Raytheon had a higher [evaluated cost], [its] EXCEPTIONAL rating in Subfactor 1, Architecture and Design[,] when combined with [its] ACCEPTABLE rating in Subfactor 2, Integrated Processes, and [its] overall Significant Confidence rating in Past Performance provide the best overall benefit to the Government. The cost differential of approximately 11% was outweighed by the benefits of Raytheon's EXCEPTIONAL rating and LOW risk in the Architecture and Design Subfactor in the Mission Capability and Proposal Risk factors. Raytheon's EXCEPTIONAL, LOW risk Architecture and Design approach will best enable my customer to streamline information flow so that operational decisions are made logically and rapidly[,] providing the best value to the Government, and it is well worth the premium. Taken together, all the evaluated technical strengths Raytheon proposes offers significant and important operational and technical advantages to the Government and represents the best value to the Government.

AR, Vol. 131, Tab 16C, SSD, supra, at 5-10.

TECHNICAL EVALUATION

As a preliminary comment, we note that while Northrop raises a number of issues challenging the reasonableness of the agency's technical evaluation, Northrop provides no meaningful basis for our Office to question the reasonableness of the agency's conclusions regarding the technical merits of its proposal. In fact, in a number of instances, Northrop's arguments amount to no more than mere disagreement with the results of the agency's technical evaluation. Northrop has not shown that the agency's evaluation was unreasonable or otherwise not in accordance with the terms of the RFP. To place the evaluation of Northrop's technical proposal in context, we point out that the agency characterized its concerns with Northrop's proposal as weaknesses, but not significant weaknesses, inadequacies, or deficiencies.[4] The agency also noted strengths in Northrop's proposal, defined as significant, outstanding, or exceptional aspects of an offeror's proposal that have merit and exceed the specified performance or capability requirements in a way beneficial to the Air Force. Id.

The evaluation of technical proposals is primarily the responsibility of the contracting agency; the agency is responsible for defining its needs and the best method of accommodating them and must bear the burden of any difficulties arising out of a defective evaluation. Raytheon Co., B-291449, Jan. 7, 2003, 2003 CPD 54 at 7. In reviewing an agency's evaluation and source selection decision, we will not reevaluate the proposals; we will only review the evaluation to determine whether the evaluation was reasonable and consistent with the stated evaluation criteria and with applicable statutes and regulations. Id. A protester's disagreement with the agency's judgment is not sufficient to establish that the agency acted unreasonably. Id.

Alleged Value-added Capabilities

Northrop complains that while the agency recognized three strengths for its proposal under the architecture and design subfactor, the agency failed to recognize as strengths 12 other value'added capabilities included in the firm's final revised proposal. Northrop characterizes these capabilities as enhancements that should have been evaluated as strengths because, in Northrop's view, they exceeded specified performance or capability requirements in a way beneficial to the Air Force. Protest, Mar. 2, 2004, at 52-53.[5] The agency, however, did not share Northrop's view. In its report, the agency provided a matrix listing the 12 TRD requirements and the corresponding language from Northrop's proposal which the firm relies upon to argue that these 12 items should have been evaluated as strengths. The agency goes on to explain why these 12 alleged value-added capabilities were not evaluated as strengths. Id. at 23-26. In its comments on the agency report, Northrop responds to the agency's position concerning only two of these items. Protester's Comments, Apr. 19, 2004, at 67-69.[6]

More specifically, the TRD required that the –system shall support six simultaneous MASINT [Measurement and Signature Intelligence] users at each Core Location— and that the –system shall support two simultaneous MASINT users at each Sentinel Falconer Remote location.— AR, Vol. 8, Tab 5S16, Air Force TRD, July 9, 2003, at 83. By proposing to support –at least [deleted]— simultaneous MASINT users at each core location –[deleted]— and by proposing to support –at least [deleted]— simultaneous MASINT users at each Sentinel Falconer Remote location –[deleted],— Northrop argues that it proposed to exceed these two TRD requirements in a way beneficial to the Air Force for which two strengths should have been recognized. Protester's Comments, supra, at 68. The agency explains, however, that it did not consider these items to be strengths because Northrop only committed to meet, but not to exceed, the stated requirements by proposing to support, respectively, the six and the two simultaneous MASINT users. AR, Vol. 131, Tab 16B, PAR, supra, at 25-26. In other words, we think it was reasonable for the agency to conclude that Northrop's use of the phrase –at least— guaranteed nothing more than that the agency would receive from Northrop the required minimums in terms of MASINT support. Moreover, the agency points out, and Northrop does not dispute, that Raytheon's design also provided for the referenced MASINT support on any workstation in the local network. AR, Vol. 130A, Tab 2, Contracting Officer's Statement, at 201; Protester's Comments, supra, at 68. The record shows that neither Northrop nor Raytheon received strengths for exceeding these two TRD requirements involving MASINT support. On this record, we have no basis to question the reasonableness of the agency's decision not to consider Northrop's alleged value-added capabilities as strengths since Northrop only committed to meet, but not to exceed, the stated TRD requirements.

Site Acceptance Testing

Northrop next argues that the agency misevaluated as a weakness its schedule for site acceptance testing for the DGS-X prototype.[7]

As described above, the SOO required the contractor to establish a DGS-X capability to reduce the risk associated with integrating Air Force multi-INT core and future upgrades and modifications to operational Air Force DCGS sites. This risk reduction will be done by initially integrating the Air Force multi-INT core upgrade system in a DGS multi-INT site-like environment, known as DGS-X. Again, the SOO stated that –DGS-X site acceptance testing shall be successfully completed within in [sic] 12 months of contract award as an objective requirement, and 15 months as a threshold requirement.— AR, Vol. 8, Tab 5S15, SOO, supra, at 5.[8]

As part of its final revised proposal, Northrop submitted a model contract that contained an SOW.[9] Paragraph 3.6.2.1.4 of Northrop's SOW, titled –DGS-X Delivery,— stated, in relevant part, as follows: –[deleted].— AR, Vol. 51, Tab 6B26, Northrop's Final Revised Proposal, Model Contract, SOW, at 42. This language in Northrop's final revised proposal is a complete restatement of the SOO requirement addressing the timeframe for site acceptance testing, as quoted above. The agency evaluated Northrop's proposal as containing a commitment to meet, but not to exceed, the 15'month threshold requirement in the SOO for site acceptance testing because, in the agency's view, Northrop did not clearly or unambiguously commit in its proposal to complete this testing in 12 months. AR, Vol. 131, Tab 16B, PAR, supra, at 33. Northrop argued in its protest that it committed to meeting the 12'month objective requirement contained in the SOO. Protest, supra, at 40. Northrop maintains that it did not commit to achieving only the 15-month threshold requirement and that the agency's evaluation in this regard was unreasonable. Protester's Comments, supra, at 37.

Here, based on the language in Northrop's final revised proposal, we believe the agency reasonably concluded that Northrop was committed to meeting the 15-month mandatory threshold requirement for completion of site acceptance testing. In this regard, by merely parroting back the SOO language, Northrop did not expressly and clearly commit in its proposal to complete DGS-X site acceptance testing within the desired 12-month period. Hence, in our view, it was not unreasonable for the agency to read Northrop's proposal as not providing a firm commitment to meet the 12'month requirement. In fact, the agency's evaluation of the timetable commitment by Northrop to complete site acceptance testing is supported by other language in Northrop's proposal where the firm explains, for example, that

[deleted]

AR, Vol. 42, Tab 6B17, Northrop's Cost/Technical Proposal, Aug. 15, 2003, Vol. II, Part 1 of 2, 2.6, Integrated Master Plan, at 14; see also Protester's Comments, supra, at 35-36.

In our view, Northrop's position that it firmly committed to meet the 12-month desired objective requirement as set forth in the SOO is belied by the language in the firm's proposal, quoted above, where it explains how, if there are any changes and delays after contract award, it still would be able to comply with the mandatory 15'month minimum threshold requirement. On this record, we believe the agency reasonably concluded that Northrop met, but did not exceed, the SOO requirement for completion of DGS-X site acceptance testing.[10]


Re-use of Software

Northrop stated in its proposal that –Workflow management capabilities [had to] be distinguished at two levels in [its proposed] architecture.— AR, Vol. 10, Tab 6A7f3, Northrop's Response to Evaluation Notice, July 14, 2003, at 2. The two levels involved –Enterprise-level Workflow Management— and –Exploitation Workflow.— Id. Northrop challenges the agency's determination that a weakness in its proposal involved the firm's proposed re-use of its existing Exploitation Workflow module. The agency believed that the re-use of this existing software would increase the complexity associated with developing and maintaining workflow processes across the enterprise, which Northrop proposed to manage using an Oracle workflow manager, a commercially available software. As explained at the hearing, –Part of the [agency's] concern was, if you have two things, keeping them in sync is going to be a little bit more complex at times. It's the old [']the more moving parts you have, the more complex things can be.[']— Tr. at 264. As a result, the agency concluded that Northrop's proposed architecture would not easily accommodate modifications, growth, and system upgrades in a plug-and-play fashion. AR, Vol. 131, Tab 16B, PAR, supra, at 17; see also Tr. at 244-89.

In its final revised proposal, Northrop explained why, in its view, the agency's concerns regarding this matter were misplaced. Northrop stated as follows:

Use of our existing –Exploitation Workflow— module does not increase complexity associated with developing and maintaining workflow processes across the enterprise . . . .

. . . .

The numerous interchanges on workflow management have apparently created a perception of –complexity associated with developing and maintaining workflow processes across the enterprise— resulting in a –moderate— proposal risk for architecture and design. This perceived complexity relates to our use of the phrase –Exploitation Workflow— to name an existing exploitation application.

In our proposal, the –Exploitation Workflow— application is a misnomer resulting from historical (internal) naming conventions. This application does not do workflow management as defined and performed by today's commercially available tools. It is essentially a GUI . . . similar to NIMA's Enhanced Analyst Client (EAC), that works with IESS.

. . . .

We proposed our existing exploitation GUI because it has value-added capabilities beyond those offered by EAC today. . . . If the government prefers EAC to the proposed exploitation GUI, it can be used with no additional risk or training required. EAC is already available in our solution.

Selection of the exploitation GUI has no impact on enterprise-level workflow, which is provided solely by Oracle Workflow Manager.

AR, Vol. 52, Tab 6B27, Northrop's Final Revised Proposal, Vol. II, Mission Capability, Sept. 25, 2003, at 75; Tr. at 283-89.

Based on the above-quoted language from its final revised proposal, Northrop contends that it –categorically offered— to remove the Exploitation Workflow module from its proposed architecture and to use NIMA's EAC. Protest, supra, at 68; Protester's Comments, supra, at 71-73. We believe, however, that the agency reasonably concluded, based on this quoted language, that there was no such –categorical offer— from Northrop which may have alleviated the agency's concerns with the firm's proposed architecture in terms of accommodating future modifications, growth, and system upgrades in a plug-and-play fashion. In fact, Northrop concedes that it –left this choice to the Air Force. However, Northrop Grumman firmly intended, and firmly committed, to using EAC in place of Exploitation Workflow if the Air Force desired that solution.— Protester's Final Comments, Apr. 29, 2004, Declaration of Northrop's Director of Air Force Programs, at 3. Notwithstanding Northrop's disagreement with the agency's position, based on our review of the entire record, we have no basis to question the reasonableness of the agency's continuing concerns with Northrop's proposed architecture and its assignment of a moderate risk rating to Northrop's proposal in this area.

PAST PERFORMANCE EVALUATION

Northrop challenges the overall significant confidence rating assigned to Raytheon's proposal for the past performance evaluation factor. In this respect, Northrop contends that the agency ignored Raytheon's –extraordinarily poor— record of past performance under recent and relevant contracts, primarily focusing on Raytheon's performance of the Tactical Exploitation Group (TEG) delivery order for the Marine Corps. Protest, supra, at 11. Northrop argues that had Raytheon's performance of this delivery order been properly evaluated, Raytheon's proposal should have received no higher than an overall confidence rating for the past performance evaluation factor. Protester's Comments, supra, at 17.

More specifically, the TEG delivery order was issued by the Air Force to Raytheon on April 30, 1997, and required Raytheon to produce a COTS-based system to provide a tactically transportable capability to the Marine Corps for the receipt, processing, exploitation, dissemination, and archiving of primary tactical and theater, as well as secondary national, imagery. Although the Marine Corps was the user activity, the Air Force was the contracting activity and was responsible for determining whether Raytheon's performance of the delivery order was in accordance with the applicable technical requirements.

The record shows, based on past performance questionnaires and follow-up information furnished by Marine Corps personnel, that the Marine Corps, as the user activity, was not satisfied with Raytheon's performance of the TEG delivery order. For example, Marine Corps personnel reported, in great detail, that Raytheon encountered cost overruns and delays in delivery; these individuals also expressed their views that Raytheon's deliverables did not meet the mission needs of the Marine Corps. Marine Corps personnel further reported that the TEG system was removed from Raytheon's contract and ultimately was redesigned by another contractor (which happened to be Northrop). Marine Corps personnel finally reported that, if given the choice, they would not award to Raytheon again. AR, Vol. 133, Tabs 19E5-E7, Marine Corps References and Questionnaires for Raytheon's Performance of the TEG Delivery Order.

In contrast to what was reported by Marine Corps personnel, the record shows that the Air Force, as the contracting activity, was satisfied with Raytheon's performance of the TEG delivery order when the firm's performance was evaluated in light of the applicable technical requirements. In this regard, and consistent with the terms of the RFP here, the record contains two relevant Contractor Performance Assessment Reports (CPAR) for Raytheon, each signed by an Air Force TEG program manager (the same individual signed both CPARs), which address Raytheon's performance of the TEG delivery order in the areas of technical quality of product (product performance, systems engineering, software engineering, logistics support/sustainment, product assurance, and other technical performance); schedule; cost control; and management (management responsiveness, subcontract management, and program and other management).[11] For each of these areas, Raytheon received one of five possible adjectival/color ratings, as follows: unsatisfactory (red), marginal (yellow), satisfactory (green), very good (purple), and exceptional (blue). These adjectival/color ratings were supported by detailed narratives.

Under the CPAR covering the period from April 30, 1999 through April 29, 2000, for which only 57 days fell within the relevant 3-year period, Raytheon received 1 exceptional (blue) rating for management responsiveness; 1 very good (purple) rating for product assurance; and 10 satisfactory (green) ratings for technical quality of product, product performance, systems engineering, software engineering, logistic support/sustainment, schedule, cost control, management, subcontract management, and program and other management. The Air Force TEG program manager reported that he –probably would— award to Raytheon again if given the choice. (In completing the CPAR, the program manager had five choices to select from--definitely would not, probably would not, might or might not, probably would, or definitely would--in expressing the likelihood, if given the choice, of whether he would award another contract to Raytheon.) Completed Raytheon CPAR, Aug. 2000.

Under the CPAR covering the period from April 30, 2000 through December 31, 2000, a period completely within the relevant 3-year period, Raytheon received 2 very good (purple) ratings for software engineering and management responsiveness; 9 satisfactory (green) ratings for technical quality of product, product performance, systems engineering, logistic support/sustainment, product assurance, schedule, management, subcontract management, and program and other management; and 1 marginal (yellow) rating for cost control.[12] The narratives in this CPAR included references to some of Raytheon's performance problems related to the TEG delivery order.[13] The Air Force TEG program manager reported that he –probably would— award to Raytheon again if given the choice. Completed Raytheon CPAR, Mar. 2001.

Thus, to the extent Raytheon's performance of the TEG delivery order contributed to the agency's assignment here of an overall significant confidence rating to Raytheon's proposal for the past performance evaluation factor, the record shows that the agency was aware that the Marine Corps, as the user activity, was not satisfied with Raytheon's performance of this delivery order,[14] while the Air Force, as the contracting activity for the TEG delivery order, concluded that despite the concerns of the Marine Corps, Raytheon's performance was nonetheless in compliance with the applicable technical requirements. In this regard, the two relevant CPARs show that Raytheon received all satisfactory and higher (very good and exceptional) ratings for all evaluated items, with just one exception, that being a marginal rating for cost control. In our view, the single marginal rating did not necessarily require the agency to assign less than an overall significant confidence rating to Raytheon's proposal for the past performance evaluation factor. Furthermore, we think it is significant that the Air Force TEG program manager who completed these two CPARs stated that he probably would award another contract to Raytheon. Therefore, on this record, we believe that the agency's decision to assign an overall significant confidence rating to Raytheon's proposal for the past performance evaluation factor was reasonable. Although Northrop believes that in evaluating Raytheon's past performance, the agency should have accorded more weight to the dissatisfaction expressed by Marine Corps personnel, as described above, we conclude that this belief does not provide a basis to question the reasonableness of the agency's decision to assess Raytheon's performance as favorable overall such that the agency had little doubt that Raytheon would successfully perform the RFP requirements here.[15]

TRADEOFF DECISION

Northrop challenges the SSA's decision to pay a cost premium in selecting Raytheon's higher technically rated proposal for award.

In a negotiated procurement, where the solicitation does not provide for award on the basis of the lowest cost, technically acceptable proposal, an agency has the discretion to make an award to an offeror with a higher technical rating and a higher cost where it reasonably determines that the cost premium is justified and the result is consistent with the evaluation criteria. ACC Constr. Co., Inc., B'288934, Nov. 21, 2001, 2001 CPD 190 at 5-6.

Here, the RFP stated that the mission capability, proposal risk, and past performance evaluation factors, when combined, were significantly more important than the cost evaluation factor in determining the proposal representing the best value to the government. The RFP also stated that the agency reserved the right to award to an offeror that submitted a higher technically rated, higher cost proposal. The record shows that in making his best value determination, the SSA was aware that Raytheon's evaluated cost was higher (by approximately 11 percent) than Northrop's evaluated cost and that both Raytheon and Northrop received the same significant confidence ratings for past performance (thereby neutralizing past performance in terms of the best value determination). However, and as more completely set forth above, the SSA concluded that Raytheon's proposed technical approach reflected a more open architecture than Northrop's proposed technical approach in terms of accommodating modifications, growth, and system upgrades in a plug-and-play fashion. The SSA believed that Raytheon's exceptional, low risk architecture and design approach would –best enable [his] customer to streamline information flow so that operational decisions are made logically and rapidly[,] providing the best value to the Government, and it is well worth the premium.— AR, Vol. 131, Tab 16C, SSD, supra, at 10. In this respect, at the hearing, the SSA testified that by awarding to Raytheon, the agency –was going to get an architecture that the warfighter could fight with, that streamlines information flow so the decision is made logically, quickly and rapidly and, based on [his] lessons [learned] from [Operation Iraqi Freedom], it was an important component. This is why [he] chose [Raytheon].— Tr. at 382. The SSA continued by stating that –the architectural approach that Raytheon was offering was an opportunity to ensure the maximum combat capability.— Id. at 394. While Northrop disagrees with the SSA's decision to pay a premium to Raytheon for its technically superior approach, Northrop has failed to show that the SSA's best value determination was unreasonable or otherwise not in accordance with the terms of the RFP.

The protest is denied.[16]

Anthony H. Gamboa

General Counsel



[1] This protest follows the agency's decision to take corrective action in response to an earlier protest filed by Northrop challenging the evaluation of proposals and the award to Raytheon. In the earlier protest, our Office conducted a hearing in which members of the technical, past performance, and cost evaluation teams, as well as the source selection authority (SSA)--an Air Force colonel who serves as the Systems Program Director of the Air Force's Intelligence, Surveillance, Reconnaissance (ISR) Program Office--testified. In this decision, references to a hearing transcript (Tr.) relate to the hearing conducted by our Office prior to the agency's decision to take corrective action. References in this decision to the evaluation and source selection documents, as well as to the written submissions from the parties, are to materials that were filed after corrective action was taken.

[2] Past performance proposals were originally submitted on March 3, 2003. In taking corrective action, the agency considered the 3-year period to run from March 3, 2000 through March 3, 2003.

[3] This decision does not address the reasonableness of the agency's evaluation of the offerors' costs because the record is clear that any alleged errors in the cost evaluation could not have affected the source selection decision here. The record shows that Raytheon's proposed cost was approximately 12 percent higher than Northrop's. As shown below, Raytheon's evaluated cost was approximately 11 percent higher than Northrop's. While Northrop contests the evaluated cost adjustments, we conclude that, on this record, none of Northrop's allegations calls into question the reasonableness of the agency's source selection decision.

[4] A weakness is defined as a flaw in the proposal that increases the risk of unsuccessful contract performance, while a significant weakness is defined as a flaw that appreciably increases the risk of unsuccessful contract performance. An inadequacy is defined as an aspect or omission from an offeror's proposal that may contribute to a failure in meeting specified minimum performance or capability requirements. A deficiency is defined as a material failure of a proposal to meet a government requirement or a combination of significant weaknesses in a proposal that increases the risk of unsuccessful contract performance to an unacceptable level. AR, Vol. 131, Tab 16A, SSET Briefing to SSA and Key Advisors, supra, at 53.

[5] The agency previously did not document its assessment of these alleged value'added capabilities because the agency did not consider them to be strengths. In taking corrective action, the agency documented its previous evaluation of these items. AR, Vol. 131, Tab 16B, Amended Proposal Analysis Report (PAR), Feb. 5, 2004, at 23.

[6] Northrop provides no meaningful response to the remaining items. For example, Northrop does not rebut the agency's position that providing for up to [deleted] months of on-line storage did not represent added value because the operational concept requires products more than 1 month old to be moved to more permanent storage with other Air Force organizations and mission partners. As another example, Northrop does not rebut the agency's position that supporting on-line storage of up to [deleted] navigation plans per ISR platform type did not represent added value because the requirement for 30 navigation plans was generated from expected worldwide usage and, therefore, already anticipated the potential for growth. AR, Vol. 131, Tab 16B, PAR, supra, at 24-25.

[7] The agency also assigned a weakness to Raytheon's proposal for proposing to complete DGS-X site acceptance testing within 14 months.

[8] The 15-month threshold reflected a mandatory minimum that the contractor would be required to satisfy; the 12-month objective reflected a non-mandatory, desired enhancement exceeding the threshold.

[9] The RFP required an offeror to submit an SOW addressing the requirements, tasks, concepts, and objectives defined in the SOO and meeting the RFP requirements. The RFP provided that the successful offeror's SOW would be incorporated into the contract. The RFP advised that the offeror's SOW –should not be a restatement of the SOO,— but should reflect the added value of the offeror's proposed efforts and those tasks required to manage the contracted effort, to deliver DCGS systems, and to provide required data and support. AR, Vol. 6, Tab 5K1, RFP amend. 5, L, at 19. The RFP instructed that if an offeror used the SOO as the baseline document to develop its SOW, the offeror was to provide a redlined version showing changes from the SOO to the offeror's SOW.

[10] In related matters, Northrop also challenges the reasonableness of the agency's evaluation of its proposal as being of moderate risk. For example, the record shows that the agency recognized two risk mitigators in Northrop's proposal--the firm's Head Start Program (generally described as Northrop's completion of [deleted] schedule work [deleted]) and the firm's use of CMMI Level 4 practices and procedures regarding software development and integration. While Northrop disputes the agency's characterization of these two items as risk mitigators, as opposed to proposal strengths, we believe that Northrop's position amounts to no more than disagreement with the agency's evaluation where, under the terms of the RFP, the agency was permitted to assess an offeror's proposal for mitigating risks. Although Northrop may believe that these items are more than simply risk mitigators, this does not provide a basis to question the reasonableness of the agency's evaluation where the record shows that the agency gave meaningful consideration to these two aspects of Northrop's proposal.

As another example, Northrop challenges the agency's assignment of the same moderate risk rating to its proposal and to Raytheon's proposal involving each firm's software development efforts. Northrop essentially argues that its proposal should have been regarded as having a lower risk than Raytheon's proposal because it proposed to develop fewer software lines of code than did Raytheon. The agency report reflects that the development of software is not a precise science (as evidenced by industry articles cited by both the agency and Northrop) and is dependent upon a number of factors and variables, such as the type of software being developed and whether existing software is being modified versus whether software is being developed from scratch. The agency determined that the specific software development approaches, including software productivity rates, proposed by Northrop and Raytheon were consistent with industry standards and each presented risks, for example, involving the underestimation of initial software estimates and the failure to build in sufficient time to accommodate delays. Again, other than disagreeing with the agency's assessment, and providing no expert analysis of its own proposed approach versus Raytheon's proposed approach to software development, Northrop has failed to show that the agency unreasonably evaluated the risks associated with each firm's respective approach to software development. AR, Vol. 131, Tab 16B, PAR, supra, at 35-36; see also Tr. at 379-82.

[11] We note that the record contains two additional CPARs related to Raytheon's performance of the TEG delivery order that pre-date March 3, 2000, the beginning of the relevant 3-year period under the terms of the RFP. We also note that during the period when the agency was implementing corrective action, it discovered that the CPARs for Raytheon's performance of the TEG delivery order were not in its files. Raytheon, therefore, complied with the agency's request to provide replacement copies of the completed CPARs. Contrary to Northrop's suggestion, the agency's limited request that Raytheon provide copies of documents missing from the agency's files did not constitute the reopening of discussions with Raytheon.

[12] For each of the CPARs described above, Raytheon also received a –not applicable— rating for the area addressing other technical performance.

[13] In its comments on the agency report, Northrop acknowledges that this CPAR does describe some of Raytheon's performance problems related to the TEG delivery order. Protester's Comments, supra, at 16.

[14] Without going into detail, we note that there is information in the record that some of the problems Raytheon encountered in performing the TEG delivery order were, for example, attributable to the government and to the need for Raytheon to use an interim solution since a component being developed by Northrop could not be timely furnished to Raytheon as government-furnished equipment.

[15] In assigning an overall significant confidence rating to Raytheon's proposal for the past performance evaluation factor, the agency also considered Raytheon's performance of an on-going contract for NIMA, referred to as the Information Dissemination Services--Direct Delivery (IDS-D) contract. (This contract was not listed by Raytheon in its proposal, but rather (and consistent with the terms of the RFP), was brought to the attention of the past performance evaluators by the chairperson of the source selection evaluation team.) The record shows that for the first three (of eight) performance periods of the IDS-D contract--June 28, 2001 through September 30, 2001, October 1, 2001 through September 30, 2002, and October 1, 2002 through September 30, 2003--Raytheon received award fees of 98.2 percent, 93 percent, and 99 percent, respectively. ([deleted].) While Northrop quibbles with the agency's characterization of the IDS'D contract as very relevant, based on the agency's assessment that this contract satisfied at least 9 of the 11 relevancy criteria listed in the RFP, Northrop's primary complaint is that the agency afforded too much weight to Raytheon's performance of the IDS'D contract where there has not been final system delivery by Raytheon. Northrop asserts that the agency considered Raytheon's IDS-D contract in order to offset Raytheon's alleged poor performance of, for example, the TEG delivery order. Protester's Comments, supra, at 21. We disagree.

First, as discussed above, the agency reasonably considered Raytheon's performance of the TEG delivery order in assigning an overall significant confidence rating to Raytheon's proposal for the past performance evaluation factor. Second, Northrop does not dispute that the RFP did not require final system delivery in order for the agency to consider an offeror's performance under a particular contract, and that Raytheon successfully had passed a number of development milestones under the IDS-D contract, such as design reviews and initial product testing, as reflected by the high award fees. Tr. at 173-87. Accordingly, we find no merit to Northrop's contention that the agency unreasonably relied upon Raytheon's performance of the IDS-D contract to offset other less favorable past performance by Raytheon. We further note that at the hearing, the SSA testified that even if Raytheon's performance under the IDS-D contract had not been evaluated (while still considering Raytheon's performance of the TEG delivery order), and even if Raytheon had received an overall past performance rating of confidence, as opposed to significant confidence, the SSA would have made the same decision to award to Raytheon based on the technical superiority of that firm's proposal. Tr. at 389-91.

[16] This decision has addressed the primary arguments presented by Northrop's protest. In addition, Northrop raised a number of collateral issues that we have considered and find without merit, but which do not warrant detailed analysis or discussion.

Downloads

GAO Contacts

Office of Public Affairs