Skip to main content

General Dynamics Information Technology, Inc.

B-407057 Oct 12, 2012
Jump To:
Skip to Highlights

Highlights

General Dynamics Information Technology, Inc. (GDIT), of Fairfax, Virginia, protests the issuance of a task order to Science Applications International Corp. (SAIC), of Alexandria, Virginia, under request for proposals (RFP) No. W52P1J-11-R-0084, issued by the Department of the Army to obtain information technology support services for the G-2 Army Military Intelligence Enterprise (GAME) requirement. GDIT argues that the agency unreasonably evaluated proposals in certain respects, and conducted an improper best value determination.

We deny the protest.
View Decision

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This version, with no redactions, has been approved for public release.

Decision

Matter of: General Dynamics Information Technology, Inc.

File: B-407057

Date: October 12, 2012

Kevin P. Mullen, Esq., Ethan E. Marsh, Esq., and J. Alex Ward, Esq., Jenner & Block LLP, for the protester.
James J. McCullough, Esq., Karen M. Soares, Esq., and Brian M. Stanford, Esq., Fried, Frank, Harris, Shriver & Jacobson LLP, for Science Applications International Corp., an intervenor.
Debra J. Talley, Esq., and Leslie A. Nepper, Esq., U.S. Army Materiel Command, for the agency.
Tania Calhoun, Esq., and Edward Goldstein, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

Protest challenging evaluation of proposals and price/technical tradeoff decision is denied where the record shows that both were reasonable, consistent with the solicitation’s evaluation criteria, and adequately documented.

DECISION

General Dynamics Information Technology, Inc. (GDIT), of Fairfax, Virginia, protests the issuance of a task order to Science Applications International Corp. (SAIC), of Alexandria, Virginia, under request for proposals (RFP) No. W52P1J-11-R-0084, issued by the Department of the Army to obtain information technology support services for the G-2 Army Military Intelligence Enterprise (GAME) requirement. GDIT argues that the agency unreasonably evaluated proposals in certain respects, and conducted an improper best value determination.

We deny the protest.

BACKGROUND

The solicitation, issued March 22, 2012, anticipated the issuance of a fixed-price task order for information technology (IT) repair, maintenance, operations, logistics, and engineering services. RFP Performance Work Statement (PWS) ¶ 2. The contractor will provide the services for 19 continental United States (CONUS) and outside CONUS (OCONUS) organizations over a 45-day phase-in period, a 1-year base period, and up to two 1-year option periods. PWS ¶ 3.1, RFP § A.3. The competition was conducted under the Defense Intelligence Agency’s multiple-award Solution for the Information Technology Enterprise (SITE) contract. RFP § A.3.

Task order selection was to be made on a best value basis considering four evaluation factors: technical capability, management approach, past performance, and price. RFP § M.1. The technical capability and management approach factors were of equal importance, and each was more important than past performance. The past performance factor was more important than price. The non-price factors, when combined, were significantly more important than price. RFP § M.4. The evaluation of technical capability and price are not at issue in this protest.

The management approach factor was comprised of five equally important subfactors: (1) staffing, certification, and training plan; (2) program management; (3) transition and quality assurance plan; (4) organizational structure and communication plan; and (5) change management. RFP §§ M.4.1.3, M.6.2. In evaluating management approaches, the RFP provided for technical/risk ratings of blue/outstanding, purple/good, green/acceptable, and so on. RFP § M.6.2. The first and third subfactors are the only ones at issue here.

Under the past performance factor, the RFP provided for performance confidence assessment ratings of substantial confidence, satisfactory confidence, and so on, focusing on performance relevant to the contract requirements. RFP § M.6.3. The agency was to evaluate an offeror’s past performance to determine how relevant its recent efforts were to the requirements, and then to obtain past performance information from other sources to determine the quality and usefulness of that information as it applied to a performance confidence assessment rating. Id.

The Army received two proposals by the April 23 closing date, one from GDIT and one from SAIC. The source selection authority (SSA) received a detailed briefing on all of the evaluation results. On May 6, the agency sent each firm a letter opening discussions and attaching a variety of evaluation notices (EN). The Army evaluated each firm’s responses. The agency sent out additional ENs, one for GDIT and three for SAIC, and evaluated their responses. The SSA received a detailed briefing prior to the close of discussions. On June 5, the agency sent both firms letters closing discussions and requesting their final proposal revisions. The final proposal revisions were evaluated, and the source selection evaluation board (SSEB) prepared a report for the SSA. The final evaluation results were as follows:

GDIT

SAIC

Technical Capability

Good

Good

Information Assurance

Good

Good

System Availability

Good

Outstanding

Preventive and Remedial Maintenance

Good

Outstanding

Technical Enhancement

Good

Good

Database Administration & Visual Information

Good

Good

Management Approach

Good

Acceptable

Staffing, Certification, Training Plan

Good

Acceptable

Program Management

Good

Outstanding

Transition & Quality Control Plan

Outstanding

Good

Organizational Structure & Communication Plan

Good

Acceptable

Change Management

Good

Good

Past Performance

Substantial Confidence

Substantial Confidence

Price

$165,809,367.20

$150,697,838.06

Agency Report (AR), Exh. 14-2, Source Selection Decision Document, at 6.

The SSA prepared a detailed source selection decision document (SSDD) using the evaluation team findings as a basis for her comparative analysis of the proposals. She adopted the team’s ratings, but looked behind those ratings to the underlying qualitative differences between the proposals.

First, notwithstanding the “good” technical capability ratings received by both firms, the SSA found that the evaluated strengths in SAIC’s proposal with respect to two subfactors--system availability, and preventive and remedial maintenance--made its proposal “far superior” to GDIT’s proposal. Id. at 7. Her conclusion was supported by a detailed discussion of both proposals, under all subfactors. Id. at 7-9. Second, the SSA found that, while GDIT’s management approach proposal was rated “good” and SAIC’s “acceptable,” the differences in their overall management approach proposals were “less distinct than would just appear if one were to look simply” at the adjectival ratings. Id. at 10. The SSA discussed the features of both proposals under each subfactor, including the evaluated performance risks of each offeror’s staffing approach, and the features in SAIC’s program management approach that would reduce overall performance risk. Id. at 10-14. Based on the proposals as a whole, she concluded that GDIT’s management approach was only “slightly stronger” than that of SAIC. Id. at 14.

The SSA concluded that SAIC had the superior technical capability proposal, while GDIT’s management approach proposal was only “slightly better” than that of SAIC. She explained that the past performance factor was not a discriminator because she found that each firm’s record provided a high expectation of successful performance. She stated that she did not find that GDIT’s “slightly better” management approach warranted paying a premium of 10.03 percent, or approximately $15.1 million, over SAIC’s superior technical capability proposal, and found that SAIC’s proposal represented the best value to the government. Id. at 15. Upon learning of the agency’s decision, GDIT filed the subject protest.

DISCUSSION

GDIT primarily challenges the SSA’s assessment of risk under the management approach factor for SAIC and GDIT; argues that the Army improperly evaluated proposals with respect to past performance; and contends that the SSA’s price/technical tradeoff decision was improper.

Management Approach Factor

GDIT argues that, in making her source selection decision, the SSA minimized the risk presented by SAIC’s proposed staffing approach and inflated the risk of GDIT’s proposed staffing approach. These risks were identified under the management approach factor’s first subfactor--staffing, certification, and training plan. GDIT also challenges the evaluation of SAIC’s proposal under the transition and quality assurance plan subfactor.

In reviewing an agency’s evaluation of proposals and source selection decision, it is not our role to reevaluate submissions; rather, we examine the supporting record to determine whether the decision was reasonable, consistent with the stated evaluation criteria, and adequately documented. Trofholz Techs., Inc., B-404101, Jan. 5, 2011, 2011 CPD ¶ 144 at 3; Johnson Controls World Servs., Inc., B-289942, B-289942.2, May 24, 2002, 2002 CPD ¶ 88 at 6. Source selection officials have broad discretion in determining the manner and extent to which they will make use of, not only adjectival ratings or point scores, but also the written narrative justification underlying the technical results, subject only to the tests of rationality and consistency with the evaluation criteria. Development Alternatives, Inc., B-279920, Aug. 6, 1998, 98-2 CPD ¶ 54 at 9. A protester’s mere disagreement with the agency’s evaluation judgments, or with the agency’s determination as to the relative merits of competing proposals, does not establish that the evaluation or the source selection decision was unreasonable. Smiths Detection, Inc.; Am. Sci. and Eng’g, Inc., B-402168.4 et al., Feb. 9, 2011, 2011 CPD ¶ 39 at 6-7; ITW Military GSE, B-403866.3, Dec. 7, 2010, 2010 CPD ¶ 282 at 5.

GDIT has not challenged the evaluation team’s findings concerning the risks of each offeror’s staffing approach, only the SSA’s consideration of the team’s findings. We begin our discussion with a brief review of those findings.

GDIT’s initial proposal was evaluated as having a significant weakness because it failed to provide a detailed explanation of how it would satisfy certain PWS performance, service, and deliverable requirements with its proposed staffing level, which was well below the current staffing level. AR, Exh. 13-6, Management Approach Evaluation Report for GDIT, at 5. Based on this significant weakness, GDIT’s proposal was rated marginal under the subfactor and the factor overall. Id. at 8,19.

On May 6, the Army sent an EN to GDIT asking for an explanation that justified its proposed staffing. AR, Exh. 8-1, GDIT EN4. Based on GDIT’s response, the Army changed the significant weakness to a strength, citing such things as the process GDIT used to develop the staffing proposal, its leveraging of staff, its experience as the incumbent, and workforce rebalance. AR, Exh. 13-6, Management Approach Evaluation Report for GDIT at 8-9. GDIT’s ratings for both the subfactor and the factor overall increased from marginal to good, with low performance risk. Id. at 9, 20. The evaluation team stated that, while the risk rating was low, there was risk in GDIT employing a reduced staff in relation to current staffing levels. The evaluators also stated that GDIT would be challenged to use its management plan to provide the required level of service with no staffing flexibility. Id. at 20; AR, Exh. 13-11, SSEB Report, at 15-17, 22-23.

SAIC’s initial proposal was evaluated as having a significant weakness because it did not contain an explanation of how certain PWS performance and service requirements would be satisfied at each organization with reduced staffing, which was also well below the government’s current staffing level. AR, Exh. 13-8, Management Approach Evaluation Report for SAIC at 5-6. The proposal was evaluated as having another significant weakness, weaknesses, and a deficiency related to other staffing concerns.[1] As a result, SAIC’s initial proposal was rated unacceptable for both the subfactor and the factor overall. Id. at 9, 24.

On May 6, the Army sent ENs to SAIC concerning staffing issues. One EN asked SAIC to explain how its proposed staffing levels would satisfy the PWS requirements. AR, Exh. 9-1, SAIC EN1. Based on SAIC’s response, the Army changed the significant weakness to a “meets the requirement,” citing such things as the tool SAIC used to develop the staffing proposal, its experience on successful comparable IT service contracts at other organizations, and its knowledge of the Army and the intelligence community. AR, Exh. 13-8, Management Approach Evaluation Report for SAIC at 9. The Army found that SAIC’s explanations were acceptable but raised certain risks. Id. at 9-10.

Another EN associated with the deficiency identified in SAIC’s proposal asked the firm to explain how its proposed staffing in a specific area would meet the PWS requirement. AR, Exh. 9-1, SAIC EN3. Based on SAIC’s response, the Army changed the deficiency to “meets the requirement,” but considered that the response raised concerns. Id. at 10-11.

The Army assigned SAIC’s proposal an interim rating of acceptable, with moderate risk. However, because SAIC’s EN responses raised areas of concern, and the Army believed SAIC’s rationale for reduced staffing was not sufficiently detailed, the Army sent SAIC follow-up requests for additional information; these requests were contained in three staffing-related ENs. Based on SAIC’s responses, the Army found that SAIC’s rationale for reduced staffing had been addressed but did not change the final subfactor and overall ratings of acceptable, with moderate risk. In this regard, the Army concluded that performance risk existed in three service areas, which it described in significant detail. AR, Exh. 13-8, Management Approach Evaluation Report for SAIC, at 13-15; AR, Exh. 13-11, SSEB Report, at 23-30, 36-37.

The record reflects that the SSA adopted the evaluation team’s ratings but stated that she did “not find great variance” in the overall management approach proposals. AR, Exh. 14-2, SSDD at 10. According to the SSA, “[c]ertainly, the differences in the overall Management Approaches presented by both Offerors are less distinct than would just appear if one were to look simply at the proposals as presenting a ‘Good’ versus ‘Acceptable’ adjectival rating comparison.” Id. The SSA then set forth a detailed comparative analysis of the proposals under each management approach subfactor, and for the factor overall. GDIT focuses its arguments on her discussion under the staffing, certification, and training plan subfactor.

Under that subfactor, GDIT received a rating of “good”, while SAIC received an “acceptable” rating. The SSA indicated that SAIC’s acceptable rating was largely driven by a finding of “moderate” risk associated with its staffing approach. Specifically, the SSA noted that the evaluation team identified increased risk associated with SAIC’s approach to staffing select service areas at three organizations, and that for these organizations, there was a moderate risk of unsuccessful performance. The SSA, however, found that “SAIC’s proven ability to resource multiple contracts worldwide will enable it to respond quickly in the event proposed staffing hinders performance which reduces the delta between the two offerors in the overall spectrum of performance risk.” AR, Exh. 14-2, SSDD at 11. In reaching this conclusion, she noted that the evaluation team had identified risk in GDIT’s staffing approach, as well.

Notwithstanding GDIT’s assertions to the contrary, the SSA’s comparative analysis did not improperly minimize the risk in SAIC’s approach or inflate the risk in GDIT’s approach. Instead, the SSA accurately described the risks identified by the evaluation team and concluded that they did not constitute a significant difference between the proposals considered in their entirety.[2] Although GDIT attempts to magnify the risk that the evaluation team found in SAIC’s proposal by citing to the evaluators’ views formed prior to the conclusion of discussions, these efforts are misleading and irrelevant since they did not reflect the agency’s ultimate assessment of SAIC’s final proposal.

GDIT also challenges the SSA’s statement that “SAIC’s proven ability to resource multiple contracts worldwide will enable it to respond more quickly in the event proposed staffing hinders performance,” arguing that the SSA improperly understood SAIC to have proposed more staff than it did. We disagree.

When the record is read as a whole, it is apparent that the SSA was referring to her view that the performance risk represented by SAIC’s staffing approach was reduced because SAIC had experience in resourcing multiple contracts worldwide that would enable it to respond quickly to augment staffing if necessary. In evaluating SAIC’s staffing approach, the evaluation team favorably noted SAIC’s experience gained from successful comparable IT service contracts at other facilities, as well as its ability to draw on all of its personnel and capabilities across the program in order to meet performance demands. AR, Exh. 14-2, SSEB Report at 26, 28-29. The SSA’s view is echoed in her conclusion that, overall, she found GDIT’s management approach to be only “slightly stronger” than SAIC’s because she also found that SAIC’s program management approach and its ability to resource hundreds of projects worldwide moved the proposals closer together under this factor. AR, Exh. 14-2, SSDD, at 14.

GDIT finally argues that, if its discussion responses did not completely resolve the Army’s concerns about its proposed staffing approach, the Army engaged in unequal discussions. That is, GDIT suggests that the SSA’s statement that the evaluation team identified risk in GDIT’s staffing approach meant that the agency had remaining concerns about that approach and, as it did with SAIC, should have conducted further discussions with the firm.

In negotiated procurements, if an agency conducts discussions, the discussions must be meaningful. The Communities Group, B-283147, Oct. 12, 1999, 99-2 CPD ¶ 101 at 4. That is, agencies must lead offerors into the areas of their proposals that contain significant weaknesses or deficiencies, and may not mislead offerors. Metro Mach. Corp., B-281872, et al., Apr. 22, 1999, 99-1 CPD ¶ 101 at 6-7. Offerors must be given an equal opportunity to revise their proposals, but discussions need not be identical; rather, discussions must be tailored to each offeror’s proposal. Federal Acquisition Regulation §§ 15.306(d)(1), (e)(1); WorldTravelService, B-284155.3, Mar. 26, 2001, 2001 CPD ¶ 68 at 5-6.

SAIC’s initial proposal clearly had more numerous and substantial weaknesses related to its staffing approach than did GDIT’s initial proposal. That SAIC received more extensive discussions in this area than did GDIT does not indicate that discussions were unequal but, rather, that discussions were tailored to the proposals. See Metropolitan Interpreters and Translators, Inc., B-403912.4 et al., May 31, 2011, 2012 CPD ¶ 130 at 7. Both firms were sent discussions items after they submitted their initial discussion responses; GDIT received an item on its price and SAIC received follow-up items on its staffing approach. An agency may not have continuing discussions with only one offeror regarding a certain concern where it has the same concern with other proposals, but may reiterate a technical concern with one offeror where that concern applies only to that offeror, and the agency has no remaining technical concerns for the other offeror. See Int’l Business & Tech. Consultants, Inc., B-310424.2 et al., Sept. 23, 2008, 2008 CPD ¶ 185 at 9-10. The record shows the Army had no remaining concerns about GDIT’s staffing approach, which it considered a strength, and it was not required to further pursue the matter.

We now turn to GDIT’s allegations concerning the transition and quality assurance plan subfactor. In this regard, GDIT first argues that the Army overlooked the fact that SAIC took exception to the solicitation’s transition requirements for locations requiring Technical Expert Status Accreditation (TESA) approval.

The solicitation stated that, upon award, the contractor was to have 45 days to provide all fully qualified and cleared personnel to provide the contracted services, ensuring no break in services from the prior contractor. PWS ¶ 9.1. According to the RFP, contractor employees hired for positions in Germany as technical experts must receive TESA approval from the German government prior to beginning work in Germany. Since the contractor is responsible for creating position descriptions for all positions requiring TESA approval, see PWS ¶ 10.4.5; see also PWS ¶ 11.2.3, the contractor is required to undertake these tasks to ensure TESA approval in order to be operational within 45 days of award.

Under the heading, “Technical Expert/Analytical Support (TE/AS) Status Accreditation (TESA),” SAIC’s proposal states:

SAIC’s proposal assumes the granting of Enterprise Approval for the resulting Task Order and the approval of Technical Expert/Analytical Support (TE/AS) Status Accreditation for the individual technical personnel proposed. . . . Performance will begin upon final approval of the Task Order and individual staff members assigned to this Task. . . .

AR, Exh. 7-5a, SAIC Price Proposal at § 3.8.

Citing this language, GDIT argues that SAIC’s proposal conditioned the commencement of its performance in locations requiring TESA approval on the granting of TESA approval for its candidates, and thus failed to meet a material requirement. In our view, however, the agency reasonably understood the statement that SAIC would “assume granting of TESA approval” to mean, in this context, that SAIC viewed this approval as a “given” and that the firm, like any other firm, would begin performance upon final approval. To the extent that GDIT argues approval might not be a “given,” any contractor must meet the same approval requirements.

In a related matter, GDIT argues that SAIC’s proposal fails to set out how it would satisfy the TESA requirements while meeting the transition schedule deadline, and that the Army ignored these significant transition risks.

Under this subfactor, offerors were to provide a transition plan that, among other things, clearly identified specific actions that must occur for CONUS and OCONUS locations. As relevant here, these actions included those associated with entry, exit, and work requirements for U.S. citizens working in all OCONUS locations. RFP § L.3.2. The methodologies proposed in the transition plan were to be evaluated based on the extent to which they ensured a smooth transition while sustaining operations and critical missions at each organization. RFP § M.6.2(b)(3).

In describing its methodologies, SAIC proposed that, within 72 hours of award notification, it would submit its initial round of documents to begin the processes for such things as TESA accreditation. According to SAIC’s proposal, rapid submission, review, and approval of these processes comprised a critical, initial goal for its team to mitigate transition risk for overseas organizations. AR, Exh. 07-3a, SAIC Management Approach Proposal at § 3.1.1.2. This methodology is included in its plan of actions and timelines. Id. at § 3.1.2.1. SAIC also described its extensive experience in the rapid deployment of contractor personnel to overseas locations, and its “intimate familiarity” with deployment issues such as obtaining TESA approvals. Id. SAIC stated that its human resources staff has a team of specialists who exclusively handle international staffing issues and will address such things as host national employment requirements. Id. at § 3.1.2.2. The Army assigned SAIC’s proposal a strength associated with its proposed methodology to move quickly to ensure a successful OCONUS transition, citing the above passages from SAIC’s proposal discussing TESA approval, and found that it reduced transition risk. AR, Exh. 13-11, SSEB Report, at 34.

The Army clearly evaluated SAIC’s transition plan with respect to the need for TESA approvals, and found that it merited a strength. Based on the information in SAIC’s proposal, we have no basis to question the Army’s evaluation. GDIT’s view that SAIC’s proposal should have been more detailed does not show there were significant risks in its proposal that the Army ignored.

Past Performance Factor

GDIT argues that the Army failed to properly assess relevance when evaluating past performance, and this failure undermined the source selection decision. GDIT also argues that the SSA unreasonably relied only on the offerors’ adjectival performance confidence assessment ratings, which led her to conclude that SAIC’s past performance was equal to GDIT’s past performance. According to GDIT, the evaluation record shows that GDIT’s past contracts are more relevant and therefore deserved greater consideration.

An agency’s evaluation of past performance, including its consideration of the relevance, scope, and significance of an offeror’s performance history, is a matter of discretion which we will not disturb unless the agency’s assessments are unreasonable, inconsistent with the solicitation criteria, or undocumented. L-3 Sys. Co., B-404671.2, B-404671.4, Apr. 8, 2011, 2011 CPD ¶ 93 at 4; Family Entm’t Servs., Inc., d/b/a IMC, B-291997.4, June 10, 2004, 2004 CPD ¶ 128 at 5. A protester’s mere disagreement with such judgment does not provide a basis to sustain a protest. Birdwell Bros. Painting & Refinishing, B-285035, July 5, 2000, 2000 CPD ¶ 129 at 5. Our review of the record shows that GDIT’s allegations have no merit.

Offerors could submit up to five recent and relevant contract references for evaluation. RFP § L.3.3. The agency’s evaluation of proposals, and assignment of performance confidence assessment ratings, was to focus on performance that was relevant to the contract requirements. RFP § M.6.3.

Both SAIC and GDIT submitted five references. Consistent with the RFP’s requirements, the past performance evaluation team first evaluated the information to determine the relevance of the past efforts. After this determination, the team reviewed the past performance information it could gather to determine its quality and usefulness in assigning the performance confidence assessment ratings. The agency evaluated the information in the proposals, made follow-on telephone calls to references, and gathered available Contractor Performance Assessment Reporting System (CPARS) and Past Performance Information Retrieval System (PPIRS) data.[3] AR, Exh. 13-9, Past Performance Evaluation Report, at 2-4.

For GDIT, the agency found that three contracts were “very relevant” and two were “relevant.” With that relevance determination as a backdrop, the agency obtained available past performance information and conducted interviews. The CPARS for one “very relevant” contract rated GDIT as satisfactory in one area and very good to exceptional in others; the contracting officer for that procurement expressed a concern about GDIT’s ability to consistently backfill personnel, and rated its performance as good, but not excellent. The contracting officer for another “very relevant” contract rated GDIT’s overall performance as satisfactory. The contracting officer for one task order under the third “very relevant” contract said that GDIT’s overall customer satisfaction was good. For one of GDIT’s “relevant” contracts, the contracting officer stated that the firm’s overall performance was very good; no third party information was available for the fifth contract. Id. at 6-8. The Army concluded that, based on GDIT’s “recent/relevant performance record,” it had a high expectation that GDIT would successfully perform the required effort and assigned it a substantial confidence rating. The team stated that the one satisfactory rating did not outweigh the other good/exceptional findings. Id. at 6, 8.

For SAIC, the agency found that one contract was “very relevant” and four were “relevant.” With that relevance determination as a backdrop, the agency obtained available past performance information and conducted one interview. For SAIC’s “very relevant” contract, SAIC’s proposal stated that its performance was consistently rated as excellent, but the agency was unable to confirm this statement. The CPARS for one of SAIC’s “relevant” contracts confirmed that SAIC was rated exceptional in a number of areas and very good in some others. The CPARS for a second “relevant” contract confirmed that SAIC was rated exceptional in all areas; the contracting officer for this procurement stated that he was very satisfied with SAIC’s overall efforts. For a third “relevant” contract, SAIC stated that it was rated exceptional in all areas, but the agency was unable to confirm this statement. Information was not available on another “relevant” contract. Id. at 11-12. The Army found that, based on SAIC’s “recent/relevant performance record,” it had a high expectation that SAIC would successfully perform the required effort and assigned it a substantial confidence rating. Although it was not able to confirm all the meritorious past performance assessment ratings identified by SAIC, the agency stated that the past performance information it did find indicated that SAIC had received good/excellent ratings, and this gave the Army substantial confidence in SAIC’s ability to perform. Id. at 11-12.

GDIT’s argument that the agency did not consider relevance when assigning its performance confidence assessment ratings is belied by the record. GDIT’s assertion that SAIC should have been rated satisfactory confidence because its past efforts are less relevant than those of GDIT ignores the RFP’s requirement for an integrated assessment of both the relevance and quality of an offeror’s past performance. The Army conducted this integrated assessment, placing the quality of the offerors’ performance in the context of the relevance of its efforts, and GDIT has given us no basis to find it unreasonable.

GDIT argues that the SSA did not consider the evaluation information underlying the adjectival performance confidence assessment ratings and that, if she had, she would have discovered that GDIT’s past performance was far more relevant than that of SAIC and deserving of greater weight. The record, however, reflects that the SSA received the underlying evaluation information in her initial briefing. AR, Exh. 15-1, SSA Initial Briefing, at Slides 36-41, 77-81. Based on this information, the SSA properly assessed whether there were qualitative differences in the offerors’ past performance records that could be factored into her source selection decision, and concluded that the differences did not provide a basis for discrimination. We have no basis to find her conclusion in this regard unreasonable. See Sigmatech, Inc., B-406288.2, July 20, 2012, 2012 CPD ¶ 222 at 6.

Price/Technical Tradeoff Decision

GDIT argues that the SSA’s characterization of SAIC’s technical capability proposal as “far superior” to GDIT’s is inconsistent with the fact that, even though SAIC’s proposal received higher ratings than GDIT under two subfactors, the subfactors were equally weighted. As a result, the protester contends, SAIC’s proposal should be considered as only slightly better than GDIT’s proposal. GDIT further argues that the SSA’s characterization of the difference between the firms’ management approach proposals as “minimal” is inconsistent with her conclusion concerning the technical capability proposals. In this regard, GDIT asserts since the SSA found SAIC’s technical capability proposal--with higher adjectival ratings under two subfactors-- “far superior” to GDIT’s, to be consistent, the SSA should have found GDIT’s management approach proposal--with higher adjectival ratings under three subfactors and a higher overall adjectival rating--to merit at least a “far superior” finding. GDIT Comments at 14-15.

In a best value procurement, it is the function of the SSA to perform a price/technical tradeoff, that is, to determine whether one proposal’s technical superiority is worth the higher price. ITW Military GSE, supra. Ratings, whether numerical, color, or adjectival, are merely guides to assist agencies in evaluating proposals; information regarding strengths and weaknesses of proposals is the type of information that source selection officials should consider, in addition to ratings, to enable them to determine whether and to what extent meaningful differences exist between proposals. Pemco Aeroplex, Inc., B-310372, Dec. 27, 2007, 2008 CPD ¶ 2 at 6; ACCESS Sys., Inc., B-400623.3, Mar. 4, 2009, 2009 CPD ¶ 56 at 7. Proposals with the same adjectival ratings are not necessarily of equal quality, and an agency may properly consider specific advantages that make one proposal of higher quality than another. Pemco Aeroplex, Inc., supra, at 6-7. A single evaluation factor--even a lower-weighted factor--may properly be relied upon as a key discriminator for purposes of a source selection decision. Smiths Detection, Inc.; Am. Sci. and Eng’g, Inc., supra, at 16.

Here, the SSA properly did not engage in the purely mathematical or mechanical price/technical tradeoff methodology advocated by GDIT. Rather, in making her price/technical tradeoff decision, the SSA looked behind the adjectival ratings to identify any qualitative differences that existed between the proposals. Recognizing the wide discretion afforded agencies in making their tradeoff decisions, we have no basis to find the SSA’s decision in this case unreasonable or otherwise improper.

The protest is denied.

Lynn H. Gibson
General Counsel



[1] These other concerns were addressed during discussions.

[2] There is no evidence to support GDIT’s argument that the SSA disagreed with the evaluation team’s findings concerning the risks of both proposals.

[3] The PPIRS is a web-enabled, government-wide application that collects quantifiable delivery and quality past performance information. FAR § 42.1503; General Dynamics, American Overseas Marine, B-401874.14, B-401874.15, Nov. 1, 2011, 2012 CPD ¶ 85 at 7 n.4.

Downloads

GAO Contacts

Office of Public Affairs