General Dynamics Information Technology, Inc.
Highlights
General Dynamics Information Technology, Inc. (GDIT), of Fairfax, Virginia, protests the issuance of a task order to Science Applications International Corp. (SAIC), of Alexandria, Virginia, under request for proposals (RFP) No. W52P1J-11-R-0084, issued by the Department of the Army to obtain information technology support services for the G-2 Army Military Intelligence Enterprise (GAME) requirement. GDIT argues that the agency unreasonably evaluated proposals in certain respects, and conducted an improper best value determination.
DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This version, with no redactions, has been approved for public release.
Decision
Matter of: General Dynamics Information Technology, Inc.
File: B-407057
Date: October 12, 2012
Kevin P. Mullen, Esq., Ethan E. Marsh, Esq., and J. Alex Ward, Esq., Jenner & Block LLP, for the protester.
James J. McCullough, Esq., Karen M. Soares, Esq., and Brian M. Stanford, Esq., Fried, Frank, Harris, Shriver & Jacobson LLP, for Science Applications International Corp., an intervenor.
Debra J. Talley, Esq., and Leslie A. Nepper, Esq., U.S. Army Materiel Command, for the agency.
Tania Calhoun, Esq., and Edward Goldstein, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.
DIGEST
Protest challenging evaluation of proposals and price/technical tradeoff decision is denied where the record shows that both were reasonable, consistent with the solicitations evaluation criteria, and adequately documented.
DECISION
General Dynamics Information Technology, Inc. (GDIT), of Fairfax, Virginia, protests the issuance of a task order to Science Applications International Corp. (SAIC), of Alexandria, Virginia, under request for proposals (RFP) No. W52P1J-11-R-0084, issued by the Department of the Army to obtain information technology support services for the G-2 Army Military Intelligence Enterprise (GAME) requirement. GDIT argues that the agency unreasonably evaluated proposals in certain respects, and conducted an improper best value determination.
We deny the protest.
BACKGROUND
The solicitation, issued March 22, 2012, anticipated the issuance of a fixed-price task order for information technology (IT) repair, maintenance, operations, logistics, and engineering services. RFP Performance Work Statement (PWS) ¶ 2. The contractor will provide the services for 19 continental United States (CONUS) and outside CONUS (OCONUS) organizations over a 45-day phase-in period, a 1-year base period, and up to two 1-year option periods. PWS ¶ 3.1, RFP § A.3. The competition was conducted under the Defense Intelligence Agencys multiple-award Solution for the Information Technology Enterprise (SITE) contract. RFP § A.3.
Task order selection was to be made on a best value basis considering four evaluation factors: technical capability, management approach, past performance, and price. RFP § M.1. The technical capability and management approach factors were of equal importance, and each was more important than past performance. The past performance factor was more important than price. The non-price factors, when combined, were significantly more important than price. RFP § M.4. The evaluation of technical capability and price are not at issue in this protest.
The management approach factor was comprised of five equally important subfactors: (1) staffing, certification, and training plan; (2) program management; (3) transition and quality assurance plan; (4) organizational structure and communication plan; and (5) change management. RFP §§ M.4.1.3, M.6.2. In evaluating management approaches, the RFP provided for technical/risk ratings of blue/outstanding, purple/good, green/acceptable, and so on. RFP § M.6.2. The first and third subfactors are the only ones at issue here.
Under the past performance factor, the RFP provided for performance confidence assessment ratings of substantial confidence, satisfactory confidence, and so on, focusing on performance relevant to the contract requirements. RFP § M.6.3. The agency was to evaluate an offerors past performance to determine how relevant its recent efforts were to the requirements, and then to obtain past performance information from other sources to determine the quality and usefulness of that information as it applied to a performance confidence assessment rating. Id.
The Army received two proposals by the April 23 closing date, one from GDIT and one from SAIC. The source selection authority (SSA) received a detailed briefing on all of the evaluation results. On May 6, the agency sent each firm a letter opening discussions and attaching a variety of evaluation notices (EN). The Army evaluated each firms responses. The agency sent out additional ENs, one for GDIT and three for SAIC, and evaluated their responses. The SSA received a detailed briefing prior to the close of discussions. On June 5, the agency sent both firms letters closing discussions and requesting their final proposal revisions. The final proposal revisions were evaluated, and the source selection evaluation board (SSEB) prepared a report for the SSA. The final evaluation results were as follows:
| GDIT | SAIC |
Technical Capability | Good | Good |
Information Assurance | Good | Good |
System Availability | Good | Outstanding |
Preventive and Remedial Maintenance | Good | Outstanding |
Technical Enhancement | Good | Good |
Database Administration & Visual Information | Good | Good |
Management Approach | Good | Acceptable |
Staffing, Certification, Training Plan | Good | Acceptable |
Program Management | Good | Outstanding |
Transition & Quality Control Plan | Outstanding | Good |
Organizational Structure & Communication Plan | Good | Acceptable |
Change Management | Good | Good |
Past Performance | Substantial Confidence | Substantial Confidence |
Price | $165,809,367.20 | $150,697,838.06 |
Agency Report (AR), Exh. 14-2, Source Selection Decision Document, at 6.
The SSA prepared a detailed source selection decision document (SSDD) using the evaluation team findings as a basis for her comparative analysis of the proposals. She adopted the teams ratings, but looked behind those ratings to the underlying qualitative differences between the proposals.
First, notwithstanding the good technical capability ratings received by both firms, the SSA found that the evaluated strengths in SAICs proposal with respect to two subfactors--system availability, and preventive and remedial maintenance--made its proposal far superior to GDITs proposal. Id. at 7. Her conclusion was supported by a detailed discussion of both proposals, under all subfactors. Id. at 7-9. Second, the SSA found that, while GDITs management approach proposal was rated good and SAICs acceptable, the differences in their overall management approach proposals were less distinct than would just appear if one were to look simply at the adjectival ratings. Id. at 10. The SSA discussed the features of both proposals under each subfactor, including the evaluated performance risks of each offerors staffing approach, and the features in SAICs program management approach that would reduce overall performance risk. Id. at 10-14. Based on the proposals as a whole, she concluded that GDITs management approach was only slightly stronger than that of SAIC. Id. at 14.
The SSA concluded that SAIC had the superior technical capability proposal, while GDITs management approach proposal was only slightly better than that of SAIC. She explained that the past performance factor was not a discriminator because she found that each firms record provided a high expectation of successful performance. She stated that she did not find that GDITs slightly better management approach warranted paying a premium of 10.03 percent, or approximately $15.1 million, over SAICs superior technical capability proposal, and found that SAICs proposal represented the best value to the government. Id. at 15. Upon learning of the agencys decision, GDIT filed the subject protest.
DISCUSSION
GDIT primarily challenges the SSAs assessment of risk under the management approach factor for SAIC and GDIT; argues that the Army improperly evaluated proposals with respect to past performance; and contends that the SSAs price/technical tradeoff decision was improper.
Management Approach Factor
GDIT argues that, in making her source selection decision, the SSA minimized the risk presented by SAICs proposed staffing approach and inflated the risk of GDITs proposed staffing approach. These risks were identified under the management approach factors first subfactor--staffing, certification, and training plan. GDIT also challenges the evaluation of SAICs proposal under the transition and quality assurance plan subfactor.
In reviewing an agencys evaluation of proposals and source selection decision, it is not our role to reevaluate submissions; rather, we examine the supporting record to determine whether the decision was reasonable, consistent with the stated evaluation criteria, and adequately documented. Trofholz Techs., Inc., B-404101, Jan. 5, 2011, 2011 CPD ¶ 144 at 3; Johnson Controls World Servs., Inc., B-289942, B-289942.2, May 24, 2002, 2002 CPD ¶ 88 at 6. Source selection officials have broad discretion in determining the manner and extent to which they will make use of, not only adjectival ratings or point scores, but also the written narrative justification underlying the technical results, subject only to the tests of rationality and consistency with the evaluation criteria. Development Alternatives, Inc., B-279920, Aug. 6, 1998, 98-2 CPD ¶ 54 at 9. A protesters mere disagreement with the agencys evaluation judgments, or with the agencys determination as to the relative merits of competing proposals, does not establish that the evaluation or the source selection decision was unreasonable. Smiths Detection, Inc.; Am. Sci. and Engg, Inc., B-402168.4 et al., Feb. 9, 2011, 2011 CPD ¶ 39 at 6-7; ITW Military GSE, B-403866.3, Dec. 7, 2010, 2010 CPD ¶ 282 at 5.
GDIT has not challenged the evaluation teams findings concerning the risks of each offerors staffing approach, only the SSAs consideration of the teams findings. We begin our discussion with a brief review of those findings.
GDITs initial proposal was evaluated as having a significant weakness because it failed to provide a detailed explanation of how it would satisfy certain PWS performance, service, and deliverable requirements with its proposed staffing level, which was well below the current staffing level. AR, Exh. 13-6, Management Approach Evaluation Report for GDIT, at 5. Based on this significant weakness, GDITs proposal was rated marginal under the subfactor and the factor overall. Id. at 8,19.
On May 6, the Army sent an EN to GDIT asking for an explanation that justified its proposed staffing. AR, Exh. 8-1, GDIT EN4. Based on GDITs response, the Army changed the significant weakness to a strength, citing such things as the process GDIT used to develop the staffing proposal, its leveraging of staff, its experience as the incumbent, and workforce rebalance. AR, Exh. 13-6, Management Approach Evaluation Report for GDIT at 8-9. GDITs ratings for both the subfactor and the factor overall increased from marginal to good, with low performance risk. Id. at 9, 20. The evaluation team stated that, while the risk rating was low, there was risk in GDIT employing a reduced staff in relation to current staffing levels. The evaluators also stated that GDIT would be challenged to use its management plan to provide the required level of service with no staffing flexibility. Id. at 20; AR, Exh. 13-11, SSEB Report, at 15-17, 22-23.
SAICs initial proposal was evaluated as having a significant weakness because it did not contain an explanation of how certain PWS performance and service requirements would be satisfied at each organization with reduced staffing, which was also well below the governments current staffing level. AR, Exh. 13-8, Management Approach Evaluation Report for SAIC at 5-6. The proposal was evaluated as having another significant weakness, weaknesses, and a deficiency related to other staffing concerns.[1] As a result, SAICs initial proposal was rated unacceptable for both the subfactor and the factor overall. Id. at 9, 24.
On May 6, the Army sent ENs to SAIC concerning staffing issues. One EN asked SAIC to explain how its proposed staffing levels would satisfy the PWS requirements. AR, Exh. 9-1, SAIC EN1. Based on SAICs response, the Army changed the significant weakness to a meets the requirement, citing such things as the tool SAIC used to develop the staffing proposal, its experience on successful comparable IT service contracts at other organizations, and its knowledge of the Army and the intelligence community. AR, Exh. 13-8, Management Approach Evaluation Report for SAIC at 9. The Army found that SAICs explanations were acceptable but raised certain risks. Id. at 9-10.
Another EN associated with the deficiency identified in SAICs proposal asked the firm to explain how its proposed staffing in a specific area would meet the PWS requirement. AR, Exh. 9-1, SAIC EN3. Based on SAICs response, the Army changed the deficiency to meets the requirement, but considered that the response raised concerns. Id. at 10-11.
The Army assigned SAICs proposal an interim rating of acceptable, with moderate risk. However, because SAICs EN responses raised areas of concern, and the Army believed SAICs rationale for reduced staffing was not sufficiently detailed, the Army sent SAIC follow-up requests for additional information; these requests were contained in three staffing-related ENs. Based on SAICs responses, the Army found that SAICs rationale for reduced staffing had been addressed but did not change the final subfactor and overall ratings of acceptable, with moderate risk. In this regard, the Army concluded that performance risk existed in three service areas, which it described in significant detail. AR, Exh. 13-8, Management Approach Evaluation Report for SAIC, at 13-15; AR, Exh. 13-11, SSEB Report, at 23-30, 36-37.
The record reflects that the SSA adopted the evaluation teams ratings but stated that she did not find great variance in the overall management approach proposals. AR, Exh. 14-2, SSDD at 10. According to the SSA, [c]ertainly, the differences in the overall Management Approaches presented by both Offerors are less distinct than would just appear if one were to look simply at the proposals as presenting a Good versus Acceptable adjectival rating comparison. Id. The SSA then set forth a detailed comparative analysis of the proposals under each management approach subfactor, and for the factor overall. GDIT focuses its arguments on her discussion under the staffing, certification, and training plan subfactor.
Under that subfactor, GDIT received a rating of good, while SAIC received an acceptable rating. The SSA indicated that SAICs acceptable rating was largely driven by a finding of moderate risk associated with its staffing approach. Specifically, the SSA noted that the evaluation team identified increased risk associated with SAICs approach to staffing select service areas at three organizations, and that for these organizations, there was a moderate risk of unsuccessful performance. The SSA, however, found that SAICs proven ability to resource multiple contracts worldwide will enable it to respond quickly in the event proposed staffing hinders performance which reduces the delta between the two offerors in the overall spectrum of performance risk. AR, Exh. 14-2, SSDD at 11. In reaching this conclusion, she noted that the evaluation team had identified risk in GDITs staffing approach, as well.
Notwithstanding GDITs assertions to the contrary, the SSAs comparative analysis did not improperly minimize the risk in SAICs approach or inflate the risk in GDITs approach. Instead, the SSA accurately described the risks identified by the evaluation team and concluded that they did not constitute a significant difference between the proposals considered in their entirety.[2] Although GDIT attempts to magnify the risk that the evaluation team found in SAICs proposal by citing to the evaluators views formed prior to the conclusion of discussions, these efforts are misleading and irrelevant since they did not reflect the agencys ultimate assessment of SAICs final proposal.
GDIT also challenges the SSAs statement that SAICs proven ability to resource multiple contracts worldwide will enable it to respond more quickly in the event proposed staffing hinders performance, arguing that the SSA improperly understood SAIC to have proposed more staff than it did. We disagree.
When the record is read as a whole, it is apparent that the SSA was referring to her view that the performance risk represented by SAICs staffing approach was reduced because SAIC had experience in resourcing multiple contracts worldwide that would enable it to respond quickly to augment staffing if necessary. In evaluating SAICs staffing approach, the evaluation team favorably noted SAICs experience gained from successful comparable IT service contracts at other facilities, as well as its ability to draw on all of its personnel and capabilities across the program in order to meet performance demands. AR, Exh. 14-2, SSEB Report at 26, 28-29. The SSAs view is echoed in her conclusion that, overall, she found GDITs management approach to be only slightly stronger than SAICs because she also found that SAICs program management approach and its ability to resource hundreds of projects worldwide moved the proposals closer together under this factor. AR, Exh. 14-2, SSDD, at 14.
GDIT finally argues that, if its discussion responses did not completely resolve the Armys concerns about its proposed staffing approach, the Army engaged in unequal discussions. That is, GDIT suggests that the SSAs statement that the evaluation team identified risk in GDITs staffing approach meant that the agency had remaining concerns about that approach and, as it did with SAIC, should have conducted further discussions with the firm.
In negotiated procurements, if an agency conducts discussions, the discussions must be meaningful. The Communities Group, B-283147, Oct. 12, 1999, 99-2 CPD ¶ 101 at 4. That is, agencies must lead offerors into the areas of their proposals that contain significant weaknesses or deficiencies, and may not mislead offerors. Metro Mach. Corp., B-281872, et al., Apr. 22, 1999, 99-1 CPD ¶ 101 at 6-7. Offerors must be given an equal opportunity to revise their proposals, but discussions need not be identical; rather, discussions must be tailored to each offerors proposal. Federal Acquisition Regulation §§ 15.306(d)(1), (e)(1); WorldTravelService, B-284155.3, Mar. 26, 2001, 2001 CPD ¶ 68 at 5-6.
SAICs initial proposal clearly had more numerous and substantial weaknesses related to its staffing approach than did GDITs initial proposal. That SAIC received more extensive discussions in this area than did GDIT does not indicate that discussions were unequal but, rather, that discussions were tailored to the proposals. See Metropolitan Interpreters and Translators, Inc., B-403912.4 et al., May 31, 2011, 2012 CPD ¶ 130 at 7. Both firms were sent discussions items after they submitted their initial discussion responses; GDIT received an item on its price and SAIC received follow-up items on its staffing approach. An agency may not have continuing discussions with only one offeror regarding a certain concern where it has the same concern with other proposals, but may reiterate a technical concern with one offeror where that concern applies only to that offeror, and the agency has no remaining technical concerns for the other offeror. See Intl Business & Tech. Consultants, Inc., B-310424.2 et al., Sept. 23, 2008, 2008 CPD ¶ 185 at 9-10. The record shows the Army had no remaining concerns about GDITs staffing approach, which it considered a strength, and it was not required to further pursue the matter.
We now turn to GDITs allegations concerning the transition and quality assurance plan subfactor. In this regard, GDIT first argues that the Army overlooked the fact that SAIC took exception to the solicitations transition requirements for locations requiring Technical Expert Status Accreditation (TESA) approval.
The solicitation stated that, upon award, the contractor was to have 45 days to provide all fully qualified and cleared personnel to provide the contracted services, ensuring no break in services from the prior contractor. PWS ¶ 9.1. According to the RFP, contractor employees hired for positions in Germany as technical experts must receive TESA approval from the German government prior to beginning work in Germany. Since the contractor is responsible for creating position descriptions for all positions requiring TESA approval, see PWS ¶ 10.4.5; see also PWS ¶ 11.2.3, the contractor is required to undertake these tasks to ensure TESA approval in order to be operational within 45 days of award.
Under the heading, Technical Expert/Analytical Support (TE/AS) Status Accreditation (TESA), SAICs proposal states:
SAICs proposal assumes the granting of Enterprise Approval for the resulting Task Order and the approval of Technical Expert/Analytical Support (TE/AS) Status Accreditation for the individual technical personnel proposed. . . . Performance will begin upon final approval of the Task Order and individual staff members assigned to this Task. . . .
AR, Exh. 7-5a, SAIC Price Proposal at § 3.8.
Citing this language, GDIT argues that SAICs proposal conditioned the commencement of its performance in locations requiring TESA approval on the granting of TESA approval for its candidates, and thus failed to meet a material requirement. In our view, however, the agency reasonably understood the statement that SAIC would assume granting of TESA approval to mean, in this context, that SAIC viewed this approval as a given and that the firm, like any other firm, would begin performance upon final approval. To the extent that GDIT argues approval might not be a given, any contractor must meet the same approval requirements.
In a related matter, GDIT argues that SAICs proposal fails to set out how it would satisfy the TESA requirements while meeting the transition schedule deadline, and that the Army ignored these significant transition risks.
Under this subfactor, offerors were to provide a transition plan that, among other things, clearly identified specific actions that must occur for CONUS and OCONUS locations. As relevant here, these actions included those associated with entry, exit, and work requirements for U.S. citizens working in all OCONUS locations. RFP § L.3.2. The methodologies proposed in the transition plan were to be evaluated based on the extent to which they ensured a smooth transition while sustaining operations and critical missions at each organization. RFP § M.6.2(b)(3).
In describing its methodologies, SAIC proposed that, within 72 hours of award notification, it would submit its initial round of documents to begin the processes for such things as TESA accreditation. According to SAICs proposal, rapid submission, review, and approval of these processes comprised a critical, initial goal for its team to mitigate transition risk for overseas organizations. AR, Exh. 07-3a, SAIC Management Approach Proposal at § 3.1.1.2. This methodology is included in its plan of actions and timelines. Id. at § 3.1.2.1. SAIC also described its extensive experience in the rapid deployment of contractor personnel to overseas locations, and its intimate familiarity with deployment issues such as obtaining TESA approvals. Id. SAIC stated that its human resources staff has a team of specialists who exclusively handle international staffing issues and will address such things as host national employment requirements. Id. at § 3.1.2.2. The Army assigned SAICs proposal a strength associated with its proposed methodology to move quickly to ensure a successful OCONUS transition, citing the above passages from SAICs proposal discussing TESA approval, and found that it reduced transition risk. AR, Exh. 13-11, SSEB Report, at 34.
The Army clearly evaluated SAICs transition plan with respect to the need for TESA approvals, and found that it merited a strength. Based on the information in SAICs proposal, we have no basis to question the Armys evaluation. GDITs view that SAICs proposal should have been more detailed does not show there were significant risks in its proposal that the Army ignored.
Past Performance Factor
GDIT argues that the Army failed to properly assess relevance when evaluating past performance, and this failure undermined the source selection decision. GDIT also argues that the SSA unreasonably relied only on the offerors adjectival performance confidence assessment ratings, which led her to conclude that SAICs past performance was equal to GDITs past performance. According to GDIT, the evaluation record shows that GDITs past contracts are more relevant and therefore deserved greater consideration.
An agencys evaluation of past performance, including its consideration of the relevance, scope, and significance of an offerors performance history, is a matter of discretion which we will not disturb unless the agencys assessments are unreasonable, inconsistent with the solicitation criteria, or undocumented. L-3 Sys. Co., B-404671.2, B-404671.4, Apr. 8, 2011, 2011 CPD ¶ 93 at 4; Family Entmt Servs., Inc., d/b/a IMC, B-291997.4, June 10, 2004, 2004 CPD ¶ 128 at 5. A protesters mere disagreement with such judgment does not provide a basis to sustain a protest. Birdwell Bros. Painting & Refinishing, B-285035, July 5, 2000, 2000 CPD ¶ 129 at 5. Our review of the record shows that GDITs allegations have no merit.
Offerors could submit up to five recent and relevant contract references for evaluation. RFP § L.3.3. The agencys evaluation of proposals, and assignment of performance confidence assessment ratings, was to focus on performance that was relevant to the contract requirements. RFP § M.6.3.
Both SAIC and GDIT submitted five references. Consistent with the RFPs requirements, the past performance evaluation team first evaluated the information to determine the relevance of the past efforts. After this determination, the team reviewed the past performance information it could gather to determine its quality and usefulness in assigning the performance confidence assessment ratings. The agency evaluated the information in the proposals, made follow-on telephone calls to references, and gathered available Contractor Performance Assessment Reporting System (CPARS) and Past Performance Information Retrieval System (PPIRS) data.[3] AR, Exh. 13-9, Past Performance Evaluation Report, at 2-4.
For GDIT, the agency found that three contracts were very relevant and two were relevant. With that relevance determination as a backdrop, the agency obtained available past performance information and conducted interviews. The CPARS for one very relevant contract rated GDIT as satisfactory in one area and very good to exceptional in others; the contracting officer for that procurement expressed a concern about GDITs ability to consistently backfill personnel, and rated its performance as good, but not excellent. The contracting officer for another very relevant contract rated GDITs overall performance as satisfactory. The contracting officer for one task order under the third very relevant contract said that GDITs overall customer satisfaction was good. For one of GDITs relevant contracts, the contracting officer stated that the firms overall performance was very good; no third party information was available for the fifth contract. Id. at 6-8. The Army concluded that, based on GDITs recent/relevant performance record, it had a high expectation that GDIT would successfully perform the required effort and assigned it a substantial confidence rating. The team stated that the one satisfactory rating did not outweigh the other good/exceptional findings. Id. at 6, 8.
For SAIC, the agency found that one contract was very relevant and four were relevant. With that relevance determination as a backdrop, the agency obtained available past performance information and conducted one interview. For SAICs very relevant contract, SAICs proposal stated that its performance was consistently rated as excellent, but the agency was unable to confirm this statement. The CPARS for one of SAICs relevant contracts confirmed that SAIC was rated exceptional in a number of areas and very good in some others. The CPARS for a second relevant contract confirmed that SAIC was rated exceptional in all areas; the contracting officer for this procurement stated that he was very satisfied with SAICs overall efforts. For a third relevant contract, SAIC stated that it was rated exceptional in all areas, but the agency was unable to confirm this statement. Information was not available on another relevant contract. Id. at 11-12. The Army found that, based on SAICs recent/relevant performance record, it had a high expectation that SAIC would successfully perform the required effort and assigned it a substantial confidence rating. Although it was not able to confirm all the meritorious past performance assessment ratings identified by SAIC, the agency stated that the past performance information it did find indicated that SAIC had received good/excellent ratings, and this gave the Army substantial confidence in SAICs ability to perform. Id. at 11-12.
GDITs argument that the agency did not consider relevance when assigning its performance confidence assessment ratings is belied by the record. GDITs assertion that SAIC should have been rated satisfactory confidence because its past efforts are less relevant than those of GDIT ignores the RFPs requirement for an integrated assessment of both the relevance and quality of an offerors past performance. The Army conducted this integrated assessment, placing the quality of the offerors performance in the context of the relevance of its efforts, and GDIT has given us no basis to find it unreasonable.
GDIT argues that the SSA did not consider the evaluation information underlying the adjectival performance confidence assessment ratings and that, if she had, she would have discovered that GDITs past performance was far more relevant than that of SAIC and deserving of greater weight. The record, however, reflects that the SSA received the underlying evaluation information in her initial briefing. AR, Exh. 15-1, SSA Initial Briefing, at Slides 36-41, 77-81. Based on this information, the SSA properly assessed whether there were qualitative differences in the offerors past performance records that could be factored into her source selection decision, and concluded that the differences did not provide a basis for discrimination. We have no basis to find her conclusion in this regard unreasonable. See Sigmatech, Inc., B-406288.2, July 20, 2012, 2012 CPD ¶ 222 at 6.
Price/Technical Tradeoff Decision
GDIT argues that the SSAs characterization of SAICs technical capability proposal as far superior to GDITs is inconsistent with the fact that, even though SAICs proposal received higher ratings than GDIT under two subfactors, the subfactors were equally weighted. As a result, the protester contends, SAICs proposal should be considered as only slightly better than GDITs proposal. GDIT further argues that the SSAs characterization of the difference between the firms management approach proposals as minimal is inconsistent with her conclusion concerning the technical capability proposals. In this regard, GDIT asserts since the SSA found SAICs technical capability proposal--with higher adjectival ratings under two subfactors-- far superior to GDITs, to be consistent, the SSA should have found GDITs management approach proposal--with higher adjectival ratings under three subfactors and a higher overall adjectival rating--to merit at least a far superior finding. GDIT Comments at 14-15.
In a best value procurement, it is the function of the SSA to perform a price/technical tradeoff, that is, to determine whether one proposals technical superiority is worth the higher price. ITW Military GSE, supra. Ratings, whether numerical, color, or adjectival, are merely guides to assist agencies in evaluating proposals; information regarding strengths and weaknesses of proposals is the type of information that source selection officials should consider, in addition to ratings, to enable them to determine whether and to what extent meaningful differences exist between proposals. Pemco Aeroplex, Inc., B-310372, Dec. 27, 2007, 2008 CPD ¶ 2 at 6; ACCESS Sys., Inc., B-400623.3, Mar. 4, 2009, 2009 CPD ¶ 56 at 7. Proposals with the same adjectival ratings are not necessarily of equal quality, and an agency may properly consider specific advantages that make one proposal of higher quality than another. Pemco Aeroplex, Inc., supra, at 6-7. A single evaluation factor--even a lower-weighted factor--may properly be relied upon as a key discriminator for purposes of a source selection decision. Smiths Detection, Inc.; Am. Sci. and Engg, Inc., supra, at 16.
Here, the SSA properly did not engage in the purely mathematical or mechanical price/technical tradeoff methodology advocated by GDIT. Rather, in making her price/technical tradeoff decision, the SSA looked behind the adjectival ratings to identify any qualitative differences that existed between the proposals. Recognizing the wide discretion afforded agencies in making their tradeoff decisions, we have no basis to find the SSAs decision in this case unreasonable or otherwise improper.
The protest is denied.
Lynn H. Gibson
General Counsel
[1] These other concerns were addressed during discussions.
[2] There is no evidence to support GDITs argument that the SSA disagreed with the evaluation teams findings concerning the risks of both proposals.
[3] The PPIRS is a web-enabled, government-wide application that collects quantifiable delivery and quality past performance information. FAR § 42.1503; General Dynamics, American Overseas Marine, B-401874.14, B-401874.15, Nov. 1, 2011, 2012 CPD ¶ 85 at 7 n.4.