Skip to main content

Science Applications International Corporation

B-423532,B-423532.2 Jul 28, 2025
Jump To:
Skip to Highlights

Highlights

Science Applications International Corporation (SAIC), of Reston, Virginia, protests its exclusion from the competitive range under request for proposals (RFP) No. FA7014-25-R-0001, issued by the Department of the Air Force for modeling and support simulation services. SAIC argues that the agency unreasonably failed to evaluate its proposal before establishing the competitive range.

We deny the protest.
View Decision

Decision

Matter of: Science Applications International Corporation

File: B-423532; B-423532.2

Date: July 28, 2025

Luke W. Meier, Esq., Robyn N. Burrows, Esq., David L. Bodner, Esq., and Shane M. Hannon, Esq., Blank Rome LLP, for the protester.
Colonel Nina R. Padalino, Major Ryan P. Payne, Hector M. Rivera-Hernandez, Esq., and Gretchen Bundy-Ladowicz, Esq., Department of the Air Force, for the agency.
Todd C. Culliton, Esq., and Tania Calhoun, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

Protest that the agency unreasonably established a competitive range is denied where the record shows the agency established the competitive range in accordance with the terms of the solicitation and consistent with the results of the offeror’s self-scores.

DECISION

Science Applications International Corporation (SAIC), of Reston, Virginia, protests its exclusion from the competitive range under request for proposals (RFP) No. FA7014‑25‑R‑0001, issued by the Department of the Air Force for modeling and support simulation services. SAIC argues that the agency unreasonably failed to evaluate its proposal before establishing the competitive range.

We deny the protest.

DISCUSSION

On December 8, 2024, the Air Force issued the RFP to procure modeling and support simulation (M&S) services. Agency Report (AR), Tab 3, RFP at 1; Contracting Officer’s Statement (COS) at 2; Protest, exh. 2, RFP, Performance Work Statement (PWS) at 4. The selected contractor will develop modeling and simulation architectures for mission operability and efficiency, implement robust data standards, and enhance cybersecurity measures and engineering capabilities. COS at 2; RFP, PWS at 4. The RFP contemplated the award of a single indefinite-delivery, indefinite-quantity (IDIQ) contract with a ceiling value of $972 million. COS at 3; RFP at 3. Task orders would be issued on fixed-price, cost-plus-fixed-fee, time-and-materials, fixed-price-plus-incentive, and cost‑reimbursable bases. RFP at 3-5.

When evaluating proposals, the RFP advised that the agency would evaluate proposals by selecting the highest technical rated offeror at a fair and reasonable price. To aid the agency’s evaluation, the RFP advised offerors to submit their proposals using a seven‑volume format, which included, as relevant here, technical past experience and cost/price volumes. AR, Tab 4, RFP, Section L, Instructions at 7-9. As part of the technical past experience volume, each offeror was instructed to reference a maximum of 10 prior contracts. Id. at 15. Each referenced contract had to meet minimum qualifications, including that it must have been for modeling and simulation support services, it had to have a value of at least $15 million, and it must have either ongoing performance or performance had to have been completed within the past five years. Id. at 16.

The RFP instructed each offeror to complete a self-scoring matrix, which was divided into two tables. AR, Tab 4, RFP, Section L, Instructions at 18. One table required offerors to demonstrate their experience performing similar M&S functions. AR, Tab 6, RFP, Self‑Scoring Matrix. This “past experience” table was divided into eight elements. Id. As an example, under task one (program management) the offeror was required to identify the number of contracts referenced in its past experience volume with more than 400 full-time equivalent employees, the number of contracts with a value greater than $750 million, the number of single award IDIQ contracts with more than 20 independently funded task orders, and the years of experience as a prime or subcontractor in managing M&S programs. Id.; AR, Tab 4, RFP, Section L, Instructions at 18-19. The matrix would auto-calculate the score based on the information provided. AR, Tab 6, RFP, Self-Scoring Matrix.

The self-scoring matrix also had a “past performance” table for each offeror to report its ratings for the referenced contracts in the past experience volume as recorded in the contractor performance assessment reporting system (CPARS). AR, Tab 6, RFP, Self‑Scoring Matrix; AR, Tab 4, RFP, Section L, Instructions at 29. This table required each offeror to provide their CPAR adjectival category ratings. AR, Tab 4, RFP, Section L, Instructions at 30-31. As an example, the RFP provided that if an offeror had a CPAR with ratings of “Exceptional” for “Quality,” “Very Good” for “Schedule,” “Satisfactory” for “Cost Control,” and “Exceptional” for “Management,” then the offeror would self-score that record as having two “Exceptional,” one “Very Good,” and one “Satisfactory” rating. Id. Based on the total number of reported category ratings, the matrix would calculate a score. Id. at 31; AR, Tab 6, RFP, Self-Scoring Matrix.

The agency would evaluate proposals by validating the technical scores. AR, Tab 5, RFP, Section M, Evaluation Criteria at 5. To do so, the agency would initially sort all proposals in order of the highest to the lowest self-score. Id. Starting with the highest self-scored proposal, the agency would review the past experience and CPARS ratings to confirm that the self-score was correct. Id. If the score was inaccurate and fell below the score of the next highest-ranked proposal, then the agency would proceed to validate the score of the next proposal. Id. Significantly, the solicitation advised that an offeror’s score could only be decreased, not increased, through the evaluation process. Id. at 12.

After identifying the proposal with the highest technical score, the agency would evaluate the firm’s price proposal to determine whether the proposed price was complete, balanced, fair, and reasonable. AR, Tab 5, RFP, Section M, Evaluation Criteria at 6. The agency would also determine whether the offeror’s professional employee compensation plan (PECP) was realistic. Id. If the agency determined that the proposed pricing was complete and reasonable, and that the PECP was realistic, then award would be made to that offeror. Id. Otherwise, the RFP advised that the agency would reject the proposal, enter discussions with the offeror, or evaluate the next highest self‑scored proposal. Id. at 15.

Prior to the February 14, 2025, close of the solicitation period, [DELETED], including SAIC, submitted proposals. AR, Tab 7, Initial Evaluation Briefing at 44. The agency reviewed the self-scores and determined that another offeror had a higher self-score than SAIC. Id.; COS at 5. The agency validated the offeror’s self-score but determined that the offeror’s price proposal was incomplete. COS at 5. Rather than evaluate the next highest-ranked proposal, the agency elected to establish a competitive range consisting of just the highest-ranked offeror’s proposal and conduct discussions. Id. The agency then notified SAIC and the other offerors that they were excluded from the competitive range.

This protest followed.

DISCUSSION

SAIC argues that the agency improperly established the competitive range without having evaluated its proposal. Protest at 9; Comments and Supp. Protest at 4-5. As support, SAIC argues that Federal Acquisition Regulation (FAR) section 15.306(c)(1) permits an agency to establish a competitive range only after evaluating all proposals. Comments and Supp. Protest at 4‑5. SAIC also argues that the agency unreasonably excluded its proposal from the competitive range because, under the selection methodology, the highest‑ranked offeror’s proposal is unawardable due to having an incomplete cost/price proposal. Id. at 7-8.

The agency responds that it properly established the competitive range because the RFP allowed the agency to hold discussions at its discretion, permitted the agency to conduct discussions with a single offeror, and expressly permitted the agency to enter discussions to remedy pricing issues. Memorandum of Law (MOL) at 8-15. The Air Force also argues that the protester misinterprets FAR section 15.306(c)(1) because agencies may limit the scope of competition, particularly where the goal is to make award to the highest technically rated offeror, and need not retain proposals in a competitive range when they have no realistic chance of being selected for award. Id. at 13-15; Supp. COS at 8. The Air Force also responds that the selection methodology does not preclude the agency from entering discussions to remedy the pricing issue. Supp. COS MOL at 6-8.

As additional background, the RFP contains several important provisions contained in section M, evaluation criteria. First, section 1.1 provides that the acquisition will be conducted using the procedures set forth under FAR subpart 15.3, and section 1.1.1 defines “best-value” as the offeror who is the highest technical rated offeror (HTRO) and a complete price proposal. AR, Tab 5, RFP, Section M, Evaluation Criteria at 4. Second, section 1.6 provides as follows:

1.6. Discussions. The Government intends to award without discussions, but reserves the right to conduct discussions with one, some, or all Offerors if necessary. Evaluation Notices (ENs) may be used for clarification or communication purposes as well as discussions. Responses to the ENs for cost and technical past experience will be evaluated in accordance with the evaluation criteria stated herein. Failure to respond to ENs will eliminate the Offeror from further evaluation and consideration for award.

Id.

Third, the RFP sets forth a 10-step evaluation methodology. In steps one through five, the agency would identify the apparent HTRO based on the self-score, review the firm’s proposal for administrative compliance, and evaluate the firm’s top secret facility clearance level and small business participation plan on an acceptable or unacceptable basis. AR, Tab 5, RFP, Section M, Evaluation Criteria at 5-7. In step six, the agency would review the offeror’s past experience to determine whether the referenced contracts satisfied the minimum qualifications. Id. at 5. In step seven, the agency would validate the apparent HTRO’s self-score by comparing the scores against the past experience. Id.

For step eight, the RFP provides as follows:

Step 8: The Government Cost team will evaluate the cost of the HTRO to ensure cost is complete, balanced, fair and reasonable, with a realistic PECP. If the pricing proposal is determined complete, balanced, fair and reasonable, with a realistic PECP, the Government will proceed to Step 9. If Offeror’s cost is determined to be complete, balanced, fair and reasonable, with a realistic PECP[,] an award is made to this Offeror. However, if the Offeror’s price is not determined complete, balanced, fair and reasonable, with a realistic PECP, then the Government will revert back to Step 1. An award will be made if this Offeror successfully meets all evaluation criteria.

AR, Tab 5, RFP, Section M, Evaluation Criteria at 6. In steps nine and ten, the agency would calculate the HTRO’s total evaluated price and issue the initial task order against the awarded IDIQ contract. Id. at 7.

Fourth, the RFP set forth specific procedures for evaluating the HTRO’s cost/price proposal. Importantly, section 3.6.3, Completeness, provides as follows:

3.6.3 Completeness. Cost proposals will be evaluated for completeness by assessing the level of detail of the Offeror-provided Cost data for all requirements in the PWS and assessing the traceability of estimates. . . . If the Government determines a price proposal to be incomplete, the Government may not be able to further evaluate the proposal for reasonableness and balance and may not be able to make award to that Offeror. In this case, the Government will determine whether to enter into discussions [in accordance with] section 1.6 or proceed to the Offeror with the next highest Government-validated self-rated score.

AR, Tab 5, RFP, Section M, Evaluation Criteria at 15.

Finally, the RFP provides that the Government “will only adjust the Offeror’s self-rated score downward, not upward,” when reviewing each offeror’s self‑score. AR, Tab 5, RFP, Section M, Evaluation Criteria at 12.

When evaluating proposals, the agency ranked all offerors’ self-scores from highest‑to‑lowest and validated the highest-ranked offeror’s score. AR, Tab 7, Initial Evaluation Briefing at 44. The agency, however, found that the offeror with the highest score submitted an incomplete cost/price proposal. COS at 5. It then elected to conduct discussions with the offeror to remedy the pricing issue. Id. The agency’s evaluation documents record SAIC’s and the other remaining offerors’ proposals as “Not Evaluated.” AR, Tab 7, Initial Evaluation Briefing at 46.

Turning to our discussion, in reviewing an agency’s evaluation of proposals and subsequent competitive range determination, we will not reevaluate proposals but will examine the record to ensure that the evaluation was reasonable and in accordance with the solicitation’s evaluation criteria and applicable statutes and regulations. Environmental Restoration, LLC, B‑413781, Dec. 30, 2016, 2017 CPD ¶ 15 at 3. Further, we are guided by FAR section 15.306(c)(1) which provides in relevant part, as follows:

Agencies shall evaluate all proposals in accordance with 15.305(a), and, if discussions are to be conducted, establish the competitive range. Based on the ratings of each proposal against all evaluation criteria, the contracting officer shall establish a competitive range comprised of all of the most highly rated proposals, unless the range is further reduced for purposes of efficiency pursuant to paragraph (c)(2) of this section.

FAR 15.306(c)(1).

Additionally, agencies are not required to retain in the competitive range a proposal that is not among the most highly rated or that the agency otherwise reasonably concludes has no realistic prospect of award. Environmental Restoration, LLC, supra.

Addressing the protester’s principal argument (i.e., that the agency could not reasonably establish a competitive range without having first evaluated all proposals), we are unpersuaded given the unique circumstances of this acquisition. As noted above, the RFP outlined an evaluation process that considered offerors’ self-scores based exclusively on objective characteristics of their experience and past performance. Significantly, under the terms of the solicitation, each offeror’s self-score could only decrease through the verification process. Thus, when the agency ranked all self‑scores from highest-to-lowest and validated the HTRO’s self-score, the agency effectively evaluated and compared the technical capability of every proposal. See MOL at 10 (“[T]he Air Force could know, and would know, the most highly rated proposal by following its own methodology.”). In this way, even though the agency may not have validated SAIC’s score, it had reasonably and objectively determined that the HTRO’s proposal was technically superior and thereby established a competitive range of the most highly rated proposal consistent with FAR section 15.306(c)(1). Under these circumstances, any formal evaluation of SAIC’s past experience and validation thereof would have been superfluous because SAIC’s technical score would always be lower than the HTRO’s.

Further, as outlined above, section 1.6. of the RFP permitted the agency to conduct discussions with a single offeror and section 3.6.3 of the RFP allowed the agency to conduct discussions when it determined that an offeror’s cost/price proposal was incomplete. Thus, we see no basis to object to the agency’s establishment of the competitive range because the self-scores reasonably informed the agency as to the relative merit of all proposals, and the RFP informed offerors that the agency could conduct discussions with a single offeror for the precise issue of addressing the HTRO’s incomplete price. Accordingly, we deny the protest allegation.

Turning to SAIC’s second argument, we are likewise unpersuaded. As referenced above, the protester argues that the agency “violated the Section M ground rules for Best Value by prematurely removing all competition in favor of an unfinished, uncertain, and as-yet unawardable proposal.” Comments and Supp. Protest at 7. SAIC asserts the agency’s conduct was unfair because its proposal represents the best value as it submitted a slightly lower rated technical proposal and a complete cost/price proposal. Id. at 8. The agency responds that it reasonably established the competitive range in accordance with the terms of the solicitation. Supp. COS/MOL at 8.

On this record, we find no basis to object to the agency’s conduct. While we recognize that the agency can only make award to the proposal representing the best value (i.e., the HTRO with a fair, reasonable, complete, and balanced price), the agency is not making the award at this juncture, rather it is simply entering into discussions. As discussed above the apparent HTRO’s proposal represents a superior technical approach, and in its discretion and in accordance with the terms of the solicitation, the agency has elected to conduct discussions with that firm regarding its cost/price proposal. COS at 11 (“In determining the competitive range, I ranked all proposals based on self-rated scores, from the highest self-rated scored Offeror to the lowest self-rated scored Offeror . . . Based on SAIC not having the highest validated offer, I determined SAIC did not have a realistic chance of receiving award[.]”). Accordingly, we deny the protest allegation.

As a final matter, even if our Office were to find that the agency unreasonably established the competitive range, we would not find that the protester suffered any competitive prejudice from its omission. Competitive prejudice is an essential element of a viable protest and there is no basis for finding prejudice and sustaining a protest where the protester fails to demonstrate that, but for the agency’s actions, it would have had a substantial chance of receiving the award. Diverco, Inc., B‑259734, Apr. 21, 1995, 95-1 CPD ¶ 209 at 3. Where an impropriety in the conduct of discussions is found, it must be clear from the record that the protester was not prejudiced in order to deny the protest. Id. at 3-4.

Here, the RFP advised that an offeror’s score could only be adjusted downward, not upward, during the evaluation process, and SAIC’s past experience was self-scored lower than the apparent HTRO. Additionally, SAIC’s protest did not allege that the firm would have revised its referenced past experience to attain a higher score through the discussions process; instead, the firm only argued that it was prejudiced because the agency has not yet determined whether the apparent HTRO’s proposal is awardable, and as a result, SAIC could still receive the award. Protest at 13; see also Comments and Supp. Protest at 11-12. While it is true that SAIC can still potentially receive the award under the solicitation’s selection process, it cannot change its competitive standing since its maximum self-score remains fixed. Thus, the record is clear that the protester did not suffer any competitive prejudice because, even if the agency had included the firm as part of discussions, there is no possibility that the firm could improve its technical score to surpass the apparent HTRO.[1] See MOL at 17-18. Accordingly, we deny the protest.

The protest is denied.

Edda Emmanuelli Perez
General Counsel

 

[1] The Air Force explains that, if the apparent HTRO’s cost/price proposal is incomplete after discussions, then it will move to the next highest self-scored proposal. MOL at 16‑17.

Downloads

GAO Contacts

Kenneth E. Patton
Managing Associate General Counsel
Office of the General Counsel

Edward (Ed) Goldstein
Managing Associate General Counsel
Office of the General Counsel

Media Inquiries

Sarah Kaczmarek
Managing Director
Office of Public Affairs

Public Inquiries