Karna, LLC
Highlights
Karna, LLC, of Atlanta, Georgia, protests the award of a contract to General Dynamics Information Technology, Inc. (GDIT), of Falls Church, Virginia, under request for proposals (RFP) No. 75D301-25-R-73216, issued by the Department of Health and Human Services, Centers for Disease Control (CDC) for a contractor to administer a health benefits program. The protester challenges the agency's conduct of discussions, evaluation of proposals, and resulting source selection decision.
Decision
DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.
Matter of: Karna, LLC
File: B-424039; B-424039.2; B-424039.3
Date: February 20, 2026
Craig A. Holman, Esq., Kara L. Daniels, Esq., Amanda J. Sherwood, Esq., Roee Talmor, Esq., Sarah Belmont, Esq., and Dustin Vesey, Esq., Arnold & Porter Kaye Scholer LLP, for the protester.
Noah B. Bleicher, Esq., Moshe B. Broder, Esq., Elizabeth M.D. Pullin, Esq., Jennifer Eve Retener, Esq., Aime JH. Joo, Esq., Sierra A. Paskins, Esq., and Megan C. Bodenhamer, Esq., Jenner & Block LLP, for General Dynamics Information Technology, Inc., the intervenor.
Joon K. Hong, Esq., and Brandon Dell'Aglio, Esq., Department of Health and Human Services, for the agency.
Heather Self, Esq., and Peter H. Tran, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.
DIGEST
1. Protest arguing the agency conducted discussions unequally is denied where solicitation provided for separate discussions specific to proposed betterments, and in accordance with this provision the agency conducted an extra round of discussions with the only offeror in the competitive range that proposed betterments.
2. Protest challenging evaluation of technical proposals and conduct of discussions is denied where the evaluation was reasonable and consistent with the solicitation, and concerns not raised during discussions were not significant weaknesses.
3. Protest that agency failed to conduct a comparative assessment of offerors' past performance and recognize protester's superiority under this factor is denied because protester was not competitively prejudiced by any errors that may have occurred.
3. Protest alleging the agency failed to evaluate realism of offerors' levels of effort, as required by the solicitation, is denied where contention is not supported by the record.
DECISION
Karna, LLC, of Atlanta, Georgia, protests the award of a contract to General Dynamics Information Technology, Inc. (GDIT), of Falls Church, Virginia, under request for proposals (RFP) No. 75D301-25-R-73216, issued by the Department of Health and Human Services, Centers for Disease Control (CDC) for a contractor to administer a health benefits program. The protester challenges the agency's conduct of discussions, evaluation of proposals, and resulting source selection decision.
We deny the protest.
BACKGROUND
In 2011, Congress established the World Trade Center Health Program (WTCHP or “the Program”) to provide medical monitoring and treatment for people suffering adverse health conditions resulting from responding to the September 11, 2001 terrorist attacks. Agency Report (AR), Exh. 1.10, RFP Amend. 9 at 98 (citing the James Zadroga 9/11 Health and Compensation Act of 2010, Pub. L. No. 111-347, signed into law on Jan. 2, 2011).[1] The CDC administers the Program through services performed under multiple interconnected contracts, two of which are the Health Program Support (HPS) contract and the Nationwide Provider Network (NPN) contract. Contracting Officer's Statement (COS) at 1.
The agency explains the solicitation at issue here--which seeks to award a contract for third party administrator (TPA) services for the Program--is intended to replace the current HPS contract and incorporate aspects of the NPN contract to create greater efficiencies. COS at 1. In this regard, in 2021, the Program conducted an efficiency and effectiveness review. This review concluded that the HPS and NPN contractors performed duplicative functions for two different Program member populations--with the HPS contractor performing services for members inside the New York Metropolitan Area (NYMA) and the NPN contractor performing services for members outside the NYMA. Id. Following the efficiency review, the CDC undertook a consolidation effort to combine aspects of the HPS and NPN contracts into a single, new TPA contract vehicle. Id.
In 2022, the agency issued solicitation No. 75D301-22-Q-74251 seeking proposals for the new consolidated TPA requirements, and on May 31 awarded the TPA contract to Cahaba Safeguard Administrators, LLC, of Birmingham, Alabama. COS at 2. Two unsuccessful offerors--Karna and GDIT--protested the initial award to Cahaba. Karna, LLC; General Dynamics Info. Tech., Inc., B-420899 et al., Aug. 17, 2022 (unpublished decision). In response, the CDC notified our Office of its intent to take corrective action by terminating the award and issuing a new solicitation for the TPA requirement. Id. As a result, we dismissed Karna's and GDIT's 2022 protests as academic. Id. Throughout 2023 and 2024, the agency continued to receive separate HPS and NPN services via short-term interim “bridge” contracts, with the HPS bridge contract being performed by Karna as the incumbent prime and GDIT as an incumbent subcontractor. COS at 2. During this time, the CDC performed market research and other acquisition planning tasks leading up to issuance of the new solicitation (RFP No. 75D301-25-R-73216), under which the award to GDIT is presently challenged.
On January 16, 2025, using the procedures of Federal Acquisition Regulation (FAR) parts 12 and 15, the agency issued the current solicitation seeking to award a new consolidated TPA contract for the Program. RFP at 1, 99, 101. The solicitation required the new TPA contractor to be responsible for providing contract management, member services, provider network management, medical benefits administration, claims processing, and data and business information management necessary to support the operation of the Program. Id. at 101. Specifically, the RFP explains the “TPA Contractor receives and processes enrollment and certification applications for [agency] decision-making, maintains member demographic information, manages a call center, manages and maintains provider network(s) on behalf of the Program, administers Program benefits, and processes medical (to include dental) claims.” Id. at 99. As part of providing these services, the TPA contractor will be required to engage with several other contractors providing services related to execution of the Program (e.g., comprehensive cost avoidance contractor, pharmacy benefits manager, etc.). Id. at 99-101.
The solicitation contemplated award of a single contract with time-and-materials, cost‑only, firm-fixed-price, and firm-fixed-unit-price line items and a potential 10-year period of performance (comprising a 1‑year base period and nine 1-year option periods). RFP at 3-10, 15-20, 25-30, 35-40, 45-50, 55-60, 65-70, 75-80, 85-91, 96-98. Award would be made on a best-value tradeoff basis considering three factors: (1) technical; (2) past performance; and (3) price. Id. at 267, 271-273. The technical factor consisted of ten subfactors: (i) transition-in; (ii) contract management, regulatory compliance, and transition-out; (iii) member services; (iv) provider network management; (v) medical benefits administration and policy; (vi) claims processing; (vii) data and business information management; (viii) key personnel; (ix) small business subcontracting; and (x) corporate experience. Id. at 267-271.
For nine of the ten technical subfactors, the agency would “assess its level of confidence that the offeror will successfully perform the requirements,” and assign a confidence rating of “substantial confidence,” “high confidence,” “moderate confidence,” “some confidence,” or “low confidence.” AR, Exh. 1.11, RFP Attach. 114--Technical Ratings at 1. The tenth technical subfactor (small business subcontracting) would be evaluated on a pass/fail basis. Id. The solicitation stated the agency would not assign an overall technical rating. RFP at 267. In addition to written proposal submissions, offerors were required to participate in a two-hour “virtual session” (i.e., an oral presentation) that would be considered as part of the evaluation for some of the technical subfactors. Id. at 244, 246, 264, 267, 269-270.
For past performance, the evaluators would assign proposals a rating of substantial confidence, satisfactory confidence, neutral confidence, or limited confidence, based on consideration of the recency, relevancy, and quality of an offeror's past performance. AR, Exh. 1.11, RFP Attach. 115--Past Performance Ratings. With respect to price, the agency would evaluate for reasonableness, balance, and realism. RFP at 272.
The solicitation established that the technical and past performance factors, combined, were significantly more important than price. RFP at 273. Regarding the relative importance of the technical subfactors and past performance, the RFP provided the following levels of importance: (A) four of the ten technical subfactors--corporate experience, transition-in, provider network management, and claims processing--were the most important and were equally important to one another; (B) past performance was the second most important; and (C) the remaining six technical subfactors were third most important and were equally important to one another. Id.
Relevant here, the solicitation stated the agency intended to establish a competitive range and conduct discussions. RFP at 265. Additionally, the solicitation permitted, but did not require, offerors to include “betterments” in their proposals, and advised the CDC would conduct specific discussions related to any proposed betterments. Id. at 266. The agency received five timely proposals, including those submitted by the protester, Karna, and the awardee, GDIT. AR, Exh. 5.9, Award Decision at 5. The technical evaluation panel (TEP) evaluated all proposals, and the agency established a competitive range consisting of Karna, GDIT, and a third offeror. Id. at 13.
The contracting officer, TEP chair, and TEP subject matter expert then “reviewed the technical evaluation report and business report to determine the scope of discussion[s].” AR, Exh. 4.2, Discussion Item Development Memo at 1. The group reviewed the multitude of “Factors that increase confidence” and “Factors that decrease confidence” assessed in proposals and identified which of the factors that decreased confidence were significant weaknesses, all of which were then “included in the discussion items for each of the offerors.” Id.; see also generally AR, Exh. 5.1, Technical Evaluation (Tech. Eval.)-- Original Proposals at 18-30, 32-43 (for GDIT and Karna, respectively). The group did not identify any deficiencies in the proposals of any offerors in the competitive range nor were there any adverse past performance issues to which offerors had not had an opportunity to respond. AR, Exh. 4.2, Discussion Item Development Memo at 1. In addition to the confidence decrease factors identified as significant weaknesses, the group identified “issues from the business review” to add to discussions and other proposal aspects that “could, in the opinion of the contracting officer, be altered or explained to enhance materially the proposal's potential for award.” Id.
The agency conducted a first round of discussions related to significant weaknesses and other issues of concern with all three offerors in the competitive range, each of which then submitted proposal revisions. AR, Exh. 5.9, Award Decision at 16. Based on revised proposals, the agency had no additional questions or negotiation points for Karna and the third offeror, and no additional questions for GDIT about its proposed base solution. Id. The agency, however, did have questions for GDIT about the firm's proposed betterments.[2] Id. The agency conducted a second round of discussions with only GDIT specific to the firm's proposed betterments. Id. Following the second round of discussions with GDIT, and evaluation of the firm's proposal revisions related to betterments, the agency conducted a third, and final, round of discussions with all offerors in the competitive range related to the timing of travel under the PWS; the third round also included additional questions for GDIT related to its proposed betterments. Id.at 17.
After concluding discussions and receiving final proposal revisions, the evaluators assessed Karna's and GDIT's final proposals as set forth in the table below with the factors and subfactors listed in order of importance.
|
Karna |
GDIT |
|
|---|---|---|
|
MOST IMPORTANT FACTORS |
||
|
Technical: Transition-In |
Moderate Confidence |
Moderate Confidence |
|
Technical: Provider Network Management |
Some Confidence |
High Confidence |
|
Technical: Claims Processing |
Moderate Confidence |
High Confidence |
|
Technical: Corporate Experience |
Moderate Confidence |
High Confidence |
|
SECOND MOST IMPORTANT FACTOR |
||
|
Past Performance |
Substantial Confidence |
Substantial Confidence |
|
THIRD MOST IMPORTANT FACTORS |
||
|
Technical: Contract Management, Regulatory Compliance, Transition‑Out |
Moderate Confidence |
Moderate Confidence |
|
Technical: Member Services |
Some Confidence |
High Confidence |
|
Technical: Medical Benefits Administration and Policy |
Moderate Confidence |
Moderate Confidence |
|
Technical: Data and Business Information Management |
High Confidence |
High Confidence |
|
Technical: Key Personnel |
Some Confidence |
Moderate Confidence |
|
Technical: Small Business Subcontracting |
Pass |
Pass |
|
LEAST IMPORTANT FACTOR |
||
|
Price[3] |
$341,136,398 |
$325,382,299 |
AR, Exh. 5.9, Award Decision at 12-13, 17-19; Exh. 5.8, Memorandum to File--Typographical Error in Award Decision at 1.
In comparing final proposal evaluations, the contracting officer, who also served as the source selection authority (SSA), noted GDIT's proposal received a rating of high confidence under five of the ten technical subfactors, while Karna's proposal received a rating of high confidence under only one technical subfactor. AR, Exh. 5.9, Award Decision at 20. Additionally, GDIT's proposal did not receive a rating of some confidence under any factor, while Karna's proposal received a rating of some confidence under three technical subfactors. Id. The SSA also compared proposals “looking at the top four sub-factors,” and summarized her finding as “GDIT again received the highest ratings: GDIT--three high [confidence] and one moderate [confidence]; Karna--four moderate [confidence].” Id. The SSA noted, overall, GDIT had “the highest rating, or tie[d] for highest, for every sub-factor with the exception of 1(v): Medical benefits administration and policy,” under which GDIT received a rating of moderate confidence and the third offeror in the competitive range received a rating of high confidence. Id. at 21. After reviewing prices for reasonableness, realism, and balance, the SSA concluded “GDIT had the lowest price and the highest technical ratings of the offerors in the competitive range,” and selected GDIT's proposal as offering the best value without conducting a tradeoff. Id.
On October 17, CDC notified unsuccessful offerors of the award, and, on October 23, provided Karna with a debriefing. COS at 6. Karna filed this timely protest challenging the award decision on the first day our Office reopened following a lapse in appropriations.
DISCUSSION
Across several hundred pages, the parties dissect and disagree over nearly every aspect of the CDC's source selection decision. See Protest at 1-94; Supp. Protest at 1-77; 2nd Supp. Protest at 1-23; COS at 1-51; Memorandum of Law (MOL) at 1-80; Comments at 1‑95; Intervenor Comments at 1-50. After reviewing the cornucopia of challenges and raft of responses, we find no basis to question the agency's conduct of the procurement or its overall assessment that Karna's proposal was too reliant on its incumbent experience, and failed to adequately recognize the change in requirements moving forward under the expanded contract. AR. Exh. 5.1, Tech. Eval.--Original Proposals at 43.
For example, the protester contends the agency conducted discussions in an unequal manner by giving GDIT an additional round of discussions not afforded to Karna or the third offeror in the competitive range. The protester also challenges the conduct of discussions and evaluation of proposals under the past performance factor, as well as under seven of the ten technical subfactors.[4] Further, the protester asserts the agency's price realism evaluation and best‑value source selection decision were flawed. While we do not address in detail every argument, or permutation thereof, raised by Karna, we have considered them all and find none provides a basis to sustain the protest.[5]
At the outset, we note, as a general matter, discussions with offerors in the competitive range must identify “[a]t a minimum . . . deficiencies, significant weaknesses, and adverse past performance information to which the offeror has not yet had an opportunity to respond.” FAR 15.306(d)(3). Additionally, when an agency engages in discussions with an offeror, the discussions must be meaningful--that is, sufficiently detailed so as to lead an offeror into the areas of its proposal requiring amplification or revision in a manner to materially enhance the offeror's potential for receiving award. FAR 15.306(d); Miltope Corp., B-422799, B-422799.2, Nov. 7, 2024, at 6. The actual content and extent of discussions are matters of judgment primarily for determination by the agency involved, however, and we generally limit our review of the agency's judgments to a determination of whether they are reasonable. Id. at 6-7; Creative Info. Tech., Inc., B‑293073.10, Mar. 16, 2005, at 7. To satisfy the requirement to provide meaningful discussions, agencies are not required to afford offerors all-encompassing discussions or to discuss every aspect of a proposal identified as a weakness, if it is not considered significant. Apptis Inc., B-403249, B-403249.3, Sept. 30, 2010, at 4. Further, an agency must treat offerors unequally when conducting discussions; that is, offerors must be afforded equal opportunities to address the portions of their proposals that require revision, explanation, or amplification. CSC Gov't Sols. LLC, B-413064, B‑413064.2, Aug. 10, 2016, at 11. The requirement for equal treatment does not mean, however, that discussions with offerors must, or should, be identical; to the contrary, discussions must be tailored to each offeror's own proposal. Id.; FAR 15.306(d)(1), (e)(1).
Additionally, in reviewing a protest challenging an agency's evaluation, our Office will not reevaluate proposals nor substitute our judgment for that of the agency, as the evaluation of proposals is a matter within the agency's discretion. Systems Implementers, Inc.; Transcend Tech. Sys., LLC, B-418963.5 et al., June 1, 2022, at 10. Rather, we will review the record to determine whether the agency's evaluation was reasonable and consistent with the stated evaluation criteria and with applicable procurement statutes and regulations. Id.; SeaTech Security Sols.; Apogee Grp., LLC, B-419969.6, B‑419969.7, Apr. 21, 2023, at 11. A protester's disagreement with the agency's evaluation judgments, without more, does not render those judgments unreasonable. Id.
Allegation of Unequal Discussions
The protester argues the agency conducted discussions in an unequal manner by affording only GDIT an additional round of discussions relating to the firm's proposed betterments. Supp. Protest at 16. The protester contends the CDC's conduct prejudiced Karna because it “did not receive the multiple, iterative rounds of discussions to improve its proposal as GDIT did.” Id. at 17.
The agency responds that it conducted discussions related to betterments in accordance with the solicitation. COS at 36. The CDC maintains, based on the solicitation: (1) Karna “understood, or should have understood . . . that betterments would be addressed during discussions”; (2) Karna “had the opportunity to propose betterments”; and (3) Karna's exclusion “from a round of discussions that concerned betterments was simply a result of its business decision to not propose betterments.” Id. The protester acknowledges “[t]he Solicitation indicates any proposed betterments would be the subject of discussions,” but insists the RFP said “nothing about betterments--uniquely, unlike the other listed topics of discussions--receiving multiple rounds of discussions, each targeted to help improve an offeror's standing in the Agency's eyes.” Supp. Protest at 18.
When a protester and agency disagree over the meaning of solicitation language, we will resolve the matter by reading the solicitation as a whole and in a manner that gives effect to all its provisions. Patronus Systems, Inc., B-418784, B-418784.2, Sept. 3, 2020, at 5. To be valid, an interpretation must be consistent with the solicitation when read as a whole and in a reasonable manner. Id.
As noted above, the solicitation permitted, but did not require, offerors to propose betterments. RFP at 266. With respect to discussions, the instructions to offerors section of the solicitation provided:
(j)(1) Following its evaluation of proposals, the Government intends to establish a competitive range and conduct discussions.
* * * * *
(j)(4) The primary objective of discussions is to maximize the Government's ability to obtain best value, based on the requirement and the evaluation factors set forth in this solicitation. The Government will address with each offeror still being considered for award deficiencies, significant weaknesses, and adverse past performance information to which the offeror has not yet had an opportunity to respond. The Government may also discuss other aspects of the offeror's proposal that could, in the opinion of the Government, be altered or explained to enhance materially the proposal's potential for award. However, in accordance with FAR 15.306(d)(3), the Government is not required to discuss every area where the proposal could be improved.
(j)(5) As described in the evaluation criteria in Section E4, the Government seeks out and will give evaluation credit for technical solutions exceeding mandatory minimums in the PWS [performance work statement] that are concrete, specific, meaningful to the Government, would positively impact Program members, and for which the offeror agrees to incorporate such technical solutions into any resulting contract. This is referred to here as “betterment.” If an offeror makes a betterment promise in its proposal, the offeror shall clearly and explicitly identify the betterment promise as such in its proposal. . . . To the extent that an offeror makes any betterment promises in its proposal, the Government intends to address those betterment promises during discussions and, for betterment promises evaluated favorably by the Government, negotiate those betterment promises into any resulting contract during the discussions process. . . . The result of any such discussions will factor into the Government's final evaluation of the offeror's proposal.
Id. at 243, 265-266, ¶¶ E3. Addendum to FAR 52.212-1 Instructions to Offerors--Commercial Products and Commercial Services, (j) Discussions.
Here, we find the protester's interpretation of the solicitation unreasonable because it fails to read the solicitation as a whole and in a manner that gives effect to all its provisions. Specifically, the protester's reading of the solicitation does not give effect to the formatting of the discussions section content. The discussions content was divided into two distinct paragraphs that segregated discussions related to deficiencies and significant weaknesses from discussions related to betterments, with each paragraph explaining the different purposes for discussing the two different types of subject matter. We read this paragraph structure as clearly indicating the agency's intent to conduct discussions related to betterments separately from other discussions topics.
Accordingly, we conclude the agency's conduct of an extra round of discussions with only GDIT--specific to the firm's proposed betterments--was in accordance with the solicitation, and did not constitute unequal discussions where neither Karna nor the third offeror in the competitive range proposed any betterments. Therefore, we do not find Karna's interpretation of the solicitation to be reasonable, and this allegation is denied. See e.g., Unico Mechanical Corp., B-419250, Oct. 29, 2020, at 5-6 (denying evaluation challenge based on an unreasonable solicitation interpretation that failed to take into account RFP's format and structure); Second Street Holdings, LLC et al., B‑417006.4 et al., Jan. 13, 2022, at 29-30 (denying allegation of unequal discussions where, after submission of final proposal revisions, agency had additional communications with only awardee about historical preservation mitigation efforts because solicitation specifically provided for such communications).
Allegations of Non-Meaningful Discussions and Unreasonable Evaluation
The protester raises a myriad of challenges to the agency's evaluation of proposals and conduct of discussions under seven of the ten technical subfactors. As representative examples, we address Karna's challenges under two of the four technical subfactors identified in the solicitation as the most important evaluation factors: provider network management and claims processing.[6]
Provider Network Management Subfactor
The protester challenges the conduct of discussions related to the provider network management subfactor because, among other things, CDC's discussions with Karna were not meaningful. The agency defends its conduct of discussions as consistent with the solicitation and applicable regulations.
Under the performance network management subfactor, the evaluators assessed 12 confidence increases and 10 confidence decreases in Karna's initial proposal, and assigned it a rating of some confidence. AR, Exh. 5.1, Tech. Eval.--Original Proposals at 34-35. The solicitation defined a rating of some confidence for the technical subfactors as:
The Government has some confidence that the offeror understands the requirement, proposes a sound approach, has addressed potential risks, and will be successful in performing the contract with more than normal Government intervention. Risk of unsuccessful performance is moderate to high.
AR, Exh. 1.11, RFP Attach. 1.14--Technical Ratings at 1 (emphasis omitted).
In developing the discussions items, the evaluators considered 1 of the 10 confidence decreases to rise to the level of a significant weakness, and CDC raised it with Karna during discussions. AR, Exh. 4.4, Karna 1st Round Discussions Items at 2. After evaluating Karna's response to discussions questions and proposal revisions, the evaluators concluded the significant weakness was not adequately addressed and the assigned rating of some confidence remained unchanged for Karna's final proposal. AR, Exh. 5.2, Tech. Eval.--1st Round Proposal Revisions at 9-10.[7]
In challenging the rating of some confidence under this subfactor, the protester, first, generally contends that because the solicitation defined a rating of some confidence as indicating a “moderate to high” risk of unsuccessful performance, the assignment of this rating necessarily means the evaluators viewed aspects of Karna's proposal as substantially increasing the risk of unsuccessful performance, and that concerns about such an increased risk to performance are tantamount to significant weaknesses that must be raised in discussions.[8] Protest at 23, 25-26; Comments at 25-26.
The agency responds that “Karna's assumption that a rating of ‘Some Confidence' can only be assigned if there was an identified significant weakness is incorrect.”[9] COS at 7. Rather, it was “[t]he combined effect of all of the factors that increased confidence and factors that decreased confidence [that] resulted in the rating assigned to a subfactor.” Id. The CDC explains it identified which of the confidence decreases rose to the level of significant weaknesses, and needed to be raised during discussions, because those particular decreases “appreciably increased the risk of unsuccessful contract performance in the opinion of the [contracting officer] and the TEP.” Id. at 7-8.
Relevant here, agencies are not required to raise non-significant weaknesses as part of discussions. FAR 15.306(d)(3). Our decisions have explained, however, the fact that an agency does not expressly label or characterize a particular evaluation finding as a significant weakness is not controlling; rather, we also look to the context of the evaluation. AT&T Corp., B‑299542.3, B-299542.4, Nov. 16, 2007, at 11. For example, we have found an agency was required to raise, in discussions, a weakness not characterized as significant because the evaluators concluded the risk at issue could “jeopardize the overall success of the project.” Id.; Raytheon Co., B‑404998, July 25, 2011 at 7 (similarly finding so-called “category 4 weaknesses” tantamount to significant weaknesses in part because they “were said to jeopardize program success”). Conversely, we have found weaknesses that fall short of jeopardizing program success did not rise to the level of significant weaknesses, and were not required to be raised during discussions. Education Dev. Ctr., Inc., B-418217, B-418217.2, Jan. 27, 2020, at 6.
In those decisions, however, the protesters specifically identified weaknesses that they alleged were miscategorized as non-significant, and, thus, resulted in a lack of meaningful discussions. Except as discussed below, Karna has not done that here. Instead, the protester infers that because the agency assigned its proposal a rating of “some confidence” (indicating “moderate to high” level of risk of unsuccessful performance) that the assignment of such a rating alone necessarily means the agency must have assessed multiple significant weaknesses in the proposal. There is no basis for such an inference, however, where the solicitation did not establish that a rating of some confidence would only result where the agency identified multiple significant weaknesses.[10] Further, we see no reason why the protester's single significant weakness is incompatible with a finding of “moderate to high” risk, and the associated rating of “some confidence,” where, as noted above, the FAR defines a significant weakness as a flaw in a proposal that “appreciably increases the risk of unsuccessful performance.” FAR 15.001. Accordingly, this line of argument is without a basis.
In later filings, the protester more specifically argues that two additional confidence decreases assessed in the firm's proposal under the provider network management subfactor rose to the level of significant weaknesses, and, thus, were required to be raised during discussions.[11] Supp. Protest at 26. In support of this argument, the protester notes that preceding the list of confidence increases and decreases for each technical subfactor, the evaluators included an introductory paragraph. Id. at 25 (citing AR, Exh. 5.1, Tech. Eval.--Original Proposals at 34). With regard to the provider network management subfactor, the introductory paragraph reads:
The Offeror received an overall rating of Some Confidence for 1(iv) Provider Network Management. Offeror's provider network management approach included a plan to [DELETED] and to [DELETED]. However, some of Offeror's statements indicate a lack of understanding of the TPA role. Further, Offeror's metrics when discussing adequacy do not directly correlate with QASP [Quality Assurance Surveillance Plan] standards, and Offeror's proposed methodology for monitoring and assessing network adequacy is unclear.
AR, Exh. 5.1, Tech. Eval.--Original Proposals at 34. Each of the items referenced in the introductory paragraph--e.g., plan to [DELETED], unclear monitoring and assessing of network adequacy--was later discussed in greater detail as part of the full list of confidence increases and decreases assessed under this subfactor. Id. at 34-35.
The protester characterizes this introductory paragraph as demonstrating “the Agency emphasized three supposed flaws as central to downgrading Karna's otherwise compliant approach to Some Confidence:” (1) alleged confusion regarding the TPA role; (2) compliance with QASP standards; and (3) methodology for monitoring and assessing network adequacy. Supp. Protest at 26. Thus, Karna, contends all three “supposed flaws” constituted significant weaknesses required to be raised during discussions, yet the CDC raised only one. Id.
The agency responds that the introductory paragraphs “were not intended to capture only significant weaknesses, as suggested by Karna,” but were simply summaries of the evaluation. COS at 37. With respect to the two additional confidence decreases noted in the provider network management introductory paragraph, the CDC explains they “were not identified as significant weaknesses, since neither of these weaknesses appreciably increased the risk of unsuccessful contract performance.” Id. at 37-38. With respect to the assignment of a final rating of some confidence for this subfactor, the agency maintains the rating was a result of the cumulative effect of all “the weaknesses which were not countered by enough strengths to elevate the confidence rating.” Id. at 38.
As an initial matter, we note there is some internal inconsistency within the protester's argument here. Karna insists the structure of the evaluation report indicates all three of the confidence decreases referenced in the provider network management introductory paragraph rose to the level of significant weaknesses by virtue of being mentioned in the introductory paragraph. Yet, the protester does not similarly insist that both of the confidence decreases under the key personnel subfactor--for which Karna also received a rating of some confidence--rose to the level of significant weaknesses despite both being referenced in the introductory paragraph for that subfactor. Instead, the protester claims one of the two decreases should have been considered a significant weakness. Supp. Protest at 65‑66, 68; AR, Exh. 5.1, Tech. Eval.--Original Proposals at 43.
More importantly, based on our review of the record, we find no basis to question the agency's explanation that only confidence decreases that appreciably increased the risk of unsuccessful performance were identified as significant weaknesses and raised during discussions. The protester has not pointed to--and our review of the record does not find--any indication that the evaluators were concerned the two additional confidence decreases created a possibility of Karna's proposed approach jeopardizing the overall success of the contract. Rather, the record shows that where the evaluators had such an elevated level of concern related to a specific confidence decrease, they directly expressed this in the contemporaneous record.
For example, the TEP noted certain of Karna's proposal “statements significantly decrease confidence in the Offeror's understanding of the requirement,” and this particular confidence decrease was then deemed a significant weakness and raised with Karna during discussions. AR, Exh. 5.1, Tech. Eval.--Original Proposal at 34 (emphasis added); Exh. 4.4, Karna 1st Round Discussions Items at 2. In contrast, for the two additional confidence decreases the protester characterizes as significant weaknesses, the evaluators did not use language expressing an elevated level of concern. Instead, the TEP specifically noted for one that “this reference decreases confidence in the Offeror's understanding of the requirement,” and for the second, that a “lack of clarity” in Karna's proposal “makes it difficult to assess (and therefore decreases confidence in)” the proposed approach. AR, Exh. 5.1, Tech. Eval.--Original Proposal at 34.
Further, our review of the record provides no basis for us to question the evaluators' judgment that the combined weight of the 10 confidence decreases assessed in Karna's proposal under the provider network management subfactor was sufficient to merit a rating of some confidence, notwithstanding that only 1 of the 10 confidence decreases was deemed a significant weakness; especially where, as here, the TEP concluded Karna's proposal revisions failed to fully mitigate the significant weakness raised during discussions.[12] While Karna disagrees with the evaluators' judgments about the level of risk associated with some of the confidence decreases and with their judgments about the overall merits of the firm's proposal, such disagreement, without more, is insufficient to render those judgments unreasonable. See e.g., Education Dev. Ctr., Inc., supra at 6; TriCenturion, Inc.; SafeGuard Servs., LLC, B‑406032 et al., Jan. 25, 2012, at 21 (denying allegation that concerns labeled as non-significant weaknesses reflected more serious concerns which were required to be raised during discussions).
Claims Processing Subfactor
Under the claims processing subfactor, the protester contends the TEP unreasonably assessed a confidence decrease and failed to assign a confidence increase. The record shows the evaluators assessed 13 confidence increases and 4 confidence decreases in Karna's proposal and assigned a rating of moderate confidence for the claims processing subfactor. AR, Exh. 5.1, Tech. Eval.--Original Proposals at 36-37. The protester challenges one of the four confidence decreases as based on a misreading of Karna's proposal. Supp. Protest at 40-41; Comments at 66-68. The protester also describes Karna's proposed use of a “[DELETED]” tool as offering attributes the solicitation deemed beneficial for which the agency unfairly failed to assess a confidence increase.[13] Protest at 51; Comments at 68-69.
Relevant here, the solicitation required offerors to “address their approach to receive, process, and accurately adjudicate all medical (to include dental) claims for all services provided to any member.” RFP at 251. In part, offerors were instructed to “address how their claims processing system and process auto-adjudicates claims and provides for high-volume data capture and routing in accordance with the Government's need for continuous delivery.” Id. For evaluation purposes, the solicitation set forth several ways in which “[t]he Government's confidence will be increased,” one of which was by “claims processing systems and processes that ensure accurate, rapid payment of claims to providers to support the delivery of healthcare to Program members and the retention of quality providers in the Program.” Id. at 270.
The evaluators noted multiple confidence increases related to Karna's proposed claims processing system. AR, Exh. 5.1, Tech. Eval.--Original Proposals at 36. For example, the TEP found Karna's proposed processing system was “agile and easily configurable,” “a [DELETED] that will facilitate intelligent decision-making,” and that it “is a proven high-volume claims processing system,” which “annually processes more than [DELETED]” and “[DELETED].” Id. The evaluators, however, also assessed a confidence decrease, finding that Karna “did not provide any quantitative data to demonstrate successful use of its claims processing adjudication system.” Id.
The protester maintains this confidence decrease “is incorrect,” because “Karna's proposal provides ample quantitative information to substantiate the processing capabilities of its [proposed] processing system.” Supp. Protest at 40. As an example, the protester points to the sections in Karna's proposal describing the system's processing of [DELETED] annually. Id. The agency responds the TEP noted these metrics related to the volume of claims as a confidence increase in Karna's proposal, but these volume-related metrics did not speak to the accuracy and timeliness of the claims processing system, and it is this latter type of information to which the confidence decrease for missing quantitative data refers. MOL at 34 (citing AR, Exh. 5.1, Tech. Eval.--Original Proposal at 43 “[o]fferor does not provide metrics on claims turnaround time, accuracy, etc.”). The protester replies by referencing two items in Karna's proposal: (1) a statement that Karna's proposed claims processing system has “current [DELETED] rates consistently [DELETED]”; and (2) a table illustrating Karna's “anticipated [DELETED].” Comments at 66-67 (citing AR, Exh. 3.1f, Karna Initial Proposal--Claims Processing Vol. at 14-15). Thus, in the protester's view, Karna's proposal “demonstrates, with quantitative support, that Karna's claim intake is more than [DELETED] by [the system's] [DELETED], pursuant to timetables that exceed Solicitation requirements.” Comments at 67.
While Karna's proposal does state the firm's proposed system has an auto-adjudication rate “consistently above [DELETED] [percent],” the protester does not point to, nor did our review of the record find, any supporting data relative to this statement. Similarly, the table of [DELETED] timelines noted by Karna sets forth the firm's projected timelines based on its proposed approach, not metrics of past timelines previously achieved by the offered system. In sum, the agency's explanation that the metrics information provided in Karna's proposal did not speak to the accuracy and timeliness of the firm's proposed claims processing system is confirmed by our review of the contemporaneous record. Accordingly, we find unavailing the protester's contention that the evaluators ignored information in Karna's proposal when they assessed this confidence decrease, and we deny this ground of protest. See Network Runners, Inc., B‑418268, B‑418268.2, Feb. 14, 2020, at 7-10.
Another of the ways in which offerors could increase “[t]he Government's confidence” under the claims processing subfactor was by proposing “approaches that leverage an agile and flexible reporting system to provide the Program visibility into claims data and trends.” RFP at 270. The protester argues the TEP should have assessed a confidence increase in Karna's proposal because the firm proposed a “[DELETED]” that “offers comprehensive [DELETED], and [DELETED].” Protest at 51; Comments at 68. The CDC responds the evaluation did credit Karna for “its proposed [DELETED] platform.” COS at 23. The protester replies the agency is wrong because the TEP “did not even mention Karna's [DELETED] in its Subfactor 6 [claims processing] evaluation.” Comments at 69. We find the protester's argument to be without merit.
The protester's argument focuses on the TEP not using the specific term “[DELETED],” but ignores the content of the evaluation, which assessed a confidence increase for the particular aspects of the “[DELETED]” tool for which Karna seeks credit. Specifically, the TEP assessed the following confidence increase:
Offeror describes its proposed reporting to include a combination of quantitative and qualitative data that can be [DELETED], which increases confidence in the usability of the reports to provide visibility and actionable insights [into the] Program.
AR, Exh. 5.1, Tech. Eval.--Original Proposals at 37. As the basis for this confidence increase, the TEP cited to pages of Karna's proposal on which the firm discussed its proposed “[DELETED]” tool. Id.; AR, Exh. 3.1f, Karna Initial Proposal--Claims Processing Vol. at 17-18 (“Team Karna will report [DELETED] available through the [DELETED]”; “Team Karna's [DELETED] . . . presents [DELETED] . . . via the [DELETED]”). Accordingly, we find the record contradicts the protester's contention of a missed strength and deny this ground of protest. See General Dynamics Info. Tech., Inc., B-420589, B-420589.2, June 15, 2022, at 14.
Past Performance Evaluation
The protester takes issue with the CDC's failure to comparatively assess past performance. The solicitation required offerors to submit at least three, and no more than five, “recent and relevant examples of past performance,” and permitted offerors to submit examples of their performance as prime contractors or subcontractors as well as to submit examples of performance for their proposed teaming partners and subcontractors. RFP at 262-263. The agency would “evaluate the offeror's past performance information for recency, relevancy, and quality.” Id. at 271.
The record shows both Karna and GDIT submitted five past performance references; two of Karna's five references were for its performance as the prime contractor on the predecessor HPS contract and the current HPS “bridge” contract; and one of GDIT's five references was for its performance as a subcontractor to Karna on the predecessor HPS contract.[14] AR, Exh. 5.5, Past Performance Eval. At 7-11. The TEP assessed multiple confidence increases and no confidence decreases in either Karna's or GDIT's past performance, and assigned both proposals a rating of substantial confidence for this factor. Id. at 9, 12. For both Karna and GDIT, the source selection decision provides the following conclusion:
[Offeror] received an overall rating of Substantial Confidence for Past Performance. [Offeror] offered relevant past performance under all tasks and functional areas of the TPA. [Offeror] presented and the government collected evidence of positive past customer experiences and CPARS evaluations.
AR, Exh. 5.9, Award Decision at 13, 17.
The protester argues the agency failed to conduct a comparative assessment of offerors' past performance as required by the FAR and the solicitation. Protest at 81‑83. The protester contends “Karna submitted past performance references that were more recent, more relevant, and of higher quality than did GDIT,” and the agency's failure to conduct a proper evaluation “diminished the comparative strength of Karna's past performance and stripped Karna of a critical differentiator.”[15] Protest at 84; Supp. Protest at 71-72.
In response to this argument, the agency first contends the CDC was not required to conduct a comparative assessment of offerors' past performance. MOL at 53. Next, the agency characterizes the differences between Karna's and GDIT's past performance records as “relatively minor in their totality,” and asserts that while Karna's past performance was superior it was only “by a hair.” COS at 34. The CDC further notes the past performance factor was less important than four of the technical subfactors, and maintains Karna's superior past performance was “not enough to outweigh GDIT's superiority in all other technical factors.” Id.
The protester disputes the agency's contention that no comparative assessment of offerors' past performance was required. Comments at 45. Also, contrary to the CDC's characterization of the difference between Karna's and GDIT's past performance as minor, the protester insists “offerors' past performance records in this procurement differ materially.” Id. at 46.
We need not resolve these disagreements between the parties about whether a comparative assessment of past performance was required, or the degree of difference between proposals such an assessment might have reflected, because Karna cannot show it was competitively prejudiced by any errors that may have occurred here. Competitive prejudice is an essential element of a viable protest; where the protester fails to demonstrate that, but for the agency's actions, it would have had a substantial chance of receiving the award, there is no basis for finding prejudice, and our Office will not sustain the protest even if deficiencies in the procurement are found. SeaTech Security Sols.; Apogee Grp., LLC, supra at 17.
Here, even if the SSA had recognized Karna's past performance as superior to the awardee's, GDIT still received a rating of substantial confidence for this evaluation factor. Moreover, GDIT's proposal was higher-rated than Karna's under three of the four technical subfactors the solicitation identified as more important than past performance, and equally-rated under the fourth such subfactor. AR, Exh. 5.9, Award Decision at 17-18. Additionally, GDIT's proposal was higher-rated than Karna's for three of the six technical subfactors the solicitation identified as less important than past performance, and equally-rated for the remaining three of those subfactors. Id. Further, GDIT's proposal was lower-priced than Karna's. Id. at 19.
In sum, regardless of how much more substantial the SSA's confidence had been in Karna's rating of substantial confidence than in GDIT's rating of substantial confidence under the past performance factor, we cannot conclude Karna's advantage in this regard would have been sufficient to overcome GDIT's technical superiority and lower price given the relative importance of the evaluation factors and subfactors established by the solicitation. Accordingly, we find no basis to conclude the protester was prejudiced by any error that may have occurred in the agency's past performance evaluation. See e.g., Perspecta Enter. Sols., LLC, B-418533.2, B-418533.3, June 17, 2020, at 30 (finding no competitive prejudice where even if protester had been evaluated as superior under past performance factor awardee would have remained higher-rated under other more important factors and was lower-priced).
Price Realism Evaluation
The protester contends the price evaluation was flawed because the CDC failed to assess offerors' proposed levels of effort for realism. As noted above, the solicitation established the agency would evaluate proposed prices for realism, and explained the purpose of this evaluation would be “to determine whether proposed prices are realistic for the work to be performed, reflect a clear understanding of the requirements, and are consistent with the offeror's unique method of performance.” RFP at 272-273. The record reflects the agency did evaluate proposed prices for reasonableness, realism, and balance. AR, Exh. 5.9, Award Decision at 19‑20; see also AR, Exh. 5.11, Technical Review of Business Proposals; Exh. 5.12, Price Realism GDIT Other Direct Costs Analysis; Exh. 5.13, Price Realism Karna Other Direct Costs Analysis; Exh. 5.14, TPA Price Realism and Unbalanced Pricing Analysis Labor Rates at 1. Relevant here, with respect to labor hours, the SSA found “[t]he offerors all proposed different labor mixes and hours, but the composition of labor hours was found to be realistic based upon the proposed approach presented by each offeror.” AR, Exh. 5.9, Award Decision at 20. The SSA determined the level of effort proposed by each offeror, based on its unique approach, to be realistic. Id.
The protester asserts that “[d]espite the Solicitation's express requirements, the Agency failed to consider critical aspects of the offerors' proposals for realism (including proposed levels of effort).” 2nd Supp. Protest at 5. The protester characterizes the record as revealing “that the CDC analyzed only the realism of proposed labor rates and ODCs and not the levels of effort,” and contends “no contemporaneous record exists of the CDC's realism assessment of offerors' proposed levels of effort.” Comments at 5. In support of this characterization, the protester cites to agency report exhibits 5.12, 5.13, and 5.14. Id. Additionally, the protester raises concerns with the CDC's purported failure to evaluate the realism of offerors' proposed labor hours in relation to two specific contract line items (CLINs) under which Karna proposed a higher level of effort than GDIT. See generally 2nd Supp. Protest at 6-9. The protester argues “[t]he Agency merely accepted that GDIT proposed a cheaper solution without analyzing the risks entailed in markedly lower levels of effort on important CLINs or whether the additional levels of effort Karna offered merited a price premium.” Id. at 9‑10.
The agency responds that as part of its price evaluation, the CDC compared each offeror's technical approach to the basis of estimate and proposed staffing each offeror provided in its business proposal “to determine whether the labor categories and level of effort were realistic for the approach as described, reflected a clear understanding of the requirements, and were consistent with the Offeror's unique method of performance.” COS at 45.
The manner and depth of an agency's price realism analysis is a matter within the sound exercise of an agency's discretion. Criterion Corp., B-422309, Apr. 16, 2024, at 4. In considering a protest against the propriety of such an evaluation, we will review to ensure the evaluation was reasonable and consistent with the solicitation and applicable procurement statutes and regulations. MPZA, LLC, B-421568.3, Dec. 14, 2023, at 8. When, as here, a solicitation does not prescribe a method of performance but permits offerors to propose their own technical solutions, a price realism analysis must include consideration of an offerors' proposed approaches in order to be reasonable. Criterion Corp., supra at 4.
Here, our review of the record finds three primary problems with the protester's arguments. First, Karna's comparison of its own level of effort and that proposed by GDIT under the two identified CLINs ignores the fact that the solicitation did not mandate any particular level of effort or technical solution, but instead permitted offerors to propose their own solutions with corresponding levels of effort and required offerors to submit basis of estimate explanations for their proposed levels of effort as part of their price proposals. RFP at 256; COS at 45-46 (protester's “attempt to make an apples-to-apples comparison of Karna's proposed approach to GDIT's proposed approach demonstrates a lack of understanding of significant variations across the two proposals”).
Second, had the agency been required to compare offerors' levels of effort as the protester insists, such a comparison would have had to occur under every CLIN--not just the two CLINs focused on by the protester (seemingly because the CLINs happen to be ones for which Karna proposed a higher level of effort than GDIT). Such a comparison also would have had to look at other CLINs where GDIT, not Karna, proposed the higher level of effort--a fact the protester's argument ignores. For example, in the base year period of performance under CLIN 0001 (Transition‑In Related Services), Karna proposed approximately [DELETED] percent of the total labor hours proposed by GDIT. Compare AR, Exh. 2.5d, GDIT Final Proposal, Business Vol., Basis of Estimate at 9, 12 (proposing [DELETED] labor hours for CLIN 0001) with Exh. 3.2i, Karna Final Proposal, Pricing Workbook at worksheet “Base Transition-In Period,” cells G4-G86 (proposing [DELETED] labor hours for CLIN 0001); Exh. 3.2j, Karna Final Proposal, Business Vol., Basis of Estimate at 12-16 (internal page numbers); see also COS at 46. Thus, applying the protester's own logic that a lower level of effort necessarily equates to a riskier proposal, GDIT, not Karna, offered a less‑risky approach under this CLIN.
Third, while the protester references several documents in the evaluation record (e.g., AR exhibits 5.12, 5.13, and 5.14) to support its arguments, Karna fails to acknowledge a fourth price evaluation document in the record: exhibit 5.11, the agency's price proposal technical review. This contemporaneous evaluation document shows that the CDC conducted a technical assessment of each offeror's business volume, which, in part, specifically focused on whether each offeror's proposed level of effort was reasonable for the work to be performed in the manner proposed by that offeror. AR, Exh. 5.11, Technical Review of Business Proposals at 1, 15-17, 31-33. Based on this assessment, the evaluators concluded both GDIT's and Karna's unique proposed approaches, “including labor hours, categories, rates, and other direct costs,” were “appropriate for this specific effort.” Id. at 26, 41.
Further, our review of the contemporaneous assessment documented in exhibit 5.11 and the corresponding evaluation of offerors' unique technical approaches provides no basis for us to question the agency's conclusion that GDIT's proposed level of effort was realistic. As an example, for CLIN X002--one of the two CLINs under which Karna contends GDIT's level of effort is too low--the evaluators assessed GDIT's proposed “use of [DELETED] . . . to perform data entry [DELETED] for enrollments and certifications” as a factor that increased confidence because it would enable the firm to “perform front-end data entry” in a manner “that leverages [DELETED] to achieve more efficient and streamlined operations.” AR, Exh. 5.1, Tech. Eval.--Original Proposals at 25. In other words, the evaluators assessed a strength in GDIT's proposed technical approach because the firm's use of [DELETED] tools created efficiencies that correlated to a reduction in required data entry labor. See id.; AR, Exh. 2.5d, GDIT Final Proposal, Business Vol., Basis of Estimate at 21 (describing basis of estimate for efficiencies from use of [DELETED]). The agency explains that “[i]n contrast, Karna's approach relies [DELETED].” COS at 47; compare AR, Exh. 2.5d, GDIT Final Proposal, Business Vol., Basis of Estimate at 21 (proposing [DELETED] labor hours for processors as part of the CLIN X002 labor mix in option period 1) with AR, Exh. 3.2j, Karna Final Proposal, Business Vol., Basis of Estimate at 26 (internal page number) (proposing [DELETED] labor hours for processors as part of the CLIN X002 labor mix in option period 1).
For the foregoing reasons, we deny the protester's price evaluation challenges.[16] See e.g., MPZA, LLC, supra at 12 (denying protest that agency failed to assess realism and appropriateness of awardee's custom level of effort and labor mix where “contentions [were] belied by the record”); ValidaTek, Inc., B-407623, Jan. 17, 2023, at 4 (denying protest where RFP provided for performance-based solutions rather than specifying set level of effort, and agency reasonably found level of effort and related pricing realistic).
Best-Value Determination
As a final matter, the protester challenges the agency's source selection decision, arguing the multiple alleged evaluation errors resulted in a flawed best-value determination.[17] Protest at 89; Supp. Protest at 75; Comments at 92. This allegation is derivative of Karna's various challenges to the agency's evaluation of proposals and conduct of discussions, which we denied above. Accordingly, Karna's remaining challenge to the source selection decision is without a basis. MPZA, LLC, supra at 12.
The protest is denied.
Edda Emmanuelli Perez
General Counsel
[1] Unless otherwise noted, reference to the solicitation is to the RFP at AR, Exh. 1.10 and citations to documents in the record use Adobe PDF pagination.
[2] The solicitation explained: “The Government seeks out and will give evaluation credit for technical solutions exceeding mandatory minimums in the PWS that are concrete, specific, meaningful to the Government, would positively impact Program members, and for which the offeror agrees to incorporate such technical solutions into any resulting contract. This is referred to here as ‘betterment.'” RFP at 266.
[3] Prices are rounded to the nearest dollar.
[4] Karna does not challenge the agency's evaluation of proposals under the technical subfactors: (1) small business subcontracting, which was assessed on a pass/fail basis; and (2) data and business information management, for which Karna received a rating of high confidence. AR, Exh. 5.9, Award Decision at 13, 18; see generally Protest. For a third technical subfactor--contract management, regulatory compliance, and transition‑out--Karna initially challenged the agency's evaluation of proposals, but subsequently withdrew its challenge. See Protest at 59-65; Resp. to Req. for Partial Dismissal at 7 n.2.
[5] Karna also initially protested the agency's: (i) treatment of betterments as part of proposal evaluation; (ii) assessment of GDIT's price with respect to alleged unfair leveraging of an incumbent information technology system; (iii) purported failure to document the evaluation of offerors' virtual sessions; and (iv) unreasonable evaluation of and failure to conduct discussions about other direct costs. See generally Protest at 30-32, 85-88; Supp. Protest at 5-13; 2nd Supp. Protest at 15-23. Karna later withdrew these arguments; accordingly, we do not address them further. Comments at 4 n.1.
[6] While we do not address in detail the corporate experience technical subfactor (another of the four most important evaluation factors), we provide here examples of why we find no basis to sustain the protester's challenges under this subfactor. First, we find no merit in the protester's evaluation challenges based on arguments related to the quality of the Karna's and GDIT's past work. See e.g., Protest at 56-58 (alleging agency ignored contractor performance assessment reporting system (CPARS) information for Karna's and GDIT's performance as incumbent prime and incumbent subcontractor, respectively, and ignored other publicly available information about GDIT's purported low quality performance of a different contract). Here, the solicitation advised offerors corporate experience would be evaluated “using the Corporate Experience Questionnaire” provided as an RFP attachment, and that “[n]o additional material” would be considered. RFP at 264, 271. Moreover, the protester's arguments are not germane to the agency's evaluation of offerors' corporate experience, which focuses on whether a firm does or does not have experience performing similar work and does not focus on the quality of such work, because the quality of prior work is the focus of a past performance evaluation. Cydecor Inc., B-422942, B-422942.2, Dec. 23, 2024, at 13 n.7. Second, we find no merit in the protester's evaluation challenges based essentially on Karna's incumbency. The protester argues, among others, that it was unreasonable for the firm's proposal to not be assigned the highest possible rating of substantial confidence because “no other offeror can match Karna's corporate experience given that Karna for the past nine years has provided the Agency with TPA services in many ways identical” to those solicited. Protest at 53; see also id. at 55. The record shows the evaluators did consider Karna's incumbent experience to be a strength, assessing a confidence increase because “[t]he offeror has experience with and working knowledge of WTCHP . . .”. AR, Exh. 5.1, Tech. Eval.--Original Proposals at 37. As noted above, however, the solicited requirement is not a direct follow-on to Karna's incumbent HPS contract but, rather, is a combination of the incumbent effort with a different contract related to the WTCHP (i.e., the Program). Further, as our decisions have explained, a protester's apparent belief that its incumbency status entitles it to a higher rating provides no basis for finding an evaluation unreasonable. Systems Implementers, Inc.; Transcend Tech. Sys., LLC, supra at 16; Cydecor Inc., supra at 13. We similarly find no merit in Karna's incumbency-based challenges to the CDC's evaluation under the transition-in technical subfactor (the fourth of the most important evaluation factors). Protest at 43, 46-48; Comments at 58-59. Finally, we also find the differences in Karna's and GDIT's evaluations under the corporate experience and key personnel subfactors stemmed from differences in the offerors' proposals, and did not result from unequal or disparate evaluation, as claimed by the protester. See generally Comments at 74-76, 89-91. To prevail on an allegation of disparate treatment, a protester must show that the agency unreasonably evaluated its proposal in a different manner than another proposal that was substantively indistinguishable or nearly identical. Systems Implementers, Inc.; Transcend Tech. Sys., LLC, , LLC, B-418963.5 et al., June 1, 2022, at 17. Having failed to meet its burden, we find no merit to Karna's arguments in this regard.
[7] The agency prepared its evaluation documents in an iterative fashion, with each successive review addressing only elements of offerors' proposals that were revised as a result of a particular round of discussions. See AR, Exh. 5.1, Tech. Eval.--Original Proposals; Exh. 5.2, Tech. Eval.--1st Round Proposal Revisions; Exh. 5.3, Tech. Eval.--2nd Round Betterments Revisions; Exh. 5.4, Tech. Eval.--3rd Round Final Proposal Revisions. Consequently, there is no single document in the record encompassing the entirety of the agency's evaluation of offerors' final proposals. Rather, for aspects of proposals that remained unchanged throughout discussions, the final evaluation is reflected in agency report exhibit 5.1 setting forth the evaluation of initial proposals. For other aspects of proposals that changed in response to the first round of discussions but did not change again thereafter, the final evaluation is reflected in agency report exhibit 5.2, setting forth the evaluation of offerors' first set of proposal revisions, and so on.
[8] The protester makes this same argument for the other two technical subfactors under which Karna's proposal received a rating of some confidence: member services (4 confidence increases, 4 confidence decreases, 1 decrease considered a significant weakness) and key personnel (2 confidence increases, 2 confidence decreases, 0 significant weaknesses). Id.; AR, Exh. 5.1, Tech. Eval.--Original Proposals at 39-40, 43; Exh. 4.4, Karna 1st Round Discussions Items at 1-2.
[9] In the context of discussions, the FAR defines a “significant weakness” as a proposal “flaw that appreciably increases the risk of unsuccessful contract performance.” FAR 15.001.
[10] The protester also raises this general argument--that the assignment of a certain rating necessarily indicates the assessment of significant weaknesses--with respect to the five factors under which Karna's proposal received a rating of moderate confidence; a rating which indicated the risk of unsuccessful performance was “no worse than moderate.” Protest at 27-29; Comments at 26-27; AR, Exh. 1.11, RFP Attach. 1.14--Technical Ratings at 1. For the same reasons we decline to make this finding in relation to a rating of some confidence, we also decline to make this finding in relation to a rating of moderate confidence.
[11] The protester additionally argues one of the two confidence decreases under the key personnel subfactor was required to be raised during discussions; the protester does not identify any of the 4 confidence decreases under the member services subfactor that should have been considered significant weaknesses. Supp. Protest at 65-66, 68 and generally at 50-56. As explained below, Karna's characterization of two additional confidence decreases under the provider network management subfactor as significant weaknesses expresses nothing more than the protester's disagreement with the evaluators' judgment; we reach the same conclusion with respect to Karna's similar characterization of one of the two confidence decreases under the key personnel subfactor.
[12] The protester contends both the initial assessment of the significant weakness and the evaluators' continued concern following proposal revisions are based on a misreading of Karna's proposal. Protest at 34-35; Supp. Protest at 20-21; Comments at 52. The CDC maintains the evaluators properly found fault with [Karna's] proposal, and that the negative evaluation is reflective of the protester's failure to submit a well-written proposal. MOL at 25. Having reviewed the solicitation, Karna's various proposal submissions, the contemporaneous record of evaluation and discussions, and the contracting officer's explanation provided in response to the protest (COS at 14-17), we find the TEP reasonably assessed the significant weakness, and that the evaluators reasonably continued to have concerns following proposal revisions.
[13] The protester initially claimed three missed confidence increases, or strengths--one for the [DELETED], one for Karna's claims processing staffing approach, and one for Karna's proposed use of an [DELETED] methodology; in its comments on the agency report, Karna continues to claim credit for only one of the three, however. Compare Protest at 51-52 with Comments at 68-69. The protester also initially argued the agency conducted discussions related to the claim processing subfactor in a manner that was unequal and not meaningful. Protest at 27‑28; Supp. Protest at 37-40. While Karna continues to pursue its contention that discussions for the claims processing subfactor were not meaningful (Comments at 41‑43), the protester's arguments in this regard largely mirror those we addressed above with respect to the conduct of discussions under the provider network management subfactor. As we found no merit in those arguments, we similarly find no merit in Karna's contention the agency failed to engage in meaningful discussions for the claims processing subfactor.
[14] As noted above, HPS refers to the Health Program Support contract, which is being combined with aspects of another contract to create the currently solicited effort for a TPA (third party administrator) contractor. Karna is the incumbent HPS prime contractor and GDIT is a subcontractor to Karna under the incumbent HPS effort.
[15] The protester also initially challenged the TEP's assignment of a rating of substantial confidence to GDIT's proposal, arguing that alleged issues with the quality of the firm's past performance should have resulted in GDIT being assigned a lower rating. Protest at 78-80. Karna did not continue to pursue this line of argument in its comments on the agency report. See generally Comments at 43-48. Accordingly, we address here only the protester's argument that Karna's rating of substantial confidence was superior to GDIT's rating of substantial confidence under the past performance factor.
[16] The protester also takes issue with the agency's conduct of discussions related to offerors' proposed level of effort. See 2nd Supp. Protest at 10. The protester's argument has two primary prongs. First, that “the Agency could not have engaged in meaningful price discussions” because it “failed to properly evaluate and analyze the offerors' proposed levels of effort.” Comments at 15. Second, that the discussions the agency did conduct related to level of effort “failed to get to the bottom of the issue: that the offerors proposed vastly different levels of effort under certain CLINs.” 2nd Supp. Protest at 10; see also at 12-15 (e.g., the agency's discussions “deprived Karna of the opportunity to either explain to the CDC the need for (and benefits of) the levels of effort on multiple important CLINs (and correspondingly how any lower level of effort would present risk that the CDC should not accept to the Program)”). With respect to the first argument, as explained above, the protester simply is incorrect that the agency did not evaluate proposed levels of effort. Regarding the second argument, again, as noted above, the solicitation permitted offerors to propose unique technical approaches with customized labor mixes and levels of effort, which the record shows the agency accounted for in assessing whether the supposedly “vastly different levels of effort” proposed by each offeror were appropriate for each firm's unique approach. Accordingly, we find unavailing the protester's challenge to the agency's conduct of discussions related to price.
[17] Karna also initially maintained that the materials made available during the firm's debriefing indicated “the Agency conducted a mechanical comparison of the unreasonably assigned technical subfactor confidence ratings when making [the] award decision.” Protest at 89. The protester did not further pursue this line of argument in its later filings responding to the record and agency report. See Supp. Protest at 75‑76; Comments at 92-94. To the extent this argument has not been abandoned, we note the record provides no support for the protester's characterization of the SSA's decision as “mechanical.” Rather, the record shows the SSA considered the merits of Karna's and GDIT's proposals, and based on this consideration found GDIT's proposal offered “the best value as the highest‑technically-rated offeror with the lowest proposed price.” See generally AR, Exh. 5.9, Award Decision at 7-8, 11-13, 18-21.