Palmetto GBA, LLC; CGS Administrators, LLC
B-407668,B-407668.2,B-407668.3,B-407668.4: Jan 18, 2013
Palmetto GBA, LLC, of Columbia, South Carolina, and CGS Administrators, LLC, of Nashville, Tennessee, protest the award of a contract to Noridian Administrative Services, LLC, of Fargo, North Dakota, under request for proposals (RFP) No. RFP-CMS-2011-0026, issued by the Department of Health and Human Services, Centers for Medicare and Medicaid Services (CMS), to obtain a Medicare Administrative Contractor (MAC) to provide services for the administration of Medicare Part A and Medicare Part B (A/B) fee-for-service benefit claims. The protesters argue that CMS's evaluation of proposals, and selection of Noridian's proposal for award, were unreasonable.
We deny the protests.
DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.
Matter of: Palmetto GBA, LLC; CGS Administrators, LLC
File: B-407668; B-407668.2; B-407668.3; B-407668.4
Date: January 18, 2013
1. Protests challenging the agencys evaluation of proposals under the solicitations past performance, implementation, and technical understanding factors are denied where the record shows that the evaluation was reasonable and consistent with the solicitation.
2. Protests challenging the agencys evaluation of the protesters and awardees proposed costs for realism, and upward adjustment of those costs, are denied where the evaluation and upward adjustments to the proposed costs were reasonable.
Palmetto GBA, LLC, of Columbia, South Carolina, and CGS Administrators, LLC, of Nashville, Tennessee, protest the award of a contract to Noridian Administrative Services, LLC, of Fargo, North Dakota, under request for proposals (RFP) No. RFP-CMS-2011-0026, issued by the Department of Health and Human Services, Centers for Medicare and Medicaid Services (CMS), to obtain a Medicare Administrative Contractor (MAC) to provide services for the administration of Medicare Part A and Medicare Part B (A/B) fee-for-service benefit claims. The protesters argue that CMSs evaluation of proposals, and selection of Noridians proposal for award, were unreasonable.
We deny the protests.
The RFP provided for the award of a cost-plus-award-fee contract for an implementation period of up to six months, a base period of one year with four 1-year option periods, and an optional outgoing contractor transition period of up to six months, for A/B MAC services in Jurisdiction E. RFP at 2, 13, 80, 91. Jurisdiction E includes California, Hawaii, and Nevada, as well as American Samoa, Guam, and the Northern Mariana Islands. Id. at 2.
The RFPs statement of work (SOW) requires the contractor to provide all necessary personnel, material, equipment, and facilities to perform the specified A/B MAC services. In this regard, the A/B MAC will receive and control Medicare claims from institutional and professional providers, suppliers, and beneficiaries within its jurisdiction and will perform standard or required editing on these claims to determine whether the claims are complete and should be paid. SOW at 2. The A/B MAC will also calculate Medicare payment amounts and arrange for remittance of these payments to the appropriate party, enroll new providers, conduct redeterminations on appeals of claims, operate a Provider Customer Service Program . . . that educates providers about the Medicare program and responds to provider telephone and written inquiries, respond to complex inquiries from Beneficiary Contact Centers, and make coverage decisions for new procedures and devices in local areas. Id. The SOW also identified two requirements specific to Jurisdiction E, related to the Rural Community Hospital Demonstration Program, and the designation of California as a high risk fraud and abuse area. Id. at 156-159
The RFP stated that award would be made to the offeror submitting the proposal found to provide the best value to the agency, based upon cost, and the following non-cost evaluation factors: past performance (40% relative weight), technical understanding (40% relative weight), and implementation (20% relative weight). RFP at 138, 142. The RFP further provided that as part of the evaluation of each of the non-cost factors, the agency would consider four common elements: customer service, financial management, operational excellence, and innovations and technology. RFP at 143. Offerors were informed that the non-cost factors, when combined, were significantly more important than cost. RFP at 138.
Noridian, CGS, and Palmetto submitted proposals by the solicitations closing date. The offerors also delivered oral presentations to the agency, as required by the RFP.
A CMS technical evaluation panel (TEP) evaluated the offerors non-cost proposals. The TEP employed a rather convoluted evaluation scheme that resulted in numeric scores for proposals under each of the non-cost evaluation factors. With regard to the past performance factor, the TEP first created a Baseline Score . . .by converting the adjectival ratings present in the National Institute[s] of Health (NIH)/Contractor Performance Assessment [Report] (CPAR), and the Award Fee percentage earned, into a total point value. Agency Report (AR) at 4. For example, an adjectival CPAR rating of exceptional for a particular aspect of a contractors performance would be assigned a numeric rating of 0.9, while a CPAR rating of satisfactory would be assigned a numeric rating of 0.5. AR at 4; Tab 9a, TEP Report, at 8. The agency used a somewhat similar scheme with regard to NIH reports, assigning, for example, a numeric rating of 0.9 to a rating of outstanding, and a numeric rating of 0.5 to a rating of good. AR at 5; Tab 9a, TEP Report, at 9-10. Finally, with regard to award fee determinations, the agency assigned, for example, a numeric rating of 0.8 if the contractor had earned 91 percent to 100 percent of the available award fee, and a numeric rating of 0.5 if the contractor had earned from 1 percent to 50 percent of the available award fee. AR at 4; Tab 9a, TEP Report, at 8.
CMS explains that the TEP then determined the average NIH/CPARS scores for each evaluation period, and the average rating for all Award Fees earned. AR at 6; see AR, Tab 9a, TEP Report, at 10. The TEP next averaged the single Award Fee score . . . with each of the NIH/CPARS evaluation period scores to determine the initial baseline score for that offeror. Id. The agency explains that [b]y doing it this way, the TEP placed more weight on the NIH and CPARS scores and less weight on the Award Fee scores in the calculation. Id.
The agency explains, and the record reflects, that the baseline numeric scores produced were not the deciding factor for determining an Offerors overall Past Performance numeric score, but rather, were used by the TEP as one tool in determining an overall Past Performance rating. AR, Tab 9A, TEP Report, at 10. In this regard, the TEP also weighed the narrative in each of the above reports considered, as well as the source documents for each formal performance report. Id. at 11. This included the TEPs review of source materials, such as contractor monthly status reports, Medicare contractor provider satisfaction survey results, comprehensive error rate testing (CERT) results, and contractor rebuttals to NIH reports and CPARs. Id. The record reflects that the TEP deliberated and reached agreement with respect to the findings for each Offeror, and then deliberated as to whether the baseline numeric score (score assigned to Award Fees and NIH/CPARS evaluations) was a valid indicator of each Offerors past performance given the findings. Id. at 12. Finally, depending on the outcome of the deliberations, and the mix of relevant strengths, weaknesses, and significant weaknesses, the TEP adjusted the baseline score to determine an overall numeric score for each Offerors Past Performance. Id.
The TEPs evaluation of proposals under the technical understanding and implementation factors was less convoluted. In this regard, the TEP first identified strengths, weaknesses, significant weaknesses, and deficiencies in the proposals. After the TEP reached consensus on all findings, the TEP members assigned an individual numeric rating to each evaluation factor that assessed the value of a proposals assertions and promises in light of the Offerors capability as determined by the TEP through review of all pertinent findings for a given evaluation factor. Id. at 6-7. The numeric ratings, which under the scheme here ranged from 0.1 to 0.9, indicated whether the proposal was more likely to succeed than fail (0.6-0.9), equally likely to succeed or fail (0.5), or more likely to fail than succeed (0.1-0.4). Id.
A separate agency business evaluation panel (BEP) evaluated the offerors cost proposals for reasonableness and realism. CMS explains that the TEP assisted in this review by confirming whether the proposals accurately reflected a clear understanding of the RFP requirements, and confirming whether the business proposals were consistent with the relevant technical proposals. AR at 6.
The final numeric scores assigned by the agency to the proposals submitted by Noridian, CGS, and Palmetto; the scores as multiplied by their relative weights; the offerors proposed costs; and agency-calculated probable costs; were as follows:
.70 x 20% = .14
.80 x 20% = .16
.90 x 20% = .18
.90 x 40% = .36
.60 x 40% = .24
.70 x 40% = .28
.87 x 40% = .35
.70 x 40% = .28
.40 x 40% = .16
Total Weighted Score
AR, Tab 20A, Source Selection Decision, at 7, 29.
The contracting officer (CO) served as the source selection authority (SSA). In her role as the SSA, the CO performed an independent review of the proposals, and prepared a lengthy source selection decision addressing the strengths, weaknesses, and significant weaknesses of the offerors proposals as evaluated under the RFPs past performance, implementation, and technical understanding factors. COs Statement at 7; see AR, Tab 20A, Source Selection Decision, at 1-61. The COs source selection decision also discussed the offerors proposed costs, as well as the cost realism adjustments and probable costs as evaluated by the agency. The CO ultimately concluded that Noridians proposal, as evaluated by the agency under the non-cost factors and considering the proposals probable costs, represented the best overall value to the government. In doing so, the CO also noted as follows:
I have also reviewed the proposals in light of CGSs and Palmettos proposed costs--that is, the costs proposed by CGS and Palmetto prior to the cost realism analysis--as compared to [Noridians] adjusted costs. Based on this review (where CGS and Palmetto were not cost adjusted, but [Noridian] was), I can state with certainty that I would nonetheless recommend an award to [Noridian] based on the past performance and technical approach discriminators detailed above in the trade-off analysis.
AR, Tab 20A, Source Selection Decision, at 60. After requesting and receiving debriefings, Palmetto and CGS filed these protests.
Palmetto and CGS raise numerous challenges to CMSs evaluation of the offerors non-cost and cost proposals, and the agencys conclusion that Noridians proposal represented the best value to the agency. Although we do not specifically address all of Palmettos and CGSs arguments, we have fully considered all of them and find that they afford no basis on which to sustain the protest.
Past Performance Evaluation
Baseline Numeric Scores
The protesters challenge CMSs past performance evaluation. In doing so, the protesters first argue at length, and in numerous different ways, that the process by which the agency calculated the offerors baseline numeric scores under the past performance factor was flawed and, and consequently rendered the final past performance evaluation scores unreasonable.
The protesters contend, for example, that the agencys evaluation process resulted in an unreasonable assessment of past performance, because it assigned a score of 0.5, defined as [e]qually likely to succeed or fail, to a good CPAR rating or a satisfactory NIH report rating. The protesters continue this argument by pointing out that under the scoring system adopted by the agency, a good CPAR rating would be assigned the same numeric value of 0.5 as an offeror who had received a dismal 2% of the available Award Fee under another contract. CGS Comments at 59; see AR at 4; Tab 9a, TEP Report, at 8. The protesters also argue that the process used by the agency in calculating the offerors baseline past performance scores failed to account for trends in the offerors past performance. Additionally, the protesters argue that the agencys process for calculating the offerors baseline numeric scores placed too much emphasis on certain contracts, or certain performance periods of those contracts, and too little emphasis on others, and improperly excluded from consideration two categories of past performance evaluated under the CPAR system (management of key personnel and utilization of small business).
Although we are mindful of the protesters concerns here, we do not find the agencys evaluation of the offerors past performance unreasonable. In this regard, our Office has consistently recognized that ratings, be they numerical, adjectival, or color, are merely guides for intelligent decision-making in the procurement process. Where the evaluation and source selection decision reasonably consider the underlying basis for the ratings, including the advantages and disadvantages associated with the specific content of competing proposals, in a manner that is fair, and equitable, and consistent with the terms of the solicitation, the protesters disagreement over the actual numerical, adjectival, or color ratings is essentially inconsequential in that it does not affect the reasonableness of the judgments made in the source selection decision. Similarly, the evaluation of proposals and consideration of their relative merit should be based upon a qualitative assessment of proposals consistent with the solicitations evaluation scheme, and should not be the result of a simple count of the relative strengths and weaknesses assigned to the proposals during the evaluation process. Highmark Medicare Servs., Inc., et al., B-401062.5 et al., Oct. 29, 2010, 2010 CPD ¶ 285 at 19.
As mentioned previously, the record reflects that the TEP considered the baseline numeric scores produced as simply one tool used by the agency in evaluating the merits of the offerors past performance. See AR, Tab 9A, TEP Report, at 10. As an initial matter, we note that the summaries of the CPAR and NIH reports prepared by the agency during its evaluation of the offerors proposals include narratives discussing in detail the reasons for the relevant CPAR or NIH report ratings, evidencing that the TEP engaged in far more than the simple mathematical exercise of determining the numeric scores to be assigned to the offerors under the past performance factor. See AR, Tab 9G, Past Performance Findings-Palmetto; Tab 9H, Past Performance Findings-CGS; Tab 9I, Past Performance Findings-Noridian.
Additionally, and as will be discussed in more detail later with respect to the evaluation of each offeror, the TEP report includes lengthy narratives setting forth the TEPs views as to the relative strengths, weaknesses, or significant weaknesses associated with the offerors past performance. AR, Tab 9A, TEP Report, at 19-24, 30-33, 37-40. The narrative for each offeror concludes with a section detailing the considerations of the TEP in reaching a consensus as to the merits of a particular offerors past performance, and where the TEP felt necessary, an adjustment to the baseline numeric score initially calculated by the TEP. Id. at 23-24, 32, 39-40. In this regard, as will be discussed in more detail below, the TEP, after focusing on the strengths, weaknesses, and significant weaknesses in each offerors performance, made adjustments to each offerors baseline past performance score. For Palmetto, the TEP revised the baseline score of 0.58 to a final score of 0.4; for CGS, the TEP revised the baseline score from 0.66 to a final score of 0.70; and for Noridian, the TEP revised the baseline score from 0.82 to a final score of 0.87. Id. at 23-24, 33, 40.
The record further reflects, again as mentioned previously, that in performing the source selection the CO did not simply rely on the numeric scores assigned to the offerors proposals by the TEP under any of the non-cost factors. Rather, the CO, while considering the TEPs evaluation, engaged in her own review of the offerors proposals. AR, Tab 20A, Source Selection Decision, at 39-61. In this regard, the COs summary of her views of the offerors respective past performance makes no mention of numeric past performance scores. Id. at 49-58. As such, while we understand the protesters concerns regarding the process by which the TEP calculated the baseline numeric past performance scores, the calculations did not drive the decision here. Instead, the record shows that the decision was driven by a consideration of the relative strengths and weaknesses underlying the numerical scores. As a result, we find no basis to conclude that the agencys evaluation and source selection were unreasonable.
Next, the protesters argue that CMSs evaluation of their respective past performance and that of Noridian was unreasonable. Here, the protesters focus on the strengths, weaknesses, and significant weaknesses identified by the agency under the past performance factor.
Where a protester challenges an agencys past performance evaluation, we will examine the record to ensure that the evaluation was reasonable and consistent with the solicitations stated evaluation factors and applicable statutes and regulations. Although an agency is not required to identify and consider each and every piece of past performance information, it must consider information that is reasonably available and relevant as contemplated by the terms of the solicitation. Where an agency has considered reasonably available and relevant past performance information, its judgments regarding the relative merits of competing offerors past performance are primarily matters within the contracting agencys discretion, and the protesters mere disagreement with such judgments does not establish a basis for our Office to sustain a protest. Highmark Medicare Servs., Inc., et al., supra, at 16-17.
The solicitation informed offerors that [t]he evaluation of past performance will be based on the Offerors demonstrated ability, (considering its performance of the requirements of other contracts of a similar in nature, scope, and complexity as the MAC contract) to successfully meet the requirements of the SOW in this solicitation. RFP at 142. To facilitate this evaluation, the RFP provided that offerors could submit past performance information pertaining to themselves or their proposed subcontractors, but stated that information need not relate to any MAC contract, Medicare fiscal intermediary (FI) contract, or Medicare carrier contract, as CMS would review information maintained in its internal record and or with its personnel pertaining to such contracts. Id. at 111. The RFP noted here that the agency may consider information from a range of CMS internal and external sources to complete its past performance evaluation, including Section 912 Evaluations, Medicare Contractor Provider Satisfaction Survey (MCPSS) Results, Quality Assurance and Surveillance Plans, Reports of Contractor Performance, and Previous MAC Implementations. Id. at 112. We address the evaluation of each offerors past performance in turn.
Palmetto argues that CMSs assignment of a final numeric rating of 0.4 to its proposal under the past performance factor was unreasonable. See AR, Tab 9A, TEP Report, at 6. Palmetto contends, for example, that the agency, in assigning Palmettos past performance score of 0.4, placed far more emphasis than was warranted on Palmettos Section 912 evaluations, and its difficulties with certain aspects of its medical review strategy. In Palmettos view, the agency should have given greater weight to its overall performance record as evidenced by its CPAR ratings. Palmetto Suppl. Comments at 18-20.
The TEP evaluated Palmettos past performance as evidencing a number of strengths, noting, for example, that during the base year of Palmettos A/B MAC Jurisdiction 11 contract, it had met 100% of the standards reviewed in financial management, claim processing, and provider customer service, which exceeded the national averages. AR, Tab 9A, TEP Report, at 19. As another example, the TEP noted that Palmettos Jurisdiction 1 Audit & Reimbursement reflect strong performance, with the TEP specifically commenting on the upward trend in Palmettos performance as evidenced by the improvement in this area reflected between Option Years 1 and 2. Id.
The TEP also evaluated certain aspects of Palmettos past performance as weaknesses and significant weaknesses. For example, the TEP found that Palmettos performance as the Jurisdiction 1 A/B MAC was marred by poor performance in the area of Provider Customer Service during all of the evaluated periods of performance. Id. at 20. Accordingly, the TEP noted this aspect of Palmettos past performance as a weakness. Id. The TEP also found that certain of Palmettos CERT scores for its Jurisdiction 1 contract were above the national averages--indicating a higher than average rate of improper payments--and noted this as a weakness in Palmettos past performance. Id. at 20-21
The TEP also noted that certain aspects of Palmettos performance as the Jurisdiction 1 A/B MAC were substandard, as evidenced by the results of the agencys Section 912 evaluation, wherein Palmetto had a higher number of risk findings than the national average. Id. at 21; Tab 9G, Past Performance Findings-Palmetto, at 15. These findings included, among other things, that Palmettos [s]ecurity policies . . . did not address/enforce platform security configurations or patch management, that the system security plan was not kept current, and that [v]ulnerabilities were identified during the network penetration test. AR, Tab 9G, Past Performance Findings-Palmetto, at 15. The TEP concluded that these findings exhibited non-conformance with the contract requirements for information systems security, and posed substantial risk . . . because protected health information (PHI) and private financial data could be accessed inappropriately, exposing the Government to potential fraud, waste, and abuse. AR, Tab 9A, TEP Report, at 21; see Tab 9G, Past Performance Findings-Palmetto, at 15. The TEP added here that [b]y continually failing to properly safeguard PHI and other private health information, the Offeror has risked the availability, confidentiality, and security of the Medicare [fee-for-service] systems and data. Id. This aspect of Palmettos past performance was considered a significant weaknesses by the TEP. Id.
The TEP also assessed two other aspects of Palmettos past performance as significant weaknesses. In this regard, the TEP first noted with regard to Palmettos performance as the Jurisdiction 1 A/B MAC, that Palmetto had failed to meet the [Quality Assurance Surveillance Plan] Medical Review metric that assesses the MACs [medical review strategy] to ensure an appropriate and complete plan is in place to address vulnerabilities and the high CERT error rate in the jurisdiction. AR, Tab 9A, TEP Report, at 22; Tab 9G, Past Performance Findings-Palmetto, at 7. The TEP continued here by pointing out that the performance issues in this regard were sufficiently grievous to warrant a request from CMS senior management for corrective action, and that even after extensive oversight and direction from CMS beginning in April 2011. . . that included weekly meetings and multiple on-site follow-up reviews, Palmetto continued to struggle with developing an effective strategy to correct the problems identified in the [Quality Assurance Surveillance Plan] review. AR, Tab 9A, TEP Report, at 22. Although the TEP added that Palmetto had made some progress in this area, the MAC program office has determined that weekly monitoring meetings with Palmetto remain necessary and continued intense oversight is still warranted. Id.
The last aspect of Palmettos past performance assessed by the TEP as a significant weakness pertains to Palmettos performance as the Jurisdiction 11 A/B MAC. Here, the TEP found that certain aspects of Palmettos performance were substandard, and that the cause was Palmettos failure to provide an adequate number of personnel to perform the required services. AR, Tab 9A, TEP Report, at 23; Tab 9G, Past Performance Findings-Palmetto, at 7; see Tab 13A(05), CMS Letter to Palmettos President (Dec. 5, 2011). Specifically, the record reflects that during first option year of Palmettos Jurisdiction 11 A/B MAC contract, the protester, for a number of months, provided less than 50 percent of the agreed upon full-time equivalent personnel for the functions at issue, and, for a number of other months, provided considerably less staff than it had proposed. AR at 25; AR, Tab 9A, TEP Report, at 23; Tab 9G, Past Performance Findings-Palmetto, at 7; Tab 13A, CMS Letter to Palmettos President (Dec. 5, 2011). The TEP noted, among other things, that CMS had determined [that] this performance was sufficiently egregious to question whether to exercise the next [Jurisdiction] 11 contract option year. AR, Tab 9A, TEP Report, at 23.
The record shows that the TEP reasonably considered its findings regarding Palmettos past performance, and concluded that although it had calculated a baseline numeric score of 0.58 for Palmettos past performance, a lower numeric score of 0.4 was warranted in light of the strengths, weaknesses, and significant weaknesses highlighted above. Id. The TEP specifically noted its concerns with Palmettos anemic improvement in Jurisdiction 1 Medical Review program after more than a year of CMS stringent oversight, and Palmettos failure in Jurisdiction 11 to hire the appropriate number of personnel to meet the contract requirements. Id. at 24. In her source selection decision, the CO agreed with the TEPs concerns, noting with regard to the results of Palmettos Section 912 evaluations, that Palmettos performance demonstrates its difficulty performing portions of the work at an acceptable level, and a failure to establish internal controls and manage its IT systems in a way which clearly maintains the confidentiality, integrity and availability of Medicare systems operations. AR, Tab 20A, Source Selection Decision, at 21. The CO further commented in her source selection decision that Palmettos provision of less staff than agreed upon for certain of the services to be provided by Palmetto as the Jurisdiction 11 A/B MAC had a major negative impact on the quality of service in these areas . . . and put the Medicare program at unacceptable levels of risk. Id. at 22.
Although Palmetto clearly disagrees with the agencys evaluation of its past performance, we cannot find this aspect of the agencys evaluation to be unreasonable. The record of the evaluation evidences the agencys clear concerns with Palmettos past performance in the areas discussed above, and the impact that these significant weaknesses had and could have on the Medicare program and its beneficiaries.
CGS contends that CMSs evaluation of its past performance and assignment of a numeric rating of 0.7 to its proposal under this factor were unreasonable. CGS argues that the agencys evaluation failed to recognize numerous strengths demonstrated by its past performance record. CGS continues by arguing that the agencys evaluation of its past performance gave undue consideration to the past performance of Palmetto, which was proposed by CGS as its [DELETED] subcontractor [DELETED]. CGS Protest at 31; see AR, Tab 9A, TEP Report, at 38.
The TEP evaluated CGSs past performance as presenting a number of strengths, noting, for example, that CGS has met 100% of the metrics for medical review in Option Years 2-4, and 100% [of] its Claims Processing metrics for Option Years 2 and 3 with regard to its Durable Medical Equipment (DME) MAC. AR, Tab 9A, TEP Report, at 37. The TEP noted as another strength that CGS had implemented a project to consolidate certain Medicare Summary Notices into one envelope, and that this consolidation had saved the program approximately $500,000 throughout the year and was subsequently adopted by the [fee-for-service] contractor community. Id. By way of another example, the TEP assigned a strength to CGSs past performance based on [CGSs] ability to maintain low CERT rates in its legacy Part B workload, and an additional strength for CGS past performance with respect to its legacy contract for its work as an outgoing contractor. Id. at 37-38. The TEP noted here that CGS worked closely with a number of entities to transition the Tennessee Part B workload to the incoming MAC for Jurisdiction 10, thereby eliminating disruption to the Tennessee providers. Id. at 38. The TEP continued by commenting here that CGS was able to decrease its workload inventories to historical lows allowing the incoming MAC an advantage during its implementation, which in the TEPs view demonstrated [CGSs] willingness to focus on the Agencys goals for the implementation even while losing part of its own workload. Id.; see Tab 9H, Past Performance Findings-CGS, at 8.
The weaknesses and significant weaknesses noted by the TEP with regard to CGSs past performance concerned the Section 912 evaluations for CGS itself, and Palmetto, which as mentioned above, was proposed as CGSs [DELETED] subcontractor. CGS Protest at 31; see AR, Tab 9A, TEP Report, at 38. The TEP noted here that the agencys Section 912 evaluations conducted in connection with CGSs performance of its legacy carrier contract noted more high risk findings than the national average for fiscal year 2010, and a number of high risk findings and medium risk findings for fiscal year 2011. AR, Tab 9A, TEP Report, at 39. The TEP concluded that CGS, as well as any offeror with high risk findings in this area, warranted a weakness because, among other things, the failure to perform efficiently and protect the confidentiality, integrity and availability of Medicare [fee-for-service] data could result in the need for increased Government oversight and monitoring. Id.
With regard to Palmetto, the record reflects that the TEP carefully considered Palmettos role and responsibilities as CGSs subcontractor, where Palmetto would be providing the [DELETED] in performing this contract. The TEP assessed this as a significant weakness in CGSs proposal under the past performance factor, given the issues with Palmettos performance as identified during the Section 912 evaluations as discussed earlier in this decision. Id.
The TEP ultimately assigned a numeric score of 0.7 to CGS under the past performance factor, rather than the calculated baseline score of 0.67. Id. at 40. The TEP noted here that despite CGSs strong past performance on its legacy [c]arrier contract, as well as its performance as the DME MAC for Jurisdiction C as evidenced by its CPAR report, NIH Reports, and award fees earned, the weaknesses revolving around the Section 912 findings for both CGS and its proposed subcontractor, Palmetto, led the TEP to assign a 0.7 numeric rating for this factor. Id.
We find nothing unreasonable in the agencys evaluation here. As indicated above, the agencys evaluation of CGSs past performance was comprehensive, and contrary to CGSs assertions and based upon our review of the record, well documented. With regard to the impact that Palmetto, as CGSs subcontractor, had on CGSs past performance score, we cannot find the agencys actions unreasonable. As explained by the agency, given Palmettos role as CGSs subcontractor [DELETED]. AR at 36. As such, Palmettos ability to secure its IT systems, which is the subject of the Section 912 evaluations, clearly was relevant to the agencys consideration of the past performance of CGS and its proposed subcontractor Palmetto.
Both protesters argue that CMSs evaluation of Noridians past performance, and assignment of a 0.87 under this factor, was unreasonable and evidenced unequal treatment. The protesters point out, for example, that the agencys Section 912 evaluations of Noridian for fiscal years 2010 and 2011 included a number of high risk findings--which in the case of the protesters, led to the assessment of weaknesses and contributed to their lower ratings.
With regard to Noridians past performance, the TEP noted a number of operational excellence indicators from past internal Quality Assurance Surveillance Program . . . reviews where [Noridian] clearly exceeded the national norms. AR, Tab 9A, TEP Report, at 30. For example, the agency noted that Noridian had met 100% of the claims processing metrics, and 100% of the Provider Customer Service Plan metrics, in each of the three years reviewed by CMS with regard to Noridians performance as the Jurisdiction 3 A/B MAC. Id. The TEP added that Noridian, as the Jurisdiction 3 A/B MAC, had performed at a level well above the national average in a number of areas, including the area of Provider Enrollment. Id. at 31.
The TEP further identified a very high quality strength based on Noridians performance on its legacy Fiscal Intermediary and Carrier contracts. Id. at 31. In doing so, the TEP described Noridians actions in responding to a devastating flood near Noridians North Dakota headquarters in March 2009, and noted that [t]hroughout this period, [Noridian], monitored its performance and staffing levels to ensure CMS requirements were met. Id.
While noting that there were no findings in CMSs fiscal year 2009 Section 912 evaluation of Noridian, the TEP assigned a weakness to Noridian under the past performance factor because the fiscal year 2010 Section 912 evaluation of Noridian revealed high and medium risk findings, and that as such, Noridians performance during this period had been only slightly better than the national average. Id. at 31-32. The agency concluded in this regard that the fiscal year 2010 Section 912 evaluation findings detract from [Noridians] ability to perform efficiently and protect the confidentiality, integrity and availability of Medicare [fee-for-service] data and could result in the need for increased Government oversight and monitoring. Id. at 32. The record reflects that the TEP, in assigning a final numeric score of 0.87 to Noridian under the past performance factor, noted that while the nature of the Section 912 weaknesses did reduce the final rating to some extent, the strengths cited greatly outweighed the weaknesses. Id. Here, the TEP pointed to, among other things, the excellence in many of the . . . metrics measured, where Noridian had achieved 100 percent ratings. Id.
The record further reflects that the CO, in comparing Nordians past performance to that of CGS and Palmetto, considered the findings of the TEP as set forth above, and conducted an independent review of each Offerors past performance. AR, Tab 20A, Source Selection Decision, at 58. The COs source selection decision discusses, in considerable detail, many of the same points cited by the TEP, and certain additional information, such as CGSs performance as the Jurisdiction 15 A/B MAC, as reflected in the CPARs. Id. at 57. The CO ultimately concluded that [n]either Palmetto nor CGS has achieved the level of successful past performance as [Noridian], and that Noridians past performance provides the SSA with a high level of confidence that it could successfully perform the requirements of [Jurisdiction] E. Id. at 58.
We find nothing unreasonable in the agencys evaluation here. The record demonstrates that the agency reasonably considered Noridians past performance as demonstrating numerous strengths. The record further reflects that the agency reasonably considered the results of the Section 912 evaluations, and did not engage in unequal treatment with regard to its ultimate consideration of the relative merits of the offerors past performance. In this regard, and as noted above, the record establishes that the agency was aware of and considered the Section 912 evaluation results for Noridian during its assessment of Noridians past performance, and reasonably considered that while they represented a weakness, they were offset in part by the numerous strengths evidenced by Noridians past performance. The record also establishes that, as compared to Noridian, CGS and Palmetto both had a greater number of high and medium risk findings in fiscal years 2010 and 2011 as evaluated under Section 912, and as such, the agencys assessments in this regard were not unreasonable. AR, Tab 9A, TEP Report, at 21, 31-32, 39. In sum, we see nothing unreasonable about the agencys evaluation of the offerors competing proposals under the past performance factor, and its conclusion, as set forth in the source selection decision, that [n]either Palmetto nor CGS has achieved the level of successful past performance as [Noridian]. See AR, Tab 20A, Source Selection Decision, at 58.
CGS argues that CMSs evaluation of its proposal and the proposal submitted by Noridian under the implementation factor was unreasonable.
The evaluation of offerors technical proposals, including the determination of the relative merits of proposals, is primarily a matter within the contracting agencys discretion, since the agency is responsible for defining its needs and the best method of accommodating them. Highmark Medicare Servs., Inc., et al., supra,
at 12. In reviewing an agencys evaluation, we will not reevaluate the proposals, but will examine the record of the evaluation to ensure that it was reasonable and consistent with the stated evaluation criteria as well as with procurement law and regulation. Id. A protesters mere disagreement with a procuring agencys judgment is insufficient to establish that the agency acted unreasonably. Id.
The RFP required each offeror to provide a clear and concise description of its understanding of implementation activities and its ability to resolve any issues that may arise. RFP at 119. The RFP specifically requested that proposals address [i]ssues, concerns, and key assumptions of the implementation, and [a]ctions that will be taken to communicate with Medicare providers, practitioners, suppliers, and other affected partners and stakeholders in order to minimize the impact of the implementation. Id.
The TEP assessed a number of strengths in CGSs proposal in evaluating it under the implementation approach factor. The TEP found, for example, that CGSs continued use of [DELETED] and that because of CGSs proposed use of the incumbent contractor, Palmetto, for [DELETED]. AR, Tab 9A, TEP Report, at 34. The TEP also pointed out that CGSs proposed use of [DELETED]. Id.
The TEP also evaluated CGSs proposed approach to implementation as posing a weakness in the area of provider outreach and education. Id. at 34. The TEP found that certain aspects of this area of CGSs implementation plan, as set forth in its proposal and presented during its oral presentation, were vague, in that the plan did not specify how [CGS] will provide the outreach necessary for such a large provider community. Id. The TEP concluded that, overall, CGSs approach was solid, and that the strengths associated with CGSs proposed approach to implementation all benefit the Government substantially and set the stage for a low risk implementation. Id. at 35. The CO, as reflected in her source selection decision, concurred with the TEPs views. AR, Tab 20A, Source Selection Decision, at 41-42.
In arguing that the agencys evaluation of its proposal under the implementation factor was unreasonable, CGS simply repeats many of the features of its approach as set forth in its proposal and its oral presentation, and asserts that the agencys assignment of a numeric rating of 0.8 to CGSs proposal under the implementation factor, rather than the maximum available rating of 0.9, was unreasonable. See CGS Protest at 39-43; CGS Comments at 79-82; AR, Tab 7A, CGS Proposal, Implementation Approach at 1-10; Tab 7G, CGS Oral Presentation, at 64-69, 111-19. Based upon our review of the record, CGSs arguments here evidence nothing more than its disagreement with this aspect of the agencys evaluation, and as such, provides no basis to find the agencys evaluation unreasonable.
CGS next argues that CMS failed to properly consider risks in Noridians proposal under the implementation factor. CGS contends, for example, that in contrast to its own implementation approach, Noridians approach includes personnel new to Jurisdiction 1 [DELETED], and proposed to transition from the incumbent contractors EDI to Noridians own EDI [DELETED]. CGS also complains that the agency erroneously concluded that Noridians staffing plan was well thought out, and unreasonably credited Noridian for including a risk mitigation plan as part of its implementation approach. See Tab 9A, TEP Report, at 26.
As pointed out by the agency, Noridians proposal and oral presentation included detailed explanations as to how it intends to retain, or recruit, hire, and train, the personnel necessary to perform the contract. Agency Suppl. Report at 27, citing AR, Tab 6aC, Noridian Proposal, § C.1.2 at 1-11; Tab 6F, Noridian Oral Presentation, at 16-32. The agency also points out that Noridians proposal includes a description of a specific technical innovation to assist in transitioning the EDI, and that contrary to CGSs assertion, Noridians oral presentation specifically mentions its provision of a risk mitigation plan as a deliverable under the contract. Agency Suppl. Report at 23-24, citing AR, Tab 6aD, Noridian Proposal, at D3. Based upon our review of the record, CGSs arguments here again evidence nothing more than its mere disagreement with this aspect of the agencys evaluation, and as such, we have no basis on which to find the agencys evaluation of Noridians proposal, and assignment of a numeric score of 0.7 to that proposal under the implementation factor, to be unreasonable.
Technical Understanding Evaluation
The protesters argue that CMSs evaluation of proposals under the technical understanding factor was unreasonable, with each protester making multiple challenges to the evaluation of its own and Noridians proposals.
The RFP provided that the agency would consider customer service, financial management, operational excellence, and innovations and technology, in evaluating proposals and determining the overall rating for the technical understanding evaluation factor. RFP at 112. The solicitation advised offerors that in addressing the technical understanding factor their proposals were to include sections addressing personnel (to include key personnel and a staffing plan), innovations, their medical review strategy, and their customer service strategy. RFP at 113-119. The RFP included detailed explanations as to what was meant by each of these areas, as well as the type of information offerors were to include in their proposals.
In challenging the propriety of CMSs evaluation under the technical understanding factor, Palmetto and CGS focus on the agencys evaluation of Noridians proposal and its conclusion that an innovation proposed by Noridian applicable to the provider enrollment process, termed RapidApp, constituted a strength.
The TEP noted that Noridians RapidApp automates the provider enrollment process, which is currently very much a manual system. AR, Tab 9A, TEP Report, at 28. The TEP found that Noridian proposes to streamline the enrollment process through RapidApp by interviewing the enrollee via an online application and electronically capturing the information, which was previously recorded on paper forms. Id. The TEP also noted that, through the use of RapidApp, the enrollee would be validated against external sanction and licensure databases, and that RapidApp as proposed allows for a paperless process through the use of electronic signatures and attachments. Id. Additionally, the TEP cited RapidApps achievement of an [DELETED] reduction in the time it takes [Noridian] to process similar paper provider enrollment applications in the pilot of this system currently being conducted in [Noridians] Jurisdiction F contract with physician assistants. Id. The TEP concluded that Noridians proposed use of RapidApp merited a strength, based on the following:
[T]he continued development and roll out of the RapidApp application . . . will foster efficiencies in the administration of the Medicare [fee-for-servce] program, serve to enhance customer service by effectively responding to providers needs and inquiries, and also promote the fiscal integrity of the Medicare [fee-for-service] program and prompt claims payment . . . by processing enrollment applications more quickly and accurately.
In her source selection decision, the CO recognized that while RapidApp is a very progressive innovation, it has not been in place for a sufficient period of time to demonstrate that it will achieve everything that is proposed. AR, Tab 20A, Source Selection Decision, at 43. The CO also specifically noted in her consideration of Noridians proposal that subsequent to the completion of the TEP Report, it was learned that CMS may disallow the front-end on-line interview feature of RapidApp, since CMS now has on-line data entry screens available for provider applications. Id. After discussing this matter with the TEP, the CO concluded that despite this new information, RapidApp remains a strength within [Noridians] proposal, because the back-end automation features of RapidApp may be retained and may serve to foster efficiencies in the administration of the Medicare [fee-for-service] program. Id. at 43-44; see COs Suppl. Statement at 2. As such, the CO continued to consider Noridians RapidApp as a strength, and cited to it as one of the seven non-cost strengths supporting her determination that Noridians proposal represented the best value to the agency. AR, Tab 20A, Source Selection Decision, at 60.
The protesters argue at length that the COs continued consideration of Noridians RapidApp as a strength was unreasonable, contending that CMS either would not support or was likely to discontinue support for the [Rapid App] technology, as evidenced by the new information referenced by the CO in her source selection decision. CGS Comments at 49; CGS Suppl. Comments at 24; see Palmetto Comments at 23; Palmetto Suppl. Comments at 12-13. The protesters also argue here that the record lacks support for the COs conclusion that the back-end of RapidApp may be retained and may be of value. See Id.
As an initial matter, we note that, in a number of instances, the protesters mischaracterize the record as establishing that CMS had decided to withdraw its support for Noridians RapidApp pilot program. Rather, as set forth in the COs source selection decision, the record provides only that CMS may withdraw such support. AR, Tab 20A, Source Selection Decision, at 43; Tab 19g, CMS emails, at 18; Tab 19h, CMS emails, at 128.
In any event, the record shows that the CO was aware and considered the available information regarding Noridians RapidApp innovation, including the above-referenced new information set forth in emails between CMS personnel stating that CMS may withdraw its support of Noridians RapidApp pilot project, and discussing the effect of that withdrawal of support on the viability of Noridians RapidApp and its claimed efficiencies. Id. Although the protesters clearly disagree with the COs conclusion that Noridians RapidApp remained a strength, they have not shown it to be unreasonable, and their arguments reflect nothing more than their disagreement with the agencys ultimate evaluation.
Palmetto and CGS
The protesters raise a number of other issues challenging the propriety of CMSs evaluation of proposals under the technical understanding factor, arguing that the agencys evaluation of proposals was unreasonable or evidenced unequal treatment. For example, both CGS and Palmetto argue that the agency unreasonably assigned a weakness to their proposals under the technical understanding factor based on the agencys determination that the proposals included certain erroneous assumptions regarding the work to be performed under the audit and reimbursement function. See AR, Tab 9A, TEP Report, at 18, 36;
Tab 20A, Source Selection Decision, at 46, 48.
In this regard, Palmetto and CGS both proposed the same subcontractor, First Coast Service Options (FCSO), to perform the audit and reimbursement tasks required by the RFP, and FCSO submitted identical proposals to CMS for both companies. AR at 67 n.42. The agency found that FCSOs proposal (and thus Palmetto and CGS) made number of assumptions concerning the workloads involved in Audit and Reimbursement activities that are not reasonable and underestimate the work required. AR, Tab 9D, Technical Understanding Findings-Palmetto, at 7. Specifically, the CMS subject matter expert (SME) for the Audit and Reimbursement function found that FCSOs proposed performance of this function was erroneously predicated on the assumption that [DELETED]. Id. The SME added here that FCSOs proposal was also predicated on its assumption that [DELETED]. Id. The CMS SME also noted that FSCOs proposal was predicated on the assumption that [DELETED], and that this was contrary to CMS experience. Id.
The TEP considered and adopted the findings of the CMS SME, and concluded that a weakness was merited for both Palmettos and CGSs proposals because FCSOs assumptions pose the risk that a large portion of the institutional provider community may not be adequately audited resulting in the Agency reimbursing these providers more than they are entitled to receive. AR, Tab 9A, TEP Report, at 18, 36. This weakness was noted by the CO in her source selection decision, with the CO specifically commenting that the incorrect audit and reimbursement assumptions also operate to qualitatively weaken [Palmettos and CGSs proposals] in comparison to the [Noridian] proposal. AR, Tab 20A, Source Selection Decision, at 48.
The protesters raise a number of arguments in challenging the reasonableness of the agencys conclusions here. For example, the protesters argue that FCSOs proposal was based upon the workload estimates set forth in the RFP, and point out that the agency did not make any adjustment to FCSOs proposed costs (and thus, those proposed by Palmetto and CGS) based upon the agencys determination that the above-described assumptions were not reasonable. The protesters also assert that the assumptions set forth in FCSOs proposal were based upon FCSOs experience gained from performing these services in Jurisdiction 1 as a subcontractor to Palmetto (the incumbent contractor), with CGS specifically objecting to the agencys determinations on the basis that [a] review of each of the assumptions that CMS criticized shows that the SME rejected FCSOs assumptions without data to offer any more reasonable or realistic alternative. Palmetto Comments at 21; CGS Comments at 32-35.
As explained by the agency and evidenced by the record, the agencys criticisms of FCSOs proposal here did not involve the number of hours proposed for performance of the work, or any concern that the hours proposed were not based upon the RFPs workload estimates. Rather, the agencys criticism and assignment of a weakness to the protesters proposals under the technical understanding factor reflected the agencys concerns that the assumptions set forth in FCSOs proposal--which were unique to FCSO and thus based upon FCSOs view of the work to be performed--were inconsistent with the CMS SMEs understanding of the agencys requirements regarding the audit and reimbursement function. AR, Tab 9A, TEP Report, at 18, 36. In this regard, although the protesters repeatedly assert that the assumptions set forth in FCSOs proposal should have been accepted by the agency because they were based upon FCSOs experience in performing the audit and reimbursement function under the incumbent contract, the agency notes that FCSO failed to substantively explain this in its proposal, or include any data or other information of any kind in support of its views. Agency Suppl. Report at 28-29. An offerors technical evaluation is dependent upon the information furnished; there is no legal basis for favoring a firm with presumptions on the basis of its incumbent status. HealthStar VA, PLLC, B-299737, June 22, 2007, 2007 CPD ¶ 114 at 2. It is the offerors burden to submit an adequately written proposal; an offeror, including an incumbent contractor (or here, subcontractor), must furnish, within its proposal, all information that was requested or necessary to demonstrate its capabilities in response to a solicitation. Id. In sum, although the protesters clearly disagree with the agencys determinations here, they have not shown them to be unreasonable.
CGS also argues that CMSs evaluation under the technical understanding factor was flawed because CGS proposal and oral presentation offered the same features proposed by [Noridian], typically with significantly more detail than [Noridian] offered, yet CGS received no credit. CGS Suppl. Comments at 30; see CGS Comments at 27. For example, CGS contends that the agency credited Noridian, but not CGS, for proposing a law enforcement liaison, even though CGS noted in its oral presentation that it will [DELETED]. CGS Comments at 27; see CGS Suppl. Comments at 28-29.
The TEP assigned several strengths to Noridians proposal under the technical understanding factor based upon Noridians proposed staffing plan and key personnel. In doing so, the agency noted that Noridians proposal goes beyond the requirements in the RFP through its inclusion of three additional positions as key personnel/managers, including the position of Provider Education Manager and Law Enforcement Liaison. AR, Tab 9A, TEP Report, at 27. The TEP also noted that establishing the Provider Enrollment Manager and Law Enforcement Liaison as a key personnel position indicates [that Noridian] understands the impact of medical review and provider enrollment on the success of the [Jurisdiction] E contract given California is one of the highest impact fraud states in the country. AR, Tab 9F, Technical Findings-Noridian, at 1; see Tab 9A, TEP Report, at 27. The CO noted Noridians proposed key personnel position of Provider Education Manager and Law Enforcement Liaison in her source selection decision, stating that this aspect of Noridians proposal demonstrates that Noridian is pro-active in its desire and intent to meet the agency[s] needs to reduce the risk of fraud and abuse. AR, Tab 20A, Source Selection Decision, at 42-43.
CMS also notes that Noridians proposal included the curriculum vitae of the proposed Provider Education Manager and Law Enforcement Liaison, which provides in part that, in that position, the Provider Education Manager and Law Enforcement Liaison collaborates with the Federal Bureau of Investigation (FBI), Office of Inspector General (OIG), and Assistant United States Attorneys (AUSA) in pursuing actions against suspected fraudulent providers and suppliers. AR, Tab 6A, Noridian Technical Proposal, Technical Understanding, Key Personnel, Provider Education Manager and Law Enforcement Liaison. This section of Noridians proposal further explains that the Provider Education Manager and Law Enforcement Liaison provides expert assistance to law enforcement agencies on issues affecting Medicare guidelines, policies, and other claims related issues. Id.
In contrast, CGS merely referred to [DELETED] one time in its oral presentation, and that reference was in the context of its awareness of the uniqueness of Jurisdiction E. AR, Tab 7G, CGS Oral Presentation, at 90. While recognizing that it is a fundamental principle of government procurement that the contracting agency must treat all offerors equally, and in so doing must evaluate proposals evenhandedly against common requirements, we see no evidence of unequal treatment. See Contingency Mgmt. Group, LLC; IAP Worldwide Servs., Inc., B-309752 et al., Oct. 5, 2007, 2008 CPD ¶ 83 at 15. That is, as discussed above, Noridians proposal established the Provider Education Manager and Law Enforcement Liaison as a key personnel position, and described the position in a detailed manner that was reasonably evaluated by the agency as advantageous. In contrast, CGSs proposal included no such position or description, or any other similar information.
Cost Realism Evaluation
Palmetto and CGS contend that CMSs evaluation of Noridians and the protesters cost proposals was unreasonable. Although these arguments span a variety of cost elements, such as productivity rates and indirect rates, the protesters focus on the agencys determinations regarding provider enrollment productivity. The protesters argue that the agencys upward adjustment of $4,052,414 to Noridians proposed costs for the Medicare Part B provider enrollment function was inadequate, and that the agencys upward adjustment to their respective proposed costs of $2,327,481 for the same function was excessive. AR, Tab 15A, BEP Report-Noridian, at 10; Tab 16A, BEP Report-CGS, at 18; Tab 17A-BEP Report-Palmetto, at 34.
When an agency evaluates a proposal for the award of a cost-reimbursement contract, an offerors proposed estimated costs are not dispositive because, regardless of the costs proposed, the government is bound to pay the contractor its actual and allowable costs. FAR §§ 15.305(a)(1); 15.404-1(d); Tidewater Constr. Corp., B-278360, Jan. 20, 1998, 98-1 CPD ¶ 103 at 4. Consequently, the agency must perform a cost realism analysis to determine the extent to which an offerors proposed costs are realistic for the work to be performed. FAR § 15.404-1(d)(1). An agency is not required to conduct an in-depth cost analysis, see FAR § 15.404-1(c), or to verify each and every item in assessing cost realism; rather, the evaluation requires the exercise of informed judgment by the contracting agency. Cascade Gen., Inc., B-283872, Jan. 18, 2000, 2000 CPD ¶ 14 at 8. Further, an agencys cost realism analysis need not achieve scientific certainty; rather, the methodology employed must be reasonably adequate and provide some measure of confidence that the rates proposed are reasonable and realistic in view of other cost information reasonably available to the agency as of the time of its evaluation. See SGT, Inc., B-294722.4, July 28, 2005, 2005 CPD ¶ 151 at 7; Metro Mach. Corp., B-295744, B-295744.2, Apr. 21, 2005, 2005 CPD ¶ 112 at 10-11. Because the contracting agency is in the best position to make this determination, we review an agencys judgment in this area only to see that the agencys cost realism evaluation was reasonably based and not arbitrary. Hanford Envtl. Health Found., B-292858.2, B-292858.5, Apr. 7, 2004, 2004 CPD ¶ 164 at 8.
By way of background, the record reflects that the agency, in preparing its cost realism analysis, received and considered audit reports from the Defense Contract Audit Agency, and prepared separate, lengthy, and detailed Business Evaluation Reports regarding the proposals submitted by Noridian, CGS, and Palmetto. AR, Tab 15A, BEP Report-Noridian; Tab 16A, BEP Report-CGS; Tab 17A-BEP Report-Palmetto. The record further reflects that the agency reviewed, among other things, the offerors proposed labor rates and escalation factors and indirect rates including their pool and base composition for reasonableness, realism, and acceptability, as well as the offerors proposed other direct costs for allowability, realism and supportability. Id. at 2. The agency also considered the offerors proposed labor mix in relation to the technical requirements, the proposed productive hours and whether they are reasonable, realistic and computed correctly, and compared the proposed/historical productivity [to] ensure that differences are explained. Id. at 2-3. The agency also considered, where available, historic data specific to Noridian, CGS, and Palmetto regarding the above elements, and compared the historic data for each offeror to that offerors proposed labor mix, level of effort, productivity, and/or unit cost for a number of the functions set forth in the statement of work, including the provider enrollment function. Id. at 3.
As discussed previously with regard to the technical factor evaluation, Noridian proposed to automate the provider enrollment process through its RapidApp process. AR, Tab 6A, Noridian Technical Proposal, at C.2-5. Noridian explained in its proposal that RapidApp streamlines Medicare provider enrollment by interviewing the enrollee and capturing the information that was previously recorded on multiple, complex paper forms. Id. Noridians proposal further stated that RapidApp automates many aspects of the [provider enrollment and validation] process, such as validation against sanction and licensure databases, and claimed in its proposal that RapidApp reduces the time it takes us to process these [provider enrollment] applications from an average of 90 minutes down to 20 minutes. Id.
In evaluating Noridians proposed costs, the agency noted that Noridians proposed level of effort for the provider enrollment function was based upon its RapidApp innovation. Noridian claimed that use of RapidApp would create efficiencies in the processing of provider enrollment applications, and would result in an increase in its processing rate from Noridians historical rate of 5 provider enrollment applications per day to [DELETED] provider enrollment applications per day, with the same level of effort. AR, Tab 15C, Technical Cost Findings-Noridian, at 8. The agency noted that because of certain of the uncertainties regarding RapidApp as discussed previously in this decision, including the limited application of RapidApp at the time of [the] evaluation and when RapidApp may be deployed in [Jurisdiction E] and to what extent, it was unable to estimate a level of productivity savings going forward. Id. at 8-9. The agency thus rejected the majority of Noridians claimed savings here.
While the agency recognized Noridians RapidApp as a strength, it nonetheless found that Noridians proposed productivity gains to be realized from RapidApp overstated. AR, Tab 15A, BEP Report-Noridian, at 10. The agency considered the options for making an adjustment to Noridians proposed costs, given its inability to estimate a level of productivity savings going forward, and determined that the preferred method was to adjust Noridians costs upwards based upon an application of the most recent baseline [provider enrollment] productivity numbers as calculated by CMS using its internal productivity measures. AR, Tab 15A, BEP Report-Noridian, at 10; Tab 15C, Technical Cost Findings-Noridian, at 8. When applied to Noridians proposed costs, this calculation, which yielded a baseline of 12 applications per day (rather than [DELETED] per day as proposed by Noridian), resulted in the aforementioned $4,052,414 upward adjustment to Noridians proposed costs. AR, Tab 15A, BEP Report-Noridian, at 10; Tab 15C, Technical Cost Findings-Noridian, at 8.
The protesters argue that the agency should have rejected Noridians assumption that the RapidApp innovation would have resulted in increased productivity, and therefore adjusted Noridians costs upwards based upon the application of Noridians historic provider enrollment productivity rate of five applications per day. The protesters further argue that the agencys adjustment of Noridians proposed costs upwards based upon the application of the agencys calculated national average of 12 applications per day lacked a reasonable basis. Palmetto Comments at 35; see CGS Comments at 19-20; Palmetto Suppl. Comments at 4; CGS Suppl. Comments at 14-17.
As pointed out by the agency in response to the protests, and as explained previously in this decision with regard to the technical factor, the agency did not reject Noridians RapidApp innovation. Nor does the record reflect that the agency rejected, in their entirety, the proposed productivity gains that Noridian claimed had been achieved and would be achieved as the result of its RapidApp innovation. Rather, as explained above, the record reflects that the agency reasonably found that while Noridians RapidApp was a technical strength and may result in certain efficiencies, Noridians claimed efficiencies were overstated, and a partial adjustment to Noridians proposed costs was thus appropriate. While the protesters clearly disagree, and believe that the agency should have adjusted Noridians proposed costs based upon Noridians historic provider enrollment productivity rate, and without consideration of Noridians RapidApp innovation, we cannot find the agencys conclusion that a partial adjustment to the agencys calculated average of 12 applications per day to be unreasonable.
Palmetto and CGS
Both Palmetto and CGS proposed FCSO as their subcontractor to perform the Medicare Part B provider enrollment tasks required by the RFP. The record reflects that FCSOs proposed level of effort, and thus in part, its proposed costs for the provider enrollment function, was based on a productivity rate of [DELETED] provider enrollment applications per day. AR, Tab 16C, Technical Cost Findings-CGS, at 7; Tab 17C, Technical Cost Findings-Palmetto, at 14. FCSOs proposal provided a narrative explaining the basis for its proposed provider enrollment productivity rates, which included descriptions of certain process improvements it claimed would improve efficiency and increase productivity. AR, Tab 7E, FCSO Subcontact-CGS Vol. II, Tab F.1, Business Proposal Assumptions, at 3-4; Tab 8E, FCSO Subcontact-CGS Vol. II, Tab F.1, Business Proposal Assumptions, at 3-4. FCSO added elsewhere in its proposal that its proposed provider enrollment productivity rates were based upon its experience in performing the provider enrollment function in Jurisdiction 1, adjusted for enhancements to the provider enrollment process as outlined in the RFP and FCSO initiated process improvements. Id. at 11. FCSOs proposal did not, however, include any historical data as to its provider enrollment productivity rates.
The agency considered FCSOs proposed process improvements, as well as FCSOs proposed provider enrollment productivity rates, and found that there were no innovations or other technical proposals to support the Offerors proposed productivity. AR, Tab 16C, Technical Cost Findings-CGS, at 7; Tab 17C, Technical Cost Findings-Palmetto, at 14. The agency also noted here that while in most cases it uses the Offerors historic workloads and level of effort data derived from various CMS systems and report . . . to arrive at historic productivity calculations, it was unable to do so with regard to provider enrollment productivity rates, given the manner in which that data is reported to CMS. Id. As such, the record reflects that the agency compared FCSOs proposed provider enrollment productivity rate of [DELETED] applications per day to the agencys most recent baseline [provider enrollment] productivity numbers as calculated by CMS using its internal productivity measures--as the agency had also done when considering Noridians proposal. Id.; see AR, Tab 15A, BEP Report-Noridian, at 10; Tab 15C, Technical Cost Findings-Noridian, at 8. When applied to FCSOs proposed costs, this calculation, which yielded a baseline of 12 applications per day (rather than [DELETED] per day as proposed by FCSO), resulted in the aforementioned $2,327,481 upward adjustment to FCSOs proposed costs (and thus those of Palmetto and CGS). Tab 16A, BEP Report-CGS, at 18; Tab 17A-BEP Report-Palmetto, at 34.
In challenging this aspect of the agencys evaluation, the protesters argue that the agency had no reasonable basis to reject FCSOs proposed provider enrollment productivity rate, as it was based upon FCSOs historical experience. Palmetto Comments at 30-31; see CGS Comments at 9; Palmetto Suppl. Comments at 11; CGS Suppl. Comments at 9-12. We disagree. As noted above, FCSO did not provide in its proposal (and the protesters have not provided during the course of this protest) the productivity rates on which its proposed rate was purportedly based, and thus failed to provide any verifiable support for its claim that its proposed provider enrollment productivity rate was based upon its historical experience. We also find the protesters assertion that the agencys evaluation here was mechanical to be without merit. CGS Comment at 12. Specifically, the record reflects that the agency considered the information in FCSOs proposal, as well as the lack of information regarding FCSOs historic provider enrollment productivity rates, and found no basis to conclude that FCSO would achieve a provider enrollment processing rate that exceeded the agencys calculated provider enrollment productivity baseline by [DELETED] percent. See CGS Comments at 12. In sum, although the protesters clearly disagree with the agencys evaluation of FCSOs proposal and its upward adjustment to FCSOs costs, we cannot find the agencys actions here to be unreasonable.
In sum, we find CMSs evaluation of the offerors proposals, and conclusion that Noridians proposal was superior under the non-cost factors to the proposals submitted by Palmetto and CGS, to be reasonable and consistent with the RFPs evaluation scheme. We also find reasonable the agencys evaluation of the offerors proposed costs, including the probable cost adjustments. Finally, we find reasonable the agencys conclusion that Noridians proposal represented the best value to the government, particularly given the COs determination that Noridians proposal would remain the best value even if Noridians proposal was considered on the basis of its probable costs and the proposals of Palmetto and CGS were considered on the basis of their proposed costs.  See AR, Tab 20A, Source Selection Decision, at 60.
The protests are denied.
Susan A. Poling
 Pursuant to the Medicare Prescription Drug Improvement and Modernization Act of 2003 (MMA), 42 U.S.C. §§ 1395kk et seq. (2006), MACs perform the claims services that were previously performed by legacy contractors, acting as fiscal intermediaries or carriers, under the Title XVIII of the Social Security Act, 42 U.S.C. §§ 1395c et seq. and 1395j et seq. (2000). Prior to the enactment of the MMA, fiscal intermediaries were generally responsible for processing claims from institutional providers, such as hospitals and nursing facilities, under Part A of the Medicare program; carriers were responsible for processing claims from professional providers, such as physicians and diagnostics laboratories, under Part B of the Medicare program. The MMA required the phase-out of the legacy contracting method, which did not require that contracts for fiscal intermediary or carrier services be competitively awarded, and imposed competition requirements and the use of Federal Acquisition Regulation (FAR)-based contracting, with the intent of improving Medicares administrative services to beneficiaries and providers by bringing to Medicare the standard contracting principles which have long applied to other federal programs operating under the FAR. Highmark Medicare Servs., Inc., et al., B-401062.5 et al., Oct. 29, 2010, 2010 CPD ¶ 285
at 3 n.2
 In the first phase of its Medicare modernization program, CMS divided the United States into fifteen separate jurisdictions (Jurisdictions 1 through 15) for the purposes of acquiring and providing MAC services. The agency has now combined certain of the 15 Jurisdictions, resulting in ten separate jurisdictions (Jurisdictions E through Jurisdiction N). RFP at 141; See TrailBlazer Health Enterprises, LLC, B-406175, B-406175.2, Mar. 1, 2012, 2012 CPD ¶ 78 at 3 n.2. Palmetto is the incumbent MAC services contractor for Jurisdiction 1, which is now Jurisdiction E.
 The remaining adjectival CPAR ratings and the numeric value assigned were as follows: very good = 0.8; marginal = 0.3; unsatisfactory = 0.2. AR at 4; Tab 9a, TEP Report, at 8.
 The remaining adjectival NIH Report ratings and the numeric values assigned were as follows: excellent = 0.8; fair = 0.3; poor or unsatisfactory = 0.2. AR at 5;
Tab 9a, TEP Report, at 9-10.
 The remaining award fee earned percentages and numeric ratings assigned were as follows: 76 percent to 90 percent = 0.7; 51 percent to 75 percent = 0.6; 0 percent = 0.4. AR at 4; Tab 9a, TEP Report, at 8.
 Section 912 of the MMA implemented requirements for annual evaluation, testing, and reporting on security programs at both MAC contractors and existing carrier and intermediary business partners (to include their respective data centers). SOW at 47. As explained by the agency, at its heart, Section 912 reviews evaluate the ability of the contractor to maintain the security of their Information Technology (IT) based systems. AR at 36. More specifically, Section 912 reviews check such things as whether you apply vendor-supplied security patches; whether passwords and computer access [are] properly managed; whether IT access is promptly removed from terminated employees; [and] whether the security procedures are well documented. AR, Tab 2C, Statement of TEP Member, at 1.
 As mentioned previously, ratings from 0.1 to 0.4 correlated to a determination that the proposal was considered more likely to fail than succeed. AR, Tab 9A, TEP Report, at 6-7.
 The section of the COs source selection decision detailing the offerors past performance does not refer at any time to either the baseline numeric scores calculated by the TEP, or the final numeric scores assigned to the offerors by the TEP under the past performance factor.
 Palmetto, whose proposal was assigned a numeric score of 0.9 (the highest score available) under the implementation factor, does not raise any challenge to the agencys evaluation of proposals under this factor.
 As explained by the CO, the front end of RapidApp provides for the interview of the provider online by using question-and-answer input screens and electronically capturing the application data provided. COs Suppl. Statement at 1. The back end of the RapidApp process takes this enrollee information and validates it against multiple external databases such as licensure and sanction databases, thereby automating many of the required verifications that had previously been done manually. Id. It also allows for automatic letter generation. Id.
 In this regard, the protesters submissions include the assertions that CMS intended to cancel the [RapidApp] pilot, CGS Comments at 49; CGS Suppl. Comments at 22, that CMS had rejected the proposed RapidApp innovation, CGS Comments at 49, that CMS will cancel the RapidApp pilot, CGS Comments at 49, that CMS would not support further RapidApp development, CGS Supp. Comments at 22, or that CMS would likely eradicate the RapidApp pilot program. Palmetto Suppl. Comments at 12.
 FCSO performed the same services on the incumbent contract.
 The agency explains that [d]ue to a clerical oversight in the process, the FCSO specific findings for Technical Understanding were included in the Palmetto [technical understanding] Findings file, but not the CGS Findings file. AR at 67 n.42. The agency explains, and the record reflects, that since FCSO was proposed by both Palmetto and CGS as their subcontractor for the audit and reimbursement function, and since FCSO submitted identical proposals for Palmetto and CGS, the evaluation of [FCSO] was identical between both Palmetto and CGS, and the findings documented within the Palmetto file are applicable to CGS. Id.
 ZPICs perform program integrity functions for Medicare Parts A, B, C, and D; Durable Medical Equipment Prosthetics, and Orthotics Supplier; Home Health and Hospice; and the Medicare-Medicaid Data Matching Program (a partnership between Medicaid and Medicare designed to enhance collaboration between the two programs to reduce fraud, waste, and abuse). See TriCenturion, Inc.; SafeGuard Services, LLC, B-406032 et al., Jan. 25, 2012, 2012 CPD ¶ 52 at 2.
 The CO cited Noridians proposed Key Personnel/Managers as one of the seven non-cost strengths supporting her determination that Noridians proposal represented the best value to the agency. AR, Tab 20A, Source Selection Decision, at 60.
 As indicated above, Palmetto and CGS have made numerous other contentions concerning the evaluation of proposals under both the non-cost and cost factors, and the agencys source selection decision. Although these arguments are not all specifically addressed in this decision, each was considered, including the protesters general and specific assertions that the conduct of the agency reflected unequal treatment in a number of ways, and were found either to be insignificant in view of our other findings, or without merit based upon record as a whole.