Skip to main content

Data Systems Analysts, Inc.; Peerless Technologies Corporation; NCI Information Systems, Inc.

B-413028.6,B-413028.7,B-413028.8,B-413028.10,B-413028.11,B-413028.12 Feb 21, 2018
Jump To:
Skip to Highlights

Highlights

Data Systems Analysts, Inc. (DSA), of Trevose, Pennsylvania; Peerless Technologies Corporation, of Fairborn, Ohio; and NCI Information Systems, Inc., of Reston, Virginia, protest the failure of the Defense Information Systems Agency (DISA), Department of Defense (DOD), to award "Encore III" contracts to the protesters under request for proposals (RFP) No. HC1028-15-R-0030 for information technology services supporting DOD and other federal agencies. The protesters challenge the agency's technical and cost/price evaluations and source selection decision.

We deny the protests.

We deny the protests.
View Decision

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of:  Data Systems Analysts, Inc.; Peerless Technologies Corporation; NCI Information Systems, Inc.

File:  B-413028.6; B-413028.7; B-413028.8; B-413028.10; B‑413028.11; B‑413028.12

Date:  February 21, 2018

David S. Cohen, Esq., John J. O’Brien, Esq., Laurel A. Hockey, Esq., and Daniel Strouse, Esq., Cohen Mohr LLP, for Data Systems Analysts, Inc.; Barbara A. Duncombe, Esq., Suzanne Sumner, Esq., and Erin R. Davis, Esq., Taft Stettinius & Hollister, LLP, for Peerless Technologies Corporation; and Daniel P. Graham, Esq., Elizabeth Krabill McIntyre, Esq., and Ryan D. Stalnaker, Esq., Vinson & Elkins LLP, for NCI Information Systems, Inc., the protesters.
Paul A. Debolt, Esq., James Y. Boland, Esq., Jengeih S. Tamba, Esq., and Emily A. Unnasch, Esq., Venable LLP, for Next Tier Concepts, Inc.; Jonathan D. Shaffer, Esq., John S. Pachter, Esq., Mary Pat Buckenmeyer, Esq., and Todd M. Garland, Esq., Smith Pachter McWhorter PLC, for Solers, Inc.; J. Alex Ward, Esq., Sandeep N. Nandivada, Esq., and R. Locke Bell, Esq., Morrison & Foerster LLP, for IAP C4ISR, LLC; Alexander B. Ginsberg, Esq., and Meghan D. Doherty, Esq., Pillsbury Winthrop Shaw Pittman LLP, for IndraSoft, Inc.; Mark D. Colley, Esq., Stuart W. Turner, Esq., Emma K. Dinan, Esq., and Alexandra L. Barbee-Garrett, Esq., Arnold & Porter Kaye Scholer LLP, for CSRA, Inc.; and Karen R. Harbaugh, Esq., and John R. Sharp, Esq., Squire Patton Boggs (US) LLP, for NetCentrics Corporation, the intervenors.
Colleen A. Eagan, Esq., Mark B. Grebel, Esq., and Daniel C. McIntosh, Esq., Defense Information Systems Agency, for the agency.
Heather Weiner, Esq., and Jennifer D. Westfall-McGrail, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1.  Protests challenging agency’s cost realism evaluation are denied where the agency’s evaluation was reasonable and supported by the record.

2.  Protests challenging the agency’s cost/price evaluations are denied where protesters cannot demonstrate that they were prejudiced by the agency’s actions.

3.  Protest challenging agency’s price realism evaluation is denied where the agency’s realism analysis reasonably assessed whether proposed prices presented a performance risk or demonstrated a lack of understanding of the RFP’s requirements.

4.  Protest challenging the technical acceptability of two awardees is denied where the agency reasonably evaluated the awardees’ proposals in accordance with the solicitation’s terms.  

DECISION

Data Systems Analysts, Inc. (DSA), of Trevose, Pennsylvania; Peerless Technologies Corporation, of Fairborn, Ohio; and NCI Information Systems, Inc., of Reston, Virginia, protest the failure of the Defense Information Systems Agency (DISA), Department of Defense (DOD), to award “Encore III” contracts to the protesters under request for proposals (RFP) No. HC1028-15-R-0030 for information technology services supporting DOD and other federal agencies.  The protesters challenge the agency’s technical and cost/price evaluations and source selection decision.

We deny the protests.

BACKGROUND

On March 2, 2016, the agency issued the solicitation, referred to as “Encore III,”[1] which anticipated the award of two separate suites of multiple indefinite-delivery, indefinite‑quantity (IDIQ) contracts, one resulting from full-and-open competition, and the other, set aside for small business concerns.[2]  For the full-and-open suite, which is the subject of this protest, the government intended to award up to 20 contracts, each with a 5-year base period, and one 5-year option period.  RFP at 121.

The RFP provided for award using a lowest-priced, technically acceptable source selection process, considering the following evaluation factors:  technical/management approach, past performance, and cost/price.[3]  Id. at 144-49. 

As relevant here, the technical/management approach factor included the evaluation of seven subfactors.[4]  Id. at 145.  The solicitation required that offerors address their “proposed historical approach, as applicable, to meeting or exceeding the standard for an ‘acceptable proposal’ of each technical/management subfactor.”  Id. at 129.

In addition, under the cost/price factor, offerors were required to propose fixed-price and cost reimbursement labor rates for both government and contractor sites, for all 116 labor categories listed in the solicitation.  Id. at 132.  For both the fixed-price and the cost reimbursement labor rates, offerors were to include direct and indirect rate burdens, and detailed labor rate build-up information, including all formulas and methodology.  Id.  The solicitation provided that the agency would “calculate a Total Proposed Price [TPP] for each offeror by applying Government estimated labor hours for each year of contract performance to each offeror’s proposed fully burdened FP [fixed price] and CR [cost reimbursement] labor rates for each labor category at both site locations.”  Id. at 148.

With regard to the cost evaluation, the RFP provided that the agency would “perform a cost realism analysis on the proposed [cost reimbursement] labor rates in accordance with [Federal Acquisition Regulation] FAR 15.404-1(d).”  Id. at 149.  Specifically, the RFP provided that the agency would conduct the following standard deviation analysis to determine cost realism:

The cost/price team will develop an average for each CR [cost reimbursement] labor rate utilizing the proposed CR rates on the “CR Labor Rate Table” tab from ALL complete proposals within each suite (Full and Open and Small Business).  The team will then calculate the standard deviation of the average for each CR labor rate. . . .  The Government considers a rate that is 1 standard deviation below the average to be a realistic rate, subject to cost analysis techniques in accordance with FAR 15.404.  The initial calculations for Average and Standard Deviation will be utilized for the entirety of the evaluation and will not be recalculated if a competitive range is set.

Id.

In addition, the solicitation provided that “[i]f an offeror’s proposed CR labor rate is more than 1 standard deviation below the average for that labor rate, the Cost/Price Team will review the submitted supporting documentation at the component level for that rate.”  Id. The solicitation explained that, “[i]f it is determined that the supporting documentation supports the realism of the proposed rate, no adjustment will be made to the offeror’s rate.”  Id.  However, the solicitation further provided that, “[i]f inadequate or no justification is provided by the offeror for any component of that rate[,] . . . the Government will adjust the fully burdened CR Labor rate to be equal to the average for purposes of calculating the Most Probable Cost for that offeror.”  Id.

Based on the standard deviation cost realism analysis, the RFP provided that the agency would “calculate a total [m]ost [p]robable [c]ost (MPC) for the CR [cost reimbursement] only portion of the proposal for each offeror by applying Government estimated labor hours for each year of contract performance to each offeror’s most probable cost labor rates for each labor category at both Government and contractor sites.”  Id.

With regard to price analysis, as relevant here, the RFP provided that the government “reserve[d] the right, but [was] not obligated, to conduct a realism analysis of the offeror’s proposed price.”  RFP at 150.

The solicitation provided that the agency would then calculate a total evaluated price (TEP) by adding the TPP for the fixed price portion of the proposal to the MPC for the cost reimbursement portion of the proposal.  Id.  The RFP explained that the cost/price team would then “organize the proposals by their TEP price from lowest to highest for each suite,” and “[u]p to 20 (30 if a competitive range is established) of the lowest evaluated priced proposals for each suite will next be evaluated by the contracting officer [ ] for compliance with other terms and conditions of the RFP.”  Id. at 140.  After the contracting officer’s compliance review, the lowest evaluated priced proposals remaining in each suite would be evaluated under the non-cost/price factors.  Id.

The agency calculated the TEPs and established a competitive range of the 30 proposals with the lowest total evaluated prices.  DSA Combined Contracting Officer Statement & Memorandum of Law (COS/MOL) at 24.  The agency conducted several rounds of discussions and received/evaluated final proposal revisions.  Id.  The protesters’ proposals were ranked 22nd (Peerless), 23rd (NCI) and 24th (DSA) based on evaluated price.  Agency Report (AR),[5] Tab 12A1, DSA Debriefing, at 2; 12B1, NCI Debriefing, at 2; 12C1, Peerless Debriefing, at 2.  On November 2, 2017, the agency notified the protesters that their proposals had not been selected for award, and also provided the unsuccessful offerors with debriefing letters.  AR, Tab 12A1, DSA Debriefing, at 1; 12B1, NCI Debriefing, at 1; 12C1, Peerless Debriefing, at 1; Tab 11A, DSA Award Notification, at 1; 11B, NCI Award Notification, at 1; 11C, Peerless Award Notification, at 1.  These protests followed.

DISCUSSION

The protesters raise a number of challenges to the agency’s evaluation of proposals.  The main challenge, asserted by all three protesters, is that the agency’s cost/price evaluation was flawed.  For example, DSA and Peerless contend that the agency’s cost realism analysis was unreasonable because the agency accepted the offerors’ proposed costs without conducting a meaningful analysis of the supporting documentation.[6]  DSA also argues that the agency failed to conduct a proper price realism evaluation.  NCI, on the other hand, challenges the agency’s labor hour estimate, arguing that it was arbitrary, bore no rational relationship to the agency’s needs, and failed to meaningfully demonstrate the relative cost/price of the offers.[7]  Finally, DSA also challenges the technical acceptability of two of awardees.  We have considered all of the protesters’ arguments, and although we only address the primary ones, we find that none provides a basis to sustain the protest.[8]

Cost Realism

DSA and Peerless first challenge the agency’s cost realism evaluation of the awardees’ proposals, primarily arguing that the agency accepted the awardees’ proposed cost reimbursement labor rates without conducting a meaningful analysis of the documentation submitted to justify the costs.  For the reasons discussed below, we find no basis to sustain the protests.

When an agency evaluates a proposal for the award of a cost-reimbursement contract, an offeror’s proposed costs are not dispositive because, regardless of the costs proposed, the government is bound to pay the contractor its actual and allowable costs. FAR §§ 15.305(a)(1), 15.404-1(d); CSI, Inc.; Visual Awareness Techs. & Consulting, Inc., B-407332.5 et al., Jan. 12, 2015, 2015 CPD ¶ 35 at 5-6.  Consequently, the agency must perform a cost realism analysis to determine the extent to which an offeror’s proposed costs are realistic for the work to be performed.  FAR § 15.404‑1(d)(1).  An agency is not required to conduct an in-depth cost analysis, or to verify each and every item in assessing cost realism; rather, the evaluation requires the exercise of informed judgment by the contracting agency.  Cascade Gen., Inc., B‑283872, Jan. 18, 2000, 2000 CPD ¶ 14 at 8; see FAR § 15.404-1(c).  The methodology employed must be reasonably adequate and provide some measure of confidence that the rates proposed are reasonable and realistic in view of other cost information reasonably available to the agency as of the time of its evaluation.  SGT, Inc., B‑294722.4, July 28, 2005, 2005 CPD ¶ 151 at 7.  Our review of an agency’s cost realism evaluation is limited to determining whether the cost analysis is reasonable; a protester’s disagreement with the agency’s judgment, without more, does not provide a basis to sustain the protest.  Imagine One Tech. & Mgmt., Ltd., B‑412860.4, B‑412860.5, Dec. 9, 2016, 2016 CPD ¶ 360 at 14-15.

The record reflects that, as specified in the RFP, the Cost/Price Evaluation Team (CPET) first developed an average for each cost reimbursement labor rate using the proposed cost reimbursement rates on the “CR Labor Rate Table” tab from all complete proposals within the full and open suite.  AR, Tab 9T, Peerless, Final Cost/Price Eval. Report, at 4; 9I, DSA, Final Cost/Price Eval. Report, at 4.  The CPET then calculated the standard deviation from the average for each cost reimbursement labor rate.  Id.  As also provided in the RFP, the CPET considered a rate that was one standard deviation below the average to be a realistic rate.  Id.  For any rates that were more than one standard deviation below the average for a particular labor category, the CPET reviewed the offeror’s submitted supporting cost information at the component level for that rate.  After reviewing the supporting documentation provided by the various offerors, the agency adjusted the proposed costs of only one offeror for only five labor categories, i.e., NetCentrics.  AR, Tab 9R, NetCentrics Cost/Price Eval. Report, at 6; Tab 10, Source Selection Decision Document (SSDD) (showing total proposed prices and total evaluated prices of all offerors in the competitive range).[9]

DSA and Peerless raise numerous challenges to the agency’s cost realism analysis of the awardees’ proposals.  For example, Peerless asserts that it was contradictory for the agency to conclude that NetCentrics’ proposed rates were unrealistic for five labor categories, while finding other offerors’ even lower rates realistic for the same five labor categories.[10]  As relevant here, the evaluators concluded that NetCentrics’ proposed rates for the five labor categories at issue were unrealistically low based on the agency’s determination that the awardee had not provided adequate documentation for those five rates in its proposal.  AR, Tab 9R, NetCentrics, Final Cost/Price Eval. Report, at 5.  In contrast, the record shows that the agency determined that other offerors, which proposed lower rates for those same five labor categories, had sufficiently supported their rates.  See, e.g., AR, Tab 9A, 22nd Century, Cost/Price Eval. Report, at 4-5; 9B, AASKI Cost/Price Eval. Report, at 4‑5; 9C, Ace Info Cost/Price Eval. Report, at 4-5; 9F, Booz Allen Cost/Price Eval. Report, at 4-5; 9G, CACI Cost/Price Eval. Report, at 4-5; 9V, QBase Cost/Price Eval. Report, at 4-5; and 9W, Solers Cost/Price Eval. Report, at 4-5.  We find nothing unreasonable regarding the agency’s evaluation in this regard.

Peerless also contends that the agency failed to conduct a meaningful evaluation of the rates proposed by the subcontractor of another awardee (Booz Allen), and instead, merely relied on Booz Allen’s assurance that it was willing to accept the risk of contract performance with the subcontractor’s rates.  Contrary to the protester’s assertion, however, the record reflects that the agency independently reviewed the realism of the subcontractor’s rates, and concluded that the subcontractor “provided adequate supporting cost information to support all 101 subcontractor labor rates proposed.”[11]  AR, Tab 9F, Booz Allen, Final Cost/Price Eval. Report, at 5-6.  Accordingly, we also find no basis to object to the agency’s determination that these subcontractor costs were realistic.

As another example, DSA argues that the agency’s cost realism assessment of another awardee’s proposal (Solers) was unreasonable because, for 16 labor categories, the supporting documentation failed to meet the RFP’s experience and/or education requirements.  In support of this argument, the protester points to Solers’ proposal, which, for example, mapped the RFP’s labor category for a principal information engineer (requiring an advanced degree with at least 10 years of experience), to a [DELETED] (requiring an advanced degree with at least seven years of experience).  DSA Supp. Comments at 30; AR, Tab 6W2, Solers Price Narrative, at 258. 

The protester maintains that it was unreasonable for the agency to accept Solers’ reliance on “noncompliant” labor categories, such as this example, as the basis for deriving its direct labor rates.  DSA Supp. Comments at 30.  As the agency noted in its evaluation of Solers’ cost proposal, the agency engaged in multiple rounds of discussions with Solers regarding the education and experience requirements of Solers’ proposed labor category positions.  AR, Tab 9W, Solers Final Cost/Price Eval. Report, at 10-11.  Specifically, the agency “verified that [Solers] met the minimum education and experience levels for all labor categories as identified in Section J, Attachment G2, ‘Encore III Labor Category Descriptions.’”  Id. at 6.  In making this determination, the CPET Chair explains in response to the protest that the evaluators considered Appendix A in Solers’ final proposal.  AR, Tab 20D, Declaration of Cost/Price Evaluation Team (CPET) Chair, Jan. 11, 2018, at 14.  Specifically, the CPET Chair states that the “Appendix included extensive discussion regarding how Solers mapped its salary survey data to the RFP labor categories,” and that “Table 4 in the Appendix specifically showed that the education and experience levels in the ‘Solers Proposed Years Experience’ and ‘Solers Proposed Degree’ column matched the requirements of the RFP.”  Id.; AR, Tab 6W2, Solers Price Narrative, at 169-180. 

Further, the CPET Chair points out that Solers’ proposal explained that “[a]ny time salary research is utilized, it may not exactly match RFP requirements, necessitating adjustments to the data to provide compliant cost estimates,” and that “[if] the [DELETED] data did not exactly map to the Encore III labor category requirements, . . . Solers adjusted the data by utilizing a [DELETED] salary band, and [it] explained [its] adjustment [in Table 12].”  AR, Tab 6W2, Solers Price Narrative, at 257.  The CPET Chair then explains that “Table 12 of Appendix A” of Solers’ final proposal “identifies the rate adjustments Solers made to account for degrees or experience required in the RFP that differed from the [DELETED] position utilized as the basis of Solers’ rates,” including the “principal information engineer/software architect” labor category.  AR, Tab 20D, Declaration of CPET Chair, Jan. 11, 2018, at 14. 

Based on our review of the record and the explanation provided by the agency, we find nothing unreasonable regarding the agency’s evaluation in this regard.[12]  Accordingly, we find that this argument also fails to provide a basis to sustain the protest.

DSA and Peerless also challenge the agency’s acceptance of geographic locations as justifications for several of the awardees’ cost-reimbursement, direct labor rates.  For instance, the protesters contend that it was unreasonable for the agency to allow Solers to calculate its rates using the locations of [DELETED], while other offerors based their rates on the national average for each labor category. 

As relevant here, the RFP advised offerors that performance associated with any task orders would be “throughout the United States and its territories and possessions,” as well as “in any country where the customer has a presence.”  RFP at 41. 

The agency sent evaluation notices to all offerors in the competitive range requesting that the offerors either confirm or provide additional support to show that their proposed rates could support the requirements of the PWS.  See, e.g., AR, Tabs, 9N, Leidos Cost/Price Eval. Report, at 11; 9J, ECS Federal Cost/Price Eval. Report, at 10; 9R, NetCentrics Cost/Price Eval. Report, at 11; 9S, Next Tier Cost/Price Eval. Report, at 10; 9T, Peerless Cost/Price Eval. Report, at 9-10; 9V, QBase Cost/Price Eval. Report, at 10; and 9W, Solers Cost/Price Eval. Report, at 12.  For the offerors that used locations other than the national average or National Capital Region as the basis for their rates, the evaluation notice also sought confirmation that the offerors understood that contract performance would be required worldwide.  See, e.g., AR, Tabs 9J, ECS Federal Cost/Price Eval. Report, at 10; 9N, Leidos Cost/Price Eval. Report, at 11; 9S, Next Tier Cost/Price Eval. Report, at 10; 9T, Peerless Cost/Price Eval. Report, at 9-10; 9V, QBase Cost/Price Eval. Report, at 10; 9W, Solers Cost/Price Eval. Report, at 12. 

In responding to the evaluation notice, Solers explained that it used locations other than the national average as the basis for the majority of its rates, noting that it “collected data for five representative cities in close proximity to universities with Information Technology, Computer Science, or Engineering degree programs.”  AR, Tab 6W2, Solers Pricing Narrative, at 1-2.  Solers explained that the “representative cities have a low cost of living to support a geographically distributed workforce,” and will “allow Solers to leverage engineering and information technology talent from [DELETED] who have graduated and possess three (3) years or more of experience.”  Id.  Solers also stated that its “research and [ ] direct experience hiring in other university towns, reflects that a percentage of the graduates from each of these schools stay and work within proximity to the university following graduation,” and “[a]s such, these cities become recruiting-rich environments to attract and retain skilled engineering and IT [information technology] talent.”  Id.  Accordingly, Solers stated that it will “target them in hiring for the Encore III contract.”  Id.  Solers also noted that this assertion is “further supported by Department of Labor (DOL) May 2016 Metropolitan and Nonmetropolitan Area Occupational Employment and Wage Estimates for the engineering and IT related occupations, which can be found in Appendix A, Section A.6, Table 4.”  Id.

Solers explained, however, that “[t]he [five] cities utilized as representative locations for pricing purposes are not meant to imply a place of performance,” and that “Solers understands that the place of performance may be anywhere in either CONUS [continental United States] or OCONUS [outside the continental United States].”  Id.  In this regard, Solers stated that it “has been supporting DISA for over 17 years and has been called on to staff OCONUS requirements on numerous occasions.”  Id.  Solers further noted that its “long standing, successful policy is to offer salaries and benefits for OCONUS work consistent with [its] compensation packages for CONUS-based employees,” and that costs/prices “provided in [its] proposal [are] intended for both CONUS and OCONUS requirements.”  Id.  In addition, Solers confirmed that it would “only hire and apply personnel to task orders who meet or exceed the experience and degree requirements for Encore III,” and that Solers “will perform work in any country where the customer has a presence.”  Id.  In addition, Solers stated:  “We are highly confident that we can hire personnel in the localities bid and others like them who fully satisfy all Encore III requirements for years of experience and degree based on our DOL and salary.com research and our proven past performance hiring in similar university towns.”  Id.

In evaluating Solers’ cost/price proposal, the CPET noted that Solers’ direct labor rates were proposed using the locations in [DELETED].  AR, Tab 9W, Solers Final Cost/Price Eval. Report, at 5.  The CPET further explained that Solers planned to “leverage engineering and information technology talent from” local universities, as well as provided other supporting information from the DOL May 2016 Metropolitan and Nonmetropolitan Area Occupational Employment and Wage Estimates and salary.com “to support the ability to hire personnel at the experience and education levels in accordance with the RFP from the locations used to formulate their direct labor rates.”  Id.  Ultimately, the agency concluded that Solers provided adequate information regarding its proposed direct labor rates “to support the ability to hire personnel at the experience and education levels in accordance with the RFP.”  Id.

Although Peerless and DSA object to the agency’s evaluation process and conclusions as described above, the protesters have not shown that DISA acted unreasonably.  As noted above, an agency is not required to conduct an in-depth cost analysis or to verify each and every item in assessing cost realism; rather, the evaluation requires the exercise of informed judgment by the contracting agency.  Cascade Gen., Inc., supra; see FAR § 15.404-1(c).  To the extent that the protesters argue that the agency’s judgments were unreasonable, the protesters’ disagreement provide no basis to sustain the protest.  We find no basis to object to the agency’s evaluation.

In any event, DSA and Peerless have not demonstrated that they were prejudiced by any of the alleged defects in the agency’s cost realism analysis.  Competitive prejudice is an essential element of a viable protest; where the protester fails to demonstrate that, but for the agency’s actions, it would have had a substantial chance of receiving the award, there is no basis for finding prejudice, and our Office will not sustain the protest, even if deficiencies in the procurement are found.  HP Enter. Servs., LLC, B-411205, B‑411205.2, June 16, 2015, 2015 CPD ¶ 202 at 6; Booz Allen Hamilton Eng’g Servs., LLC, B-411065, May 1, 2015, 2015 CPD ¶ 138 at 10 n.16.

Here, the agency selected the awardees because they provided the 20 lowest-priced, technically acceptable proposals in response to the RFP.  With regard to the protesters, the agency ranked their proposals as 22nd, 23rd, and 24th lowest priced.  AR, Tabs 12A1, DSA Debriefing, at 4; 12B1, NCI Debriefing, at 4; 12C1, Peerless Debriefing, at 1.  Per the evaluation scheme set forth in the RFP, if an offeror did not adequately justify any component of its proposed labor rates, the agency would adjust that rate to the average labor rate for that labor category to calculate the offeror’s most probable cost.  RFP at 149.  As the CPET Chair explains in response to the protest, even if the agency “adjusted to the average each of the awardees’ labor rates that fell more than one standard deviation below the average, [the protesters] would not have [total evaluated prices] that [were] within the lowest 20 priced proposals.”  AR Tab, 15A, DSA Declaration, at 2; 15B, NCI Declaration at 2; and 15C, Peerless Declaration, at 2.  Thus, even assuming that the agency erred in finding that the awardees with labor rates more than one standard deviation below the average had adequately supported those rates, the protesters have not established that, but for the alleged errors, the protesters would have had a substantial chance for award.  See, e.g., Oasis Sys., LLC; Quantech Servs., Inc., supra (finding that protester did not demonstrate prejudice where it would not be next in line for award); DynCorp Int’l LLC, B‑411465, B-411465.2, Aug. 4, 2015, 2015 CPD ¶ 228 at 12-15 (denying protest that agency’s cost realism analysis should have resulted in upward adjustment to the awardee’s proposed labor hours where the protester failed to demonstrate that, but for the alleged errors, the protester would have had a substantial possibility for award).  For this reason, we also conclude that the protesters’ arguments concerning the agency’s cost realism evaluation of the awardees’ proposals fail because the protesters have not demonstrated a reasonable possibility of prejudice. 

Government Estimated Labor Hours

NCI’s challenge to the agency’s cost/price evaluation focuses on the government estimated labor hours used by the CPET for calculating the offerors’ most probable costs and total proposed prices.  As explained in the RFP, the government estimated labor hours were “not provided to the offerors until after award.”  RFP at 148.  The protester argues that the agency’s estimate was arbitrary, bore no rational relationship to the agency’s needs, and failed to meaningfully demonstrate the relative cost/price of the offers.  For the reasons discussed below, we find no basis to sustain the protest. 

Agencies are required to consider the cost to the government in evaluating competitive proposals.  FAR § 15.304(c)(1); Health Servs. Int’l, Inc.; Apex Envtl., Inc., B-247433, B‑247433.2, June 5, 1992, 92-1 CPD ¶ 493 at 3-4.  While an agency has discretion to decide upon an appropriate and reasonable method for proposal evaluation, it may not use an evaluation method that produces a misleading result.  Id.  The method chosen must include some reasonable basis for evaluating or comparing the relative costs of proposals, so as to establish whether one offeror’s proposal will be more or less costly than another.  SmithKline Beecham Corp., B-283939, Jan. 27, 2000, 2000 CPD ¶ 19 at 4-5.  A protester’s disagreement with an agency’s estimate does not provide a basis to sustain the protest. 

As set forth above, the RFP required offerors to propose both fixed-price and cost reimbursement labor rates for 116 labor categories listed in the solicitation for both government and contractor sites.  RFP at 131.  The agency explained that these labor categories were consistent with the Encore II labor categories, with the exception of the removal of 14 labor categories.  AR, Tab 4, Memorandum for Record (MFR), at 1.  The solicitation provided that the agency would calculate a total proposed price (TPP) for each offeror by applying government estimated labor hours for each year of contract performance to each offeror’s proposed fully burdened fixed price and cost reimbursement labor rates for each labor category at both site locations.  RFP at 139.

In establishing the estimated labor hour quantities for the cost/price evaluation, the agency relied on historical data from Encore II, which the evaluators documented as follows: 

To establish the estimated hours, the Government collected the hours proposed by each awardee for all Encore II task orders awarded during Base Year 5 and Option Years 1 and 2, Option Year 2 being the most recently exercised Option Year.  The task orders awarded were segregated by contract type ([fixed price] and cost reimbursement).  The Government tallied the total labor hours for each task order awarded for both Government site and Contractor site by labor category.  The total amount of hours per labor category based on Encore II historical data for [fixed price] task orders and cost reimbursable task orders will be used as the ‘Government Estimated Labor Hours’ for the [Encore III] cost/price evaluation when determining each offeror’s [total proposed price].  The Encore III team has determined that these estimated labor hours represent a valid estimate of distribution of labor for the Encore III base year because they are based on Encore II historical data. 

AR, Tab 4, MFR, at 1. 

During the Encore III cost/price evaluation, the agency applied the estimated fixed-price and cost-reimbursement labor hour quantities to the proposed labor rates for each offeror, including NCI, for each year of the Encore III period of performance.  Id.; AR, Tab 9P, NCI Cost/Price Evaluation Consensus Report, at 3.  The agency explained that “[t]hese labor hours [were] utilized to calculate the Total Proposed Price of each offeror’s proposal received in response to the Encore III solicitation.”  Id.  In addition, in responding to the protest, the agency maintains that “[t]he hours proposed by [the] awardees of ENCORE II task orders were used, instead of the actual numbers of hours worked, because the task orders were performance-based, the majority were fixed‑price, and data was not available regarding the actual number of labor hours worked.”  AR, Tab 18A, Declaration of CPET Chair (Jan. 3, 2018), at 1; Agency Supp. Legal Memo (NCI) at 3, 8.

NCI argues that DISA’s labor hour estimate fails to reflect the agency’s actual needs for Encore III.  Specifically, the protester asserts that, because the agency used aggregate labor hours from three years of Encore II task orders, DISA’s “labor hour estimate is inflated.”  NCI Supp. Comments at 3.  The protester also contends that the agency’s labor hour estimate was arbitrary because it was based on proposed hours, rather than “hours actually necessary to accomplish a particular task.”  NCI Supp. Comments at 5.  In this regard, NCI contends that the proposed distribution of labor hours in the estimate--which was based “entirely on an awardee’s particular technical approach”--has no bearing at all on the actual distribution of labor hours that DISA might order.  Id.

Based on our review of the record here, we find no basis to object to the agency’s labor hour estimate.  The record reflects that the agency created the labor hour estimate from historical data (i.e., the labor hours proposed) from task orders awarded under Encore II for three years.  Although the protester asserts that it was “arbitrary” and “irrational” for the agency to base the estimate on three years of historical information, as the agency notes in response to the protest, its “use of historical hours merely represent[ed] a distribution of anticipated work among the labor categories as they are relative to each other.”  NCI Supp. AR at 1-2.  In this regard, as the agency points out, “the relative distribution of hours would remain the same,” regardless of whether the agency used an aggregate of the hours for all three years or the average number of hours per year, and therefore, “the relative price ranking of offerors would remain the same.”[13]  NCI Supp. AR at 1‑2.  The protester has not in any way demonstrated how the agency’s reliance on three years of historical data for the estimate was either unreasonable or prejudicial to NCI.  We find no merit to this argument.

The protester also argues that the agency’s labor hour estimate was improperly based upon proposed, rather than actual labor hours.  As noted above, the agency explains that it used the hours proposed by the Encore II task order awardees, instead of the actual numbers of hours worked because “the task orders were performance-based, the majority were fixed‑price, and data was not available regarding the actual number of labor hours worked.”  AR, Tab 18A, Declaration of CPET Chair (Jan. 3, 2018), at 1.  Based upon our review of the record, and the agency’s explanation, we find nothing unreasonable regarding the agency’s reliance on the proposed Encore II task order labor hours in establishing the estimate. 

Furthermore, although the protester disagrees with the quantity of labor hours included in the government estimate for seven labor categories (involving subject matter experts and consultants), the protester has failed to demonstrate how the agency’s reliance on the historical data with regard to these labor categories was unreasonable.  While NCI may disagree with the agency’s method for establishing the estimate, or believe an alternative method may have been better, the protester’s disagreement, without more, is insufficient to render the government’s estimate unreasonable or otherwise provide a basis to sustain the protest.  We find no basis to sustain the protest.

Price Realism

Next, DSA argues that DISA failed to conduct a proper price realism evaluation.  Specifically, the protester asserts that the agency failed to reasonably assess the realism of NetCentrics’ proposed fixed prices for five labor categories.[14]  For the reasons discussed below, we find no basis to sustain the protest.[15]

For a solicitation that anticipates the award of a fixed-price contract, an agency may include a requirement to evaluate the realism of proposed prices for the limited purpose of assessing whether an offeror’s price reflects a lack of technical understanding or risk. Emergint Techs., Inc., B-407006, Oct. 18, 2012, 2012 CPD ¶ 295 at 5-6; see FAR § 15.404-1(d)(3).  The depth of an agency’s price realism is a matter within the sound exercise of the agency’s discretion and our review of a price realism analysis is limited to determining whether it was reasonable and consistent with the terms of the solicitation and adequately documented.  GiaCare & MedTrust JV, LLC, B-407966.4, Nov. 2, 2016, 2016 CPD ¶ 32 at 7.

As noted above, the RFP advised that during the fixed-price portion of the agency’s evaluation of offerors’ cost/price proposals, the agency “reserve[d] the right” to “conduct a realism analysis of the offeror’s proposed price.”  RFP at 150.  DISA conducted a price realism evaluation on all proposals in the competitive range.  In its price realism evaluation of NetCentrics’ proposal, the CPET noted its concern that, “[d]ue to the direct labor rates for five labor categories not being adequately supported, these five [fixed‑price] labor rates may represent a level of performance risk, in the form of a lowered ability to hire and retain employees at the proposed rates.”  AR, Tab 9R, NetCentrics Final Cost/Price Eval. Report, at 8.  The CPET stated, however, that “[t]his performance risk [was] likely mitigated by the offeror’s Acceptable technical proposal and Acceptable past performance.”  Id.  In addition, the CPET explained that “further realism evaluation can be done at the task order level,” and that “offerors are not required to propose on all task orders,” which the CPET noted, “may diminish the performance risk to the Government.”  Id.  The CPET recommended that the Source Selection Authority (SSA) “review this issue and determine whether the offeror’s proposed [fixed price] rates create a performance risk.”  Id.

The SSA, in reviewing this issue, determined that “any perceived performance risk caused by NetCentrics proposed [fixed-price] labor rates has been adequately mitigated.”  AR, Tab 10, SSDD, at 15.  Specifically, the SSA explained that NetCentrics was determined to have acceptable Technical/Management and Past Performance proposals, which indicated a clear understanding of the RFP’s requirements.  Id.  In addition, the SSA noted that NetCentrics failed to provide adequate supporting documentation for only 5 of the total 116 proposed labor categories.  Id.  In this regard, the SSA pointed out that, at the base contract level, none of the labor categories are mandatory and when proposing for task orders, NetCentrics “will be able to make a business decision as to whether it should propose any of the five labor categories noted by the cost/price team.”  Id.

Although the protester asserts that the agency failed to adequately consider the technical risk of these five labor rates, the agency acknowledged that the five labor rates “may represent a level of performance risk, in the form of a lowered ability to hire and retain employees at the proposed rates,” but that the risks were sufficiently mitigated.  AR, Tab 9R, NetCentrics Final Cost/Price Eval. Report, at 8.  The agency therefore concluded that the risk did not signify that the proposed prices were either unrealistic or demonstrated a lack of understanding of the PWS requirements.  Id.  To the extent DSA asserts that the agency’s price realism assessment should have considered other risks, the protester’s disagreement with the agency’s evaluation, without more, does not establish that the agency’s evaluation was unreasonable or otherwise provide a basis to sustain the protest.  See GiaCare & MedTrust JV, LLC, supra, at 7 (“The depth of an agency’s [price realism] evaluation is a matter within the sound exercise of the agency’s discretion.”).  On this record, we find no basis to sustain the protest.

Technical Acceptability

Finally, DSA challenges the technical acceptability of awardee Next Tier’s proposal.  Specifically, the protester argues that Next Tier’s proposal failed to satisfy the RFP’s requirement that an offeror demonstrate experience “performing testing and evaluation.”[16]  DSA Supp. Comments at 39.  For the reasons discussed below, we conclude that the agency reasonably found that Next Tier’s proposal was technically acceptable, and therefore find no basis to sustain the protest.

In reviewing protests challenging an agency’s evaluation of proposals, our Office does not independently evaluate proposals; rather, we review the agency’s evaluation to ensure that it is consistent with the terms of the solicitation and applicable statutes and regulations.  SOS Int’l, Ltd., B-402558.3, B-402558.9, June 3, 2010, 2010 CPD ¶ 131 at 2.  We have consistently explained that the evaluation of proposals is a matter within the discretion of the procuring agency; we will question the agency’s evaluation only where the record shows that the evaluation does not have a reasonable basis or is inconsistent with the RFP.  Hardiman Remediation Servs., Inc., B-402838, Aug. 16, 2010, 2010 CPD ¶ 195 at 3.  An offeror’s disagreement with an agency’s evaluation, without more, is not sufficient to sustain the protest.  See Ben-Mar Enters., Inc., B‑295781, Apr. 7, 2005, 2005 CPD ¶ 68 at 7.

As referenced previously, the RFP required that offerors provide past experiences to demonstrate their ability to meet the requirements of seven technical/management evaluation subfactors.  RFP at 143.  As relevant here, under subfactor 4, test and evaluation (T&E), offerors were required to demonstrate the ability to “utilize technical and engineering documents to plan, conduct, and execute testing and evaluation.”  Id. at 145. 

In responding to this requirement, Next Tier’s proposal pointed to the offeror’s performance under two recent contracts, including as relevant here, Next Tier’s performance under the Joint Improvised Explosive Device Defeat Organization (JIEDDO) Enterprise Portal project.  AR, Tab 5E; Supp AR at 50-51.  Next Tier’s proposal provided the following information regarding its experience preparing test strategies and concepts under the JIEDDO Enterprise Portal project:

Experience preparing T&E [test and evaluation] [DELETED].  As the prime contractor, NT Concepts supported all testing aspects of the JIEDDO project.  These included [DELETED].

AR, Tab 5E3, Next Tier Tech/Mgmt Proposal, at 13.  Next Tier’s proposal also listed its “[DELETED].”  Id. at 11-12. (referencing Table 4, which demonstrated how Next Tier executed its T&E actions in accordance with its T&E strategy in performing the JIEDDO contract).  In addition, Next Tier’s proposal demonstrated how the offeror used its T&E strategy throughout the full life cycle process.  See id. (Table 5, JIEDDO Full Life Cycle T&E Process).

In evaluating Next Tier’s proposal under this subfactor, the agency concluded that Next Tier’s performance on “their JIEDDO project” clearly met the minimum technical requirements of the RFP.  AR, Tab 8F, Next Tier Tech/Mgmt. Eval. Report, at 11.  With regard to the testing and evaluation requirement, the evaluators stated:

The Subfactor 4 section of [Next Tier’s] technical proposal discusses their experience in Test and Evaluation (T&E).  [Next Tier] followed their CMMI processes for Risk Management, Technical Solution, Integration, and Planning as described in the PMP.  They also incorporated the Cybersecurity Risk Management Framework (RMF) concepts from DoD Instruction 8510.01 into their T&E Strategy.  They also developed a T&E Strategy for the JIEDDO project based on DoD Instructions 5000.02, 8330.01, 8510.01, and the DOT&E Test and Evaluation Master Plan (TEMP) Guidebook 3.0.

Id.at 12. 

In addition, the evaluators explained that, “[as] the prime contractor, [Next Tier] supported all testing aspects of the JIEDDO project,” and that “[t]hese included development of the T&E Strategy (T&E Plan) as well as test strategies, concepts, reports, metrics, and scorecards for Developmental, Operational, Interoperability, and Cybersecurity testing.”  Id.  The evaluators documented a detailed assessment of Next Tier’s use of its T&E strategy in the “JIEDDO environment,” and concluded that:

This demonstrates the ability to utilize technical and engineering documents to plan, conduct, and execute testing and evaluation and experience with DoD/IC [intelligence community] Test and Evaluation of Software and/or IT/NS [network support] systems, industry best practices in Test Design techniques for T&E methodologies, establishing appropriate test hosting environments, and modeling and simulation of Software and/or IT/NS systems. This also demonstrates the ability to prepare Test and Evaluation Plans, Test Strategies, Test Concepts, Test Reports, T&E metrics, and T&E scorecards.

Id. at 12.

The protester points to a statement in Next Tier’s proposal, which provided that, “[DELETED].”  Id.  The protester asserts that this statement shows that Next Tier did not actually “execute” testing, as required by subfactor 4 (T&E).  DSA Supp. Protest at 54.  In response, the agency maintains that “[t]he fact that Next Tier noted that an objective third party would perform key testing roles is a standard practice,” and that “[t]his just explains, as Next Tier points out in its proposal, that there is a conflict of interest if Next Tier was the only entity to perform testing of a system that Next Tier developed.”  Agency Supp. Legal Memo (DSA) at 51.  The agency further notes that the protester’s argument ignores “the actual testing Next Tier stated that it did perform,” which is documented in Table 4 of Next Tier’s proposal, such as:  unit testing, functional testing, Section 508 compliance testing, system integration testing, regression testing, and user acceptance testing.  Id.

Based on our review of the record, we find nothing unreasonable regarding the agency’s evaluation.[17]  To the extent the protester contends that the agency should have found Next Tier’s proposal unacceptable under this subfactor, a protester’s disagreement with the agency’s evaluation, without more, provides no basis to sustain the protest.  Ben-Mar Enters., Inc., supra.

The protests are denied.

Thomas H. Armstrong
General Counsel



[1] Encore III is a follow-on to Encore II and represents an ongoing expansion of DISA’s Defense Enterprise Information Services (DEIS) I and DEIS II contracts.  RFP at 11, 13.

[2] On April 25, 2016, two protests were filed with our Office challenging the terms of the initial solicitation.  On August 3, 2016, we sustained the protests.  See CACI, Inc.‑Federal; Booz Allen Hamilton, Inc., B-413028 et al., Aug. 3, 2016, 2016 CPD ¶ 238.  In response, the agency amended the solicitation and accepted revised proposals.  In all, the RFP was amended seven times.  References herein are to the conformed version of the RFP that is inclusive of the seven amendments.

[3] The solicitation also provided that the agency would evaluate offerors’ small business subcontracting plans, and organizational/consultant conflict of interest plans.  RFP at 150.

[4] Specifically, these subfactors included:  requirements analysis, custom application development, product integration, test and evaluation, operations support (Performance Work Statement (PWS) C4.17.6)), operations support (PWS C4.17.5), and enterprise information technology policy and planning.  RFP at 145.

[5] The protests were not consolidated during development, and therefore, the agency responded to each protest with a separate legal memorandum and contracting officer statement.  The agency, however, submitted a single, consolidated record for all three protests.  References herein to the agency report are to the consolidated record of exhibits submitted for all three protests.

[6] The protesters also raised various arguments in their initial protests which they later withdrew.  For example, DSA initially challenged the technical acceptability of 22nd Century’s proposal, and argued that Next Tier did not have experience with modeling and simulation under technical subfactor 4 (T&E), but later withdrew these allegations.  See DSA Supp. Comments at 39; id., n.27.  Similarly, NCI initially challenged the technical acceptability of IAP C4ISR, LLC, but has withdrawn this allegation.  NCI Response to Dismissal Request at 1.   

[7] Although NCI initially challenged the agency’s cost realism evaluation, the protester explains that it “decided not to pursue its challenge to the realism evaluation of the awardees’ costs/price” because “[the] Agency Report indicates that, ‘even if the Agency had adjusted to the average each of the awardees’ labor rates that fell more than one standard deviation below the average, NCI would not have a [total evaluated price] that was within the lowest 20 priced proposals.’”  NCI’s Comments at 2, n.2 (citing AR, Tab 15B, NCI Declaration, at 2).  Further, NCI states that it “has not been able to disprove this assertion,” and “therefore cannot at this time demonstrate prejudice as a result of DISA’s cost/price realism evaluation of specific awardee labor rates.”  Id.

[8] Based on our review of the record, we identified one instance in which we question the reasonableness of the evaluators’ findings.  Specifically, DSA asserts that one awardee (Qbase) took exception to the RFP provision that “[p]rofit is not allowed on [other direct costs] ODCs for any task order.”  RFP at 64.  Despite this restriction, Qbase’s cost proposal states that its ODCs are “[DELETED].”  AR, Tab 6V8, Qbase Price Narrative, at 8.  Although the agency argues that Qbase’s proposal [DELETED], it asserts that Qbase did not intend [DELETED].  DSA Supp. AR at 2.  Our review of the record indicates otherwise.  See, e.g., AR, Tab 6V8, Qbase Pricing Narrative, at 8, 11 (showing that [DELETED]”).  However, while we are not persuaded by the agency’s arguments that Qbase did not take exception to the RFP’s terms, we do not view this instance as prejudicial in light of the fact that DSA’s proposal was the 24th lowest‑priced offeror.  Accordingly, even if Qbase’s proposal took exception to the RFP, and was eliminated from the competition, the protester here still would not be in line for award.  See Oasis Sys., LLC; Quantech Servs., Inc., B-408227.10 et al., Apr. 28, 2016, 2016 CPD ¶ 124 at 12 (finding that protester did not demonstrate prejudice where it would not be next in line for award).

[9] With regard to the protesters’ proposals, for example, the CPET found that Peerless’ proposal had six labor rates that were more than one standard deviation below the average for its respective labor categories.  AR, Tab 9T, Peerless, Final Cost/Price Eval. Report, at 5.  The CPET therefore reviewed Peerless’ supporting cost information for all of these rates.  Id.  Ultimately, the CPET found that Peerless “provided adequate cost justification to support the direct labor rates and indirect rates proposed,” and did not adjust any of Peerless’ cost reimbursement labor rates.  Id. at 5-6.  As for DSA, although the CPET found that DSA’s proposal had zero labor rates that were more than one standard deviation below the average, the CPET “reviewed a random sample of 10 labor categories to verify that the offeror provided adequate support for [its] proposed rates.”  AR, Tab 9I, DSA, Final Cost/Price Eval. Report, at 5.  The CPET also concluded that DSA provided adequate cost justification to support its proposed direct and indirect rates, and did not adjust any of DSA’s cost reimbursement labor rates.

[10] Peerless also argues that the agency relaxed the RFP requirement that each offeror propose cost reimbursable labor rates with a 5.5% fixed fee.  In support of this argument, the protester asserts that, by proposing blended rates, several of the awardees were able to artificially lower their proposed rates.  We, however, find no merit to this argument.  The solicitation clearly allowed offerors to use subcontractor or interdivisional rates in the development of their proposed rates.  In this regard, the RFP’s pricing template allowed offerors to apply different weights to their proposed subcontractors’ rates, or for interdivisional rates, that made up their total cost reimbursable labor rate for a specific labor category.  RFP, attach. L.4, Pricing Template, Tab (CR Labor Build), Column Q.  The pricing template also advised offerors that:  “Composite labor rates (i.e. blending rates from multiple internal [labor categories] or blending prime and subcontractor rates for a single [labor category]) shall include all information used in the development of the proposed rate.”).  Id.  Furthermore, the RFP’s pricing template automatically added the 5.5% fixed fee to each offeror’s proposed costs.  RFP, attach. L.4, Pricing Template.  Based on our review of the record, we find no basis to conclude that the agency improperly relaxed any of the RFP’s requirements.

[11] Specifically, the CPET explained that “[d]ue to the uncertainties of the offeror’s Subcontractor analysis, the Government reviewed the Subcontractor’s proposal.”  AR, Tab 9R, NetCentrics, Cost/Price Eval. Report, at 5.  Further, the CPET noted that the subcontractor “provided a spreadsheet and supporting cost information showing the 10th BLS [Bureau of Labor Statistics] percentile salary,” and “[t]hen, depending on the job title,” applied a “rate premium.”  Id.  The CPET concluded that the subcontractor “adequately supported [its] proposed direct labor rates.”  Id.  With regard to indirect rates, the CPET explained that the subcontractor “applied a 4.17% indirect fee onto its direct rates to calculate their fully burdened rates,” and that the subcontractor “provided a list of what [was] included in the 4.17%.”  Id.  The agency questioned how the subcontractor could “hire and retain qualified personnel at the rates proposed,” and found that the offeror and subcontractor “provided adequate supporting cost information to support the 101 subcontractor labor rates proposed.”  Id. at 6-7.

[12] DSA also challenges the agency’s price realism analysis with regard to Solers’ proposal, asserting that it was unreasonable for Solers to base its direct fixed-price rates on the same 16 allegedly “noncompliant” labor categories.  DSA Supp. Comments at 30.  Although the protester contends that Solers proposed noncompliant labor categories, as explained above, this argument is based on a misreading of Solers’ proposal.  As such, we find no basis to sustain the protest.

[13] The agency also stated that it believed its use of the aggregate of all three years was reasonable because it expected an increase of use of the ENCORE III vehicle over the ENCORE II vehicle.  Agency Supp. Legal Memo (NCI) at 2.

[14] We note that DSA also challenges the agency’s price realism evaluation with regard to two other offerors--Solers and ECS.  Although we do not discuss all of the protester’s arguments in detail, we have considered each and find that none provides a basis to sustain the protest.  See footnote 10, supra.

[15] DSA also contends that the agency failed to conduct a reasonable unbalanced pricing evaluation because its analysis “consisted solely of a determination as to whether the awardees’ proposed labor rates for tiered labor categories ‘increased or decreased as expected’ based on the minimum experience requirements associated with each labor category.”  DSA Supp. Protest at 5; see, e.g., AR, Tab 9U, Cost/Price Eval. Report, at 4, 6.  Unbalanced pricing exists where the prices of one or more line items are significantly overstated, despite an acceptable total evaluated price (typically achieved through under pricing of one or more other line items).  Inchcape Shipping Servs. Holding, Ltd., B-403399.3, B‑403399.4, Feb. 6, 2012, 2012 CPD ¶ 65 at 4.  Here, as referenced above, the RFP required that offerors propose labor rates for certain, specified labor categories, but did not disclose the government estimated labor hours that would be used during the evaluation for calculating each offeror’s total evaluated cost and price.  RFP at 148.  In light of the fact that the RFP did not disclose the labor hour estimates, it is not clear how an offeror could have manipulated its pricing in such a way to achieve an overall total evaluated price among the 20 lowest, while overstating certain rates.  In any event, even assuming that some awardees did propose “unbalanced” labor rates, as asserted by the protester, the protester has failed to demonstrate how this would result in a material risk of the government paying unreasonably high amounts for contract performance.  See InfoZen, Inc., B-411530, B‑411530.2, Aug. 12, 2015, 2015 CPD ¶ 270 at 7 (even when an agency overlooks unbalanced pricing, we will not sustain the protest if no material risk to the government from the unbalancing is apparent from the record because the agency’s error has not prejudiced the protester).  As the agency points out, the fixed‑price rates are ceiling rates, and offerors are permitted to propose discounted rates when competing for task orders.  Agency Supp. Legal Memo (DSA) at 11-12.  On this record, we find no basis to sustain the protest.

[16] DSA also challenges the technical acceptability of NetCentrics’ proposal under subfactor 4 (T&E), arguing that NetCentrics’ proposal failed to demonstrate experience having “approximately 1,000 concurrent, [public key infrastructure]-enabled users.”  RFP at 144.  After reviewing NetCentrics’ proposal, the evaluators concluded that NetCentrics’ proposal adequately met this requirement, stating:

The NetCentrics Application Operations team launches 5 instances of the web front end on the first 15 days of each month, to allow for as many as 2,500 concurrent users, exceeding the 1,000 requirement, accessing the application, demonstrating experience in supporting 1000+ concurrent PKI-enabled users. 

AR, Tab 8C, NetCentrics Tech/Mgmt. Eval. Report, at 9-10.  Although DSA asserts that NetCentrics should have been required to provide more support to show it could meet this requirement, the protester has not demonstrated that the agency’s evaluation was unreasonable.  The protester’s disagreement with the agency’s judgment, without more, is not sufficient to render the agency’s evaluation unreasonable, or otherwise provide a basis to sustain the protest. 

[17] DSA also contends that the agency unreasonably concluded that Next Tier’s proposal demonstrated experience “leveraging DoD tools.”  DSA Supp. Protest at 55.  As relevant here, in evaluating Next Tier’s proposal, the agency determined that Next Tier met the relevancy requirement of “leveraging DoD enterprise or similar tools and capabilities” because “JIEDDO was ‘a software development project to create an enterprise application,’ [and] thus an enterprise application itself, demonstrating leverage of DoD enterprise tools.” AR Tab 8F, Next Tier Tech/Mgmt. Eval. Report, at 11.  Based on our review of the record, we find no basis to object to the agency’s evaluation of Next Tier’s proposal as acceptable.

Downloads

GAO Contacts

Office of Public Affairs