Skip to main content

deciBel Research, Inc.

B-424046,B-424046.2 Feb 18, 2026
Jump To:
Skip to Highlights

Highlights

deciBel Research, Inc., a small business of Huntsville, Alabama, protests the award of a contract to IERUS Technologies, Inc., a small business of Huntsville, Alabama, under request for proposals (RFP) No. HQ0862-24-R-0002, issued by the Department of Defense, Missile Defense Agency (MDA), for radar system test and analysis. The protester contends that the agency unreasonably evaluated its proposal under the management factor. The protester also challenges aspects of the agency's cost/price evaluation.

We deny the protest.
View Decision

DOCUMENT FOR PUBLIC RELEASE

The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of: deciBel Research, Inc.

File: B-424046; B-424046.2

Date: February 18, 2026

Damien C. Specht, Esq., James A. Tucker, Esq., and Brian E. Doll, Jr., Esq., Morrison & Foerster LLP, for the protester.
Brian G. Walsh, Esq., George E. Petel, Esq., W. Benjamin Phillips, III, Esq., Morgan W. Huston, Esq., and Nicholas T. Iliff, Jr., Esq., Wiley Rein LLP, for IERUS Technologies, Inc., the intervenor.
Major William R. Carpenter, Lieutenant Colonel Cali Y. Kim, and Robert B. Neill, Esq., Missile Defense Agency, for the agency.
Uri R. Yoo, Esq., and Alexander O. Levine, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1. Protest that agency unreasonably evaluated protester's management proposal is denied where the evaluation was reasonable and consistent with the solicitation.

2. Protest challenging agency's cost realism evaluation is denied where upward adjustments to the protester's proposed costs were reasonably based on the protester's proposed approach.

3. Protest of agency's price reasonableness evaluation is denied where the evaluation was consistent with the solicitation and applicable regulation.

DECISION

deciBel Research, Inc., a small business of Huntsville, Alabama, protests the award of a contract to IERUS Technologies, Inc., a small business of Huntsville, Alabama, under request for proposals (RFP) No. HQ0862-24-R-0002, issued by the Department of Defense, Missile Defense Agency (MDA), for radar system test and analysis. The protester contends that the agency unreasonably evaluated its proposal under the management factor. The protester also challenges aspects of the agency's cost/price evaluation.

We deny the protest.

BACKGROUND

The agency issued the solicitation on March 29, 2024, seeking contractor support for flight and ground testing and the analysis of radar systems. Agency Report (AR), Tab 8c, Statement of Work (SOW) at 1; Contracting Officer's Statement (COS) at 2‑3. The SOW encompassed flight and ground testing of various radar platforms, as well as the execution of system test-related activities including planning, executing, and analyzing sensor performance in system flight and ground tests. AR, Tab 8c, SOW at 1. The solicitation, issued as a set-aside for small businesses, anticipated award of a cost‑plus‑fixed‑fee contract with a 3‑year base period and two 3‑year option periods. COS at 3; AR, Tab 10b, Amended Instructions to Offerors at 30.

The solicitation advised that award would be made to the responsible offeror whose proposal represented the best value after a tradeoff considering the following five factors: (1) technical; (2) management; (3) corporate experience; (4) regulatory compliance; and (5) cost/price. AR, Tab 8b, Amended Evaluation Criteria at 4‑5. For the purpose of the tradeoff, the first three factors (technical, management, and corporate experience) were of equal importance, and all non-cost/price evaluation factors, when combined, were significantly more important than cost/price. Id. at 5. The solicitation informed offerors that even though price would be a substantial factor in the source selection, the competition might result in an award to a higher‑rated, higher‑priced offeror. Id.

As relevant here, the management factor was comprised of three subfactors, listed in descending order of importance: (1) program management; (2) staffing/human capital management; and (3) transition plan. Id. Under the program management subfactor, the agency would evaluate each offeror's approach to, and understanding of, program management requirements. This would include the offeror's approach to “provide a constant [level of effort] for planning, execution and analysis to support fourteen [hardware-in-the-loop (HWIL)] strings” of listed radars, and to “manage this [level of effort] to execute various permutations of the [ground testing] schedule.” Id. at 9; AR, Tab 10b, Amended Instructions to Offerors at 28. The solicitation also specified that the proposed level of effort should “cover[] all ground test personnel supporting [contract line item number (CLIN)] 0003 (from SOW 3.9.1).” AR, Tab 10b, Amended Instructions to Offerors at 28. Under the staffing and human capital management subfactor, the agency would evaluate the offeror's approach to, and understanding of, staffing to support, among other things, the fourteen HWIL strings of radars, distributed ground testing, and full flight test teams under SOW section 3.1.1.3. Id.; AR, Tab 8b, Amended Evaluation Criteria at 9.

Under the cost/price factor, the solicitation advised that the agency would evaluate offerors' cost/price proposals using one or more of the techniques described in Federal Acquisition Regulation (FAR) section 15.404 and Defense Federal Acquisition Regulation Supplement section 215.404. AR, Tab 8b, Amended Evaluation Criteria at 12. As relevant here, the cost/price evaluation would include a cost realism analysis in accordance with FAR section 15.404-1 for cost-reimbursement type CLINs “to determine the probable cost of performance for each Offeror . . .consistent with the unique methods of performance described in the Offeror's technical and management proposals.” Id. at 13. Specifically, the agency's cost realism analysis would assess “whether the proposed hour quantities and skill mix, and direct and indirect rates, reflect a clear understanding of the requirement, are consistent with the Offeror's approach, and are realistic for the work to be performed.” Id. The agency would use the results of this analysis to develop a probable cost, “adjusting each Offeror's proposed cost, and fee when appropriate, to reflect any additions or reductions in cost elements to realistic levels.” Id. The probable cost would then be used to calculate the total evaluated price for the purpose of the best‑value tradeoff. Id. at 13. With respect to price reasonableness, the solicitation provided that, “[s]ince the Government anticipates adequate price competition, the Government expects to verify price reasonableness by comparing competitively proposed prices.” Id.

The agency received timely proposals from several offerors, including deciBel and IERUS. COS at 3. After evaluating the proposals and establishing a competitive range, the agency conducted discussions with the offerors in the competitive range, including deciBel and IERUS. Id. at 3‑4. Based on the offerors' final proposal revisions, the agency evaluated deciBel's and IERUS's proposals as follows:

 

deciBel

IERUS

Technical

Acceptable/Low Risk

Outstanding/Low Risk

Test Planning & Execution

Acceptable/Low Risk

Outstanding/Low Risk

Analysis Support

Acceptable/Low Risk

Outstanding/Low Risk

Test Infrastructure

Acceptable/Low Risk

Outstanding/Low Risk

Management

Marginal/High Risk

Outstanding/Low Risk

Program Management

Marginal/High Risk

Good/Low Risk

Staffing & Human Capital

Management

Marginal/High Risk

Outstanding/Low Risk

Transition

Acceptable/Low Risk

Outstanding/Low Risk

Corporate Experience

Satisfactory Confidence

Substantial Confidence

Regulatory Compliance

Acceptable

Acceptable

Proposed Cost/Price

$273,661,952

$474,955,602

Evaluated Cost/Price

$407,212,375

$513,734,575

AR, Tab 80, Source Selection Decision Document (SSDD) at 3.

As relevant here, in evaluating deciBel's management proposal, the agency assessed a significant weakness under the program management subfactor and two weaknesses under the staffing and human capital management subfactor based on deciBel's proposed level of effort and staffing. AR, Tab 78, Proposal Analysis Report (PAR) at 234, 239, 240, 244‑245. Based on this evaluation, the agency assigned the rating of marginal/high risk to deciBel's proposal under the two management subfactors, resulting in an overall rating of marginal/high risk for the factor. Id. at 233. In addition, based on a cost realism analysis concluding that deciBel's proposed level of effort was unrealistic, the agency upwardly adjusted deciBel's proposed cost by $133,550,423. Id. at 275.

The source selection advisory council (SSAC) conducted a detailed comparative analysis of offerors' proposals and recommended IERUS's proposal as representing the best value. Id. at 277‑298. In making the recommendation, the SSAC concluded that “the technical, management, and experience benefits for MDA's radar analysis and support execution from IERUS's proposal” were “significantly more important than the cost savings of $106,522,200” from deciBel's lower-rated, higher-risk proposal. Id. at 298. The source selection authority (SSA) considered the evaluation reports, as well as the SSAC's recommendation, and agreed with their findings. AR, Tab 80, SSDD at 4. After conducting a best‑value tradeoff analysis, the SSA also concluded that the technical, management, and corporate experience benefits represented by IERUS's proposal were worth the price premium over deciBel's proposal. Id. at 9.

On October 9, the agency awarded the contract to IERUS and notified deciBel of the award decision. After requesting and receiving a debriefing, deciBel filed this protest.

DISCUSSION

The protester first challenges the evaluation of its proposal under the management factor, arguing that the agency unreasonably identified weaknesses in deciBel's level of effort and staffing approach without considering the firm's unique technical approach. The protester also argues that the agency made unreasonable upward adjustments to deciBel's proposed cost based on a flawed assessment of the firm's proposed level of effort and staffing approach. The protester additionally asserts that the agency unreasonably determined the awardee's price to be fair and reasonable. While our decision does not specifically discuss every argument raised, we have considered all of deciBel's assertions and find that none provides a basis for sustaining the protest.[1]

The evaluation of an offeror's proposal is a matter within the agency's discretion. Oak Grove Techs., LLC, B‑415772, B‑415772.2, Mar. 15, 2018, at 4. In reviewing a protest challenging an agency's evaluation of proposals, our Office does not reevaluate proposals or substitute our judgment for that of the agency; rather, we review the record to determine whether the agency's evaluation was reasonable and consistent with the solicitation's evaluation criteria, as well as applicable statutes and regulations. Arctic Slope Mission Servs., LLC, B‑417244, Apr. 8, 2019, at 8. A protester's disagreement with the agency's judgment, without more, is insufficient to establish that the agency acted unreasonably. Trilogy Fed., LLC, B‑418461.11, B‑418461.18, Feb. 23, 2021, at 5.

Evaluation of deciBel's Management Proposal

The protester alleges that the agency unreasonably evaluated deciBel's proposal under the management factor. Specifically, the protester contends that the agency unreasonably assessed a significant weakness and two weaknesses for deciBel's proposed level of effort and staffing. Protest at 10‑21; Comments & Supp. Protest at 38‑42. In this regard, the protester argues that the agency erroneously relied on an unstated historical level of effort without considering the efficiencies offered by the protester's innovative approach, which leveraged cross‑training personnel and automation. Id. The protester also argues that the weaknesses assessed in deciBel's proposed level of effort were inconsistent with the agency's finding that the firm's technical approach of relying on automation was acceptable. On the record before us, we find that none of the protester's arguments provide a basis to sustain the protest.

For example, the agency assessed a significant weakness under the program management subfactor, finding that deciBel's proposal did not “demonstrate an understanding of the [level of effort] necessary for the analysis effort that must be performed in a timely manner to support reporting throughout each ground test phase to include overlapping events and overlapping test phases.” AR, Tab 78, PAR at 239. The evaluators further found that deciBel's approach “did not demonstrate that [deciBel] will be able to perform overlapping analysis activities of overlapping events in support of the ground test,” which would result in “delays to analysis products, responses to Joint Analysis Team observations, assessments of new capabilities, and delayed Warfighter deliveries.” Id. Based on these findings, the evaluators concluded that this significant weakness appreciably increased the risk of unsuccessful contract performance. Id.

We find the agency's conclusion unobjectionable and supported by the record. As an initial matter, the record does not support the protester's assertion that the agency failed to provide a rationale for finding deciBel's proposed level of effort to be insufficient. The record shows that deciBel proposed a total of [DELETED] analysts for all ground test analysis activities across the four radar systems, ranging from [DELETED] to [DELETED] analysts per radar. See AR, Tab 64, deciBel's Final Management Proposal at 38. The solicitation, however, required “a constant [level of effort] for planning, execution, and analysis” to support 14 HWIL strings of four different radar systems and to manage this level of effort to execute “various permutations of the [ground testing] schedule.” AR, Tab 8b, Amended Evaluation Criteria at 9. The agency details further that ground test personnel are responsible for “supporting overlapping events (executing more than one Ground Test event at one time) and overlapping phases.” COS at 8. A test event comprises four different phases (requirements and scenarios; planning and integration, execution, and analysis) with possible “multiple test events happening at once with each in a different phase of testing.” Id. The agency explains that the offeror must be able to “provide adequate staffing to support each test and each phase at the same time.” Id. In light of this requirement, the agency considered the protester's proposed level of effort of [DELETED] analysts to support all ground test analysis activities across the four radar systems and found that deciBel did not demonstrate “an understanding of the [level of effort] necessary for the analysis effort that must be performed” or that deciBel would be “able to perform overlapping analysis activities of overlapping events” with this level of effort. AR, Tab 78, PAR at 239; see Memorandum of Law (MOL) at 48‑49. As further discussed below, we find reasonable the agency's conclusion that a total of [DELETED] analysts was inadequate to support overlapping phases of overlapping test events.

The protester argues that its program management approach and level of effort for ground testing were based on its more efficient and “unique technical approach, which leverages automation and cross‑trained personnel.” Protest at 12; Comments & Supp. Protest at 35‑36, 38. The record shows, however, that MDA considered the protester's proposed management approach but concluded that it did not resolve the agency's concerns with the proposed level of effort.

For example, the agency documented its consideration of deciBel's proposed cross‑training approach and concluded that the approach did not resolve the significant risk of an insufficient level of effort when the contractor is performing overlapping analysis activities during overlapping ground test events. AR, Tab 78, PAR at 239. With respect to automation, contrary to the protester's assertions, deciBel's management proposal did not include an approach to reducing the level of effort through greater automation. See Comments & Supp. Protest at 35; AR, Tab 64, deciBel's Final Management Proposal at 15‑44. Instead, deciBel proposed its automation approach as part of its technical proposal. See Comments & Supp. Protest at 35‑36, 38‑41, citing AR, Tab 63, deciBel's Final Technical Proposal at 40, 47-48, 50, 51-57, 63, 70-72, 74, 84. Consistent with an offeror's burden to submit a well‑written proposal, however, an agency is not required to search for information about an offeror's approach to one solicitation requirement in the proposal section addressing another. See Monbo Grp. Int'l, B‑421554, June 22, 2023, at 5.

We also note that deciBel's discussion response on this significant weakness similarly failed to mention automation (or cross-training) as deciBel's approach to compensate for the reduced level of effort proposed. See AR, Tab 29, deciBel 1st Evaluation Notice Response at 53. Instead, while confirming that deciBel “underst[ood] the Government's concern about the number of analysts supporting the activities of 14 HWILs executing simultaneously, and for overlapping events and/or test phases,” the protester cited to “actuals from [its] previous work on the . . . [Ballistic Missile Defense System Integration Testing (BIT)] contract” as the basis for its proposed level of effort. Id. Expressing confidence that its proposed level of effort for analysis was sufficient for the two radars included in the BIT contract, deciBel offered a slightly increased level of effort for the analysis of the remaining two radars not covered under the BIT contract. Id. The agency found that this response did not resolve its concern about the low level of effort, with MDA noting that the BIT contract “represented a lower complexity” than the solicited effort because the “BIT contract concluded over 6 years ago before MDA adopted Continuous Developmental Integration [] testing.” AR, Tab 78, PAR at 331. The agency also noted that the two radar systems that were not part of the BIT contract were “significantly more capable radars,” meaning more advanced and complex, than the two that were covered by the BIT contract, so that deciBel's “actuals” from the BIT contract experience would not be an accurate basis of estimate for the work required under those two radars. Id. at 331‑332; see MOL at 48‑49. The agency concluded that “[n]o unique approach or technology was proposed to justify decreased [level of effort] support for future efforts.” AR, Tab 78, PAR at 332. On this record, we find the agency's conclusion reasonable.

Finally, we find unavailing the protester's contention that the agency conducted misleading discussions with respect to this significant weakness. In this regard, the protester complains that the agency misled deciBel into believing that the issue was resolved by raising the issue in the first round of discussions but not in the second round. See Protest at 13; Comments & Supp. Protest at 41‑42.

The agency here conducted two rounds of discussions with deciBel, each time providing an extensive list of evaluation notices addressing various weaknesses, significant weaknesses, and uncertainties identified in deciBel's proposal. See generally, AR, Tab 27a, deciBel's 1st Evaluation Notice; AR, Tab 42a, deciBel's 2nd Evaluation Notice. In the first round of discussions, the agency informed deciBel that its proposed level of analysis effort was a significant weakness because deciBel had not demonstrated that it would be “able to perform overlapping analysis activities of overlapping events in support of the ground test.” AR, Tab 27a, deciBel's 1st Evaluation Notice at 41. As discussed above, the protester's evaluation notice response indicated an understanding of the government's concern about the level of effort for analysis supporting overlapping events and test phases. The response further explained the basis for deciBel's proposed level of effort as “actuals from [its] previous work” on another contract and offered some increase to the level of effort for those radars not covered under the prior contract. AR, Tab 29, deciBel 1st Evaluation Notice Response at 53. As relevant here, the agency's second round of discussions, which included some, but not all, of the issues raised in the previous round, did not include further discussion of the significant weakness for the level of analysis effort. See generally, AR, Tab 42a, deciBel 2nd Evaluation Notice.

As a general matter, discussions, when conducted, must be meaningful--that is, they must identify deficiencies and significant weaknesses that exist in an offeror's proposal‑‑but that requirement is satisfied when an agency leads an offeror into the areas of its proposal that require amplification or revision. See, e.g., Epsilon Sys. Sols., Inc., B-409720, B-409720.2, July 21, 2014, at 16. Moreover, an agency is generally not required to afford an offeror multiple opportunities to cure a weakness remaining in a proposal that previously was the subject of discussions. Jacobs Tech., Inc., B‑422040, Jan. 4, 2024, at 9.

Here, the agency's first round of discussions identified, as a significant weakness under the management factor, deciBel's proposed level of effort for analysis and provided an opportunity for deciBel to address the issue. The agency had no obligation to raise the issue again in subsequent discussions. See Health Net Fed. Servs., LLC, B‑421405.2, B‑421405.3, Aug. 4, 2023, at 15 (finding no obligation for an agency, in discussions, to continue raising an unresolved issue simply because the issue was previously raised, whether once or more than once).[2] Accordingly, we are not persuaded that discussions were misleading merely because the agency failed to repeatedly raise the same significant weakness, having already afforded deciBel an opportunity to cure it.

Cost Realism Adjustment

The protester also challenges the agency's upward adjustment to deciBel's proposed level of effort and staffing levels. Protest at 21‑22; Comments & Supp. Protest at 37‑38. In this regard, the protester argues that the agency mechanically used outdated, historical levels of effort without considering deciBel's “efficient, highly automated, and proven staffing approach that required fewer hours.” Protest at 22. Based on our review of the record, we find no merit to the protester's arguments.

When an agency evaluates a proposal for the award of a cost-reimbursement contract, the offeror's proposed costs are not dispositive because, regardless of the costs proposed, the government is bound to pay the contractor its actual and allowable costs. FAR 15.404‑1(d); AECOM Mgmt. Servs., Inc., B‑418467 et al., May 15, 2020, at 4. Consequently, the agency must perform a cost realism analysis to determine the extent to which the offeror's proposed costs are realistic for the work to be performed. FAR 15.404‑1(d)(1); see MAXIMUS Fed. Servs., Inc., B‑419487.2, B‑419487.3, Aug. 6, 2021, at  16. An agency is not required to conduct an in-depth cost analysis, or to verify each and every item in assessing cost realism; rather, the evaluation requires the exercise of informed judgment by the contracting agency. Id.; see FAR 15.404‑1(c). Moreover, the agency's cost realism analysis need not achieve scientific certainty; rather, the methodology employed must be reasonably adequate and provide some measure of confidence that the proposed costs are reasonable and realistic in view of other cost information reasonably available to the agency at the time of its evaluation. QMX Support Servs., B‑408959, Jan. 6, 2014, at 4‑5. Because the contracting agency is in the best position to make this determination, our review of an agency's cost realism evaluation is limited to determining whether the cost analysis is reasonably based and not arbitrary. Smartronix, Inc., B‑413721.2, Feb. 22, 2017, at 5.

The solicitation here provided for a cost realism analysis to determine the most probable cost of performance. As relevant here, it informed offerors that the cost realism analysis would assess “whether the proposed hour quantities and skill mix, and direct and indirect rates reflect a clear understanding of the requirement, are consistent with the Offeror's approach, and are realistic for the work to be performed.” AR, Tab 8b, Amended Evaluation Criteria at 13. The solicitation further provided that the agency would determine the probable cost by “adjusting each Offeror's proposed cost, and fee when appropriate, to reflect any additions or reductions in cost elements to realistic levels based on the result of the cost realism analysis.” Id.

In conducting the cost realism analysis, the agency's cost/price evaluation team relied on the labor hour adjustments provided by the technical evaluation team after its evaluation of each offeror's “unique approach consider[ing] historical staffing and knowledge of the current [radar system test and analysis contract] staffing workload.” AR, Tab 78, PAR at 330, 332. For deciBel's cost proposal, the technical evaluation team upwardly adjusted deciBel's proposed total of [DELETED] labor hours by [DELETED] hours, based on the weaknesses identified in deciBel's management approach, resulting in a total probable labor hours of [DELETED] hours. Id. at 330. The cost/price evaluation team incorporated these adjusted labor hours across the labor categories supporting the ground and flight testing CLINs, resulting in a significant upward adjustment to deciBel's most probable cost. Id. at 331‑337.

The evaluators explained that these adjustments were based on deciBel's proposed level of effort and the staffing weaknesses identified in its non‑cost/price proposal. Id. at 331‑332. In this regard, the agency noted that “some of the labor hours remained unrealistic” in deciBel's final revised proposal despite the firm having been informed in discussions that its proposed level of effort was found to be unrealistically low for the work to be performed. Id. at 332. The evaluators also provided a rationale for each category of adjustments, noting their assessment of the protester's response to the evaluation notices. Id. at 331‑332. For example, in the rationale for an upward adjustment of [DELETED] hours for sensor ground test participation, the agency noted that deciBel's work under the BIT contract (which formed the basis of estimate for deciBel's proposed level of effort) was not analogous to the level of effort required under the current effort. AR, Tab 78, PAR at 331‑332, citing, AR, Tab 29, deciBel 1st Evaluation Notice Response at 53. Moreover, as discussed above, nothing in deciBel's proposal supports its protest assertion that the firm proposed its automation approach as a “labor- and cost-saving” measure that would allow deciBel to perform the requirement with a level of effort much lower than historical levels. See Comments & Supp. Protest at 37‑38.

On this record, we find no basis to conclude that the agency mechanically applied historical staffing levels without considering deciBel's unique technical approach. Instead, we find that the agency, after reviewing the protester's technical and management approach, including its responses to the evaluation notices, reasonably concluded that “[n]o unique approach or technology was proposed to justify decreased [level of effort] support for future efforts.” AR, Tab 78, PAR at 332. We find unobjectionable the agency's decision, on the basis of this conclusion, to upwardly adjust deciBel's labor hours “to reflect the effort required.” See id. The protester's disagreement with the agency's judgment in this regard, without more, does not provide a basis to sustain the protest. See MAXIMUS Fed., supra at 16.

Finally, we find no merit to the protester's various other objections to the agency's cost realism analysis. For example, while deciBel asserts that the agency unreasonably made upward adjustments to labor categories unrelated to the weaknesses identified in deciBel's management proposal, Protest at 22, the agency explains that the adjustments to these additional labor categories were based on the protester's unique approach. COS at 13. Specifically, the agency notes that the cost/price evaluators spread out these additional labor hours across other labor categories (such as computer operator and senior engineer) based on deciBel's proposal to “cross-train personnel (operators, analysts, and responsible engineers)” to meet the requirements. Id.; see AR, Tab 78, PAR at 331. On this record, we find no basis to question these additional adjustments.

We similarly find unavailing the protester's objections to the fact that the agency's cost realism adjustments resulted in deciBel's probable labor hours being greater than the awardee's probable labor hours. See Comments & Supp. Protest at 37. As noted above, the solicitation provided for a cost realism analysis “consistent with the unique methods of performance described in the Offeror's technical and management proposals.” AR, Tab 8b, Amended Evaluation Criteria at 13. The evaluation record shows that the agency considered each offeror's proposed costs based on the offeror's proposed approach, and made realism adjustments based on that analysis, instead of mechanically applying a pre-determined level of effort to both proposals. See AR, Tab 78, PAR at 306‑317, 328‑338. We see no basis to object to the agency's reasonably based cost realism adjustments even where they resulted in different probable labor hour calculations for the two offerors.

Awardee's Price Reasonableness

The protester next challenges the agency's price reasonableness analysis, arguing that MDA erred in finding the awardee's “unreasonably high price” to be fair and reasonable. Protest at 23. DeciBel first argues that the agency's unreasonable upward adjustment to the prices proposed by deciBel and a third offeror resulted in an “artificially inflated and unreliable comparison baseline” for measuring the reasonableness of IERUS's price. Id. The protester also argues that, even if the upward adjustments were warranted, the agency unreasonably relied on price competition alone to determine reasonableness given the “large difference” between offerors' cost/price proposals. Id.

It is a fundamental principle of federal procurement law that procuring agencies must condition the award of a contract upon a finding that the contract contains “fair and reasonable prices.” FAR 15.402(a), 15.404‑1(a); see Crawford RealStreet Joint Venture, B‑415193.2, B‑415193.3, Apr. 2, 2018, at 8. The purpose of a price reasonableness analysis is to prevent the government from paying too high a price for a contract. Id. The manner and depth of an agency's price analysis is a matter committed to the discretion of the agency, which we will not disturb provided that it is reasonable and consistent with the solicitation's evaluation criteria and applicable procurement statutes and regulations. Torrent Techs., Inc., B‑419326, B‑419326.2, Jan. 19, 2021, at 6.

As noted above, the solicitation advised that the agency would evaluate offerors' proposed prices using one or more of the techniques described in section 15.404 of the FAR. AR, Tab 8b, Amended Evaluation Criteria at 12. One of the price analysis techniques described in the FAR is a “[c]omparison of proposed prices received in response to the solicitation,” where “[n]ormally, adequate price competition establishes a fair and reasonable price.” FAR 15.404‑1(b)(2)(i). Consistent with this method, the RFP here provided that, because the agency anticipated adequate price competition, it “expect[ed] to verify price reasonableness by comparing competitively proposed prices.” AR, Tab 8b, Amended Evaluation Criteria at 13.

After receiving proposals, the agency determined that “adequate price competition exists” and, accordingly, verified price reasonableness by comparing the competitively proposed prices. AR, Tab 78, PAR at 300. Specifically, the agency calculated the average total proposed price, the average total probable price, and the average total evaluated price of the three offerors included in the tradeoff, including deciBel and IERUS. COS at 14; AR, Tab 78, PAR at 301. The agency then compared each offeror's proposed, probable, and evaluated prices to the average proposed, probable, and evaluated prices. Id. As a result of this comparison, the agency concluded that deciBel's total evaluated price was “below the average by 12.6 [percent] . . . and IERUS['s] total evaluated price [was] above the average by . . . 9.9 [percent].”[3] AR, Tab 78, PAR at 301. Based on this analysis and finding no other indication that the proposed prices were unreasonable, the agency determined that there was adequate price competition and that each offeror's price was reasonable. Id. On this record, we see no basis to disturb the agency's price reasonableness analysis.

While the protester relies on our decisions in Technatomy Corp., B‑414672.5, Oct. 10, 2018, and Cognosante MVH, LLC, B‑417111 et al., Feb. 21, 2019, neither of these decisions is applicable here. Specifically, in both Technatomy and Cognosante, we sustained challenges to the price reasonableness analysis because the agency relied on the existence of competition alone without actually comparing the offerors' proposed prices. See Technatomy, supra at 13 (“there is no evidence in the record that the agency compared competitors' prices or acted upon that comparison”); Cognosante, supra at 6 (“The record here does not demonstrate that the [agency] performed any assessment or comparison of final proposal prices.”). In contrast, the agency here compared the offerors' total evaluated prices and determined that the awardee's price was reasonable. The protester's disagreement with the agency's conclusion, without more, does not provide a basis to find the agency's evaluation unreasonable. Accordingly, we deny the protester's allegations in this regard.

The protest is denied.

Edda Emmanuelli Perez
General Counsel


[1] Initially, deciBel also alleged that IERUS's proposed subcontractor has an impaired objectivity organizational conflict of interest (OCI). Protest at 5‑10; Comments & Supp. Protest at 4‑34; Supp. Comments at 3‑26. On January 15, 2026, the agency advised our Office that it had waived any OCIs regarding the award to IERUS and requested that our Office dismiss the OCI allegation as academic. Req. for Partial Dismissal at 1. As a general rule, our Office will dismiss as academic a protest challenging an OCI when the agency elects to waive the OCI. AT&T Gov't Sols., Inc., B‑407720, B‑407720.2, Jan. 30, 2013, at 4. Here, the agency submitted a waiver executed by the agency head, Req. for Partial Dismissal, exh. 1, OCI Waiver at 1‑2, and the protester did not challenge the sufficiency of this waiver under FAR section 9.503. See Resp. to Partial Dismissal Req. at 1‑2. Accordingly, we dismiss the protester's OCI allegations as academic.

[2] In this regard, the protester's reliance on our decision in ASRC Fed. Sys. Sols., LLC, B‑420443, B‑420443.2, Apr. 12, 2022, is inapposite here. In ASRC Federal, we concluded that discussions were misleading where an agency advised the protester that it considered an issue raised in discussions to be resolved but subsequently assigned a weakness on the basis of that issue. Id. at 9. In contrast, the agency here provided no such misleading advice nor made any other representation indicating that any weaknesses raised during prior discussions were resolved.

[3] We also find unpersuasive the protester's assertion that it was somehow improper for the agency to use the total evaluated price--i.e., the price reflecting any cost realism adjustments--for evaluating price reasonableness. The protester's approach would result in an illogical requirement for the agency to assess the reasonableness of a price proposal based on a proposed price that does not reflect the actual anticipated cost to the government. The solicitation here informed offerors that the agency would perform a cost realism analysis to determine the most probable cost and then use the most probable cost to calculate the total evaluated price for tradeoff purposes. AR, Tab 8b, Amended Evaluation Criteria at 13. Accordingly, we find that the agency properly used the offerors' total evaluated prices in its price reasonableness evaluation.

Downloads

GAO Contacts

Edward (Ed) Goldstein
Managing Associate General Counsel
Office of the General Counsel

Kenneth E. Patton
Managing Associate General Counsel
Office of the General Counsel

Media Inquiries

Sarah Kaczmarek
Managing Director
Office of Public Affairs

Public Inquiries