Skip to main content

ManTech TSG-1, Joint Venture

B-411253.7,B-411253.8 Mar 01, 2017
Jump To:
Skip to Highlights

Highlights

ManTech TSG-1 Joint Venture (MTTJV), of Fairfax, Virginia, protests the award of a contract to Jacobs Technology, Inc., of Tullahoma, Tennessee, by the Department of the Army, Army Materiel Command, under request for proposals (RFP) No. W91RUS-13-R-0008 for scientific and engineering support services for the Army's electronic proving ground at Fort Huachuca Military Installation and elsewhere in Arizona, as well as contingent support in other locations. The protester primarily argues that its proposal was misevaluated and that the Army made an unreasonable selection decision.

We deny the protest.

We deny the protest.
View Decision

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of:  ManTech TSG-1, Joint Venture

File:  B-411253.7; B-411253.8

Date:  March 1, 2017

Paul F. Khoury, Esq., Brian G. Walsh, Esq., Tara L. Ward, Esq., and J. Ryan Frazee, Esq., Wiley Rein LLP, for the protester.
Robert J. Symon, Esq., Elizabeth A. Ferrell, Esq., Aron C. Beezley, Esq., and Lisa A. Markman, Esq., Bradley Arant Boult Cummings LLP, for Jacobs Technology, Inc., the intervenor.
Debra J. Talley, Esq., and Susan D. Denley, Esq., Department of the Army, for the agency.
Paul N. Wengert, Esq., and Tania Calhoun, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

Protest that agency misevaluated protester’s proposal and applied unstated evaluation criteria is denied where the record shows that the evaluation was reasonable and consistent with the evaluation criteria, and where the selection decision tradeoff provided a rational explanation for the agency’s selection of the awardee’s higher-rated proposal at a higher evaluated cost/price. 

DECISION

ManTech TSG-1 Joint Venture (MTTJV)[1], of Fairfax, Virginia, protests the award of a contract to Jacobs Technology, Inc., of Tullahoma, Tennessee, by the Department of the Army, Army Materiel Command, under request for proposals (RFP) No. W91RUS-13-R-0008 for scientific and engineering support services for the Army’s electronic proving ground at Fort Huachuca Military Installation and elsewhere in Arizona, as well as contingent support in other locations.  The protester primarily argues that its proposal was misevaluated and that the Army made an unreasonable selection decision. 

We deny the protest. 

BACKGROUND

The RFP, issued on February 6, 2014, sought proposals to provide on-site contractor support for scientific and engineering services at the United States Army Electronic Proving Ground under a hybrid cost-plus-award-fee/fixed-price contract for a 2-month phase-in, a 22-month base period, a 24-month option, and a 12‑month option.  RFP at 3-11, 272.[2]

The statement of work (SOW) described the general types of services required, ranging from mission command and network systems testing, to cyber security testing, intelligence systems developmental test support, and to electromagnetic environmental effect testing.  RFP at 33-40.  The RFP explained that “[a] majority of the work efforts . . . under this contract will be assigned by the Government in the form of individual Work Assignment Orders (WAOs) . . . .”[3]  RFP at 165 (¶ J.4).  Accordingly, the SOW specified that

The contractor shall provide personnel as required to meet variable workload demands, as required by work authorization orders (WAO).  Variations in the total number of active test projects and other assigned responsibilities require that the contractor make provisions for accommodating a range in work-force size without adversely affecting performance. 

RFP at 47 (¶ C-2.14.7). 

As amended during the course of the procurement, the RFP stated that a contract would be awarded to the firm whose proposal provided the best value, considering five evaluation factors:  mission support, management, past performance, small business participation, and cost/price.  Id. at 274-75.  The management factor was slightly more important than the mission support factor.  Each was more important than the individual factors of past performance, small business participation, and cost/price, which were equally important.  Id.  The non-cost/price factors, when combined, were significantly more important than cost/price.  Id. at 274. 

The RFP identified two equally-weighted subfactors under the mission support factor--technical approach and technical expertise--and three equally-weighted subfactors under the management factor--management structure and corporate experience, phase-in, and financial system capability.  Id.  The mission support and management factors and their subfactors were to be rated adjectivally, using the ratings of outstanding, good, acceptable, marginal, and unacceptable.  Id. at 276‑78.  None of the other factors are at issue here. 

The Army received proposals from eight offerors, including MTTJV and Jacobs.  AR at 4.  On March 10, 2015, the Army awarded a contract to a third offeror, and four competitors filed protests (three with our Office and one with the contracting officer).  Id.  The Army took corrective action, held discussions, amended the RFP, and received and evaluated revised proposals.  Id.  On December 1, 2015, the Army awarded the contract to Jacobs, and two competitors protested.  Id.  The Army again took corrective action and requested proposal revisions.  Id. at 5.  After evaluating the revised proposals, the Army sent each offeror a summary of its evaluation and allowed them to submit final proposal revisions (FPR) by October 11, 2016.  Id.  As reflected in the contemporaneous record, the relevant results of the final evaluation were:

 

Jacobs

MTTJV

Mission Support

Good

Acceptable

Technical Approach

Outstanding
1 significant strength
 4 strengths

Acceptable
 3 strengths
2 weaknesses

Technical Expertise

Good
4 strengths

Acceptable
2 strengths

Management

Outstanding

Acceptable

Management Structure/ Corporate Experience

Outstanding
6 strengths

Acceptable
1 strength

Phase-in

Outstanding
1 significant strength
1 strength

Acceptable
2 strengths

Financial System Capability

Outstanding
7 strengths

Good
3 strengths

Past Performance

 

 

Overall Relevance

Relevant

Very Relevant

Overall Confidence

Substantial Confidence

Substantial Confidence

Small Business Participation

Acceptable

Acceptable

Evaluated Cost/Price

$132.1M

$125.3M



AR Tab 22, Source Selection Evaluation Board Report, at 21-41. 

A Source Selection Advisory Council (SSAC) reviewed the evaluation results for each offeror and recommended Jacobs for award.  The SSAC provided a briefing to the source selection authority (SSA) that described the offerors’ strengths, weaknesses, and adjectival ratings under the non-price factors and subfactors.  Tab 61, SSAC Report Briefing Slides, at 17-75.  The SSAC then reviewed the offerors’ evaluated cost/prices from the prior award, and the amount by which each had changed.  Their analysis identified MTTJV as one of two offerors that made significant reductions to its cost/prices when compared to the first source selection decision.  In both cases the SSAC pointed to reduced staffing for the instrumentation and software development requirement.  Id. at 76-77.  The SSAC stated that MTTJV had provided an “unsatisfactory explanation of the effects” of lowering the staffing for that requirement from its earlier proposal.  Id. at 77.  In contrast, the SSAC noted that the total evaluated cost/price for Jacobs was nearly unchanged from its initial proposal.  Id. at 78. 

After setting forth the detailed strengths and weaknesses of the offerors, the SSAC compared Jacobs to the other offerors in terms of adjectival ratings and evaluated cost/price.  Id. at 79-80.  For each offeror except Jacobs, the SSAC recommended that the offeror be eliminated from further consideration for award based on its lower ratings.[4]  Id.  The SSAC then reviewed specific evaluated strengths for Jacobs’s FPR.  Id. at 81‑82.  The SSAC concluded by noting that Jacobs’s adjectival ratings were superior to other offerors, that its cost/price had been “the most stable and consistent” throughout the procurement, and that it had remained the “[m]ost consistent proposal . . . since inception when considering Mission Support and Management factors.”  Id. at 83. 

The SSAC’s briefing for the SSA consolidated the evaluation of each offeror’s proposal and the SSAC’s recommendation that all other offerors be eliminated from competition and award made to Jacobs.  See generally, id.  The SSA reviewed the SSAC briefing and the evaluation summary.  She prepared a source selection decision, which reiterated the evaluation summary for each offeror and discussed whether Jacobs’s FPR provided sufficient technical superiority to justify award at its higher price compared to MTTJV and the other remaining competitors.  The SSA concluded that Jacobs’s proposal was the best value, and selected it for award.  AR Tab 21, Source Selection Decision Document (SSDD), at 10-11.  On November 14, the Army announced the award to Jacobs and this protest followed. 

ANALYSIS

MTTJV challenges the evaluation of its proposal under the mission support and management factors.  The protester argues that the Army unreasonably assigned weaknesses to its proposal based on unstated evaluation criteria, and failed to recognize as strengths several aspects of its proposal that were identified as important strengths in Jacobs’s proposal.  As a result, MTTJV argues that the Army unreasonably assigned MTTJV lower adjectival ratings, and then improperly relied on those adjectival ratings in the source selection decision as the basis to justify awarding the contract to Jacobs at its higher evaluated cost/price. 

We note at the onset that the evaluation of proposals is largely a matter within the contracting agency’s discretion, so our Office’s role is not to reevaluate proposals but instead to examine the record to determine whether the agency’s judgment was reasonable and consistent with the stated evaluation criteria and applicable procurement statutes and regulations.  Hi-Way Paving, Inc., B-410662, Jan. 21, 2015, 2015 CPD ¶ 50 at 3.  In making a best-value tradeoff between non‑price factors and cost/price, a source selection authority has broad discretion, which is subject only to the tests of rationality and consistency with the evaluation criteria.  The propriety of the price/technical tradeoff decision turns on whether the selection official’s judgment concerning the significance of the difference in the technical ratings was reasonable and adequately justified.  Johnson Controls World Servs., Inc., B-289942, B‑289942.2, May 24, 2002, 2002 CPD ¶ 88 at 6.  Nevertheless, systems such as point scoring or adjectival ratings are merely guides to intelligent decisionmaking, but cannot take the place of a reasoned tradeoff decision.  Science Applications Int’l Corp., B‑407105, B‑407105.2, Nov. 1, 2012, 2012 CPD ¶ 310 at 7; Shumaker Trucking & Excavating Contractors, Inc., B‑290732, Sept. 25, 2002, 2002 CPD ¶ 169 at 6. 

We address a selection of MTTJV’s evaluation challenges, as examples, and we consider the arguments over the source selection rationale below.  We have also reviewed several other arguments that MTTJV raises, such as contentions that the evaluation reflected unequal treatment, but we do not address in detail here even though we have determined that they also lack merit. 

Mission Support Factor

Under the technical approach subfactor, MTTJV argues that the Army unreasonably assigned a weakness to its proposal based on concerns that the firm’s proposed staffing was “[DELETED],” and thus posed a risk of unsuccessful contract performance.  Protest at 16-18.  In this regard, the RFP indicated that the Army would evaluate whether the proposal showed the

ability to demonstrate how it plans to perform multiple simultaneous technical requirements at different locations while managing personnel and responding to potential problem areas; the extent to which the offeror demonstrates a thorough understanding of the required safety measures necessary to support multiple complex and hazardous tests in remote environments and proposes an effective approach to risk identification and mitigation; the offeror’s ability to manage shifts of workload based on the proposed management structure and the offeror’s approach of managing the work authorization order (WAO) process to include the cross-utilization of personnel, optimizing core experience and expertise across the WAOs . . . . 

AR at 6 (citing RFP at 275). 

MTTJV’s initial proposal identified a risk in its staffing, noting that the “biggest issue is [DELETED] contractor support in many capabilities across the facilities.  Specifically, some facility SME [subject matter experts] leads have [DELETED] on station.”  AR Tab 40, MTTJV Mission Support FPR (Redlined Version), at 15 (text shown as deleted from FPR).  The Army expressed concern during discussions with the firm’s solution to [DELETED] staffing for facilities leads.  AR Tab 19, Second Evaluation Summary for MTTJV, at 3.  In response, MTTJV’s FPR added an explanation describing its plans for cross-training and cross-utilization, identifying specific personnel to serve as backups for facility leads, and noting that it had access to [DELETED] testing and evaluation professionals with a “reach-back capability [to] provide[] continuity of effort in the event ‘[DELETED]’ SMEs are not available.”  AR Tab 39, MTTJC Mission Support FPR, at 12.  MTTJV further explained that in 2013, it had “received reach-back support from [an affiliated company] to help design test cases for DCGS-A [Distributed Common Ground System-Army] tests at USAEPG.”  Id. at 45.  MTTJV also stated that it had used its reach-back resources to obtain [DELETED] full-time equivalents for 5 months, at significant cost savings, to support fleet milestones, and had brought in an SME to develop scenarios and to lead panels of senior experts and customers to facilitate a technical exchange of analysis.  Id. at 47. 

In evaluating the FPR, the Army determined that the weakness should remain.  While the evaluators agreed that the FPR had addressed the concern over facility team lead staffing, they also concluded that MTTJV had not addressed “critical SMEs unique to each facility which may not necessarily be the facility leads . . . or if these positions are [DELETED].”  AR Tab 18, FPR Evaluation Summary for MTTJV, at 2.  MTTJV argues that the assessment of a weakness was unreasonable because it provided an adequate technique to address the potential risk, and showed successful use of that technique on the incumbent contract.  Further, MTTJV argues that the agency’s concern over the staffing of other than facility SME leads constituted the application of unstated evaluation criteria and was based on speculation.  Protest at 18-20. 

The Army counters that the evaluators properly could not consider information outside of MTTJV’s proposal in determining whether its efforts to mitigate the risk of [DELETED] staffing had been resolved.  AR at 11.  In response to MTTJV’s argument that the evaluation of this weakness was based on unstated evaluation criteria, the Army argues that it properly found that MTTJV had only addressed part of the agency’s concern over the firm’s explanation of the risk to performance of [DELETED] staffing.  Id. at 12. 

We see no basis to conclude that the evaluation of this weakness was unreasonable.  As explained above, MTTJV identified the risk of [DELETED] staffing “in many capabilities across the facilities” and specifically recognized the risk of a staffing approach that “some facility SME leads have [DELETED] on station.”  AR Tab 40, MTTJV Mission Support FPR (Redlined Version), at 15.  While MTTJV argues that it properly mitigated that risk--principally by identifying a specific person capable of serving as the backup for one position at each facility--that approach did not address contingency planning for other facility SME leads, which was the basis for retaining the weakness.  In our view, the Army reasonably determined that its concerns were not completely addressed in MTTJV’s FPR.  We also disagree that the Army relied on unstated evaluation criteria in assessing a weakness for MTTJV’s planning for backups for other SMEs, beyond the facility leads.  Offerors were required to address their “plans to perform multiple simultaneous technical requirements at different locations while managing personnel and responding to potential problem[s].”  See RFP at 275.  On this record, we regard the agency’s evaluation as both reasonable and consistent with the criteria in the RFP. 

MTTJV next argues that the agency improperly assigned its proposal a weakness for failing to provide an analysis or explanation of the reduction in labor hours it included in its FPR for the base period of the contract.  Protest at 14. 

The Army explains that MTTJV reduced the number of full-time equivalents from [DELETED]in its initial proposal to [DELETED]in its FPR, and that more than half of these reductions came from tasks that the agency characterized as critical.  AR Tab 18, FPR Evaluation Summary for MTTJV, at 3.  The agency found that reducing the number of skilled and experienced positions impeded MTTJV’s ability to perform the mission.  The Army also noted that the proposal’s failure to provide an analysis or explanation of how the reduction benefited the mission increased the risk of unsuccessful contract performance.  Id. 

MTTJV argues that it did not provide an explanation in its FPR because its proposal indicated that it used the same process for labor-hour estimating in its FPR as it had in its initial proposal.[5]  Id. at 23.  In our view, this argument actually underscores the reason for the agency’s concern:  MTTJV used the same process to estimate its hours, but arrived at a very different result, with no explanation for the difference.  The protester further argues that over the course of the procurement, from the March 2014 initial proposals until the October 2016 final FPR submission, the firm’s work as the incumbent contractor allowed it to become more experienced and more efficient.  Id.  However, none of this explanation actually appears in the protester’s FPR.  An offeror is responsible for submitting a well-written proposal with adequately detailed information that allows for meaningful review by the procuring agency.  Abacus Tech. Corp.; SMS Data Prods. Group, Inc., B-413421 et al., Oct. 28, 2016, 2016 CPD ¶ 317 at 19.  As a result, we have no basis to question the reasonableness of the agency’s evaluation. 

The protester also argues that the Army’s final evaluation improperly removed a strength that had been assigned to its proposal during the initial evaluation concerning the firm’s cost reduction efforts.  Protest at 21.  MTTJV argues that the strength was based on its performance as the incumbent and that its successful efforts as the incumbent did not disappear during the corrective action period.  Id. 

The strength assigned to MTTJV’s initial proposal was based on its cost-reduction efforts from the incumbent contract, which were achieved through the use of [DELETED]personnel to supplement and bolster EPG’s core test support work force during peak labor periods.  AR Tab 20, MTTJV Initial Proposal Evaluation Summary, at 4.  The agency explains that, when MTTJV revised its cost proposal the first time, it reduced the number of labor hours for full-time personnel while increasing the proposed number of [DELETED]personnel.  The evaluators found that this change negated the initial strength and increased performance risk because, with the increased number of [DELETED]personnel, there was no longer a potential for cost reductions by using [DELETED].  AR Tab 19, Second Evaluation Summary for MTTJV, at 4; AR Tab 18, MTTJV Final Evaluation Summary, at 4.  The agency explains that the strength assigned to MTTJV’s initial proposal was based on the balance of full-time and [DELETED]personnel in meeting the requirement.  When MTTJV revised its cost proposal, the agency concluded that it had created a performance risk because of its increased reliance on [DELETED]personnel to perform critical functions.  AR at 15. 

MTTJV’s argument that its successful efforts as the incumbent did not disappear during the corrective action period fails to recognize that the strength was forward-looking.[6]  That is, the initial proposal provided a basis for the evaluators to conclude that the government would receive the same benefits in the future as in the past.  However, when MTTJV revised its proposal, the agency concluded that any benefit in cost reduction was now accompanied by a risk, and that the proposal no longer warranted the strength.  MTTJV has given us no basis to find that this conclusion was unreasonable. 

Finally, under the technical expertise subfactor, MTTJV contends that its proposal was evaluated unequally--treated unequally, from that of Jacobs.  Specifically, the protester argues that both firms proposed similar access to SMEs but only Jacobs’s proposal was evaluated as having a strength.  This argument has no merit.

Where a protester alleges that a technical evaluation is the product of unequal treatment, the protester must show that the differences in ratings were, in fact, the result of unequal treatment, rather than differences in the offerors’ proposals.  See Northrop Grumman Sys. Corp., B-406411, B-406411.2, May 25, 2012, 2012 CPD ¶ 164 at 8.  Here, the RFP required all offerors to propose access to SMEs.  E.g., RFP at 25 (SMEs “serve as technical consultants to project managers”), 32‑33 (SMEs provide input, coordination, and comments on test operating procedures), 46 (SMEs included in tier 3 support for complex hardware and operating system software).  The agency assigned a strength to Jacobs’s proposal not just because it proposed access to SMEs, but because it proposed access to several highly-skilled technical fellows with expertise in specified areas.  The Army concluded that this feature would be beneficial for addressing difficult technical problems.  AR Tab 22, Source Selection Evaluation Board Report, at 25. 

While MTTJV argues that it also proposed access to SMEs, it makes the argument generally, without identifying specific aspects of its proposal to show that it exceeded the RFP requirement to an extent that was equivalent to Jacobs’s FPR.  As a result we have no basis to find unequal treatment in the evaluation. 

Management Factor

Under the management structure and corporate experience subfactor, MTTJV also argues that the agency evaluated its proposal unequally compared to the Jacobs evaluation.  MTTJV contends that the agency credited Jacobs for its internal quality assurance process for management but made no mention of MTTJV’s quality management system.  Protester’s Comments at 4-5.  As with the firm’s unequal treatment argument above, the protester’s allegation here lacks merit.

The RFP required all offers to address certain elements, as a minimum, concerning quality control.  RFP at 263-64, 277.  The agency states that MTTJV’s quality management system was deemed to have, unlike Jacobs, merely met the requirements.  Supplemental AR at 6.  MTTJV argues that its own quality control procedures were detailed and “top notch,” and disputes that Jacobs’s proposal could have been materially better.  Protester’s Comments at 5; Protester’s Supplemental Comments at 6.  The protester asserts that it used similar terminology (such as having a formalized system, supporting on-site quality control and process improvement, using metrics, and enhancing efficiency and effectiveness of services) to describe its own quality assurance approach as the evaluators used in assigning a strength to Jacobs’s approach.  Id.  We are not persuaded that these superficial similarities are sufficient to call into question the agency’s assignment of a strength for Jacobs’s quality assurance approach.  Northrop Grumman Sys. Corp., supra

Under the phase-in factor, MTTJV’s supplemental protest similarly argues that its proposal was evaluated unequally from the Jacobs proposal, reasoning that as the incumbent, MTTJV’s phase-in approach should have been rated at least as highly as that of Jacobs’s approach.  Protester’s Comments at 6.  In its supplemental agency report, the agency responded to the firm’s argument in detail, emphasizing the narrative explanation provided by the evaluators for rating Jacobs’s proposal highly.  Supplemental AR at 6-7.  In its comments on the supplemental report, MTTJV does not meaningfully challenge the agency’s rebuttal.  We will deny a protest where the protester fails to meet its burden of showing that an evaluation was unreasonable, and in that respect, mere disagreement with the agency, without more, does not suffice to show that an evaluation was unreasonable.  Globecomm Sys., Inc., B-405303.2; B‑405303.3, Oct. 31, 2011, 2011 CPD ¶ 243 at 9; see also 22nd Century Techs., Inc., B-412547 et al., Mar. 18, 2016, 2016 CPD ¶ 93 at 10 (issues not rebutted in comments are deemed abandoned). 

Source Selection Decision

MTTJV argues that the source selection decision was unreasonable because it was based on the evaluation errors discussed above.  The protester also contends that the agency evaluation was unreasonable because, in recommending Jacobs for award, the SSAC emphasized an aspect that is unrelated to the evaluation criteria in the RFP:  that Jacobs’s proposal had been the “most stable and consistent” of all the offerors throughout the procurement.  Supplemental Protest at 7-8.  The protester also argues that the SSA over-relied on SSAC views that focused on strengths of Jacobs’s FPR to recommend award to it, while disregarding advantages of MTTJV’s FPR. 

In making a source selection decision, the SSA need not review the proposals or complete evaluation documentation, but instead can rely upon a briefing that presents the results of the evaluation.  Sabreliner Corp., B-242023, B-242023.2, Mar. 25, 1991, 91‑1 CPD ¶ 326 at 11.  Furthermore, a source selection decision need not address and discuss every evaluated strength and weakness of the competing proposals; rather, the decision must sufficiently document the rationale for business judgments and tradeoffs made by the SSA, and the benefits associated with additional costs.  E.g., TeleCommunication Sys., Inc., B‑408269.2, Dec. 13, 2013, 2013 CPD ¶ 291 at 4; see also FAR § 15.308. 

With respect to MTTJV’s alleged evaluation errors, as discussed above, we find no flaw in the evaluation, and thus no related error in the best-value tradeoff.  With respect to the argument that the best-value tradeoff improperly considered the stability and consistency of Jacobs’s FPR, MTTJV acknowledges that the SSA did not include that consideration in explaining the source selection decision rationale, but argues that given the SSA’s reliance on the SSAC’s conclusions in other respects, the allegedly improper consideration of stability and consistency affected the SSA’s judgment. 

We recognize that, as MTTJV argues, much of the SSA’s stated rationale repeats the SSAC’s evaluation conclusions.  Nevertheless, we cannot adopt MTTJV’s argument that the SSAC’s application of an allegedly unstated and contradictory criterion should be presumed to have affected the SSA’s judgment.  In this regard, the SSA’s source selection decision makes no reference to the issue.  To the contrary, the record here adequately documents the SSA’s independent judgment--including omission of the SSAC’s consideration of proposal stability or consistency--and properly identifies aspects of Jacobs’s proposal that the SSA considered in determining that Jacobs’s proposal offered the best value to the government despite its higher evaluated cost/price.  Accordingly we deny MTTJV’s challenge to the best-value tradeoff. 

The protest is denied. 

Susan A. Poling
General Counsel



[1] The protester states that one of its joint venturers is the incumbent contractor.  Protest at 1. 

[2] Except where specifically indicated, citations to the RFP in this decision are to the conformed RFP provided at Tab 3 of the Agency Report (AR) exhibits.  The RFP also provided an option for up to 6 additional months of services per Federal Acquisition Regulation (FAR) clause 52.217-8.  RFP at 147. 

[3] The Army explains that WAOs are issued by Army Electronic Proving Ground (EPG) personnel to assign test services work to the contractor, and to track and bill the costs incurred.  However, WAOs are not issued by a contracting officer, and are not task orders.  Id. at 165; Email from Army Counsel to GAO, Feb. 3, 2017, at 1. 

[4] For one of these offerors, the SSAC also identified its higher price as supporting its elimination from consideration.  Id.

[5] MTTJV also argues that neither the RFP nor the agency required a certain level of effort and that the agency is improperly holding the firm to an unstated level of effort.  However, it is readily apparent from the record that the agency’s concern was, in fact, that MTTJV did not explain why its level of effort changed. 

[6] Here, and in other instances, MTTJV has argued that the contemporaneous record lacks a sufficient explanation for the Army’s evaluation judgments, and thus the agency’s offered justifications in support of the evaluation during this protest should be afforded little weight.  However, in these circumstances, our Office will consider post-protest explanations because they provide a detailed rationale for contemporaneous conclusions and simply fill in previously unrecorded details.  Global Integrated Sec. (USA) Inc., B-408916.3 et al., Dec. 18, 2014, 2014 CPD  375 at 11; recon. denied, Global Integrated Sec. (USA) Inc.--Recon., B‑408916.6, Sept. 30, 2015, 2015 CPD  305. 

Downloads

GAO Contacts

Office of Public Affairs