M7 Aerospace LLC
B-411986,B-411986.2: Dec 1, 2015
- Full Report:
M7 Aerospace LLC, of San Antonio, Texas, protests the award of a contract to PAE Aviation and Technical Services d/b/a Defense Support Services LLC (DS2) of Mariton, New Jersey, under request for proposals (RFP) No. N00019-13-R-0092, issued by the Department of the Navy for maintenance and logistics support services for the agency's fleet of F-5 Adversary aircraft. M7 argues that the agency misevaluated proposals and made an unreasonable source selection decision.
We sustain the protest.
DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.
Matter of: M7 Aerospace LLC
File: B-411986; B-411986.2
Date: December 1, 2015
Protest is sustained in a negotiated procurement for award on a best-value basis, which provided for a comparative, qualitative evaluation of proposals, where there is no evidence in the contemporaneous record showing that the agency performed a qualitative assessment of the merits of the offerors’ differing approaches in connection with its source selection decision.
M7 Aerospace LLC, of San Antonio, Texas, protests the award of a contract to PAE Aviation and Technical Services d/b/a Defense Support Services LLC (DS2) of Mariton, New Jersey, under request for proposals (RFP) No. N00019-13-R-0092, issued by the Department of the Navy for maintenance and logistics support services for the agency’s fleet of F-5 Adversary aircraft. M7 argues that the agency misevaluated proposals and made an unreasonable source selection decision.
We sustain the protest.
The RFP contemplates the award, on a best-value basis, of an indefinite-delivery, indefinite-quantity contract for a base year and four 1-year options to perform organizational, selected intermediate, and limited depot-level aircraft maintenance, management, and logistics support services for the Navy’s fleet of variously configured F-5 Adversary aircraft located at the Naval Air Station, Fallon, Nevada; the Marine Corps Air Station, Yuma, Arizona; and the Naval Air Station, Key West, Florida. The RFP advised offerors that the agency would evaluate proposals considering price and two non-price factors, technical and past performance. RFP at 174. The technical and past performance factors were deemed equal in importance, with each deemed more important than price, and with both collectively deemed significantly more important than price. Id. Each of the non-price factors included an enumerated list of elements that the RFP provided the agency would consider in connection with its evaluation. Id. at 175-76.
For the technical factor, the RFP provided that the agency would evaluate proposals considering the following elements: phase in/transition plan, staffing and manning levels, maintenance and maintenance management, management control systems, material management, and the offeror’s strategy for utilizing various categories of small businesses. RFP at 175. For the past performance factor, the RFP provided that the agency would evaluate proposals considering the following elements: meeting technical requirements, meeting schedule requirements, controlling contract cost, managing the contracted effort on similar programs, and small business utilization. Id.
For the evaluation of proposals under the technical factor, the RFP advised offerors that the agency would assign both an adjectival rating of outstanding, good, acceptable, marginal or unacceptable; and a risk rating of low, moderate or high. RFP at 177. The RFP included detailed definitions for each adjectival rating and risk rating. Id.
For past performance, the RFP advised that the agency would evaluate the offerors’ past performance examples for relevancy, and would assign performance confidence ratings based on consideration of both relevancy and quality of performance of the offeror’s past performance examples. RFP at 175-76, 178. The RFP included detailed definitions of the following past performance confidence ratings: substantial confidence, satisfactory confidence, limited confidence, no confidence or unknown (neutral) confidence. RFP at 178.
Finally, as to price, the RFP contemplates the award of a contract that includes fixed-price elements, cost-plus-fixed-fee elements and cost reimbursable elements. RFP at 176. The RFP advised offerors that all three types of contract elements would be evaluated for reasonableness and completeness, and also would be evaluated to determine whether the offeror had a clear understanding of the solicited requirements, and to ensure that the proposal did not include material imbalances between the contract line items and subline items. RFP at 176. In addition, the RFP provided that the fixed-price elements would be evaluated to ensure consistency between the price and technical proposals, and stated that any inconsistency between an offeror’s technical and price proposal for the fixed-price elements could result in the assignment of a proposal risk under the technical evaluation factor. Id. The RFP also provided offerors detailed information regarding how the agency would calculate total evaluated price or cost for the various contract elements, as well as how the agency would calculate overall evaluated price. Id.
In response to the RFP, the agency received six proposals, including those of the protester and DS2. The agency evaluated the proposals and included all six in the competitive range. The agency then engaged in discussions with the offerors and solicited, obtained and evaluated final proposal revisions (FPRs).
At the conclusion of its evaluation of FPRs, the agency assigned all six proposals an adjectival rating of acceptable and a risk rating of low risk under the technical evaluation factor. Agency Report (AR), exh. 26, Source Selection Decision Document (SSDD) at 1. In reaching this conclusion, the agency specifically found that none of the proposals included any strengths, risk reducers, significant weaknesses, uncertainties, or deficiencies. Id. The agency therefore concluded that all six proposals were technically equal, and that there was no basis to discriminate between the proposals under the technical evaluation factor.
Under the past performance factor, the record shows that the agency assigned the protester a satisfactory confidence rating and the awardee a substantial confidence rating. AR, exh. 26, SSDD, at 2. The record shows further that one other firm was assigned a substantial confidence rating, and that the remaining offerors were assigned satisfactory confidence ratings. Id.
The record shows that the awardee had the lowest price among all six offerors (approximately $181.5 million); that the protester had the second-low price (approximately $219.7 million); and that the other firm that had been assigned a substantial confidence rating had the third-low price (that offeror’s price was not included in the record). AR, exh. 26, SSDD, at 2. The record shows that the agency made a comparison of the awardee’s proposal and the third-low offeror’s proposals (because both firms had been assigned the same technical and past performance ratings), and concluded that award to DS2 represented the best value to the government because DS2’s price was lower. Id. The agency also noted--without making a direct comparison of the two proposals from a non-price standpoint--that award to DS2 represented a savings of approximately 21 percent compared to making an award to the protester. Id. After being advised of the agency’s source selection decision and requesting and receiving a debriefing, M7 filed the instant protest.
M7 argues that the record here is completely lacking in documentation to support the agency’s evaluation finding that all six proposals--and more specifically its and the awardee’s proposal--were technically equivalent. In this connection, the protester points out that the contemporaneous record does not include any information showing that the agency critically analyzed the comparative quality of the proposals submitted, or otherwise considered the offerors’ respective technical approaches to meeting the agency’s requirements. M7 asserts that the RFP included a “performance based” statement of work, under which offerors were required to propose a technical solution and an accompanying staffing approach to meeting the agency’s requirements. M7 further notes that the awardee proposed significantly fewer staff to perform the contract than it proposed. M7 argues that DS2’s proposed approach presents a significant risk to successful performance of the contract because of its inadequate staffing, and that the agency’s evaluation simply did not take cognizance of this fact, or critically analyze the impact of DS2’s substantially lower proposed staffing. M7 concludes that the agency’s evaluation here converted the acquisition from a best-value type acquisition to a lowest-priced, technically-acceptable (LPTA) type acquisition.
In reviewing an agency’s evaluation of proposals and source selection decision, we examine the supporting record to determine whether the decision was reasonable, and in accordance with the RFP’s evaluation criteria, along with applicable procurement statutes and regulations. Cherry Rd. Techs.; Elec. Data Sys. Corp., B‑296915 et al., Oct. 24, 2005, 2005 CPD ¶ 197 at 6. The agency must have adequate documentation to support its judgment. Systems Research & Applications Corp.; Booz Allen Hamilton, Inc., B-299818 et al., Sept. 6, 2007, 2008 CPD ¶ 28 at 11-12. Where an agency fails to document or retain evaluation materials, it bears the risk that there may not be adequate supporting rationale in the record for our Office to conclude that the agency had a reasonable basis for its source selection decision. Id.
In addition, where, as here, the RFP contemplates that the relative merits of the proposals will be qualitatively compared, the evaluation may not properly be limited to determining whether proposals are merely technically acceptable. Rather, proposals should be further differentiated to distinguish their relative quality under each stated evaluation factor. Systems Research & Applications Corp.; Booz Allen Hamilton, Inc., supra at 24.
Finally, we note that adjectival or point score evaluation ratings are merely guides to intelligent decision making. Metis Solutions, LLC, et al., B-411173.2 et al., July 20, 2015, 2015 CPD ¶ 221 at 13. Evaluators and source selection officials are required to consider the underlying bases for the ratings assigned, including the advantages and disadvantages associated with the specific content of competing proposals. In this connection, Federal Acquisition Regulation § 15.308 specifically requires a documented decision based on a comparative assessment of proposals against all source selection criteria in the solicitation. Systems Research & Applications Corp.; Booz Allen Hamilton, Inc., supra. While a comparative assessment might be made in the underlying documents upon which the selection decision relies, or in the selection decision itself, it must be documented and reviewable.
Here, the record of the agency’s source selection produced in response to this protest is comprised of consensus evaluation materials for the protester and the awardee, including a source selection evaluation board (SSEB) report, a proposal analysis report (PAR) prepared by the source selection advisory committee (SSAC), two sets of briefing slides (one for the SSAC and one for the source selection authority (SSA)), and the source selection decision document (SSDD) itself.  These materials factually describe each offeror’s proposal, but do not discuss to any meaningful degree the advantages or disadvantages of each offeror’s proposed approach, or the comparative differences between the proposals. Simply stated, there is no qualitative assessment or critical analysis of the relative merits of the offerors’ respective, differing, technical approaches.
In this regard, the SSEB report includes a brief factual description of M7’s and DS2’s technical proposal under each of the technical evaluation elements. These descriptions are largely identical, and vary only slightly based on factual differences between the proposals.
In describing both proposals under the phase in/transition plan, maintenance management, key personnel and small business management plan elements of the technical evaluation factor, the SSEB report for both proposals is word-for-word identical, except for differences in the number of days each firm proposed for phase in (M7 proposed [deleted] days while DS2 proposed [deleted] days), and the small business subcontracting goal each firm proposed (M7 proposed a goal of [deleted] percent, while DS2 proposed a goal of [deleted] percent). AR, exh. 22, SSEB Report at 6, 14.
In describing each firm’s proposal under the staffing element, the SSEB report for each proposal also is virtually identical, except for numeric differences in each offeror’s proposed staffing, along with a brief description of a minor discrepancy between DS2’s proposed staffing in its price proposal versus its technical proposal (DS2’s price proposal included 4 more full time equivalents than its technical proposal). AR, exh. 22, SSEB Report at 6, 14.
Finally, under the management control systems element, the SSEB report is also virtually identical, except for specifying the name of the different management control systems proposed by each firm. AR, exh. 22, SSEB Report at 6, 14.
The remainder of the SSEB report for the two proposals is word-for-word identical. Significantly, the SSEB report for both offerors’ proposals includes the following identical statement of the agency’s rationale for assigning an acceptable adjectival rating and a low risk rating to the two proposals:
5.3 Technical Factor Technical Rating and Risk Assessment Rationale
The findings indicate the proposal meets the government’s requirements and has an adequate approach, and adequate understanding of the requirements. Therefore, the proposal technical rating is ACCEPTABLE.
No significant weaknesses or risk reducers were noted in the proposal. The proposal has little potential to cause disruption of schedule, increased cost or degradation of performance. Normal contractor effort and normal Government monitoring will likely be able to overcome any difficulties. Therefore, the proposal risk is rated LOW.
AR, exh. 22, SSEB Report, at 7, 15.
The PAR is similarly devoid of any meaningful analysis of the comparative merits of the proposals. In comparing all six proposals, the PAR provides, in its entirety, as follows:
In the evaluation of the Technical factor, a technical rating and a risk assessment rating were assessed for each Offeror. Each Offeror’s proposal received a technical rating of Acceptable because each proposal met the requirements and indicated an adequate approach and understanding of requirements. No strengths, deficiencies, or uncertainties in the Technical factor were identified in any Offeror’s proposal. Additionally, a risk assessment rating was assigned for the Technical factor, considering both risk reducers and significant weaknesses, to reflect the Government’s assessment of the potential for disruption of schedule, increased cost, degradation of performance, the need for increased Government monitoring, or the likelihood of unsuccessful contract performance. In the Technical factor, no risk reducers or significant weaknesses were identified in any Offeror’s proposal, therefore, each Offeror was assigned a risk assessment rating of Low.
AR, exh. 25, PAR, at 3.
The two sets of briefing slides also contain no meaningful analysis of the comparative merits of the proposals, and essentially are power point slide presentations of the information included in the SSEB and PAR reports described above. AR, exhs. 23, SSAC Final Briefing, and 24, SSA Final Briefing.
Finally, the SSDD does not include any critical analysis of the comparative merits of the proposals, concluding without elaboration as follows:
I agree with the assessment that all offerors are essentially equal in the Technical factor. Each proposal was assigned an Acceptable technical rating and a Low technical risk rating. None of the proposals had a strength, risk reducer, significant weakness, uncertainty, or deficiency. As such, there is no discrimination between the offerors in the Technical factor.
AR, exh. 26, SSDD at 1.
In sum, the contemporaneous record does not include any information to support the conclusion that the agency, in making its source selection decision, performed a meaningful, qualitative assessment or critical, comparative analysis of the proposals under the technical evaluation factor or its enumerated elements.
Against this backdrop, the record shows that there were substantial differences in the proposed staffing offered by M7 and DS2. Specifically, the record shows that DS2 proposed an average of 22 percent fewer full time equivalent (FTE) employees compared to the staffing proposed by M7. This difference in proposed staffing is approximately equal to the difference in the offerors’ respective prices; DS2’s price was approximately 21 percent lower than M7’s. Moreover, although the record does not include any information about the other four offerors, inasmuch as the record shows that M7 was second-low offeror, it appears that DS2’s proposed price‑and its proposed staffing--were substantially below what the other four, higher-priced, offerors proposed.
In the final analysis, it may well be that the agency had a reasonable basis for concluding, notwithstanding this significant difference in the proposed staffing of DS2 and M7, that the proposals nonetheless were technically equal. However, in the absence of any explanation in the contemporaneous evaluation record, we are left to guess at the reasonableness of the agency’s conclusion. In addition, and more fundamentally, the complete absence of any critical analysis or qualitative assessment of the proposals under the remaining elements of the technical evaluation factor other than staffing also leaves us to guess at the reasonableness of the agency’s broader conclusion that all six proposals submitted were technically equivalent under all of the RFP’s enumerated technical evaluation elements. We therefore sustain M7’s protest.
We recommend that the Navy reevaluate all offerors’ technical proposals and prepare a detailed, comprehensive evaluation record that adequately describes and documents the comparative assessment of proposals against all source selection criteria in the solicitation. We further recommend that the agency make a new source selection decision after performing that reevaluation. Should the agency conclude that another firm properly is in line for award of the contract, we recommend that the agency terminate DS2’s contract for the convenience of the government and make award to that firm, if otherwise proper. Finally, we recommend that the agency reimburse M7 the costs associated with filing and pursuing its protest, including reasonable attorneys’ fees. The protester should submit its certified claim for costs, detailing the time expended and costs incurred, directly to the contracting agency within 60 days after receipt of this decision. 4 C.F.R. § 21.8(f)(1).
The protest is sustained.
Susan A. Poling
 The agency redacted all information relating to the evaluation of the other four offerors. In addition, the record did not include any individual evaluator scoring sheets or reports.
 Offerors were required to propose staffing based on three operational tempos, surge, normal and low. RFP Section B, and at 160. The record shows that M7 proposed, respectively, [deleted] FTEs for surge tempo, [deleted] FTEs for normal tempo, and [deleted] for low tempo. DS2 offered, respectively, [deleted] FTEs for surge tempo, [deleted] FTEs for normal tempo, and [deleted] FTEs for low tempo. This equates to DS2 offering approximately [deleted] percent fewer FTEs for surge tempo, [deleted] percent fewer FTEs for normal tempo and [deleted] percent fewer FTEs for low tempo, or an average of 22 percent fewer FTEs compared to what was offered by M7.
 The record also shows that DS2’s costs for [deleted]. See AR, exh. 22 at 63. The record therefore shows that proposed staffing was the [deleted].
 During the course of this protest, the agency submitted a statement from the technical evaluation team lead. This statement includes a single paragraph in which the team lead provides a brief, largely conclusory, analysis of the DS2 staffing approach. In effect, he represents that the technical evaluation team performed an analysis of DS2’s proposed manning and concluded that it was adequate to meet the RFP requirements. AR, exh. 30, Technical Evaluation Team Lead Statement at 4. However, to the extent such an analysis was actually performed, it either was not documented, or not provided to our Office, and is not supported or corroborated in any way by information in the contemporaneous record. Thus, there is no evidence that this information was known to, or relied upon by, either the SSAC or the SSA at the time the agency made its source selection decision.
 In its initial protest, M7 challenged the agency’s evaluation of its proposal under the past performance factor; alleged that the agency had engaged in misleading discussions with it; and also alleged that the agency had treated the offerors unequally. In its comments responding to the agency report, M7 withdrew these allegations. In addition, in a supplemental protest filed after receipt of the agency report, M7 challenged the agency’s evaluation of DS2’s past performance. We have considered this latter allegation and find it to be without merit.