Skip to main content

Dewberry Crawford Group; Partner 4 Recovery

B-415940.12,B-415940.14,B-415940.20,B-415940.21,B-415940.26,B-415940.27 Jul 02, 2018
Jump To:
Skip to Highlights

Highlights

Dewberry Crawford Group (DCG), of Fairfax, Virginia, and Partner 4 Recovery (P4R), of Germantown, Maryland, protest the award of a contract to CH2M Hill-CDM PA TAC Recovery Services (CCPRS), of Englewood, Colorado, under request for proposals (RFP) No. HSFE80-17-R-0004, issued by the Department of Homeland Security, Federal Emergency Management Agency (FEMA), for advisory and assistance services. The protesters challenge the agency's technical and price proposal evaluations and the selection decision.

We deny the protests.

We deny the protests.
View Decision

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of:  Dewberry Crawford Group; Partner 4 Recovery

File:  B-415940.12; B-415940.14; B-415940.20; B-415940.21; B-415940.26; B-415940.27

Date:  July 2, 2018

Terry L. Elling, Esq., Gregory R. Hallmark, Esq., Elizabeth N. Jochum, Esq., and Rodney M. Perry, Esq., Holland & Knight LLP, for Dewberry Crawford Group; Kevin P. Connelly, Esq., Kelly E. Buroker, Esq., Marques O. Peterson, Esq., and Tamara Droubi, Esq., Vedder Price, P.C., for Partner 4 Recovery, the protesters.
Robert J. Symon, Esq., Aron C. Beezley, Esq., and Lisa A. Markman, Esq., Bradley Arant Boult Cummings LLP, CH2M Hill-CDM PA TAC Recovery Services, an intervenor.
Hillary J. Freund, Esq., and Nathaniel J. Greeson, Esq., Department of Homeland Security, for the agency.
Paula J. Haurilesko, Esq., Young H. Cho, Esq., and Laura Eyester, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1.  Protest that the agency unreasonably assigned significant weaknesses and weaknesses to the protesters’ proposals is denied, where the record shows that the agency reasonably concluded that the proposals did not meet the solicitation requirements.

2.  Protest that the agency engaged in unequal treatment is denied, where the differences in ratings stemmed from actual differences between the offerors’ proposals.

3.  Protest that the agency made an unsupportable selection decision is denied where the decision was reasonable and the source selection authority relied on a detailed draft decision document in exercising his independent judgment.

DECISION

Dewberry Crawford Group (DCG), of Fairfax, Virginia, and Partner 4 Recovery (P4R), of Germantown, Maryland, protest the award of a contract to CH2M Hill-CDM PA TAC Recovery Services (CCPRS), of Englewood, Colorado, under request for proposals (RFP) No. HSFE80-17-R-0004, issued by the Department of Homeland Security, Federal Emergency Management Agency (FEMA), for advisory and assistance services.  The protesters challenge the agency’s technical and price proposal evaluations and the selection decision. 

We deny the protests.

BACKGROUND

The RFP, which was issued on May 1, 2017, provided for the award of three indefinite-delivery, indefinite-quantity (IDIQ) contracts--one for each of three geographical zones--for nonprofessional and professional advisory and assistance services to support FEMA staff in providing disaster assistance through FEMA’s public assistance program, known as the Public Assistance Technical Assistance Contract (PA-TAC) program.  Agency Report (AR), Tab E, RFP, at 6, 11-12.  The RFP contemplated a 1‑year period of performance and four 1-year option periods.  Id. at 18.  These protests pertain to the award for Zone 3, which covers FEMA regions 2, 7, 9, and 10.[1]  Id. at 12.

The RFP provided for award to the offeror whose proposal offers the best value to the government, considering the following factors (in order of importance):  technical, past performance, and price.  Id. at 74.  The technical and past performance factors, when combined, were significantly more important than price.  Id.  The technical factor was comprised of the following three subfactors:  technical and management approach and capabilities; key personnel; and quality control plan.  Id. at 76, 149.  The RFP identified the key personnel as the program manager, deputy program manager, contract manager, and deployment/readiness manager.  Id.

The RFP required offerors to complete a pricing schedule that contained a fixed-price contract line item number (CLIN) for readiness management and administration, and four CLINs for disaster efforts:  management and administration (fixed price), labor (fully burdened labor rates), travel, and other direct costs.  Id. at 72, 124.  The RFP included ‟plug” numbers for the travel and other direct costs CLINs.  Id. at 124.

The RFP stated that prices would be evaluated for fairness and reasonableness for the base year and all four option years using one or more of the following techniques:  comparison of proposed prices, comparison with the independent government cost estimate (IGCE), comparison with available historical information, or comparison with resources proposed.  Id. at 79.  The RFP also stated that the price analysis would be performed on the total of the readiness management and administration and disaster CLINs located on the roll-up tab of the pricing schedule, which included the plug numbers.  Id. at 79, 124.

FEMA received eight proposals for Zone 3.  AR, Tab B, Source Selection Decision Document (SSDD), at 2.  The agency evaluated proposals, and awarded the contract to CCPRS on December 16, 2017.  COS/MOL at 13.  After receiving debriefings, DCG and P4R protested the award to CCPRS in January 2018.  Id.  FEMA subsequently advised our Office that it planned to review the parties’ proposals and evaluations to ensure that the evaluation criteria were applied in accordance with the solicitation, issue a new or revised source selection decision document, and if appropriate, make a new award decision.  FEMA Corrective Action Letter (B-415940 et al.), Feb. 20, 2018.  As a result, the protests were dismissed as academic on February 22, 2018.  Partner 4 Recovery; Dewberry Crawford Group, B-415940 et al., Feb. 22, 2018 (unpublished decision).

After the agency reevaluated proposals, the following adjectival ratings were assigned:[2]

  DCG P4R CCPRS
TECHNICAL ACCEPTABLE ACCEPTABLE VERY GOOD
  Technical & Management
  Approach & Capabilities
Very Good Acceptable Very Good
  Key Personnel Acceptable Acceptable Very Good
  Quality Control Plan Marginal Acceptable Very Good
PAST PERFORMANCE Substantial Confidence Substantial Confidence Substantial Confidence
PRICE $501,575,705 $399,505,623 $477,891,967

AR, Tab B, SSDD, at 59, 76-77.

As part of the evaluation, the offerors’ proposals were assigned numerous significant strengths, strengths, and weaknesses under each of the technical subfactors and the past performance factor.[3]  In evaluating price, the evaluation team compared offerors’ total prices to the IGCE and to each other, noting that all offerors were below the IGCE.  AR, Tab N, Price Analysis, at 1.  The evaluators also noted a discrepancy between the plug numbers in the IGCE and the ones provided in the RFP for offerors to use for the other direct costs CLIN.  Id. at 1-2.  In comparing the offerors’ total prices against each other, the evaluators noted that P4R offered the lowest price, which was about 19.6 percent lower than the next lowest-priced offeror, CCPRS.  Id. at 2.  The evaluators also noted that the price difference between CCPRS and the next lowest-priced offeror, DCG, was about five percent.  Id.

Additionally, the evaluation team compared the offerors’ line item prices against each other.  Id. at 1-2.  The evaluators concluded that the differences in pricing and the higher priced CLINs did not present any risk to the agency.  Id.  All offerors’ prices were found to be fair and reasonable.  Id.

The source selection evaluation board (SSEB) provided the source selection authority (SSA) with a revised SSDD.  AR, Tab AF, Decl. of SSA, at 2.  The SSA performed a review of the revised SSDD, concurred with the revised findings and recommendations, and made a new award decision.  Id.

The SSA noted that CCPRS’s proposal was the highest technically rated proposal under the technical factor.  AR, Tab B, SSDD, at 78.  The SSA identified notable significant strengths assessed to CCPRS’s proposal under each subfactor.  Id.  The SSA noted that CCPRS’s proposal, which was rated as substantial confidence under the past performance factor, was not the only highly rated proposal for that factor.  However, when the technical and past performance factors were combined, the SSA concluded that CCPRS clearly demonstrated an overall better rating than the other offerors.  Id.

The SSA found that the remaining offerors, including DCG and P4R, were all rated acceptable under the technical factor, which introduced a moderate risk and/or more weaknesses throughout their proposals.  Id. at 79.  The SSA observed that with the exception of P4R, the remaining offerors including DCG, were higher priced than CCPRS.  Id.  As a result, the SSA found that because CCPRS’s proposal was higher rated and lower priced, no tradeoff was necessary with those offerors.  Id.

With respect to the comparison between P4R and CCPRS, the SSA identified notable strengths and weaknesses of the offerors.  The SSA noted that P4R was assessed significant weaknesses under the key personnel subfactor with regard to the deployment/readiness manager’s experience as well as a weakness with regard to the contract manager’s experience and education.  Id.  By contrast, the SSA noted that CCPRS was assessed a single weakness for its contract manager but that CCPRS’s proposed program manager, deputy program manager, contract manager, and deployment/readiness manager had extensive experience in their respective positions.  Id.  The SSA noted that, under the quality control plan subfactor, while both offerors proposed the plan-do-check-act continuous quality improvement approach, P4R stated that it would not perform quality reviews on [DELETED] because P4R assumed FEMA did not desire such a hands-on approach.  Id. at 79-80.  By contrast, the SSA noted that CCPRS proposed a corrective action and preventative action plan, which was the strongest component of its quality control plan; a self-inspection plan demonstrating its understanding and commitment to addressing quality concern; and a proactive approach in the acceptance of work.  Id. at 80.  With respect to past performance, the SSA noted that both offerors received a substantial confidence past performance rating, and neither had any weaknesses identified under the factor.  Id. at 29-30. 

The SSA found that even though both offerors were assigned identical ratings under the past performance factor, CCPRS’s significant strengths and minimum weaknesses under the technical factor, when compared to P4R’s proposal, represented the best value to the government. 

In considering price, the SSA noted that CCPRS’s price was the second lowest-priced offer and that P4R offered the lowest price.  Id. at 80.  However, the SSA found that based on the comparative technical evaluation and basis for award, CCPRS’s technical proposal provided significant strengths and strengths, and minimum risk, when compared to P4R’s technical proposal, and those benefits would outweigh any cost savings.  Id.  The SSA stated that CCPRS’s price of $477,891,967 was considered fair and reasonable as it was the second lowest proposed price.  Id.  The SSA concluded that award to CCPRS was in the best interests of the government.  Id.

After a debriefing, in which FEMA provided offerors with the adjectival ratings, significant weaknesses, and total price, DCG and P4R protested to our Office.

DISCUSSION

DCG and P4R challenge various aspects of the evaluation of their own and CCPRS’s proposals.  For example, the protesters challenge the assignment of significant weaknesses and marginal ratings to their proposals, allege unequal treatment, and challenge the best-value tradeoff decision.  We have considered all of DCG’s and P4R’s many protest grounds, and although we address only a portion of the arguments, we find that none provide a basis to sustain the protest.

Technical Evaluation

In reviewing protests challenging the evaluation of proposals, we do not conduct a new evaluation or substitute our judgment for that of the agency but examine the record to determine whether the agency’s judgment was reasonable and in accord with the RFP evaluation criteria.  Watts-Obayashi, JV; Black Constr. Corp., B-409391 et al., Apr. 4, 2014, 2014 CPD ¶ 122 at 9.  A protester’s disagreement with the agency’s judgment, without more, is not sufficient to establish that an agency acted unreasonably.  22nd Century Techs., Inc., B-413210, B-413210.2, Sept. 2, 2016, 2016 CPD ¶ 306 at 8.  Moreover, it is an offeror’s responsibility to submit an adequately written proposal that demonstrates the merits of its approach; an offeror runs the risk of having its proposal downgraded or rejected if the proposal is inadequately written.  Id.

DCG Evaluation Challenges

Deputy Program Manager

DCG argues that FEMA unreasonably assessed a weakness against its proposal under the key personnel subfactor based on the education and experience of DCG’s deputy program manager.  DCG’s Comments & Supp. Protest, May 3, 2018, at 15-16.  DCG first argues that the agency’s criticism that the proposed deputy program manager lacked a bachelor’s degree in business or related technical field is unreasonable because DCG’s proposal showed that this individual possessed a master’s degree in physical oceanography management.  Id.  DCG contends that a bachelor’s degree in a technical field is a necessary precursor to the master’s degree possessed by this individual, and therefore, DCG demonstrated that this individual possessed a bachelor’s degree in a related technical field.  Id.  The protester next challenges the agency’s criticism that DCG’s proposal did not demonstrate this individual’s experience establishing and implementing objectives and monitoring project progressions.  Id. at 16.  In this regard, the protester points to this individual’s experience as a program manager under three contracts and argues that by acting as program manager for these three contracts, this individual ‟established and implemented objectives and monitored project progression.”  Id.

The agency explains that while the solicitation explicitly required the deputy program manager have a minimum of a bachelor’s degree in business or related technical field, DCG’s proposed deputy program manager’s resume only stated that this individual had a master’s degree in physical oceanography management and a bachelor’s of science degree from the U.S. Naval Academy, without specifying any fields.  COS/MOL at 54 (citing AR, Tab J, DCG Proposal, at 24); see also RFP at 14.  While the agency offers several responses for why it assessed a weakness here, it ultimately explains that this individual’s resume, which provided general information relating to experience, did not indicate that the individual ever had the responsibility of establishing and implementing objectives and monitoring project progression.  Id.

On this record, we have no basis to object to FEMA’s evaluation.  We agree with the agency that the deputy program manager’s resume does not state anywhere that this individual had the responsibility of establishing and implementing objectives and monitoring project progression.  See id. at 24-25.  To the extent DCG expected the agency to make that assumption because this individual had experience as a program manager on certain projects, it was DCG’s responsibility to submit an adequately written proposal that demonstrated the merits of its approach; an offeror runs the risk of having its proposal downgraded or rejected if the proposal is inadequately written.  22nd Century Techs., Inc., supra.  

Contract Manager

DCG challenges the agency’s assessment of weaknesses against its proposal under the key personnel subfactor with regard to its contract manager.[4]  Specifically, DCG argues that the agency’s assessment of a weakness for failure to include timeframes for its proposed contract manager’s experience in contract management was based on an unstated evaluation criterion.  DCG Comments & Supp. Protest, May 3, 2018, at 17-19. 

The agency explains that while DCG’s proposed contract manager’s resume included a number of relevant positions and reflected experience in both project management and contract management, the resume did not specify the duration for each position.  The agency states it therefore found that the resume did not contain sufficient information to determine whether this individual satisfied the requirement to possess a minimum of five years in contract management experience.  Id.; see also Supp. COS/MOL, May 31, 2018, at 11 (citing AR, Tab J, DCG Proposal, at 26-27). 

Here, the solicitation advised that the agency would evaluate the resume of each key person to determine how well the education, experience, years of experience, and certification conforms to the tasks outlined in the performance work statement (PWS).  Id. at 76.  We do not find unreasonable the agency’s concerns about the lack of details with regard to the contract manager’s experience where the solicitation explicitly stated key personnel resumes would be evaluated to determine the years of experience. 

In addition, DCG’s proposal identified a number of positions that the proposed contract manager held, but failed to provide information as to when or for how long he held each of those positions.  See AR, Tab J, DCG Proposal, at 26-27.  For example, this individual’s resume identified three contracts for which this individual was the program management office manager, program and contract manager, and regional task order manager.  Id.  However, of the three contracts, for only one contract did the resume indicate the duration.  Id. at 27.  Further, while the resume indicated it was a five year IDIQ contract, the resume also stated that this individual was a ‟program and contract manager” without specifying the timeframes for which he held either position.  Id.  For the remaining two contracts, the resume did not indicate the duration of the contract and also included a mix of project management and contract management experience.  Id. at 26-27.  We agree with the agency that there is nothing in this individual’s resume or elsewhere in DCG’s proposal that provided specific details about duration and timing of the contract manager’s contract management experience. 

Quality Control Plan

DCG challenges FEMA’s evaluation of its proposal as marginal under the quality control plan subfactor.  DCG Protest, at 14-18; DCG Comments & Supp. Protest, May 3, 2018, at 2-6.  In this regard, while FEMA assessed a strength to DCG’s proposal under this subfactor for various tools and procedures proposed for monitoring performance, FEMA also assessed a weakness for failing to address subcontractor performance under the individual performance appraisal section.  See AR, Tab M, Technical Evaluation Report (TER), at 28-29.  FEMA also assessed the following significant weaknesses:

Although [DCG] provided a table and matrix containing their quality management tools, procedures and quality standards for objectives 1 and 2, they failed to provide a detailed approach to how they will monitor performance, measure quality of service, perform corrective actions and prevent deficiencies.  [DCG] didn’t provide any substantive examples and provided the same corrective actions for each task process under [i]nvoicing and [r]eporting, i.e. ‟. . . perform a [DELETED]”.  The metrics in [t]able 7 do not provide any measureable and quantifiable actions/outcomes for most of the categories identified (pg. 39-40).  [DCG’s] corrective action for reporting did not address the performance metric as it relates to specific timeframes.  Additionally, the offeror provided metrics and measures for internal training, but they were unclear whether this applies to training [DELETED] (page 34). 

[DCG] fails to describe how they manage staff and how work will be accepted and issued.  Overall, [DCG] demonstrate[s] a lack of understanding of all the requirements of the PWS and increases the risk of unsuccessful contract performance.

Id. at 29.  We address a few examples of DCG’s arguments below.[5]

DCG argues that its proposal included a detailed quality performance matrix for each task or process under the contract.  DCG Protest at 16.  According to DCG, this matrix provided specific performance metrics, a description of the surveillance methods that would be employed to measure compliance with those metrics, and a description of the specific potential corrective actions DCG would take in the event of a quality control issue.  Id.

The agency disagrees and argues that DCG’s quality control plan lacked sufficient details addressing this requirement.  COS/MOL at 57-58.  The agency explains that the solicitation clearly required the offeror’s quality control plan to demonstrate detailed management of all tasks and services.  Id.  The agency contends that this includes staffing, how work would be accepted and issued, and procedures followed to ensure services are performed in a timely manner and of high quality.  Id.   

Here, the solicitation required the contractor to prepare and adhere to an effective quality control plan for use on all task orders.  RFP at 18, 71.  The solicitation provided details on what offerors were to include and address in their quality control plans and how these different elements of the offeror’s quality control plan would be evaluated.  Id. at 18-19, 71, 76-77.  For example, the solicitation instructed offerors to describe measures taken for corrective actions if work was not performed in accordance with the contract terms and conditions.  Id. at 77. The solicitation also advised that the government would evaluate whether the proposal provided a detailed management approach for all tasks and services, and how well the proposal described the monitoring systems and methods that would be used for all aspects of the contract.  Id.

Our review of the record shows that, as DCG argues, its proposal included tables that described its quality management tools and procedures and an outline of quality standards for PWS objectives 1 (readiness management and administration) (table 6) and 2 (professional and nonprofessional services) (table 7).  AR, Tab J, DCG Proposal, at 34-40.  Tables 6 and 7 included columns for a specific task and the corresponding performance metric, method of surveillance, and potential corrective action(s) if a performance level was not achieved.  Id. at 35-40.  For example, in table 6, DCG set forth a column for reporting tasks.[6]  Nonetheless, the performance metrics for this task stated what was required in terms of meeting a specific milestone within a specific timeframe, the proposal did not describe how deficiencies would be reported and what corrective action would be taken and in what timeframe.  Id. at 38. 

More specifically, for the monthly task order status reports task, DCG stated that the performance metric was to provide a [DELETED] report.  Id.  The method of surveillance indicated that DCG’s contract manager would [DELETED] to ensure the report met the schedule requirement.  Id.  However, for the potential corrective action, the table simply stated that ‟[DELETED]” and ‟[DELETED].”  Id.  This same proposed corrective action is repeated verbatim for all of the invoicing and reporting tasks.  Id.  We agree with the agency that DCG’s proposal did not provide a detailed approach to performing corrective actions and preventing deficiencies. 

Similarly, in table 7, while DCG included a column for performance metrics, DCG only provided general statements that did not include any details regarding satisfying any milestones.  See id. at 39-40.  For example, the performance metrics for customer support services included:  [DELETED] representation of FEMA; effective, timely [DELETED] meeting; comprehensive, logical, and prioritized [DELETED]; regular, productive meetings with [DELETED]; timely scheduling and coordination of [DELETED]; confirm all damage descriptions are [DELETED]; appropriately identify [DELETED] support needs; professional, compassionate resolution of [DELETED] issues; and, conduct comprehensive, informational [DELETED].  Id. at 39.  In this regard, we agree that without any specific milestones to achieve within any specified timeframe, the agency was unable to determine whether DCG had proper procedures in place to ensure timeliness. 

Accordingly, on this record, the protester’s challenges do not provide any basis to object to the agency’s assessment of a significant weakness, and assignment of a marginal rating, under the quality control subfactor.

P4R Evaluation Challenges 

Joint Venture Executive Committee

P4R argues that FEMA unreasonably assigned its proposal a weakness under the technical and management approach and capabilities subfactor for mentioning its joint venture executive committee in its proposal without providing specifics on the role of the committee.  P4R Protest at 45-46.  In response, FEMA explains that P4R received a weakness for introducing a joint venture executive committee without providing specifics on the role of the committee, its overall contribution, or its benefit to FEMA and the public assistance program.  COS/MOL at 25.  FEMA states that P4R’s organizational chart includes a direct line between the committee and FEMA, without including any explanation of its role.  Id.  FEMA states that because P4R included the joint venture executive committee in its proposal, the agency could properly consider the adequacy of the level of detail provided by P4R on the role and responsibilities of the committee.  Id. at 25-26. 

P4R first notes that the RFP did not require offerors to propose a committee, and therefore P4R was not required to explain the role of its joint venture executive committee.  P4R Protest at 45.  In any event, P4R states that its organizational chart shows that the joint venture executive committee is organizationally outside the program management office, thus demonstrating that the committee has no bearing on the items the RFP asked offerors to address.  Id. at 46.  P4R also states that FEMA should have reviewed its joint venture agreement in volume 5 of its proposal for information on the role of the joint venture executive committee.  P4R Comments & Supp. Protest, May 3, 2018, at 32 (citing AR, Tab G, P4R Proposal, at V-20, V‑185).  P4R also argues that the following language in the quality control plan section of its proposal explains sufficiently the role of the committee:  ‟senior managers report directly to the Executive Committee and work with the [program management office] to ensure maximum effectiveness and impact on our delivery.”  Id. at 32-33 (citing AR, Tab G, P4R Proposal, at II-36).

Based on the record before us, we find no basis to sustain the protest for the assessment of this weakness.  Under the technical and management approach and capabilities subfactor, the RFP required offerors to define, among other things, how organizational roles and responsibilities will be divided, and how decisions will be made.  RFP at 76.  Because P4R included the joint venture executive committee in its organizational chart and included a line creating the appearance that the committee had some role between the program manager and FEMA, we find reasonable the agency’s concern.[7]  See AR, Tab G, P4R Proposal, at II-4. 

Deployment/Readiness Manager

P4R argues that FEMA unreasonably assigned a significant weakness to its proposal under the key personnel subfactor for its proposed deployment/readiness manager’s lack of experience within all regions that comprise Zone 3.  P4R Protest at 54-55.  In this regard, P4R states that the RFP only required key personnel to demonstrate experience in Zone 3, and did not specify that an offeror must have experience in each region within the zone.  Id.

In response, FEMA explains that the RFP explicitly advised offerors that key personnel would be evaluated for their knowledge of the proposed zone.  COS/MOL at 33.  FEMA also explains that the RFP expressly outlined the geographical coverage of each zone by FEMA region.  Id. at 37 (citing RFP at 12).  The agency further explains that, based on these solicitation provisions, it is ‟axiomatic” that an evaluation of each key person’s knowledge of the zone would include the extent to which the individual demonstrated an understanding of each region within the zone.  Id.

Here, the RFP informed offerors that, although the predecessor contracts had a nation-wide span of operations, the current solicitation anticipated one contract for each of three geographic zones.  RFP at 12.  Additionally, the RFP identified the FEMA regions that comprise Zone 3--in this case, regions that include states as diverse as Alaska, California, Hawaii, Nebraska, and New York.  Id.  The RFP also stated that the agency expected the contractor for each zone to be responsible for the resource requirements for major disasters and emergencies declared within the zone.  Id.  Moreover, the RFP expressly required offerors to demonstrate each key person’s knowledge of the proposed zone.  Id. at 76.  Thus, in our view, the agency’s consideration of knowledge of a region is reasonably related to, and encompassed by, the subfactor’s stated criteria.[8]  Further, the resume of P4R’s deployment/readiness manager failed to provide evidence of experience in regions 7, 9 and 10.  See AR, Tab G, P4R Proposal, at II-30. 

P4R also argues that FEMA unreasonably concluded that its proposed deployment/ readiness manager lacks relevant experience in performing the duties of the position.  P4R Comments & Supp. Protest, May 3, 2018, at 42.  P4R contends that its proposed deployment/readiness manager demonstrated the requisite Zone 3 experience through her position as a site supervisor for the Department of Housing and Urban Development, in which she managed a survivor intake center in New York after Hurricane Sandy.  Id.  P4R also contends that the individual demonstrated deployment/readiness experience through her 11 years of experience in various positions under prior FEMA public assistance contracts.  Id.

FEMA explains that P4R was assigned a significant weakness because its proposed deployment/readiness manager had no relevant experience related to the position.  COS/MOL at 34.  FEMA further explains that most of the individual’s experience is with scheduling and tracking, which does not demonstrate the quantitative and qualitative aspects of the position--that is, matching the number and quality of staff with the government’s requirements.  Id. at 34‑35.  FEMA states that, although the individual’s resume broadly asserts that she ‟effectively served in a deployment/readiness management role” on two prior public assistance contracts, the details in her resume do not support this statement.  Id. at 35.  FEMA also explains it considered the individual’s experience at the Department of Housing and Urban Development, but concluded that it was not comparable in size, scope, or magnitude to this requirement.  Id.

Based on the record before us, we conclude that the agency’s evaluation of P4R’s proposed deployment/readiness manager was reasonable.  The individual’s Zone 3 experience was limited to supervising 26 temporary employees tasked with processing over 5,000 community development block grant applications in support of the Hurricane Sandy recovery efforts.  AR, Tab G, P4R Proposal, at II-30.  The individual’s FEMA public assistance contract experience, as identified by the protester, consisted of articulating management directives to staff and supervising the execution of work, performing first-line reviews of team deliverables, and providing input to management for team member evaluations.  P4R Comments & Supp. Protest, May 3, 2018, at 42.  P4R fails to explain how these experiences demonstrate experience matching staff with the agency’s requirements for the number and quality of staff.  Accordingly, the protester’s disagreement with the agency’s judgment does not provide a basis to sustain the protest.

Unequal Treatment

DCG and P4R assert that FEMA evaluated proposals unequally with respect to a number of weaknesses, strengths, and significant strengths.  We discuss a few of these, below.

It is a fundamental principle of federal procurement law that a contracting agency must treat all offerors equally and evaluate their proposals evenhandedly against the solicitation’s requirements and evaluation criteria.  ADNET Sys, Inc. et al., B-408685.3 et al., June 9, 2014, 2014 CPD ¶ 173 at 16.  Where a protester alleges unequal treatment in a technical evaluation, it must show that the differences in ratings did not stem from differences between the offerors’ proposals.  Abacus Tech. Corp.; SMS Data Prods. Grp., Inc., B‑413421 et al., Oct. 28, 2016, 2016 CPD ¶ 317 at 11; Beretta USA Corp., B-406376.2, B-406376.3, July 12, 2013, 2013 CPD ¶ 186 at 6.

Quality Control Plan

DCG argues that FEMA evaluated offerors’ proposals unequally under the quality control plan subfactor.  DCG first argues that FEMA credited CCPRS’s proposal with a significant strength for its self-inspection plan on the basis that it provided various methods of monitoring, including interviews with FEMA and CCPRS staff, performance evaluations, and deliverable reviews.  DCG contends that the agency failed to similarly credit DCG’s proposal even though its own self-inspection plan offered varied monitoring methods, including coordination with [DELETED] reviews.[9]  DCG Comments & Supp. Protest, May 3, 2018, at 28. 

In response, FEMA states that while DCG picked items from CCPRS’s significant strength that DCG argues are similar, they are not the same as those proposed by CCPRS.  Supp. MOL, May 18, 2018, at 30-31.  Moreover, FEMA explains that CCPRS’s self-inspection plan was much more detailed and comprehensive than the one offered by DCG and significantly exceeded the solicitation’s requirements.  Id. at 31.

DCG has not demonstrated that the agency evaluated offerors unequally.  The record shows that CCPRS’s proposal offered a detailed self-inspection plan that provided CCPRS’s internal approach to monitoring quality and addressing quality concerns and nonconformance during contract performance.  AR, Tab AG, CCPRS Proposal, Quality Control Plan Subfactor, at 4-10.  This plan included four sections:  monitoring systems and methods to ensure services are performed in a timely manner and high quality; measures taken for corrective action; acceptance/issuance of work; and reporting.  Id.  Under the monitoring systems and methods section, CCPRS proposed interviews of CCPRS staff, interviews with FEMA staff, staff performance evaluation forms, review of work products/deliverables, random inspections, passive surveillance, and internal audits.  Id. at 4-5.

In contrast, the record shows that DCG’s proposal’s self-inspection plan describes the tools (software, forms, and processes) for monitoring system performance, where DCG’s [DELETED] and an associated [DELETED] serve as the underpinning of its quality monitoring approach.  AR, Tab J, DCG Proposal, at 34.  DCG’s proposal identified the following tools:  [DELETED], DCG web portal [DELETED], DCG [DELETED], DCG [DELETED] work plan, internal [DELETED] reviews, DCG management [DELETED].  Id. at 34-35.  Although DCG disagrees with the agency’s judgment, the protester does not explain where in its proposal it identifies the same methods of quality control surveillance as those identified in CCPRS’s proposal.  Accordingly, we conclude that the differences in the assignment of a significant strength to CCPRS’s proposal were based on differences in the offerors’ proposals and not the result of unequal treatment.

DCG next argues that FEMA credited CCPRS’s proposal with a strength for its ‟plan-do-check-act” management method without crediting DCG’s proposal for its ‟define-measure-analyze-improve-control” [10] cycle that the protester contends provides the same benefits.  DCG Comments & Supp. Protest, May 3, 2018, at 29.  FEMA explains that, although DCG argued that its ‟define-measure-analyze-improve-control” cycle also deserves a strength, DCG cites to no portion of its proposal to support its contention.  Supp. MOL, May 18, 2018, at 32.  The agency further explains that each step of each process had its own objectives that led to different outcomes.  Id. at 33. 

Based on the record before us, DCG has not demonstrated that the difference in the evaluations was the result of unequal treatment.  In the agency’s view, CCPRS proposed a continuous improvement system based on early identification of potential and actual problem areas, developing solutions, and preventing recurrence.  See AR, Tab AG, CCPRS Proposal, Quality Control Plan Subfactor, at 2.  In contrast, the agency concluded that DCG’s proposal suggests that its ‟define-measure-analyze-improve-control” cycle is the process describing how it developed its quality control plan.  For example, DCG’s proposal stated that it applied its define-measure-analyze-improve-control business process improvement framework to assess each of the [DELETED], which ‟led to the development of the [DELETED] . . .”.  AR, Tab J, DCG Proposal, at 31.  Further, DCG’s proposal states that it used the ‟define” step in the process to ‟identify [DELETED], assign [DELETED], and define [DELETED].”  Id. at 31-32.  DCG does not explain how its cycle provides the same benefits as CCPRS’s, but simply expresses its disagreement with the judgment of the agency.  As such, we find no basis to sustain the protest.

Joint Venture Board of Directors

P4R argues that FEMA treated offerors unequally when the evaluators did not assess a weakness to CCPRS’s proposal for failing to explain the roles and responsibilities of CCPRS’s joint venture board of directors, in contrast to the weakness assessed to P4R’s proposal for mentioning its joint venture executive committee in its proposal without providing specifics on the role of the committee.  P4R Supp. Comments & Supp. Protest, May 23, 2018, at 42.

FEMA states that P4R is mischaracterizing the awardee’s proposal.  Supp. COS/MOL, May 31, 2018, at 5.  FEMA explains that CCPRS’s proposal identifies the role and responsibilities of its joint venture board of directors.  Id.  FEMA explains, for example, that CCPRS’s proposal stated that the board ‟establishes policy and defines how the [joint venture] partners work together, and oversees the decision authorities and lines of communication of our [program management office].”  Id. (citing AR, Tab AG, CCPRS Proposal, at 18).  FEMA also identifies other examples of CCPRS explaining the role of its board of directors in its proposal.  Id.  FEMA explains that, in comparison, P4R’s joint venture executive committee is mentioned once in an organizational chart and is not referenced again.  Id.  Based on the record before us, P4R has not established that the difference in the offerors’ ratings resulted from unequal treatment.

Training Approach

DCG and P4R both argue that FEMA unequally assigned a significant strength to CCPRS’s proposal with respect to training approaches without also assigning a significant strength to their proposals.  DCG Comments & Supp. Protest, May 3, 2018, at 25-26; P4R Comments & Supp. Protest, May 3, 2018, at 8-9.  The protesters argue that there are no meaningful differences between their and CCPRS’s training approaches.  DCG Comments & Supp. Protest, May 3, 2018, at 26; P4R Supp. Comments & Supp. Protest, May 23, 2018, at 31.  In this regard, both protesters point to numerous features of their training approach.

FEMA states that the training approaches proposed by each offeror were not the same.  Supp. COS/MOL, May 18, 2018 at 20.  FEMA explains that the protesters did not propose [DELETED] training for all staff, [DELETED] plans for each candidate, or [DELETED] as a training tool.  Id. at 20-21. 

Here, the agency has identified a number of specific features of CCPRS’s training approach that are not included in DCG’s and P4R’s proposals, and which the agency found to be of benefit to the government.  In this regard, FEMA identified a number of features in CCPRS’s proposal as forming the basis for the significant strength, such as:  a [DELETED] component to its training for all staff, professional and nonprofessional; [DELETED] plan for each candidate; and the use of [DELETED] to educate staff about FEMA’s PA-TAC program.  AR, Tab M, TER, at 17.  Although DCG contends that its proposal advised FEMA that it, too, offered [DELETED] plan for each candidate, the language that DCG identifies does not specifically state that each candidate would receive [DELETED] plan.[11]  See AR, Tab J, DCG Proposal, at 16.  Similarly, P4R’s plan to ‟[DELETED]” to promote readiness for deployment in [DELETED] does not equate to CCPRS’s plan to train all staff, professional and nonprofessional, in [DELETED].  Compare AR, Tab G, P4R Proposal, at II-7-8 with AR, Tab AG, CCPRS Proposal, Subfactor 1, at 4-5.  Based on this record, we find no basis to conclude that the agency treated offerors unequally.

Key Personnel

DCG and P4R allege that FEMA treated offerors’ proposals unequally in reviewing the experience of key personnel.  DCG argues that, while CCPRS’s contract manager’s resume did not include any dates or timeframes similar to the resume of DCG’s contract manager’s resume, the agency did not assess CCPRS the same weakness it assessed for DCG’s contract manager.  DCG Comments & Supp. Protest, May 23, 2018, at 18‑19.  P4R argues that the resumes of CCPRS’s proposed program manager, deputy program manager, and contract manager also failed to substantiate the experience of these individuals and therefore deserved a significant weakness similar to the one assessed for P4R’s deployment/readiness manager.  P4R Supp. Comments & Supp. Protest, May 23, 2018, at 44-45.

With respect to DCG’s and CCPRS’s contract managers, FEMA states that the concern over the lack of timeframes in DCG’s contract manager’s resume was because the resume indicated both contract and project manager experience, and the agency was unable to determine how much of the experience was related to contract management.  Id. at 11.  FEMA explains that CCPRS’s contract manager’s resume identified experience that was solely contract management experience, and did not commingle various job titles and work experiences.  Id.  In this regard, FEMA states that CCPRS’s proposed contract manager demonstrated experience fulfilling the duties of a contract manager through his experience as a contract manager for many years across multiple contracts, including the predecessor public assistance contract.  Id. at 11 (citing AR, Tab AG, CCPRS Proposal, Subfactor 2-Key Personnel, at 9-12).  With respect to P4R’s deployment/readiness manager, FEMA explains that P4R did not receive a significant weakness for failing to substantiate its deployment/readiness manager’s experience, but rather for the failure of her experience to align with the solicitation requirements for the position.  Supp. COS/MOL, May 31, 2018, at 9.  FEMA also explains that, in comparison, the resumes of CCPRS’s proposed program manager, deputy program manager, and contract manager align with the solicitation requirements.  Id. at 10. 

Here, the protesters have not demonstrated that the differences in the evaluation of proposals stems from unequal treatment.  Although the protesters are correct that the resumes of CCPRS’s key personnel also did not include time frames, the record shows that FEMA reasonably concluded that CCPRS’s key personnel met the minimum requirements based on the information provided that was specifically relevant to the requirements.  For example, the record shows that the resume for CCPRS’s contract manager reflected nothing but contract management experience through management of over 100 contracts, and experiences as a contract manager in multiple regions.  See AR, Tab AG, CCPRS Proposal, Subfactor 2-Key Personnel, at 9-12.  Additionally, as noted above, the record shows that P4R received a significant weakness because its deployment/readiness manager’s resume did not demonstrate experience relevant to the position.  AR, Tab M, TER, at 54.  Thus, we find no merit to this contention.

Selection Decision

DCG and P4R raise a number of challenges to the SSA’s selection decision.  For example, DCG argues that the agency failed to consider whether the technical benefits associated with the significant strengths and strengths assessed to CCPRS’s proposal were worth the price premium, which DCG characterizes as ‟modest.”  DCG Comments & Supp. Protest, May 23, 2018, at 22. 

With respect to the absence of a detailed comparative evaluation of the proposals, since the proposal selected for award was both higher technically rated and lower priced than DCG’s proposal, such a comparative evaluation--i.e., a price-technical tradeoff--was not required.[12]  MD Helicopters, Inc., AgustaWestland, Inc., B‑298502 et al., Oct. 23, 2006, 2006 CPD ¶ 164 at 49 n.49; see also Exeter Gov’t Servs., LLC, B‑400977, B‑400977.2, Apr. 9, 2009, 2009 CPD ¶ 82 at 10 (where the proposal selected for award was both higher rated and lower priced than the protester’s proposal, a price-technical tradeoff was not required). 

DCG and P4R also argue that the agency’s best-value determination was flawed because it was based on errors in the agency’s technical evaluation.  DCG Protest at 23-24; DCG Comments & Supp. Protest, May 3, 2018, at 17; P4R Supp. Comment & Supp. Protest, May 23, 2018, at 21.  As described above, the record does not support the protesters’ challenges to the agency’s evaluation.  Accordingly, we find no merit to this contention.  See SupplyCore, Inc., B-411648.2, B-411648.3, Feb. 21, 2017, 2017 CPD ¶ 72 at 17 (challenge to selection decision based on alleged evaluation errors is denied, where protest of the alleged errors is denied).

Consideration of Past Performance

P4R contends that the SSA failed to adequately consider the past performance discriminators, and failed to look behind the substantial confidence ratings of P4R and CCPRS in its tradeoff analysis.  P4R Protest at 31-36; P4R Comments & Supp. Protest, May 3, 2018, at 25-28.  P4R asserts that, even though P4R and CCPRS were both incumbent contractors, had the SSA looked behind the substantial confidence ratings, he would have recognized the superiority of P4R’s past performance over that of CCPRS, and would have realized that paying CCPRS’s price premium was unwarranted.  P4R Protest at 31.  In this regard, P4R asserts that, because its joint venture was formed from contractors that performed on two of the four incumbent PA-TAC III contracts, its past performance was superior to the performance of CCPRS.  Id.

FEMA states that the SSA properly considered past performance in his tradeoff decision.  COS/MOL at 69.  FEMA explains that both P4R and CCPRS presented past performance that met the relevance criteria, and both offerors received positive ratings on their past performance questionnaires.  Id.  FEMA also explains that CCPRS received a significant strength because of the awardee’s efforts across FEMA regions 2, 9, and 10, whereas P4R did not receive any significant strengths for past performance.  Id.  Finally, FEMA explains that the SSA dedicated the majority of the tradeoff discussion to the most significant discriminators, namely the technical subfactors, which were significantly more important than past performance.  Id. at 70.

Based on the record before us, we find no basis to object to the SSA’s consideration of past performance.  The SSA noted that both P4R and CCPRS received a past performance rating of substantial confidence and were assessed no significant weaknesses or weaknesses.  AR, Tab B, SSDD, at 80.  Additionally, the SSDD contained a summary of each offeror’s past performance contracts.  See id. at 73-74.  Thus, the SSA was aware of the various types of contracts that demonstrated the offerors’ past performance.  The SSA also noted that because the technical factor was significantly more important than past performance, CCPRS’s significant strengths and minimum weaknesses under the technical factor, when compared to P4R, provided the best value to the government.  Although P4R contends that the SSA was required to conduct a more in-depth analysis, an agency is not required to further differentiate between the past performance ratings based on a more refined assessment of the relative relevance of the offeror’s prior contracts, unless specifically required by the RFP.  See Pro-Sphere Tek, Inc., B‑410898.11, July 1, 2016, 2016 CPD ¶ 201 at 9-11; University Research Co., LLC, B‑294358.6, B-294358.7, Apr. 20, 2005, 2005 CPD ¶ 83 at 18.  Here, the RFP did not contain such a requirement.

Consideration of Price

P4R argues that the SSA failed to appropriately consider the large differential between its and CCPRS’s proposed prices.  P4R Protest at 25; P4R Comments & Supp. Protest, May 3, 2018, at 61.  P4R contends that FEMA awarded to the highest-rated offeror, regardless of price, thereby abandoning a best-value award basis.  P4R Comments & Supp. Protest, May 3, 2018, at 60-62; id., May 23, 2018, at 13.  P4R states that the SSA failed to meaningfully consider the $78 million (20 percent) price premium for CCPRS’s ‟slightly higher” rated proposal over P4R’s lower-priced, slightly lower-rated proposal.  P4R Protest at 26-27; P4R Comments & Supp. Protest, May 3, 2018, at 64.

FEMA states that the SSA reasonably selected the highest technically rated proposal at the second-lowest price.  COS/MOL at 64.  FEMA explains that the SSA performed a detailed tradeoff between P4R and CCPRS, in which the SSA documented his assessment of the relative merits and underlying advantages and disadvantages of the offerors’ proposals.  Id. at 66.  FEMA states that P4R overstates the percentage difference in price, which the agency calculated as a 17.9 percent difference.  Id.

In reviewing protests of allegedly flawed ‟best value” determinations, our Office will examine the record to determine whether the agency’s judgments were reasonable and consistent with the solicitation’s stated evaluation criteria and applicable procurement laws.  The Bowen Group, B‑409332.3, Aug. 6, 2014, 2014 CPD ¶ 236 at 7.  Where, as here, a solicitation provides that technical factors are more important than price, source selection officials have broad discretion in determining whether one proposal’s technical superiority is worth its higher price, so long as the agency’s decision is reasonable, consistent with the solicitation’s stated criteria, and adequately documented.  TMM Investments, Ltd., B-402016, Dec. 23, 2009, 2009 CPD ¶ 263 at 4-5.  A protester’s argument that the cost premium is simply too large is not sufficient to establish that the tradeoff was unreasonable.  Beechcraft Def. Co., LLC, B-406170.2 et al., June 13, 2013, 2013 CPD ¶ 147 at 31; see General Servs. Eng’g, Inc., B-245458, Jan. 9, 1992, 92-1 CPD ¶ 44 at 11 (tradeoff reasonable where agency determined that technical superiority of awardee’s proposal was sufficient to offset 125 percent higher cost).

Based on the record before us, we find no basis to sustain the protest.  In his tradeoff between P4R and CCPRS, the SSA reviewed and considered the significant strengths, strengths, weaknesses, and significant weaknesses of the two offerors, enumerating various features of the two offerors’ proposals.  See AR, Tab B, SSDD, at 79-80.  The SSA also reviewed the price analysis.  Id. at 76-77.  The SSA noted that P4R’s proposal was rated as acceptable under the technical subfactor overall and under all of the technical subfactors, but contained less beneficial strengths and significant strengths than CCPRS’s proposal and introduced weaknesses that created a higher risk.  Id. at 79.  The record shows that the SSA also considered whether the relative merits of CCPRS’s proposal warranted paying the price premium.  The SSA concluded that CCPRS’s technical proposal will fully achieve the government’s requirement with minimum risk and will outweigh any cost savings that may accrue from P4R’s proposal.  Id. at 80. 

SSA’s Independent Judgment

Finally, P4R also argues that the SSA’s concurrence with the draft SSDD provided by the SSEB without also receiving a debriefing, asking questions, or making any changes to the SSDD, demonstrates that the SSA failed to exercise his independent judgment.  P4R Comments & Supp. Protest, May 3, 2018, at 69.

FEMA argues that the SSA carefully reviewed the underlying evaluation documents prior to the corrective action.  Supp. COS/MOL, May 18, 2018, at 39.  After proposals were reevaluated, the SSA read the revised SSDD and concurred with the revised findings.  Id.  FEMA contends that the detailed draft SSDD, combined with the SSA’s knowledge from his review prior to making the original award, provided the SSA with a comprehensive understanding of the proposals and their relative merits, which enabled him to make an informed and reasoned judgment based on his independent review of the evaluators’ recommendations.  Id.

Section 15.308 of the Federal Acquisition Regulation (FAR) requires, in the context of a negotiated procurement, that a source selection decision be based on a comparative assessment of proposals against all of the solicitation’s source selection criteria.  The FAR further requires that while the SSA ‟may use reports and analyses prepared by others, the source selection decision shall represent the SSA’s independent judgment.”  Source selection decisions must be documented, and include the rationale and any business judgments and tradeoffs made or relied upon by the SSA.  FAR § 15.308.

We have consistently recognized that agency selection officials have broad discretion in determining the manner and extent to which they will make use of the technical and cost evaluation results in making their determination.  See, e.g., U.S. Facilities, Inc., B‑293029, B‑293029.2, Jan. 16, 2004, 2004 CPD ¶ 17 at 15.  Our Office has explained that so long as the ultimate selection decision reflects the selection official’s independent judgment, agency selection officials may rely on reports and analyses prepared by others.  See, e.g., Puglia Eng’g of California, Inc., B-297413 et al., Jan. 20, 2006, 2006 CPD ¶ 33 at 8.  The fact that the SSA based his decision on the recommendation of the agency evaluators, without performing an independent review of all documentation, is not sufficient to show that the decision did not represent his own independent judgment.  InCadence Strategic Solutions Corp., B‑410431.2, Dec. 22, 2014, 2015 CPD ¶ 57 at 5. 

Here, the SSEB provided the SSA with a detailed draft SSDD that discussed the significant strengths, strengths, weaknesses, and significant weaknesses in each offeror’s proposal.  See AR, Tab B, SSDD, at 59-76.  The SSDD also contained the price analysis and a comparison of the relative merits of the offerors’ proposals.  Id. at 76-80.  Moreover, the SSA states that, based on the information presented in the SSDD, he had a comprehensive understanding of the proposals and relative merits, and concurred with the revised findings and recommendations and made the final selection decision.  AR, Tab AF, Decl. of SSA, at 2.  Accordingly, on this record, we have no

basis to conclude that the SSA failed to exercise his independent judgment or to adequately document the rationale to support his source selection decision.

The protests are denied.

Thomas H. Armstrong
General Counsel



[1] Region 2 is comprised of New Jersey, New York, Puerto Rico, U.S. Virgin Islands, and eight tribal nations; region 7 is comprised of Iowa, Kansas, Missouri, and Nebraska; region 9 is comprised of Arizona, California, Hawaii, Nevada, and the Pacific Islands; and region 10 is comprised of Alaska, Idaho, Oregon, and Washington.  Combined Contracting Officer’s Statement & Memorandum of Law (COS/MOL) at 34 n.8.

[2] As relevant here, a very good rating meant the offeror’s proposed approaches/ solutions were expected to result in full achievement of the government’s objectives with minimal risk; the offer contained significant strengths and minimum weaknesses; the offer indicated a high probability for effective, efficient, and innovative performance; and the offer included solutions for improving overall program compliance, responsiveness, and measurable customer satisfaction.  RFP at 77.  An acceptable rating meant the offeror’s proposed approaches/solutions introduced moderate risk but were considered likely to produce performance results meeting the government’s requirements, and the proposed solution contained a number of strengths, but also contained some weaknesses.  Id.  A marginal rating meant the offeror’s proposed approaches/solutions introduced risk that performance would not achieve the government’s requirements, contained few strengths, and contained significant weaknesses.  Id.

[3] The source selection plan defined a significant strength as an element of a proposal which significantly exceeds a requirement of the solicitation in a way that is very beneficial to the government; a strength as an element of a proposal which exceeds a requirement of the solicitation in a beneficial way to the government; a weakness as a flaw in a proposal that increases the chance of unsuccessful performance; and a significant weakness as a flaw in a proposal that appreciably increases the risk of unsuccessful contract performance.  AR, Tab F, Source Selection Plan, at 11.

[4] The agency evaluators identified the following weakness concerning DCG’s proposed contract manager: 

Although the [contract manager] has noted 21 years of experience in contract management, much of the work experience documented is in relation to project management versus contract management.  The proposal does not provide timeframes for any of the [contract manager’s] work offerings thus [we] cannot confirm if [the proposed contract manager] met the minimum years of requisite experience in contract management versus project management experience.  The candidate’s Zone 3 knowledge does not include Regions 7 and 10 (pages 26-27). 

AR, Tab M, TER, at 28.

[5] In its debriefing, DCG was provided a summary of its significant weaknesses.  In its initial protest, however, DCG did not challenge the significant weakness regarding its failure to manage staff and how work would be accepted and issued.  Compare DCG Protest, exh. A, at 1 with DCG Protest at 14-18.  Because the protester waited until its comments to the agency report to challenge this significant weakness, see DCG Comments & Supp. Protest, May 3, 2018, at 5-6, this protest ground is untimely under our Regulations.  See Red River Computer Co., Inc.; MIS Sciences Corp., B-414183.8 et al., Dec. 22, 2017, 2018 CPD ¶ 7 at 6-7 n.10.  In addition, DCG argued in its initial protest that, to the extent a weakness for failure to address subcontractor performance had been assessed, it was unreasonable.  See DCG Protest at 17-18.  The agency substantively responded to this allegation in its agency report.  COS/MOL at 58-59.  DCG failed to comment on this assigned weakness and we therefore find this protest allegation to have been abandoned.  Batelco Telecomms. Co. B.S.C., B-412783 et al., May 31, 2016, 2016 CPD ¶ 155 at 4 n.5. 

[6] The reporting tasks included monthly [DELETED] reports, [DELETED] reports, monthly [DELETED] reports, [DELETED] reports, and [DELETED] reports.  AR, Tab J, DCG Proposal, at 38. 

[7] To the extent that P4R argues that FEMA should have looked to other sections of its proposal to determine the committee’s role, contracting agencies evaluating one section of a proposal are not obligated to go in search of needed information which the offeror has omitted or failed to adequately present.  Robert F. Hyland & Sons, LLC, B-408940, Dec. 19, 2013, 2013 CPD ¶ 296 at 3.

[8] Although a solicitation must identify all major evaluation factors, it need not identify all areas within each factor that might be taken into account in an evaluation, provided such unidentified areas are reasonably related to, or encompassed by, the stated evaluation factors.  Leidos, Inc., B-414773, B-414773.2, Sept. 12, 2017, 2017 CPD ¶ 303 at 5. 

[9] CCPRS was assessed the following significant strength regarding its self-inspection plan: 

[CCPRS] has a self-inspection plan that provides various methods of monitoring such as:  interviews with FEMA staff, interviews with CCPRS staff, staff performance evaluation forms, review of work products/deliverables, random inspections, passive surveillance and internal audits; allowing the [o]fferor to monitor quality and address any concerns or nonconformance during the execution of PA TAC IV.  [CCPRS]’s use of detailed evaluation forms which acknowledge key performance areas such as PA knowledge, production of quality work products, timeliness, professionalism and adherence to established protocols and procedures to show their understanding and commitment to addressing quality concerns. 

AR, Tab B, SSDD, at 69-70; AR, Tab M, TER, at 20. 

[10] DCG’s proposal explains that the define-measure-analyze-improve-control cycle is a business process improvement framework it used to assess the processes in PWS objectives 1 and 2.  AR, Tab J, DCG Proposal, at 31.

[11] DCG argues that FEMA should have gleaned from its proposal that its training is [DELETED] based on statements that it analyzes particular staffing needs ‟[DELETED]” and ‟monitor[s] [its] disaster workforce to ensure [DELETED].”  DCG Supp. Comments & Supp. Protest, May 23, 2018, at 12 (citing AR, Tab J, DCG Proposal, at 16).

[12] In this regard, the record does not support DCG’s contention that it would have been rated overall very good under the technical factor had the agency properly considered the solicitation’s stated order of importance for each subfactor.  DCG Protest at 22-23; DCG Comments & Supp. Protest, May 3, 2018, at 19-22.  In a question and answer as to what the assigned weight for each subfactor was, the agency stated:  ‟In accordance with the solicitation Section M.8, the Government has not assigned weights for each subfactor.  Instead, the Government has assigned ‘order of importance.’”  RFP, Questions & Answers, at 149.  Here, the solicitation stated that the order of importance applied to the factors, not the subfactors.  See RFP at 74.  Consequently, the solicitation did not set forth any order of importance for the subfactors.  See RFP at 76.  As a result, the agency reasonably considered the subfactors to be of equal weight and assigned an overall very good rating to DCG under the technical factor based on the agency’s assessment of a very good rating for the technical and management approach and capabilities subfactor; an acceptable rating for the key personnel subfactor; and a marginal rating for the quality control plan subfactor. 

Downloads

GAO Contacts

Office of Public Affairs