ASRC Communications, Ltd.

B-414319.2,B-414319.3,B-414319.4,B-414319.5: May 9, 2017

Additional Materials:

Contact:

Ralph O. White
(202) 512-8278
WhiteRO@gao.gov

Kenneth E. Patton
(202) 512-8205
PattonK@gao.gov

 

Office of Public Affairs
(202) 512-4800
youngc1@gao.gov

ASRC Communications, Ltd., of Beltsville, Maryland, protests the award of a contract to OST, Inc., of Washington, DC, under request for proposals (RFP) No. W900KK-15-R-0012, issued by the Department of the Army, Army Contracting Command-Orlando, for system engineering and technical assistance (SETA II) services. ASRC challenges the agency's evaluation of proposals and source selection decision.

We deny the protest.

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of:  ASRC Communications, Ltd.

File:  B-414319.2; B-414319.3; B-414319.4; B-414319.5

Date:  May 9, 2017

Kevin P. Mullen, Esq., Rachael K. Plymale, Esq., and Ethan E. Marsh, Esq., Morrison & Foerster LLP, for the protester.

Craig A. Holman, Esq., Stuart W. Turner, Esq., Michael Samuels, Esq., and Amanda Johnson, Esq., Arnold & Porter Kaye Scholer LLP, for OST, Inc., the intervenor.

Wade L. Brown, Esq., and Stephen J. Faherty, Esq., Department of the Army, for the agency.

Peter D. Verchinski, Esq., and Amy B. Pereira, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

Protest challenging an agency’s evaluation and award decision is denied where the agency’s evaluation was reasonable and consistent with the terms of the solicitation.

DECISION

ASRC Communications, Ltd., of Beltsville, Maryland, protests the award of a contract to OST, Inc., of Washington, DC, under request for proposals (RFP) No. W900KK-15-R-0012, issued by the Department of the Army, Army Contracting Command-Orlando, for system engineering and technical assistance (SETA II) services.  ASRC challenges the agency’s evaluation of proposals and source selection decision.

We deny the protest.

BACKGROUND

The RFP, anticipating a single award indefinite‑delivery, indefinite-quantity (IDIQ) contract, was issued on August 11, 2015, as a small business set-aside, and sought proposals to provide support services for the agency’s Program Executive Office for Simulation, Training, and Instrumentation (PEO STRI) for a 60-month period.  RFP at 28.  The RFP was issued under Federal Acquisition Regulation (FAR) part 15, and stated that award would be made to the offeror providing the best value to the government, considering the following evaluation factors (listed in descending order of importance):  management approach, sample task order approach, past performance, and price.  RFP at 97.  All non-price evaluation factors, when combined, were significantly more important than price.  RFP at 98.  The management approach evaluation factor was composed of three subfactors (all of equal importance):  corporate capability, transition plans, and recruitment and retention.  Id.

Under the corporate capability subfactor, the RFP provided that the agency would evaluate an offeror’s understanding, completeness, and feasibility of its proposed approach to accomplish the requirements through a review of the offeror’s management capability, risk identification and mitigation, flexibility to adjust staffing levels, subcontractor/teaming partner management methodologies, and the organizational structure’s application to the SETA II requirement, among other things.  RFP at 99.  Under the transition plan subfactor, the RFP provided that the agency would evaluate the completeness and feasibility of the proposed transition plan to ensure continuity of services, phase in/phase out strategies, and a mitigation plan to resolve any potential risks relating to transition.  Id.  Finally, under the recruitment and retention subfactor, the agency would evaluate the completeness and feasibility of the recruitment and retention plan “to ensure a stable and qualified workforce over the life of the contract by review of the Offeror’s ability and approach to recruit and retain qualified personnel . . . .”  Id.

With regard to the agency’s evaluation under the sample task order evaluation factor, the RFP provides that the agency:

will evaluate the completeness and feasibility of the proposed approach, and the Offeror’s overall understanding of the task order requirements set forth in the SETA II Sample Task Order Performance Work Statement, taking into consideration staffing, labor mix, efficiencies, and potential risks with the corresponding mitigations or solutions.

Id.

The agency received 17 proposals, including ASRC’s and OST’s, by the RFP’s September 16, 2015 closing date.  After conducting an initial evaluation, the agency included five proposals in the competitive range.  The agency conducted discussions, and the source selection evaluation board (SSEB) evaluated ASRC’s and OST’s revised proposals as follows:[1]

   

ASRC

OST

Management Approach

Corporate Capability

Good

Good

Transition Plans

Good

Good

Recruitment and Retention Plan

Good

Good

Overall Mgmt. Approach

 

Good

Good

Sample Task Order Approach

 

Acceptable

Good

Past Performance

 

Substantial Confidence

Substantial Confidence

Price

 

$141,892,164

$143,336,760


Agency Report (AR), Tab 33, Proposal Evaluation Report, at 19.

The SSEB’s ratings were supported by a narrative, which detailed the evaluators’ findings for each evaluation factor.  With regard to ASRC, the SSEB found a total of eight strengths, no significant strengths, no weaknesses, and no deficiencies under the management approach evaluation factor.[2]  Id. at 91-114.  The eight strengths consisted of two strengths for ASRC’s corporate capability, two strengths for ASRC’s transition plans, and four strengths for ASRC’s recruitment and retention plan.  Id. at 92-114.  For ASRC’s sample task order approach, the SSEB found only one strength, no significant strengths, no weaknesses, and no deficiencies.  Id. at 115-117. 

With regard to OST, the SSEB again found a total of eight strengths, no significant strengths, no weaknesses, and no deficiencies under the management approach evaluation factor.  Id. at 132-160.  The eight strengths consisted of three strengths for OST’s corporate capability, three strengths for OST’s transition plans, and two strengths for OST’s recruitment and retention plan.  Id. at 132-160.  For OST’s sample task order approach, the SSEB found three strengths, no significant strengths, no weaknesses, and no deficiencies.  Id. at 160-174. 

The source selection authority (SSA), reviewed the evaluators’ results and conducted a comparative assessment of the proposals under each evaluation factor.  AR, Tab 36, Source Selection Decision Document (SSDD), at 1-44.  The SSA concurred with the SSEB’s findings, incorporated them into his decision, and conducted a best-value tradeoff analysis.  With regard to ASRC’s lower‑rated and lower‑priced proposal, the SSA noted that the firm offered a lower price, but that OST offered “slight advantages” under the management approach for teaming with the incumbent, and that OST offered a more thorough approach and understanding of the sample task order requirement.  Id. at 43.  The SSA ultimately concluded that the advantages associated with OST’s proposal warranted paying “a contract price premium of 1%.”  Id.

The agency made award to OST, and this protest followed.

DISCUSSION

ASRC raises several challenges to the agency’s evaluation and award decision.  The protester contends that the agency’s evaluation of the offerors’ management approach was flawed because the agency engaged in disparate treatment in assigning strengths to the proposals, and improperly assigned a strength to OST’s proposal.  The protester also challenges the agency’s evaluation of the offerors’ sample task order approach, again asserting that the agency engaged in disparate treatment and improperly assigned strengths to the awardee’s proposal.  ASRC additionally alleges that the agency’s source selection decision was flawed because the agency improperly overvalued certain strengths of the awardee’s, and failed to recognize certain strengths of the protester’s.  For the reasons discussed below, we find no basis to sustain the protest.[3]

Management Approach

ASRC challenges the agency’s evaluation of OST’s and ASRC’s proposed management approach.  The protester contends that the evaluators assigned strengths unreasonably and unequally under the transition plan subfactor.  ASRC points to certain strengths that the agency assigned to OST’s proposal and asserts that either the strength was undeserved, or that the agency failed to recognize a similar strength in ASRC’s own proposal.  Specifically, ASRC alleges that it was improper for the agency to assign strengths to OST’s proposal (for example, for proposing [DELETED]) without acknowledging strengths in ASRC’s own proposed plans to retain incumbent personnel.[4]  We have reviewed each of the protester’s contentions in this regard and find that they do not provide a basis to sustain the protest.  

In reviewing a protest challenging an agency’s technical evaluation, our Office will not reevaluate the proposals; rather, we will examine the record to determine whether the agency’s evaluation conclusions were reasonable and consistent with the terms of the solicitation and applicable procurement laws and regulations.  The Kenjya Group, Inc.; Academy Solutions Group, LLC, B‑406314, B-406314.2, Apr. 11, 2012, 2012 CPD ¶ 141 at 4.  A protester’s disagreement with the agency’s conclusions, without more, does not render the evaluation unreasonable.  The Eloret Corp., B‑402696, B-402696.2, July 16, 2010, 2010 CPD ¶ 182 at 5.

Here, the protester argues that the strengths the awardee received under this evaluation factor were undeserved.  Specifically, the protester asserts that it was improper to assign OST a strength for [DELETED] because this was a “general statement” and the evaluators lacked “information with which they might determine how this would actually affect the employees’ likelihood of making the transition.”  Protester’s Comments, Apr. 5, 2017, at 5.  However, the agency’s evaluation found that [DELETED] was not something that was typically done, and that such a practice “would be considered above and beyond normal recruitment strategies.” AR, Tab 36, SSDD, at 26.  ASRC does not challenge the agency’s finding in this regard, and its argument that the agency failed to have sufficient detail to know whether this would ultimately affect recruitment amounts to mere disagreement with the agency’s determination.  While the protester essentially seeks to replace its judgment for the agency’s exercise of discretion, our Office will not sustain a protest upon a protester’s disagreement with an agency’s technical judgments where the protester has not shown that the agency’s evaluation lacks a reasonable basis.  See BNL, Inc., B‑409450, B‑409450.3, May 1, 2014, 2014 CPD ¶ 138 at 7.

The protester also argues that the agency engaged in an unequal evaluation of proposals under the transition plan.  In this regard, the protester argues that the agency found strengths in the awardee’s proposal for the awardee’s transition plan, while failing to acknowledge similar strengths of its own plan.  The protester points to aspects of its proposal, such as its plan to “hire at least 98% of the incumbent workforce,” its experience in achieving this level of retention on past contracts, and its specific strategies for achieving this level of hiring (including prioritizing early communications and offering retention bonuses), and argues that the agency unreasonably failed to acknowledge strengths for ASRC’s plan to retain incumbent personnel.  Protester’s Comments, Mar. 13, 2017, at 3.  We find no merit in these allegations.

As an initial matter, the agency explains, and the record shows, that where the agency found both proposals to have similar strengths, the agency awarded strengths to both offerors.  For example, OST and ASRC each received a strength for assigning a transition manager to facilitate the transition.  AR, Tab 36, SSDD, at 20, 26.  Here, however, the record shows that the remaining strengths assigned to the proposals were for unique aspects of the proposals.  Thus, ASRC received a strength for its post transition assessment period, a strength not assigned to OST’s proposal.  Likewise, OST received strengths for its “pre-identification of incumbents that possess superior institutional knowledge” and for its [DELETED], strengths not assigned to ASRC’s proposal.  Id. at 26.  While ASRC asserts that it, too, “had a robust transition plan,” the agency concluded that the plan contained only general statements that did not warrant a strength.  Supplemental AR, Mar. 31, 2017, at 18‑19; Protester’s Comments, Mar. 13, 2017, at 3.  As the agency explains, it viewed ASRC’s transition plan in this regard to be an approach that met the requirements and lacking in details.  Supplemental AR, Mar. 31, 2017, at 18‑19.  Given this explanation, we find nothing unreasonable with the agency’s determination not to award ASRC’s proposal a strength for its transition plan.

Sample Task Order Approach

ASRC challenges the agency’s evaluation of the offerors’ proposed sample task order approach.  The protester again contends that the evaluators assigned strengths unreasonably and unequally under this evaluation factor.  ASRC points to certain strengths that the agency assigned to OST’s proposal and asserts that either the strength was undeserved, or that the agency failed to recognize a similar strength in ASRC’s own proposal.  We have reviewed each of the protester’s contentions in this regard and find that they do not provide a basis to sustain the protest.  

For example, ASRC asserts that it was improper for the agency to assign a strength to OST’s proposed task order approach for proposing “an innovative way to support Fort Polk requirements that would result in a cost avoidance of over $70k.”  AR, Tab 36, SSDD, at 28.  ASRC asserts that OST’s cost savings were not repeatable on future task orders, and thus did not satisfy the definition of a “strength.”  Protester’s Supp. Comments, Apr. 5, 2017, at 9. 

In response, the agency explains that the strength was not given for saving $70,000 specifically, but for demonstrating an innovative way to perform the requirements.  Supplemental AR, Mar. 31, 2017, at 39.  In this regard, the agency explains that OST’s approach enhanced the merit of its proposal in a way that provided an advantage to the agency during contract performance, by demonstrating that it was able to offer an innovative approach that the Army had not previously considered.  Id. at 36.  Specifically, OST proposed to [DELETED], rather than [DELETED] originally envisioned in the RFP.  Id.  Furthermore, the Army found that this approach not only saved money, but provided other advantages, such as quicker response times and “face-to-face” support for Fort Polk.  AR, Tab 36, SSDD, at 36.  We find nothing improper about the agency’s decision to award a strength for an innovative approach to the sample task that the agency had not previously considered. 

The protester further argues that the agency engaged in an unequal evaluation because it assigned a strength to OST’s proposal for [DELETED] approach (that is, [DELETED]) to performing the work, while ASRC proposed a similar approach and yet did not receive a strength.  Specifically, OST received a strength for:

propos[ing] [DELETED]

AR, Tab 36, SSDD, at 28.  ASRC points to the language in its proposal that states the firm had:

[DELETED]

Protester’s Comments, Mar. 13, 2017, at 8 (quoting AR, Tab 19, ASRC’s Sample Task Order Approach Volume II, at 9).  Based on this language, ASRC argues it should have received a similar strength.

Where a protester alleges unequal treatment in a technical evaluation, it must show that the differences in ratings did not stem from differences in the proposals.  See Northrop Grumman Sys. Corp., B‑406411, B‑406411.2, May 25, 2012, 2012 CPD ¶ 164 at 8.

While the protester asserts generally that it was entitled to a strength for offering the same approach as the awardee, the protester has failed to explain why its approach was the same as the awardee’s.  In this regard, the protester points to two sentences in its proposal that indicates it would create “operational efficiencies” by allocating hours and CMEs to task workload estimates, and not assign specific CMEs to one office or unit.  Protester Comments, Mar. 13, 2017, at 8.  In contrast, OST’s proposal provided that it was [DELETED].  AR, Tab 42, OST Sample Task Order Proposal, at 1-10.  In this regard, OST’s proposal spent several pages demonstrating how its approach affected numerous areas of the PWS.  Id. at 5-7.  The agency’s evaluation then cited those pages from OST’s proposal that demonstrated OST’s proposed functional approach, identified which sections of the PWS were affected by the approach, and concluded that the firm had shown how its approach could increase effectiveness and efficiency.  AR, Tab 47, Technical Eval. Consensus Form, at 5‑13.  Given these differences, we find nothing improper with the agency finding that OST’s proposed approach was entitled to a strength, while ASRC’s was not.

Finally, the protester argues that, in assigning the protester a rating of “acceptable” under the sample task order approach factor, the agency improperly found that its risk was “moderate.”  Protest at 11-12.  As stated above, an acceptable rating was defined as:

Proposal meets requirements and indicates an adequate approach and understanding of the requirements.  Strengths and weaknesses are offsetting or will have little or no impact on contract performance.  Risk of unsuccessful performance is no worse than moderate. 

RFP at 102.  The protester argues that the agency has failed to provide adequate justification for the agency’s evaluators’ finding of risk, as the evaluation record contains no explanation as to why it received a moderate risk rating, rather than a low risk rating.  Protester’s Comments at 8.  Furthermore, the protester points out that the agency assigned no weaknesses to its proposal under this evaluation factor.


As an initial matter, the protester’s contention that it received a moderate risk rating is not supported by the record.[5]  Instead, the record shows that the agency rated the firm’s risk as “no worse than moderate,” in accordance with the RFP’s definition of an acceptable rating.  AR, Tab 33, Proposal Evaluation Report, at 115.

Moreover, the agency’s conclusion that ASRC’s risk of unsuccessful performance was “no worse than moderate” was a conclusion based on the agency’s assessment of ASRC’s sample task order approach.  As the agency explains, the agency identified only one strength under ASRC’s proposed sample task order approach, and found that ASRC’s approach met the requirements, with an adequate approach and understanding.  AR, Tab 36, SSDD, at 22.  With regard to that single strength--that ASRC had demonstrated a clear understanding of the SETA PWS requirements, and demonstrated a “well-rounded multi-pronged process to develop their . . . approach”--the agency found that this strength alone did not rise to the level of a rating higher than acceptable.[6]

To the extent the protester is arguing that the agency’s failure to identify any weaknesses in ASRC’s proposal entitled the offeror to a risk rating that was “low” (and, thus, a good rating), the difference between a “low risk” (good) rating and a “no worse than moderate” (acceptable) rating was not simply the existence (or lack) of any weaknesses.  In this regard, the RFP did not require the agency to assign a good rating when the agency determines that a proposal contains only strengths under a particular factor.  See RFP at 102.  Conversely, to assign an acceptable rating, the RFP did not require the agency to identify a weakness or a moderate risk.  Id.  Rather, the definitions contemplate a nuanced assessment of the quality of a quotation.  As evidenced by the plain language, the definitions require the consideration of several criteria, including a qualitative assessment as to whether the proposal indicated a thorough approach and understanding, or merely an adequate approach.  Id.  The record shows that the evaluators assigned ASRC’s proposal a rating of acceptable for this factor because the agency determined that ASRC’s proposed approach met all the requirements and demonstrated an adequate approach, but did not rise to the level of a “thorough” approach and understanding.  We see no basis to conclude that the agency’s assignment of an acceptable rating (and thus a “no worse than moderate” risk) was unreasonable. 

Best‑Value Decision


The protester argues that the agency’s best‑value decision was improper because it relied upon “an unreasonable and unequal preference” for OST “on the basis of nothing more than [OST’s] teaming with the incumbent.”  ASRC’s Comments, Mar. 13, 2017, at 10.  In support, ASRC points out that the source selection authority “mentions the fact that OST’s subcontractor is the incumbent no fewer than a dozen times in the comparison of offerors.”  Id.

While ASRC argues that the agency’s decision “overemphasized” the awardee’s use of the incumbent, there is nothing improper about an agency noting that the awardee is proposing to use the incumbent in its proposed approach, and the agency finding such an approach valuable. The existence of an incumbent advantage, in and of itself, does not constitute preferential treatment by the agency, nor is such a normally occurring advantage necessarily unfair.[7]  Government Bus. Servs. Group, B-287052 et al., Mar. 27, 2001, 2001 CPD ¶ 58 at 10.

The protester also argues that the agency’s source selection decision improperly emphasized one of the awardee’s strengths, while finding the protester’s same strength to be of little weight.  Specifically, the protester points out that both it and OST received an identical strength, under the sample task order approach evaluation factor, for “a clear understanding of SETA II Sample Task Order Performance Work Statement Requirements . . . ,” however, the agency found OST’s strength to be “advantageous to the Government during contract performance,” while ASRC’s strength would have “little impact on contract performance.”  AR, Tab 36, SSDD, at 22, 28, 40, 41. 

Here, the record shows that, while the agency used the same language to identify both strengths, the SSA viewed the underlying merits of the proposals differently.  As to ASRC’s sample task order approach, the SSA viewed ASRC’s multipronged approach as “well rounded” and having “merit,” but ultimately having little impact on contract performance.  Id. at 41.  In contrast, the SSA viewed OST approach, which involved utilizing the firm’s own approach along with the “incumbent’s, their sole teammate, expert judgment and five-year’s experience . . . to determine the support requirements,” and then “validating those results [DELETED],” as an approach that would result in a “thorough understanding of the specific support requirements.”  Id.  Given that the strengths referred to differing underlying proposal approaches, where the agency valued the awardee’s approach more highly, we find nothing improper with the SSA’s conclusion that the awardee’s approach offered advantages during contract performance, while the protester’s approach would have little or no impact on performance.

We deny the protest.

Susan A. Poling
General Counsel



[1] Consistent with the RFP, the agency assigned combined technical/risk ratings of outstanding, good, acceptable, marginal, or unacceptable to the management approach  and sample task order approach evaluation factors.  RFP at 102.  As relevant to this protest, a good rating was defined as a proposal that meets the requirements and indicates a thorough approach and understanding of the requirements, contains strengths which outweigh any weaknesses, and has a low risk of unsuccessful performance.  Id.  An acceptable rating was defined as a proposal that meets the requirements and has an adequate approach and understanding of the requirements, contains strengths and weaknesses that are offsetting or have little or no impact on contract performance, and has a risk of unsuccessful performance that is no worse than moderate.  Id.  The agency also assigned past performance ratings of substantial confidence, satisfactory confidence, limited confidence, no confidence, or unknown confidence.  Id. at 102-103.

[2] The RFP provided definitions for a significant strength, strength, weakness, significant weakness, and deficiency.  RFP at 103.  As relevant to this protest, a strength was defined as “an aspect of an Offeror’s proposal that has merit or exceeds specified performance or capability requirements in a way that will be advantageous to the government during contract performance.”  Id.

[3] To the extent we do not address certain arguments or variations of arguments presented during the course of the protest, we have considered all of the allegations and find that none provides a basis for sustaining the protest.  Furthermore, ASRC withdrew several of its protest grounds.  For example, ASRC’s third supplemental protest challenged the agency’s evaluation of OST’s management approach on the basis that OST’s proposal had exceeded the page limitation.  ASRC Supp. Protest, Mar. 20, 2017, at 1.  After receiving the agency’s supplemental report, ASRC withdrew this ground of protest.  ASRC Comments, Apr. 5, 2017, at 1 n.1.

[4] The protester also alleges that the agency unreasonably assigned a strength to OST’s proposal under the recruitment and retention subfactor.  Specifically, the protester alleges that the strength--which was for [DELETED]--did not provide anything more than a trivial benefit to the agency, since “most staffing agencies perform background checks above and beyond security checks,” and since the awardee proposed to use the incumbent work force, which rendered any additional checks of their skills and background unnecessary.  Protester’s Comments, at 4.  In response, the agency explains that, even with the awardee’s plan to hire incumbent personnel, “there will be some turnover during a five year period of performance for which OST’s plan to hire qualified personnel will add value.”  Agency’s Supp. Report, Mar. 31, 2017, at 27.  Given the agency’s reasonable explanation for how OST’s proposed approach could add value and increase the likelihood of successful contract performance, we find nothing improper with the agency’s assignment of a strength under this subfactor.

[5] The agency points out that the record consistently refers to the protester’s evaluated risk as “no worse than moderate.”  Agency Supp. Report, Mar. 31, 2017, at 54.

[6] In its initial protest, ASRC argued that the strength the agency identified here entitled the firm to a higher sample task order rating.  Protest at 10‑11.  In its report, the agency explained that it viewed ASRC’s strength as applying to only one of several aspects of the sample task order approach that were to be evaluated under the solicitation, and thus the agency viewed its acceptable rating as reasonable.  AR, Mar. 1, 2017, at 42‑45.  The protester did not provide a response in its comments, and consequently we consider the issue to be abandoned.  Analex Space Sys., Inc.; PAI Corp., B-259024, B-259024.2, Feb. 21, 1995, 95-1 CPD ¶ 106 at 8 (where an agency’s report specifically addresses issues raised by the protester, and the protester fails to address the agency’s responses in its comments, we consider the issues to have been abandoned by the protester and will not further consider them).

[7] To the extent the protester is arguing that the agency’s source selection decision unfairly relied on the awardee’s use of the incumbent without also considering ASRC’s proposed approach of “hiring the entire incumbent workforce,” the agency explains, and we agree, that OST’s approach offered advantages not found in ASRC’s approach of simply hiring the incumbent workforce.  For example, the agency found that OST proposed to use the incumbent’s “existing systems and processes” during contract performance, rather than the incumbent workforce having to learn ASRC’s new systems and processes.  Agency Supp. Report, Mar. 31, 2017, at 60-61; AR, Tab 36, SSDD, at 39.

Dec 12, 2017

Dec 11, 2017

Dec 8, 2017

Dec 7, 2017

Looking for more? Browse all our products here