Skip to main content

OnPoint Consulting, Inc.

B-417397.3,B-417397.5,B-417397.6 Oct 03, 2019
Jump To:
Skip to Highlights

Highlights

OnPoint Consulting, Inc. (OnPoint), of Arlington, Virginia, protests the issuance of a task order to Data Systems Analyst, Inc. (DSA) of Trevose, Pennsylvania, by the Defense Information Systems Agency under request for proposals (RFP) No. 831710869, for information and knowledge management solutions and services. OnPoint protests that the agency's technical evaluation was unequal and applied an unstated evaluation criterion; the agency failed to reasonably evaluate DSA's most probable cost and did not meaningfully evaluate price reasonableness; and the agency conducted a flawed best-value determination.

We deny the protest.
View Decision

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of:  OnPoint Consulting, Inc.

File:  B-417397.3; B-417397.5; B-417397.6

Date:  October 3, 2019

Kevin J. Maynard, Esq., Cara L. Lasley, Esq., and Sarah B. Hansen, Esq., Wiley Rein LLP, for the protester.
David S. Cohen, Esq., Norah D. Molnar, Esq., John J. O’Brien, Esq., and Daniel J. Strouse, Esq., Cordatis LLP, for Data Systems Analyst, Inc., the intervenor.
Vera A. Strebel, Esq., Anthony J. Balestreri, Esq., and Nati Silva, Esq., Defense Information Systems Agency, for the agency.
John Sorrenti, Esq., Glenn G. Wolcott, Esq., and Christina Sklarew, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1.  Protest that the agency’s technical evaluation was unequal is denied where the record shows that the agency assessed strengths to the awardee and not to the protester as a result of differences in the offerors’ proposals.

2.  Protest that agency applied an unstated evaluation criterion in assessing a strength to the awardee’s proposal is denied where the agency assessed the strength based on an aspect of the awardee’s proposal that was reasonably encompassed by the solicitation’s evaluation criteria.

3.  Protest that agency failed to reasonably evaluate price realism and reasonableness is denied where the record shows the agency’s evaluation was reasonable and consistent with the solicitation.

4.  Protest that the agency’s best-value determination was flawed is denied where record shows that the agency meaningfully considered the differences in offerors’ proposals and adequately documented its tradeoff decision.

DECISION

OnPoint Consulting, Inc. (OnPoint), of Arlington, Virginia, protests the issuance of a task order to Data Systems Analyst, Inc. (DSA) of Trevose, Pennsylvania, by the Defense Information Systems Agency under request for proposals (RFP) No. 831710869, for information and knowledge management solutions and services.  OnPoint protests that the agency’s technical evaluation was unequal and applied an unstated evaluation criterion; the agency failed to reasonably evaluate DSA’s most probable cost and did not meaningfully evaluate price reasonableness; and the agency conducted a flawed best-value determination.[1]

We deny the protest.

BACKGROUND

The agency issued the RFP on July 23, 2018, under the National Institutes of Health (NIH) information technology acquisition and assessment center Chief Information Officer-Solutions and Partners 3 (CIO-SP3) governmentwide multiple-award indefinite-delivery, indefinite-quantity (IDIQ) contract.  Agency Report (AR) Tab 1, RFP at 1.  The RFP sought services to support the Product Lead Military Technical (MilTech) Solutions with expertise in developing, acquiring, fielding, sustaining, and enhancing MilTech’s suite of applications and systems.[2]  AR, Tab 1A, PWS, at 3.  This includes support such as systems engineering, project management, and systems administration, and technical support to provide a variety of services for the MilTech’s technology portfolio.  Id.

The RFP contemplated the issuance of a cost-plus-fixed-fee/fixed-price task order with a 1-year base period and four 1-year option periods.  RFP at 1.  The solicitation provided for a best-value tradeoff based on a technical/management approach factor and a cost/price factor.  Id. at 3-4.  The technical/management approach factor was more important than the cost/price factor, and was evaluated using the following six subfactors, which were all of equal importance:  (1) MilTech sustainment; (2) MilTech software development; (3) infrastructure requirements; (4) MilTech capabilities; (5) program management; and (6) cyber security.  Id.  The infrastructure requirements, MilTech capabilities, and program management subfactors are relevant to this protest.

Under the infrastructure requirements subfactor, offerors were evaluated based on their “knowledge and capability to operate, administer, maintain, and upgrade the server infrastructures and integration effort supporting the MilTech products and applications.”  Id. at 3.  Offerors also had to demonstrate “experience with similar [Department of Defense] [e]nterprise class technology infrastructure.”  Id. at 3-4.  Under the MilTech capabilities subfactor, offerors were evaluated based on their “knowledge and capability to maintain the secure environments of milSuite, SharePoint, and the [single interface to the field (SIF) information technology service management (ITSM)] tool, as well as customize and develop enhancements to these capabilities.”[3]  Id. at 4.  Finally, the program management subfactor provided that an offeror’s management approach and staffing plan had to “address how it will hire, retain, track, and manage all resources to support this requirement in accordance with the PWS requirements.”  Id.

For the cost/price factor, the RFP stated that proposals would be evaluated to determine if the proposed price is reasonable and complete, and that the cost reimbursable contract line item numbers (CLINs) would be evaluated for realism.  RFP at 4.  The RFP also explained that the agency might require surge support during the base or any option period, but that any requirement for surge support was optional and not guaranteed.  Id. at 6.  The RFP instructed offerors to propose for the surge support CLIN an amount equal to 35 percent of the offeror’s total proposed cost for the base and all option periods.  Id.  The RFP also stated that surge support would be provided “at the same labor rates proposed and found fair and reasonable at time of contract/task order award for the applicable period of performance.”  Id.

The agency received proposals from three offerors, including OnPoint and DSA.  Following discussions, the TET evaluated the proposals and assigned a technical/risk rating for each subfactor.  See AR, Tab 7, Selection Recommendation Document (SRD).  The ratings the agency used are, from highest to lowest, blue/outstanding, purple/good, green/acceptable, yellow/marginal, and red/unacceptable.[4]  AR, Tab 1F, Evaluation Tables.  The final ratings assigned to the proposals submitted by OnPoint and DSA are as follows:

Technical/Management Approach Factor

Subfactors

OnPoint

DSA

MilTech Sustainment

Acceptable

Acceptable

MilTech Software Development

Acceptable

Acceptable

Infrastructure Requirements

Acceptable

Good

MilTech Capabilities

Good

Good

Program Management

Acceptable

Good

Cyber Security

Acceptable

Acceptable

Cost/Price Factor

 

$108,807,460

$128,994,148

 

AR, Tab 7, SRD, at 29-30.

Under the infrastructure requirements subfactor, the agency noted that DSA’s proposal was assigned two strengths “for its experience in migrating applications to [an] approved environment and its ability to upgrade [the] [g]overnment’s information system,” while OnPoint’s proposal was not assigned any strengths; therefore DSA’s solution was considered superior to OnPoint’s under this subfactor.  Id. at 31. 

Under the MilTech capabilities subfactor, OnPoint’s proposal was assigned two strengths for “proposing two enhancements to MilSuite environment,” while DSA was assigned two strengths for “proposing to enhance [the] MilSuite environment by proposing to transition [DELETED]; and . . . demonstrating superior knowledge and capability to maintain the secure environment of [the] SIF ITSM tool.”  Id.  The TET noted that while each offeror was assigned two strengths under the MilTech capabilities subfactor for proposing enhancements to MilSuite, “DSA is the only offeror that proposed to enhance MilSuite and demonstrated superior knowledge and capability to maintain the secure environment of SIF ITSM tool - two separate environments.”  Id.  The TET thus concluded that under the MilTech capabilities subfactor, DSA’s solution was “superior to OnPoint’s solution . . . because while OnPoint’s solution resulted in two strengths for one single environment, DSA’s solution resulted in two strengths for two separate environments.”  Id. 

Finally, under the program management subfactor, DSA was assigned two strengths for having a robust recruitment strategy and for utilizing a program management book of knowledge (PMBOK) four-phase approach for hiring personnel.  Id.  OnPoint’s proposal was not assigned any strengths under this subfactor, and therefore the agency found DSA’s solution to be “clearly stronger” than OnPoint’s solution.[5]  Id.  Based on its evaluation, the agency found that DSA proposed a “technically superior solution” to OnPoint under the technical/management approach factor.  Id.

In evaluating both the technical/management approach and cost/price factors together, the TET noted that OnPoint’s solution offered a 16 percent savings over DSA’s technically superior solution, but that OnPoint’s solution was assigned only two strengths while DSA’s solution was assigned six strengths.  Id. at 34.  Given that the technical/management approach factor was more important than the cost/price factor, the TET recommended DSA for award, concluding that “[a]fter thorough consideration of OnPoint’s technical solution and the cost/price savings that this solution would result in, the evaluation team determined that [the] technically superior solution[] proposed by DSA . . . will represent a better value to the [g]overnment than the cost/price savings associated with OnPoint’s technically inferior solution.”  Id. 

The source selection official (SSO) reviewed the TET’s evaluation and “concur[red] with all aspects of the evaluation team’s evaluation of all offerors.”  AR, Tab 8, Price Negotiation Memorandum (PNM), at 7.  The SSO stated that “based on my review of the evaluator’s recommendations and my own independent assessment . . . award to DSA is in the best interest of the [g]overnment.”  Id. at 10.  In making this assessment, the SSO concluded that “OnPoint does not represent the best value to the government” and that “[t]he award will be made to the offeror who proposed a superior technical solution.”  Id.

The agency notified OnPoint of its award decision on June 21, 2019.  Following a debriefing, OnPoint timely filed a protest with our Office.[6]

DISCUSSION

OnPoint’s protest alleges that the agency:  (1) unequally evaluated DSA’s and OnPoint’s proposals under the infrastructure requirements and program management subfactors; (2) applied an unstated evaluation criterion in evaluating the MilTech capabilities subfactor; (3) failed to reasonably evaluate DSA’s most probable cost; (4) failed to evaluate the reasonableness of offerors’ proposed labor rates; and (5) conducted a flawed best-value tradeoff and determination.[7]  The agency contends that its evaluation was reasonable and consistent with the solicitation.  For the reasons discussed below, we deny all of OnPoint’s protest grounds.

Unequal Evaluation

OnPoint contends that the agency’s evaluation of the infrastructure requirements and program management subfactors was unequal.  OnPoint asserts that the agency assessed two strengths to DSA for its approach to cloud migration and proposed use of subject matter experts (SMEs) under the infrastructure requirements subfactor, and one strength for DSA’s hiring and retention activities under the program management subfactor, but failed to assess strengths for OnPoint’s own proposed cloud migration solution, use of SMEs, and hiring and retention approach.  Protester’s Supp. Comments at 3-7, 9-11.  OnPoint maintains that these three aspects of its proposal were the same or better than DSA’s approach, and therefore each should have been assessed a strength.  Id. 

It is a fundamental principle of government procurement that agencies must treat offerors equally, which means, among other things, that they must evaluate proposals in an even-handed manner.  Novetta, Inc., B-414672.4, B-414672.7, Oct. 9, 2018, 2018 CPD ¶ 349 at 15.  Where a protester alleges unequal treatment in an evaluation, it must show that the differences in ratings do not stem from differences in the proposals.  See Credence Mgmt. Sols., LLC; Advanced Concepts and Techs. Int’l, LLC, B-415960 et al., May 4, 2018, 2018 CPD ¶ 294 at 10-11; Credence Mgmt. Sols., Inc., B-417389.2, July 31, 2019, 2019 CPD ¶ 283 at 10. 

Infrastructure Requirements Subfactor - Cloud Migration

OnPoint argues that the agency evaluated proposals unequally for each offeror’s approach to cloud migration under the infrastructure requirements subfactor.  Under this subfactor, the agency assessed the following strength to DSA’s proposal for its proposed approach to cloud migration:

The offeror’s solution has merit or exceeds the requirements of [the infrastructure requirements] subfactor of upgrading the server infrastructures and MilTech products and applications.  Specifically, [o]fferor demonstrates the ability to reduce operating costs and increase performance significantly through upgrading a government information system to a commercial cloud solution . . . .  The [o]fferor has experience in migrating applications to approved cloud environments including milSuite, [National Archives and Records Administration Description and Authority Service], [Department of Homeland Security Immigration and Customs Enforcement] Vine, and [Department of Justice] Victim Notification System . . . .  This has merit or exceeds the requirement of subfactor 3 to upgrade the server infrastructures and integration effort supporting the MilTech products and applications.

AR, Tab 7, SRD at 18.

OnPoint asserts that it proposed a “nearly identical capability” to migrate MilTech applications to the commercial cloud and that it should have received a strength as well.  Second Supp. Protest at 4.  OnPoint contends that, similar to DSA, it proposed a multi-step approach to cloud migration and also had previous experience with cloud migration.  Id. at 4-5; Protester’s Supp. Comments at 4-6.  The agency responds that DSA’s proposal was more detailed in explaining how DSA intended to achieve cloud migration, while OnPoint’s proposal provided “vague” and “general” conclusory statements about how OnPoint would conduct the migration.  Second Supp. Memorandum of Law (MOL) at 7-11.

In response to OnPoint’s protest, the agency provided more explanation of its evaluation of each offerors’ proposed cloud migration approach.  See id. at 6-11; AR, Tab 16, Second Declaration of TET Lead, at 3-9.[8]  The agency noted that DSA’s cloud migration approach included [DELETED], and provided a summary of each step.  Id. at 6.  The agency stated that for the [DELETED], “DSA provided a detailed description of how it would focus on [DELETED] within MilTech that would need to be migrated to a cloud environment.”  Id.  The agency explained that “[b]y conducting an [DELETED], DSA demonstrated that it will obtain the necessary knowledge that would eliminate the expensive re-work later.”  Id.  The agency stated that this approach was beneficial because it would help the government determine whether to migrate certain applications and systems into a cloud environment early in the migration process.  Id. at 6-7. 

In the [DELETED], the agency stated that “DSA described how it would [DELETED] in order to determine [DELETED].”  Id. at 7.  The agency explained that this would help ensure that the applications and systems were best positioned for a cloud migration.  Id. 

The agency next described how DSA’s [DELETED], provided a detailed description of how DSA intended to [DELETED], starting with an initial migration of a limited number of applications.  Id. at 7.  Based on the lessons learned from this initial migration, DSA would craft a migration plan for all applications.  Id. 

Finally, in the [DELETED], the agency explained that DSA would conduct [DELETED] for the new cloud environment.  Id.  The agency concluded that DSA’s overall cloud migration approach “would result in the efficient way of migrating MilTech’s applications and systems to a cloud environment and eliminating any costly re-work.”[9]  Id.

In contrast, the agency states, OnPoint’s proposal discussed its cloud migration procedure in general terms that simply showed that OnPoint’s approach met the requirement for cloud migration but did not rise to the level of a strength.  Id. at 7-10.  The agency explains that OnPoint proposed a [DELETED] approach.  In the [DELETED], OnPoint “presented a general description of how it would determine [DELETED], by gathering information from MilTech’s users and researching [DELETED] that are available.”  Id. at 8.  In the [DELETED], OnPoint provided general terms on how it would develop a plan for the migration.  Id.  In the [DELETED], OnPoint “again provided general statements concerning its migration strategy, which included [DELETED], and [DELETED] services.”  Id.  The agency concluded that this information showed OnPoint had the ability to conduct cloud migration, but that OnPoint “did not provide any additional information or any additional details concerning its ability to conduct cloud migration that would rise to the level of a [s]trength.”  Id. 

In response to the agency’s explanation, OnPoint identifies some similarities in both offerors’ approaches, and challenges the agency’s depiction of its proposed approach as being vague or general.  For example, OnPoint argues that both offerors proposed to conduct an [DELETED] before migrating them to the cloud, and that both offerors had experience in cloud migration. 

While OnPoint is able to identify some similarities in the two offerors’ proposed cloud migration approaches--and thus somewhat counter the agency’s argument that OnPoint’s approach was vague and general--OnPoint has not shown that it proposed a “nearly identical capability” to DSA.  In other words, OnPoint has not shown that it proposed a similar approach for all of the aspects of DSA’s proposal that the agency identified as warranting a strength, and therefore OnPoint has not met its burden to show that the differences in ratings do not stem from differences in the proposals.   

Based on our review of the record, we find unobjectionable the agency’s decision to assess a strength to DSA’s proposal but not to OnPoint’s for cloud migration under the infrastructure requirements subfactor.  The agency’s description of each offeror’s cloud migration approach is generally consistent with the record and provides a reasonable explanation for why the agency determined that DSA’s approach deserved a strength while OnPoint’s approach did not.

Infrastructure Requirements Subfactor - Use of SMEs

OnPoint also argues that the agency evaluated proposals unequally for SMEs under the infrastructure requirements subfactor.  Under this subfactor, the agency assessed the following strength to DSA for its proposed use of SMEs:

Offeror has designated identity [m]anagement SMEs that will collaborate with Army SMEs and provide input to maintain and upgrade current identity management solution used by MilTech capabilities . . . .  By having SMEs for this solution, this benefits the [g]overnment by reducing costs and schedule risk.

AR, Tab 7, SRD at 18.

OnPoint argues that this strength was assessed for “the provision of SMEs for particular solutions and tasks within the PWS . . . .”  Protester’s Supp. Comments at 7.  OnPoint asserts that because it also “offered a wide variety of SMEs to assist in various aspects of performance,” it too should have received a strength for the provision of SMEs.  Second Supp. Protest at 6.

In response, the agency explains that the infrastructure requirements subfactor referenced PWS section 6.6, which required the contractor to provide technical expertise to infrastructure technologies, among other things.  Second Supp. MOL at 12; AR, Tab 1A, PWS at 18.  In particular, the PWS listed two critical issues involving identity management solutions that contractors would be required to resolve.[10]  Second Supp. MOL at 12; AR, Tab 1A, PWS at 18.  The agency states that DSA proposed identity management SMEs that would “support the MilTech identify management initiatives” and would “focus on fostering close relations with [Department of Defense] and Army SMEs . . . to promote MilTech connectivity with future . . . identify management plans . . . [and] provide identity management solutions that support the ability to search for information spanning multiple MilTech capabilities.”  Second Supp. MOL at 13; AR, Tab 5A, DSA Tech/Management Approach Prop., at 21-22.  The agency states that DSA was assessed a strength for its “very detailed explanation concerning how it planned on providing the necessary technical support for MilTech’s infrastructure and how it planned on addressing the two problems outlined in PWS Section 6.6, Subtask 6.”  Second Supp. MOL at 14.  The agency asserts that in contrast, OnPoint’s proposal mentioned its use of SMEs generally, but did not provide detail as to how it planned to use those SMEs to resolve particular issues, which is why it was not assessed a strength.  Id. at 18-19. 

On this record, we find that the agency did not evaluate unequally the proposed use of SMEs under the infrastructure requirements subfactor.  The record supports the agency’s explanation that it assessed a strength to DSA for its detailed description of how DSA would utilize identity management SMEs to collaborate with other SMEs to solve the critical issues identified in the PWS.  This is consistent with the contemporaneous evaluation, which stated that DSA received a strength because it “has designated identity management SMEs that will collaborate with Army SMEs and provide input to maintain and upgrade current identity management solution used by MilTech capabilities.”  AR, Tab 7, SRD at 18.  OnPoint has not shown that its proposal contained a similar description of the use of SMEs, let alone specific identity management SMEs.  Thus, the agency’s evaluation here was reasonable.[11] 

Program Management Subfactor

OnPoint also contends that the agency evaluated proposals unequally under the program management subfactor.  Under this subfactor, the agency assessed DSA’s proposal the following strength:

The [o]fferor conducts their hiring and retention activities in the content [sic] of Program Management Book of Knowledge (PMBOK) Project Management Institute (PMI) based project phases 1-4, 1. Initiating; 2. Planning; 3. Executing; 4. Monitoring/Controlling . . . .  This has merit or exceeds the requirements of [the program management] subfactor . . . to address how the offeror will hire, retain, track, and manage all resources to support the award.  This benefits the [g]overnment by reducing manpower cost by ensuring the correct skillset fulfill[s] the requirements and reduces [o]fferor turnover.

AR, Tab 7, SRD, at 22.  OnPoint argues that the agency’s evaluation was unequal because OnPoint proposed a similar approach to hiring and retaining personnel that utilized distinct phases and provided the same benefits, and therefore OnPoint should have received a strength as well.  Protester Supp. Comments at 10-11.  The agency contends that DSA was assessed a strength because it proposed a detailed approach that had merit or exceeded performance requirements, while OnPoint provided a general description of its approach that met the requirements but did not contain additional detail that represented a strength.  Second Supp. MOL at 24-29.

Based on our review of the record, we find reasonable the agency’s evaluation and decision to assess a strength to DSA’s proposal, but not to OnPoint’s.  The agency reasonably determined that DSA’s proposal contained specific details about how it would conduct its hiring and retention to conclude that it had “merit or exceeds the requirements of [the program management] [s]ubfactor” and therefore deserved a strength.  For example, as the agency explains, DSA’s proposal detailed how it would identify potential employees, screen and interview them, confirm they are appropriate candidates for the respective positions, and ultimately hire them.  Second Supp. MOL at 27; AR, Tab 16, Second Decl. of TET Lead, at 22-23; see also AR, Tab 5A, DSA Tech/Management Approach Prop., at 39.  The agency also notes that DSA proposed to retain employees using a mix of training, competitive salaries and employee benefits, and other opportunities for professional development.  Second Supp. MOL at 28; AR, Tab 16, Second Decl. of TET Lead, at 22-23; see also AR, Tab 5A, DSA Tech/Management Approach Prop., at 39.  The agency also highlights DSA’s proposal to address employee absences and ensure that all positions would be filled with the properly qualified people.  Second Supp. MOL at 28; AR, Tab 16, Second Decl. of TET Lead, at 22-23; see also AR, Tab 5A, DSA Tech/Management Approach Prop., at 39-40.

We find further that the record supports the agency’s explanation that OnPoint’s proposal lacked detail showing that it deserved a strength.  For example, the agency explained that OnPoint proposed to conduct an initial review and analysis of the PWS to determine the labor mix for the contract.  Supp. MOL at 25; AR, Tab 16, Second Decl. of TET Lead, at 19-22; see also AR, Tab 5B, OnPoint Tech/Management Approach Prop., at 33-35.  After review and evaluation of this approach and OnPoint’s proposed labor mix, the agency concluded that this approach demonstrated only that OnPoint proposed a sufficient labor mix to perform the PWS requirements, but did not rise to the level of a strength.  Supp. MOL at 25; AR, Tab 16, Second Decl. of TET Lead, at 19-22.  The agency also explained that OnPoint’s statement that it had a pipeline of qualified candidates met the hiring requirement to have access to various pools of candidates, but did not demonstrate why this amounted to a strength.  Supp. MOL at 26; AR, Tab 16, Second Decl. of TET Lead, at 19-22; see also AR, Tab 5B, OnPoint Tech/Management Approach Prop., at 35-36.  Based on our review of the record, we find nothing objectionable with this evaluation; the agency reasonably concluded that OnPoint’s proposal met the requirements of the program management subfactor, but fell short of exceeding the solicitation requirements, and therefore did not deserve a strength.

Unstated Evaluation Criterion

OnPoint contends that the agency applied an unstated evaluation criterion to the MilTech capabilities subfactor when it assessed a strength to DSA’s proposal for DSA’s expertise in both BMC Remedy--the current ITSM tool--and ServiceNow--a competing ITSM product that is under consideration by the Army for designation as the standard ITSM tool.  Protester Supp. Comments at 8-10; see also AR, Tab 7, SRD at 19.  OnPoint argues that the evaluation criteria for the MilTech capabilities subfactor informed offerors that the agency would evaluate offerors’ approaches and experience only with the existing ITSM tool, and that expertise with a potential new ITSM tool was not reasonably understood to be part of the evaluation criteria.  Protester’s Supp. Comments at 9.

In response, the agency notes that the MilTech capabilities subfactor referenced PWS section 6.6, which required offerors to “support any changes to the technical architecture and baseline” and to “provide recommendations to the [g]overnment on growing technology infrastructure trends and available solutions.”  Second Supp. MOL at 20; AR, Tab 1A, PWS, at 18.  The agency argues that this language informed offerors that they would be required to support and propose recommendations for changes to the technical architecture and baseline.  Id.  The agency maintains that, given this language in the RFP and PWS, it was reasonable to assess a strength to DSA’s proposal for its ability to support ServiceNow.  Second Supp. MOL at 21-22.

Agencies are not required to identify all areas of each factor that might be taken into account in an evaluation, provided the unidentified areas are reasonably related to, or encompassed by, the established factors.  Northrop Grumman Sys. Corp., B-414312 et al., May 1, 2017, 2017 CPD ¶ 128 at 12; see also Global Analytic Info. Tech. Servs., Inc., B-298840.2, Feb. 6, 2007, 2007 CPD ¶ 57 at 4.

On this record, we find nothing objectionable about the agency’s assessment of a strength to DSA for its expertise with the ServiceNow product.  Contrary to OnPoint’s assertion, the evaluation criteria for assessing the MilTech capabilities subfactor was not limited to the existing ITSM toolset.  Rather, the RFP stated that the agency would evaluate the knowledge and capability to maintain the secure environments of the SIF ITSM tool, but did not restrict this to the existing SIF ITSM tool.  This allowed for the possibility that the SIF ITSM tool could change at some point during performance and the contractor would still need to maintain the secure environment for that new ITSM tool.  The language in the PWS also required offerors to support changes to the architecture and recommend technology infrastructure trends; a movement to a new ITSM tool could be one of those changes or trends.  As a result, there was nothing improper about the agency’s consideration of an offeror’s expertise with the existing ITSM tool and a potential new ITSM tool, as this was logically encompassed within the evaluation criteria.[12]  The agency thus reasonably assessed a strength to DSA’s proposal for its expertise with ServiceNow.[13]

Cost Realism Evaluation

OnPoint argues that the agency failed to reasonably evaluate the realism of DSA’s costs.  Specifically, OnPoint alleges that the agency made no most probable cost adjustments to DSA’s proposal, even though one of DSA’s subcontractors proposed a [DELETED], and another subcontractor proposed for two labor categories labor rates that were lower than the rates previously paid.  Protester’s Supp. Comments at 11-13.  OnPoint claims that these are the “exact types of concerns” that should have led the agency to make an upward adjustment to DSA’s proposed costs.  Second Supp. Protest at 12.

In response, the agency asserts that it properly determined that the profit and labor rates proposed by DSA’s subcontractors were realistic and did not warrant upward adjustments.  Second Supp. MOL at 30.  The agency explains that it issued evaluation notices (ENs) to each subcontractor, both of which provided satisfactory responses to the ENs.  Id. at 29-33.  The agency also contends that its conclusion was supported by its review of DSA’s own analysis of its subcontractors, which determined that the proposed labor rates were realistic.  Id.

An agency is not required to conduct an in-depth cost analysis, or to verify each and every item in assessing cost realism; rather, the evaluation requires the exercise of informed judgment by the contracting agency.  AdvanceMed Corp.; TrustSolutions, LLC, B-404910.4 et al., Jan. 17, 2012, 2012 CPD ¶ 25 at 13.  While an agency’s cost realism analysis need not achieve scientific certainty, the methodology employed must be reasonably adequate and provide some measure of confidence that the rates proposed are reasonable and realistic in view of other cost information reasonably available to the agency at the time of its evaluation.  Tantus Techs., Inc., B-411608, B-411608.3, Sept. 14, 2015, 2015 CPD ¶ 299 at 10.  Our review of an agency’s cost realism evaluation is limited to determining whether the cost analysis is reasonably based and not arbitrary.  TriCenturion, Inc.; SafeGuard Servs., LLC, B-406032 et al., Jan. 25, 2012, 2012 CPD ¶ 52 at 6.

We find the agency’s cost realism evaluation to be reasonable.  As OnPoint contends, the record confirms that one of DSA’s subcontractors, Integrated Data Services (IDS), proposed a [DELETED], while a second subcontractor, PKIMM, Inc., proposed for two labor categories direct labor rates that were lower than the rates the company was paying for these positions.  AR, Tab 6, Cost/Price Eval. Report, at 10.  Recognizing these issues, the agency issued ENs to IDS and PKIMM requesting that IDS explain how it intended to recover or redistribute its [DELETED], and that PKIMM provide documentation to support the proposed lower direct labor rates.  AR, Tab 5A1, DSA Cost/Price Supporting Documentation, at G-xxxvi, G-xxxviii.

In its response to the EN, IDS stated:

IDS is proposing a [DELETED].  IDS is accepting a [DELETED] at this time as it sees this work as strategic in nature.  This work will afford IDS the ability to: 1) continue supporting a long-term customer; 2) maintain an incumbent, long-term employee; 3) be able to show a relevant past performance for future work by continuing to support this customer; and 4) potentially be able to leverage the surge CLIN on this contract for additional work down the line which will potentially [DELETED] on this specific position.

AR, Tab 18, IDS Cost/Price Supporting Documentation, at A-3.  PKIMM responded to the EN by providing payroll screen shots that supported its direct labor rates and explaining that it “made a management decision to propose[] a reduced base rate for these labor categories” and that “[t]he reduction to each proposed base rate was made so that our proposed loaded labor rates are the same as the negotiated loaded labor rates for these categories of labor on our existing subcontract for this effort.”  AR, Tab 6, Cost/Price Eval. Report, at 6; AR, Tab 19F, Cost/Price Pay Rate Support.  PKMM also stated that it had performed this effort for multiple years at the proposed rates, “with no degradation in the quality or quantity of required effort.”  Id.

In its cost/price evaluation, the agency acknowledged the responses to the ENs, and also noted that DSA conducted its own subcontractor price analysis by comparing the subcontractors’ proposed fully burdened labor rates with salary survey data from the Economic Research Institute (ERI).  AR, Tab 6, Cost/Price Eval. Report, at 6.  The agency noted that DSA’s analysis determined that all of its subcontractors’ fully burdened proposed labor rates were between the [DELETED] and [DELETED] percentile range of the salary data pulled from ERI.  Id.  Based on the subcontractors’ responses to the ENs and DSA’s analysis, the agency determined that it had no issue with the direct and indirect rates and fees proposed by either subcontractor.  Id.

On this record, we find the agency’s cost realism evaluation unobjectionable.  The agency raised its concerns and received an explanation from each subcontractor supporting the [DELETED] and lower labor rates.  The agency also reviewed DSA’s analysis finding the proposed rates to be within the range of comparable salary data from ERI, and found it to be reasonable.  The record therefore shows that the agency considered these issues and concluded that it could accept the subcontractors’ explanations and make no most probable cost adjustments to the subcontractors’ proposed rates.  We find the agency’s explanation reasonable and deny this protest ground.

Price Reasonableness Evaluation 

OnPoint also alleges that the agency only evaluated the reasonableness of the total proposed prices and failed to meaningfully evaluate the reasonableness of the proposed individual labor rates.  Protester’s Supp. Comments at 13-14.  OnPoint asserts that this was contrary to the solicitation, which “specifically required the [a]gency to evaluate the reasonableness of offerors’ proposed labor rates.”  Id. at 13.

The RFP stated that the agency would evaluate cost/price proposals “using one or more of the techniques defined in [Federal Acquisition Regulation (FAR)] Part 15.404 in order to determine if they are reasonable and complete.”  RFP at 4.  OnPoint claims that the RFP also required the agency to evaluate the reasonableness of the proposed labor rates because the RFP stated that “[s]urge support will be provided at the same labor rates proposed and found fair and reasonable at time of contract/task order award for the applicable period of performance.”  Protester’s Supp. Comments at 13-14; RFP at 6.  We do not agree that this language required the agency to evaluate the reasonableness of individual labor rates.  Rather, the RFP states that the agency would evaluate cost/price proposals for reasonableness using one of the techniques described in FAR § 15.404.  RFP at 4.  The record shows that is exactly what the agency did.

Section 15.404 of the FAR sets forth a number of ways an agency can evaluate whether proposed prices are fair and reasonable, including comparing the proposed prices received in response to the RFP to each other; to a competitive published price list; or to a government estimate.  FAR §§ 15.404-1(b)(2)(i), (iv), (v).  Here, the agency compared the proposed prices to each other and to the independent government cost estimate (IGCE).  AR, Tab 6, Cost/Price Eval. Report at 4.  The agency also compared the fully burdened labor rates to the established rates found on each offeror’s CIO-SP3 contract.  Id. at 5.  The agency determined that DSA’s proposed price was [DELETED] percent higher than the next lowest priced offeror and [DELETED] percent lower than the average total evaluated price.  Id. at 4.  DSA’s proposed price was also [DELETED] percent lower than the IGCE.  Id.  Finally, DSA’s proposed labor rates were approximately [DELETED] percent lower to [DELETED] percent higher than its established rates on the CIO-SP3 contract.  Id. at 5.  Based on this analysis, the agency found DSA’s proposed prices--including its proposed labor rates--to be reasonable.[14]  Id.

Based on our review of the record, we find the agency’s price evaluation to be reasonable.  As described above, to evaluate offerors’ prices the agency utilized three of the potential methods provided by FAR § 15.404.  One of those methods included an evaluation of the reasonableness of the proposed labor rates by comparing those rates to the rates in each offeror’s CIO-SP3 contract.  The agency’s price evaluation thus assessed the reasonableness of proposed labor rates, and was consistent with the terms of the RFP and the requirements of FAR § 15.404, and therefore was reasonable.[15]

Best-Value Tradeoff

Finally, OnPoint challenges the agency’s best-value tradeoff decision, alleging that it improperly focused on whether OnPoint’s lower-rated proposed was worth the cost savings and did not identify the technical benefits in DSA’s proposal that would justify the price premium.  Protester Supp. Comments at 14-15.

Source selection decisions must be documented, and include the rationale and any business judgments and tradeoffs made or relied upon by the source selection official.  FAR § 15.308.  However, there is no need for extensive documentation of every consideration factored into a tradeoff decision.  Id.; Terex Gov’t Programs, B-404946.3, Sept. 7, 2011, 2011 CPD ¶ 176 at 3.  To the extent a protester argues that the source selection decision should have evidenced a more precise determination or quantification as to whether the technical advantages associated with a proposal warranted a certain price premium, we note that such a degree of precision or quantification is not required.  See Highmark Medicare Servs., Inc.; Cahaba Gov’t Benefit Adm’rs., LLC; Nat’l Gov’t Servs., Inc., B-401062.5 et al., Oct. 29, 2010, 2010 CPD ¶ 285 at 22.  Rather, the documentation need only be sufficient to establish that the source selection official was aware of and considered the strengths and weaknesses of competing proposals, the proposals’ ratings under the RFP’s evaluation factors and overall, and the proposals’ prices.  New Orleans Support Servs., LLC, B-404914, June 21, 2011, 2011 CPD ¶ 146 at 8; FN Mfg. LLC, B-407936 et al., Apr. 19, 2013, 2013 CPD ¶ 105 at 6.

Here, the source selection decision was unobjectionable.  The record shows that the agency compared the strengths assessed to DSA’s and OnPoint’s proposals and determined that DSA’s solution was technically superior to OnPoint’s solution under the infrastructure requirements, MilTech capabilities, and program management subfactors.  See AR, Tab 7, SRD at 31.  The agency thus determined that DSA “proposed a technically superior solution to OnPoint” under the technical/management approach factor.  Id.  The agency also “determined that [the] technically superior solution[] proposed by DSA . . . will represent a better value to the [g]overnment than the cost/price savings associated with OnPoint’s technically inferior solution.”  AR, Tab 8, PNM at 9.  Although OnPoint disagrees with this judgment, it has not shown it to be unreasonable.

The protest is denied.

Thomas H. Armstrong
General Counsel

 

[1] This is the third protest filed by OnPoint challenging the agency’s decision to issue a task order to DSA.  OnPoint protested the agency’s initial award to DSA; in response the agency took corrective action, which resulted in award to DSA again.  OnPoint timely protested that award decision, and the agency again took corrective action.  After the second corrective action, the agency again awarded to DSA; this protest followed.

[2] The mission of the product lead MilTech solutions is to support the Department of Defense and other partnered organizations with a broad range of information management and knowledge management solutions and services to a variety of government agencies.  AR, Tab 1A, Performance Work Statement (PWS), at 2.

[3] The PWS explained that milSuite referred to MilTech’s suite of social business solutions and that SharePoint is one of the core capabilities used for knowledge management.  AR, Tab 1A, PWS at 3.  The agency explained that the SIF ITSM tool “allows users to manage all of their information technology lifecycle.”  AR, Tab 17, Second Decl. of Technical Evaluation Team (TET) Lead, at 16.

[4] For the remainder of this discussion, we will refer only to the adjectival ratings of outstanding, good, acceptable, marginal, and unacceptable.

[5] The agency did not assess any strengths to DSA’s or OnPoint’s proposals under the other three subfactors and found the two offerors to be “technically equal in merit” for each of those subfactors.  AR, Tab 7, SRD, at 31.

[6] This protest is within our jurisdiction to hear protests of task orders placed under civilian agency IDIQ contracts valued in excess of $10 million.  41 U.S.C. § 4106(f)(1)(B); See Wyle Labs., Inc., B-413989, Dec. 5, 2016, 2016 CPD ¶ 345 at 4 (The authority under which we exercise our task order jurisdiction is determined by the agency that awarded the IDIQ contract under which the task order is issued, here NIH, rather than the agency that actually issues or funds the task order.).

[7] In its initial protest and first supplemental protest, OnPoint raised a number of protest grounds to which the agency responded in detail in its initial and first supplemental agency report.  OnPoint failed to respond to the agency’s arguments and we therefore dismiss those protest grounds as abandoned.  4 C.F.R. § 21.3(i)(3).  In its second supplemental protest, OnPoint also raises other collateral arguments.  Although we do not address every argument, we have reviewed them all and find no basis to sustain the protest.

[8] To respond to a number of OnPoint’s protest grounds, the agency provided declarations from the TET lead and the cost/price analyst and referenced those declarations in defending its award decision.  See Second Supp. MOL; AR, Tab 16, Second Declaration of TET Lead; Tab 17, Second Declaration of Cost/Price Analyst.  OnPoint contends that the agency’s declarations constitute post-hoc rationalizations to support award to DSA.  Protester Supp. Comments at 3.  As we do not expect an agency’s evaluation report to “prove a negative,” such as documenting why an offeror did not receive a strength for a particular aspect of its proposal, we view the evaluators’ declarations to be post-protest explanations that provide a more detailed rationale for the agency’s contemporaneous conclusions, and not post-hoc rationalizations.  Compare NWT, Inc.; PharmChem Labs., Inc., B-280988, B-280988.2, Dec. 17, 1998, 98-2 CPD ¶ 158, with Boeing Sikorsky Aircraft Support, B-277263.2, B-277263.3, Sept. 29, 1997, 97-2 CPD ¶ 91; see also BillSmart Sols., LLC, B-413272.4, B-413272.5, Oct. 23, 2017, 2017 CPD ¶ 325 at 14 n.19.

[9] OnPoint argues that DSA was assessed a strength not for its proposed cloud migration approach, but for its experience in cloud migration, and that the “contemporaneous record focuses only on DSA’s experience in migrating applications to approved cloud environments.”  Protester Supp. Comments at 4.  Thus, OnPoint argues, the TET lead’s declaration and explanation of why DSA was assessed a strength contradicts the contemporaneous record and should be given no weight.  Id.  OnPoint’s argument has no merit.  The contemporaneous record clearly states that DSA received a strength because DSA’s proposal “demonstrates the ability to reduce operating costs and increase performance significantly through upgrading a [g]overnment information system to a commercial cloud solution.”  AR, Tab 7, SRD at 18.  While the agency also cited to DSA’s experience in describing the strength, the record shows that the agency assessed the strength because of DSA’s proposed approach to cloud migration, not only because of its experience.

[10] The two critical issues identified in the PWS were “[t]he ability to search and discover data across platforms within a [c]ommon [a]ccess [c]ard (CAC)-only [p]ublic [k]ey [i]nfrastructure (PKI) architecture” and “[t]he ability to develop and sustain an identity management solution across applications that allows secure access to include PKI based authentication and is not limited to the [Department of Defense] CAC and [o]pen [s]ingle [s]ign [o]n (SSO) solutions.”  AR, Tab 1A, PWS, at 18.

[11] OnPoint also alleges that the assessment of a strength for DSA’s proposed use of SMEs constituted improper double counting.  Protester Supp. Comments at 8.  OnPoint’s argument rests on the fact that the summary of this strength in the TET’s evaluation referenced DSA’s ability to reduce operating costs and increase performance by upgrading to a commercial cloud solution which, OnPoint argues, is the same as the strength DSA received for its cloud migration solution.  Id.; see AR, Tab 7, SRD at 28.  The agency responds that it inadvertently referenced the commercial cloud solution when summarizing the SME strength.  Second Supp. MOL at 15-16.  The initial description of the strengths assessed to DSA under the infrastructure requirements subfactor made clear that one was for the cloud migration solution and one was for the designated identity management SMEs.  AR, Tab 7, SRD at 18.  Moreover, the next sentence in the summary after the reference to the commercial cloud states that “DSA proposed identity management SMEs that will collaborate with Army SMEs and provide input to maintain and upgrade current [i]dentity [m]anagement solution used by MilTech capabilities.”  Id. at 28.  The record thus refutes OnPoint’s allegation that the agency engaged in improper double counting.

[12] The PWS described the SIF capability lead labor position as leading a team of developers and ITSM experts for ITSM implementation.  AR, Tab 1A, PWS § 14B, at 40.  Among the required qualifications for this position was “[e]xtensive experience with BMC Remedy and/or ServiceNow.”  Id. at 41.  The PWS therefore recognized the importance of ServiceNow experience for this position, which undercuts OnPoint’s claim that this type of expertise was not reasonably understood from the evaluation criteria. 

[13] In responding to this protest ground, the agency states that “as of right now, MilTech has no current or future plans to migrate from BMC Remedy to ServiceNow.”  Second Supp. MOL at 21.  OnPoint argues that “[i]t makes no sense to give DSA a strength for a specific migration capability and then affirmatively state no such change is being contemplated.”  Protester’s Supp. Comments at 10.  However, in assessing a strength to DSA for this capability, the TET explained that ServiceNow is a “competing ITSM product that is currently under consideration across the Army for designation as the Army standard ITSM tool.”  AR, Tab 7, SRD, at 19.  Thus, we do not find it improper for the agency to have assessed a strength to DSA for its ability to support a potential move to ServiceNow given that the Army could potentially migrate to the new ITSM tool.

[14] In comparison, OnPoint’s proposed price was the lowest of all offerors and was [DELETED] percent lower than the average total evaluated price and [DELETED] percent lower than the IGCE.  AR, Tab 6, Cost/Price Eval. Report, at 19.  OnPoint’s proposed fully burdened labor rates were approximately [DELETED] percent lower to [DELETED] percent higher than its rates on the CIO-SP3 contract.  Id.  The agency also found OnPoint’s price and labor rates to be reasonable.  Id.

[15] OnPoint also argues that the agency’s alleged failure to evaluate the reasonableness of individual labor rates also meant that the agency’s price evaluation did not reflect the actual cost of performance.  Second Supp. Protest at 14.  OnPoint maintains that the agency ultimately may have to pay more for surge support because an offeror could utilize more labor categories with allegedly unreasonably high rates for the optional surge support, which could drive up the cost of performance.  Id. at 16.  We find this to be an untimely challenge to the terms of the solicitation.  The RFP made clear that the agency would calculate the price of surge support by multiplying the offeror’s total proposed cost for the base and all option periods by 35 percent.  RFP at 6.  To the extent OnPoint is alleging that the agency should have evaluated the reasonableness of individual labor rates in order to reflect a more accurate cost for surge support, it was required to file this protest prior to the closing time set for receipt of proposals.  See 4 C.F.R. § 21.2(a)(1).

Downloads

GAO Contacts

Office of Public Affairs