IAP World Services, Inc.; Jones Lang LaSalle Americas, Inc.

B-411659,B-411659.2,B-411659.3: Sep 23, 2015

Additional Materials:

Contact:

Ralph O. White
(202) 512-8278
WhiteRO@gao.gov

Kenneth E. Patton
(202) 512-8205
PattonK@gao.gov

 

Office of Public Affairs
(202) 512-4800
youngc1@gao.gov

IAP World Services, Inc. (IAP), of Cape Canaveral, Florida, and Jones Lang LaSalle Americas, Inc. (JLL), of Chicago, Illinois, protest the award of a contract to Jacobs Technology, Inc. (Jacobs) of Pasadena, California, under request for proposals (RFP) No. NNA14497087R, which was issued by the National Aeronautics and Space Administration (NASA) for maintenance support services at Ames Research Center at Moffett Field, California. Both protesters challenge NASA's evaluation of their respective proposals and the agency's best value tradeoff decision, and IAP challenges the agency's evaluation of Jacobs' proposal.

We deny the protests.

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of:  IAP World Services, Inc.; Jones Lang LaSalle Americas, Inc.

File:  B-411659; B-411659.2; B-411659.3

Date:  September 23, 2015

J. Alex Ward, Esq., Rachael K. Plymale, Esq., Damien C. Specht, Esq., and James A. Tucker, Esq., Jenner & Block LLP, for IAP World Services, Inc.; and
Thomas A. Janczewski, Esq., Michael Best & Friedrich LLP, for Jones Lang LaSalle Americas, Inc., the protesters.
Robert J. Symon, Esq., Aron C. Beezley, Esq., Jennifer F. Brinkley, Esq., and Lisa A. Markman, Esq., Bradley Arant Boult Cummings LLP, for Jacobs Technology Inc., the intervenor.
Victoria H. Kauffman, Esq., Colleen Burt, Esq., Kevin F. Kouba, Esq., National Aeronautics and Space Administration, for the agency.
Stephanie B. Magnell, Esq., and Jonathan L. Kang, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

Protests challenging the agency’s evaluation of offerors’ technical proposals and past performance, and the selection decision, are denied where the record demonstrates that the agency’s evaluation was reasonable, and the other protest grounds are either untimely or not prejudicial.

DECISION

IAP World Services, Inc. (IAP), of Cape Canaveral, Florida, and Jones Lang LaSalle Americas, Inc. (JLL), of Chicago, Illinois, protest the award of a contract to Jacobs Technology, Inc. (Jacobs) of Pasadena, California, under request for proposals (RFP) No. NNA14497087R, which was issued by the National Aeronautics and Space Administration (NASA) for maintenance support services at Ames Research Center at Moffett Field, California.  Both protesters challenge NASA’s evaluation of their respective proposals and the agency’s best value tradeoff decision, and IAP challenges the agency’s evaluation of Jacobs’ proposal. 

We deny the protests.

BACKGROUND

On September 30, 2014, NASA issued the RFP, also known as the Ames Facilities Support Services (AFSS) contract, to obtain operations and maintenance services for NASA’s Ames Research Center.  Contracting Officer’s Statement of Facts (COSF)-IAP, ¶ 2.2.[1]  NASA solicited the AFSS procurement using full and open competition, intending to award a single hybrid contract consisting of fixed-price core services, fixed-price task orders under indefinite-delivery, indefinite-quantity (IDIQ) contract line item numbers (CLINs), and environmental and emergency services under cost-plus fixed-fee CLINs.  The maximum performance period, including all options, is 10 years.  RFP at 2.  The RFP provided for consideration of three evaluation factors of approximately equal weight in the best value tradeoff decision:  mission suitability, cost/price, and past performance.  RFP §§ M.3, M.4.  Award was to be made on a best value basis, in accordance with Federal Acquisition Regulation § 15.101-1, Tradeoff Process, and the agency reserved the right to make award without discussions.  RFP § M.2(b).   

The mission suitability factor consisted of three subfactors:  management, technical, and small business utilization.  RFP at 90.  Within the management subfactor, the agency was to evaluate offerors’ management plans and staffing plans.  Agency Report (AR), Tab 33, Source Selection Statement, at 2.[2]

The solicitation provided the following adjectival ratings and allowable percentile ranges for the mission suitability subfactors:[3]

Definition

Adjectival Rating

Percentile Range

A comprehensive and thorough proposal of exceptional merit with one or more significant strengths.  No deficiency or significant weakness exists.

Excellent

91-100

A proposal having no deficiency and which demonstrates overall competence.  One or more significant strengths have been found, and strengths outbalance any weaknesses that exist.

Very Good

71-90

A proposal having no deficiency and which shows a reasonably sound response.  There may be strengths or weaknesses, or both.  As a whole, weaknesses not offset by strengths do not significantly detract from the Offeror’s response.

Good

51-70

A proposal having no deficiency and which has one or more weaknesses.  Weaknesses outbalance any strengths.

Fair

31-50

A proposal that has one or more deficiencies or significant weaknesses that demonstrate a lack of overall competence or would require a major proposal revision to correct.

Poor

0-30


RFP § M.2(f). 

Under the management subfactor, the agency stated that it would:

[E]valuate the Offeror’s staffing plan including the demonstration of adequate staffing levels for the core and environmental/emergency services supported by a reasonable Basis of Estimates (BOEs). Review will include an assessment of the staffing plan’s ability to demonstrate a thorough understanding of the labor quantities and skills needed to successfully perform the contract requirements and identify and address any potential difficulties in fulfilling the staffing needs. NASA will also evaluate the Offeror’s approach to managing fluctuations (an increase or decrease) in the IDIQ workload without impacting the resources committed to the core/base portion of the contact.

RFP § M.3.  As relevant here, the RFP required offerors to “demonstrate adequate staffing levels” and “a thorough understanding of the labor quantities and skills needed to successfully perform the contract.”  RFP § L.13.1.1(b).  The RFP cautioned offerors that “[p]roposed resources that are unrealistic may be viewed as a lack of understanding which can generate a Mission Suitability weakness.”  Id. (emphasis removed).

Under factor 3, past performance, the agency was to evaluate each offeror’s “suitability to fulfill the requirements of [the] contract.”  Id. at 93.  The RFP provided that offerors would receive past performance confidence ratings, consisting of a “performance” component and a “pertinence”--or relevance--component.  Id. at 94. The relevance evaluation considered the degree of similarity between the contract reference and the RFP in terms of dollar value, tasks, complexity, as well as the recency and duration of the past performance.[4]

On November 18, 2014, NASA received eight timely proposals, including those of IAP, Jacobs and JLL.  COSF-IAP ¶ 3.1.  NASA evaluated each proposal for strengths and weaknesses.  Jacobs and JLL each received a significant weakness in the management subfactor because of understaffing.  Specifically, the agency described the significant weakness for each of Jacobs and JLL as follows:  “[t]he Offeror’s proposed staffing level is below the Government staffing estimate and it is unclear whether the proposed staffing level is sufficient to meet the requirements of the AFSS Contract, and this uncertainty appreciably increases the risk of unsuccessful contract performance.”  AR-JLL, Tab 23, JLL Consensus Findings, at 9; Tab 29, Jacobs Consensus Findings, at 8.  The offerors’ proposed staffing levels were as follows:

Offeror

Total Proposed Staffing in WYEs[5]

IAP

[DELETED]

Jacobs

[DELETED]

JLL

[DELETED]

NASA Estimate

[DELETED]


AR, Tab 31, Source Selection Authority (SSA) Presentation, at 40.  Within the mission suitability subfactors, the assignment of strengths and weaknesses for the proposals at issue here was as follows:

Subfactor

 

IAP

Jacobs

JLL

Management (45%)

Sig. Strength

--

1

1

 

Strength

7

6

7

Weakness

3

--

2

Sig. Weakness

--

1

1

Technical (45%)

Sig. Strength

1

3

2

 

Strength

3

3

2

Weakness

1

--

--

Sig. Weakness

--

--

--

Small Bus. Util. (10%)

Sig. Strength

--

1

1

 

Strength

1

1

1

Weakness

--

--

--

Sig. Weakness

1

--

--


AR, Tab 31, SSA Presentation, at 39, 58, 84.

After determining weaknesses and strengths, the source evaluation board (SEB) assigned each mission suitability subfactor an adjectival rating, which corresponded to an established percentile bandwidth.  COSF-IAP ¶ 5.2; RFP § M.2(f).  The SEB then assigned the subfactor a percentile score within the range allowable for that adjectival rating.  COSF-IAP ¶ 5.2.  Subsequently, the assigned percentile range was multiplied by the maximum points available for that subfactor, which were weighted as follows:  management (450 points), technical (450 points), and small business utilization (100 points).  RFP at 90; COSF-IAP ¶ 5.2.  As relevant here, the agency’s final consensus ratings were as follows:

 

IAP

Jacobs

JLL

Mission Suitability (1000 points)

725

919.5

879.5

Management (450 points)

Good / 315

Very Good / 396

Very Good / 378

Technical
(450 points)

Very Good / 360

Excellent / 427.5

Excellent / 409.5

Sm. Bus. Util. (100 points)

Fair / 50

Excellent / 96

Excellent / 92

Past Performance

High Level of Confidence

Very High Level of Confidence

High Level of Confidence

Evaluated Cost / Price

$258,939,658

$235,568,847

$216,565,835


AR, Tab 33, Source Selection Statement, at 11-16.

After proposal evaluation, NASA conducted a best value tradeoff in which the SSA reviewed and adopted the findings of the SEB.  Id. at 19.  The SSA conducted a tradeoff between Jacobs and all other offerors with either a more competitive price or a higher past performance confidence rating (Jacobs’ proposal had the highest mission suitability), ultimately choosing Jacobs as the awardee.  Id. at 20-21.  These protests followed. 

DISCUSSION

IAP and JLL each challenge NASA’s evaluation of their respective proposals, and IAP also challenges the agency’s evaluation of Jacobs’ proposal.[6]  Each protester also challenges the agency’s best value tradeoff decision.  We address each protester’s arguments in turn, and for the reasons discussed below, find no basis to sustain either protest.

In reviewing protests challenging an agency’s evaluation, our Office does not reevaluate proposals; rather, we review the agency’s evaluation to determine whether it was reasonable and consistent with the solicitation, as well as applicable statutes and regulations.  ASRC Research & Tech. Solutions, LLC, B‑406164, B‑406164.3, Feb. 14, 2012, 2012 CPD ¶ 72 at 8; see also Halfaker & Assocs., LLC, B-407919, B-407919.2, Apr. 10, 2013, 2013 CPD ¶ 98 at 6.  An offeror’s disagreement with the agency’s evaluation, without more, is not sufficient to render the evaluation unreasonable.  Ben-Mar Enters., Inc., B-295781, Apr. 7, 2005, 2005 CPD ¶ 68 at 7.

IAP’s Protest Arguments

IAP argues that although NASA assessed as a significant weakness Jacobs’ proposed staffing approach, the agency failed to adequately weigh this issue in its evaluation of the awardee or the award decision.  IAP also challenges several of the weaknesses the agency assigned to its own proposal.  However, despite certain apparent errors in the agency’s evaluation of IAP’s proposal, we find no basis to conclude that protester was prejudiced by these errors or that the agency misevaluated Jacobs’ proposal.

Jacobs’ Significant Weakness For Inadequate Staffing

IAP first argues that Jacobs should have received lower evaluation ratings based on NASA’s assessment of a significant weakness for the awardee’s proposed staffing level.  In this regard IAP contends that both under the RFP and as a matter of reasonableness, a proposal with a significant weakness for understaffing merited no more than poor rating under the management subfactor.  The protester also argues that the agency did not give adequate weight to this significant weakness in the tradeoff decision.  We find no merit to these arguments.

Jacobs received a significant weakness based on the agency’s conclusion that it had not proposed adequate staffing.  AR, Tab 29, Jacobs Consensus Findings, at 8.  The agency found that Jacobs’ proposed staffing level was flawed as follows:

[S]ignificantly below the Government estimate for the overall staffing level for the core and [cost-plus-fixed-fee] requirements anticipated for the AFSS Contract.  It is not clear from the proposal how the Offeror’s management and technical approach would provide such an effective and efficient RCM [reliability centered maintenance] program that the Offeror would be able to fulfil the requirements of the SOW with the overall staffing level proposed.  This aspect of the proposal results in a concern to the government that the Offeror does not fully understand the requirements of the AFSS contract.

*         *         *         *         *

The Offeror’s proposal is below the Government staffing estimate and it is unclear whether the proposed staffing level is sufficient to meet the requirements of the AFSS Contract, [and] this uncertainty appreciably increases the risk of unsuccessful contract performance.

AR-IAP, Tab 29, Jacobs Consensus Finding, at 8.   

IAP argues that given this significant weakness for understaffing, NASA was required to assign Jacobs’ proposal a poor rating under the management subfactor.[7]  IAP Protest at 7.  IAP contends that “correction of this fundamental flaw by itself would likely result in award to IAP” because, even with a maximum value of 30 percent under the poor rating (135 points), Jacobs’ mission suitability rating would be no higher than 658.5, i.e., well below IAP’s score of 725.[8]  Id. at 8-9.  IAP further argues that inadequate staffing in a services contract clearly relates to factors essential to contract performance and that under the agency’s interpretation, there would be no significant weakness that could ever result in a poor rating.  IAP Comments (July 27, 2015) at 4. 

NASA argues that even though Jacobs’ proposal was assigned a significant weakness under the management subfactor, it merited a very good rating overall for this subfactor because it had no deficiencies, demonstrated “overall competence,” and because “[o]ne or more significant strengths have been found, and strengths outbalance any weaknesses that exist.”  AR-IAP at 2.  NASA contends that a poor rating would only be required if Jacobs “demonstrate[d] a lack of competence in this component of its proposal” or the proposal had “no other Significant Strengths and Strengths to balance the Significant Weakness(es), OR if the Significant Weakness(es) were related to a fundamental SOW requirement that was essential to contract performance.”  Id.; COSF-IAP ¶ 13.5.  Overall, in the agency’s view, a conclusion that a proposal demonstrates “a lack of competence” is distinguishable from a lingering uncertainty about whether “the Offeror would be able to fulfil the requirements of the SOW with the overall staffing level proposed.”  COSF-IAP ¶ 13.8; see RFP § M.2(f).

As discussed above, our Office does not reevaluate proposals; instead we determine whether the agency’s evaluation was reasonable and performed in accordance with the RFP.  Here, we find that the terms of the RFP do not preclude award of a very good rating or that the agency abused its discretion in assigning this rating to Jacobs.  To the extent the protester argues that the agency should have given greater weight to the weakness and assessed a lower evaluation rating, the protester’s disagreement with the agency’s judgment, without more, does not provide a basis to sustain the protest.  Furthermore, Jacobs received one significant strength and six strengths as part of the management subfactor, and the record does not support a conclusion that the agency was unreasonable in finding that these strengths outweighed the significant weakness.  Because the agency’s rating did not violate the terms of the RFP and was not unreasonable, on the record before us we have no basis to sustain the protest. 

As a related matter, IAP contends that NASA failed to assess weaknesses in the technical subfactor related to Jacobs’ understaffing under the management subfactor.  IAP Protest at 9.  NASA responds that the RFP’s evaluation plan under the technical subfactor does not include staffing, precisely because the agency intended to evaluate staffing solely as part of the management subfactor.  AR-IAP at 3.  The RFP states that, under the technical subfactor, NASA will evaluate “the Offeror’s understanding of, and approach to, implementing an RCM program that can support performance in a manner that is efficient, effective, and feasible.”  IAP Protest at 9; RFP at 91.  We agree with the agency here that the plain language of the RFP imposes no obligation upon the agency to evaluate staffing under the technical subfactor, and therefore we have no basis to sustain the protest.

Agency’s Evaluation of IAP’s Proposal

Next, IAP challenges the agency’s evaluation of its proposal, contending that it lacked a reasonable basis and was inconsistent with the RFP.  IAP Protest at 11.[9]  Although we identify certain possible errors in the agency’s evaluation, as discussed in the next section below, we find no possibility of prejudice.

Level of Monetary Authority

IAP first challenges the agency’s assessment of a weakness under the management subfactor for IAP’s plan to obtain second-tier approval when accepting task orders, or entering into subcontracts or contract modifications valued in excess of $100,000.  IAP Protest at 11; see also AR-IAP, Tab 23, IAP Consensus Findings, at 8. 

NASA explains that “the successful Offeror will have to execute hundreds of orders at any given time, and not having an adequate monetary authority on-site to execute all these task orders would likely create an inefficient process and result in delays to the completion of numerous IDIQ task orders.”  COSF-IAP ¶ 9.6.  The agency further asserts that, during IAP’s performance of the incumbent contract, over half of all of the task orders in excess of $100,000 “had issues at the start due to timely execution of the task order that resulted in documented deductions,” and that IAP’s general managers “explained to the Contracting Officers Representative (COR) that these delays were due to IAP’s corporate approval process.”  COSF-IAP ¶ 9.9.  Because NASA’s initial response to this protest ground was supported neither by a citation to the record nor by a declaration from the COR, we requested additional briefing as to the basis of the agency’s assessment of this weakness.  NASA’s response did not provide a record citation substantiating its concern regarding “issues relating to approval for task orders in excess of [$100,000].”  Supp. AR-IAP (Aug. 28, 2015) at 1. 

In response, IAP’s general manager provided a detailed statement rebutting the agency’s contention that a lack of authority concerning orders valued at more than $100,000 resulted in delays.  See Decl. Gen. Mgr. ¶ 4-5 (July 25, 2015).  Furthermore, the record shows that there were 540 task orders in 2011, 542 in 2012, and 520 in 2013, and that task orders valued in excess of $100,000 constituted 1 percent, 2 percent and 3 percent of annual task orders, respectively.  RFP Attach. J2 at 691-699.[10]  NASA provided no explanation as to the relationship between this historical volume of higher-value task orders and its assessment of a weakness.

However, although we find the agency’s rationale for assigning a weakness here to be without support in the record, for the reasons discussed below, we do not sustain the protest because the protester has failed to demonstrate that it was prejudiced by this error.

Staffing Distribution

IAP next argues that the agency unreasonably assessed a weakness under the management subfactor for IAP’s proposed staffing distribution.  IAP Protest at 12. 

The SEB assigned IAP a weakness under the management subfactor of the mission suitability factor because the SEB concluded that “[t]he Offeror’s distribution of staffing resources stated in the proposal does not align with the Government estimate for various sections of the AFSS SOW and this raises the concern to the Government as to whether or not the Offeror fully understands the requirements and associated workforce distribution necessary for the AFSS contract . . . .”  AR‑IAP, Tab 23, IAP Consensus Findings, at 10.  Specifically, while the evaluators recognized that IAP proposed more staff than the government estimate, they expressed a concern that IAP’s proposed response to sections C5, C7, and C17 of the statement of work was “below the Government estimate to such a degree as would cause concern [that the] Offeror may have misunderstood the . . . two-person safety rule for working on plumbing and electrical requirements” and other performance requirements.  AR-IAP, Tab 23, IAP Consensus Findings, at 10. 

While an agency may rely on its own estimates of the staffing levels necessary for satisfactory performance in evaluating proposals for the award of a fixed-price contract, it is improper for an agency to downgrade a proposal simply because the offeror’s overall proposed work hours differ from the government’s estimate and there are no other factors supporting the agency’s assessment.  Native Res. Dev. Co., B-409617.3, July 21, 2014, 2014 CPD ¶ 217 at 3-4.  See also Olympus Bldg. Servs., Inc., B-285351, B-283351.2, Aug. 17, 2000, 2000 CPD ¶ 178 at 10; Allied Cleaning Servs., Inc., B-237295, Feb. 14, 1990, 90-1 CPD ¶ 275 at 3-4.

NASA’s assessment of a weaknesses is based on a comparison between NASA’s estimated WYEs for various functions and the WYEs per function proposed by IAP.  AR-IAP, Tab 23, IAP Consensus Findings, at 10 (“The Offeror’s distribution of staffing resources stated in the proposal does not align with the Government estimate for various sections . . . .”).  However, the record does not explain why NASA considered IAP’s proposed staffing distribution to be insufficient in certain areas in light of its proposed staffing approach--especially as IAP’s total proposed WYEs exceeds the agency’s estimate.  Rather, the record reflects that NASA mechanically compared IAP’s WYE distributions with those set forth in the government estimate, without regard to the protester’s proposed technical approach.  AR-IAP, Tab 23, IAP Consensus Findings, at 10.  However, as discussed below, we conclude that resolution of this error would still not allow IAP to demonstrate that, overall, it has a substantial chance of being awarded the contract.[11]

Best Value Tradeoff Decision and Prejudice

IAP alleges that NASA’s best value tradeoff decision is flawed because the agency failed to reasonably consider the significant weakness assigned to Jacobs’ proposal under the management subfactor.  IAP Protest at 10, 17.  IAP also contends that the best value tradeoff decision was not only tainted by evaluation errors, but that the agency uses the artificial precision of a weighted numerical score to gloss over what would otherwise be a complex weighing of proposals’ strengths and weaknesses.  IAP Protest at 17, 18.  In this regard, IAP argues that NASA’s best value tradeoff decision fails to satisfy the requirement for intelligent decision-making.  Id.

In general, evaluation ratings are merely guides for intelligent decision-making in the procurement process; the evaluation of proposals and consideration of their relative merit should be based upon a qualitative assessment of proposals consistent with the solicitation’s evaluation scheme.  Highmark Medicare Servs., Inc., et al., B-401062.5 et al., Oct. 29, 2010, 2010 CPD ¶ 285 at 19.  In determining which proposal represents the best value to the agency, an agency may reasonably find that the benefit of specific features set forth in a proposal are not worth any additional cost associated with the proposal, as long as that decision remains consistent with the solicitation’s evaluation and source selection criteria.  Highmark Medicare Servs., supraSee also Johnson Controls World Servs., Inc., B-289942, B-289942.2, May 24, 2002, 2002 CPD ¶ 88 at 10.  In reviewing an agency’s source selection decision, our Office examines the supporting record to assess whether the decision was reasonable, consistent with the stated evaluation criteria, and adequately documented.  Id. at 6.

The record shows that the SSA took specific cognizance of Jacobs’ significant weakness within the management subfactor.  AR, Tab 33, Source Selection Statement, at 13.  The SSA also confirmed that he had reviewed the offerors’ mission suitability and past performance proposal sections, and reviewed and adopted the findings of the SEB, including those that underpin the numerical scores.  Id. at 13, 19-21.  After noting that Jacobs had both the highest mission suitability score and past performance level of confidence rating of all offerors, the SSA decided to limit the best value tradeoff evaluation and did not compare Jacobs’ proposal to those (including IAP’s) with a combination of a lower mission suitability score, lower past performance level of confidence rating and higher total evaluated price.  Id. at 20.  Ultimately, he determined that Jacobs’ proposal “was superior to all Offerors in both Mission Suitability and Past Performance.”  Id. at 21. 

IAP argues that the agency used the artificial precision of a weighted numerical score to gloss over what would otherwise be a complex weighing of proposals’ strengths and weaknesses.  IAP Protest at 17-19; IAP Comments (July 27, 2015) at 31.  The record shows that although the SSA’s best value tradeoff discussion makes multiple references to the offerors’ numerical scores, the SSA was well aware of the strengths and weaknesses underlying the evaluations.  See AR, Tab 33, Source Selection Statement, at 20 (“I conducted my deliberations by first looking at the overall evaluated scores for Mission Suitability and the findings that led to those scores . . . .”)  While numerical scoring cannot substitute for a nuanced evaluation of proposals, here, the selection official had a basis to conclude that the scores were an accurate representation of the relative strengths and weaknesses of the proposal.

Finally, IAP argues that flaws in NASA’s evaluation rendered the selection decision unreasonable.  As discussed above, we find no merit to the protester’s arguments concerning the evaluation of Jacobs’ proposal.  With regard to the possibility of errors concerning the evaluation of IAP’s proposal, we note, as discussed above, that there is some basis for concern about the reasonableness of these evaluations.  Nonetheless, there is no basis to conclude that the protester was prejudiced here.

Our Office has consistently held that to prevail, a protester must demonstrate that it has been prejudiced by the agency’s errors.  Where the protester fails to demonstrate that, but for the agency’s actions, it would have had a substantial chance of receiving the award, there is no basis for finding prejudice, and our Office will not sustain the protest, even if deficiencies in the procurement are found.  HP Enter. Servs., LLC, B-411205, B-411205.2, June 16, 2015, 2015 CPD ¶ 202 at 6; Booz Allen Hamilton Eng’g Servs., LLC, B-411065, May 1, 2015, 2015 CPD ¶ 138 at 10 n.16; Colonial Storage Co.--Recon., B-253501.8, May 31, 1994, 94-1 CPD ¶ 335 at 2-3.

Under the management subfactor, IAP received 7 strengths and 3 weaknesses, which corresponded to an adjectival rating of good and a score of 315 points.  AR, Tab 33, Source Selection Statement, at 11.  In comparison, Jacobs received 1 significant strength, 6 strengths, and 1 significant weakness, which corresponded to an adjectival rating of very good and a score of 396 points.  Id. at 13.  Even removing the two contested weaknesses, we have no basis to conclude that IAP would have received the maximum 450 points under the management subfactor.  Furthermore, even with the maximum score and a higher overall mission suitability score of 860 (versus 725), IAP’s score would remain almost 60 points lower than Jacobs’ score of 919.5, which reflects the strengths and weaknesses in the other subfactors.  In addition, even if IAP received a very high confidence rating under the past performance factor, Jacobs would remain higher rated technically, and even if IAP received the same ratings and scores as Jacobs, IAP would still be higher-priced.[12]  Id. at 19.   

On this record, we find no basis to conclude that IAP was prejudiced by any of the challenged errors regarding the evaluation of its proposal, and therefore deny the protest.  See Triad Logs. Servs. Corp., B-406416.2, June 19, 2012, 2012 CPD ¶ 186 at 2; Online Video Serv., Inc., B-403332, Oct. 15, 2010, 2010 CPD ¶ 244 at 2.  See also Lanmark Tech., Inc., B‑410214.3, Mar. 20, 2015, 2015 CPD ¶ 139 at 8 (finding that even if the protester prevailed on its proposal evaluation challenges, this would not have changed the agency’s tradeoff analysis and award decision). 

JLL’s Protest Arguments

JLL challenges several aspects of the agency’s evaluation of its own proposal, as well as the agency’s selection decision.  For the reasons below, we deny the protest.

 Improper Use of Plug Numbers in Total Price Comparison

JLL contends that the agency’s price evaluation was flawed because the agency included its own “plug numbers” totaling $47.15 million in determining each offeror’s total evaluated price, thereby improperly diminishing the impact of JLL’s $19 million price advantage over the awardee (approximately a 10 percent advantage when only the base numbers are used, as opposed to an 8 percent advantage when the plug numbers are included).  JLL Protest at 8-9.  In this regard, JLL argues that “an agency must exclude government plug numbers when performing any tradeoff analysis.”  Id.  JLL also argues that the agency failed to document why it deemed a $19 million price differential to be “minor.”  Id.  Ultimately, JLL argues that the use of plug numbers diminished the proposals’ price differentials by increasing the evaluated price, which was the basis of comparison.  Id.

NASA argues that this argument is untimely because the RFP specifically advised that offerors’ evaluated prices would include specific plug numbers, and that if JLL objected to this provision, it was required to protest prior to the deadline for submission of proposals.  AR-JLL, at 2-3 citing RFP § L.14.3(c)(3), RFP Attach. J‑1 at 11; 4 C.F.R. § 21.2(a)(1).  We agree.

The RFP provided as follows: 

For evaluation purposes only, the subcontractor’s regular shift and off-shift labor rates are artificially burdened by using a plug number for the subs overall burden markup.  Offerors are to use the Government estimate for this plug number burden multiplier and shall not change the multiplier shown in the IDIQ [price schedule].

RFP § L.14.3(c)(3).  See also RFP at 81 (“The following chart of Other Direct Costs (ODCs) is provided for proposal purposes and shall be used as plug-in numbers. The amounts represent the Government’s current best estimate of contract requirements.”).

Our Bid Protest Regulations require offerors to protest apparent improprieties in a solicitation prior to the time set for receipt of initial proposals.  4 C.F.R. § 21.2(a)(1).  Because we find that the RFP clearly informed offerors that plug numbers would be used in the agency’s price evaluation, and because the protester did not raise this basis of protest prior to the time set for submission of proposals, we dismiss this basis of protest as untimely. 

Staffing Level

JLL contests the agency’s assessment of a significant weakness for its staffing, as compared to the internal government estimate, arguing that the RFP did not expressly provide for this comparison.  JLL Protest at 14-16; JLL Comments & Supp. Protest (Aug. 6, 2015) at 8.  NASA contends that no prejudice resulted from its assessment of a significant weakness in this area because the awardee similarly received a significant weakness for understaffing.  AR-JLL at 19.  Here, Jacobs proposed [DELETED] more WYEs than JLL, a [DELETED] percent higher level of staffing.  AR, Tab 31, SSA Presentation, at 40. 

JLL responds that similarly-based significant weaknesses would not necessarily carry the same weight in the agency’s best value tradeoff decision.  JLL Comments & Supp. Protest (Aug. 6, 2015) at 9 n.2.  We agree that significant weaknesses would not cancel each other, as in a math equation.  However, although JLL challenges the agency’s evaluation of its own proposal, JLL does not allege that the agency also misevaluated Jacobs’ proposal. 

JLL argues that the agency improperly assessed a significant weakness for its proposal because its staffing level ([DELETED]) was lower than NASA’s estimate ([DELETED]).  JLL Protest at 14-15; AR, Tab 31, SSA Presentation, at 40.  However, the record shows that the agency also assessed a significant weakness against the Jacobs proposal because its staffing level ([DELETED]) was lower than NASA’s estimate.  Id.  Because the agency assessed these weaknesses on the same basis, and because JLL proposed a lower staffing than Jacobs, even if the significant weakness were removed from JLL’s from evaluation it would similarly be removed from Jacob’s evaluation and their relative positions would not change.  On this record, we agree with the agency that JLL could not have been prejudiced by this evaluation because a decision in JLL’s favor does not increase its chances for award.  See Alcazar Trades, Inc.; Sparkle Warner JV, LLC, B-410001.4, B‑410001.5, Apr. 1, 2015, 2015 CPD ¶ 123 at 9, citing A-Tek, Inc., B-404581.3, Aug. 22, 2011, 2011 CPD ¶ 188 at 10 (“we will not sustain a protest when it is clear from the record that a protester has suffered no prejudice as a result of an agency evaluation error”).

Past Performance

Next, JLL alleges that its contract references should have all received the highest relevance rating of very highly relevant, as its contracts were at least as relevant as Jacobs’, in essence alleging both that the evaluation was flawed and that the agency engaged in disparate treatment between JLL and Jacobs.  JLL Protest at 11-12; JLL Comments & Supp. Protest (Aug. 6, 2015) at 10-11.  JLL also claims that the agency failed to adequately document the basis for its conclusion that JLL’s contract references merited the second-highest--rather than the highest--relevance rating.  JLL Comments & Supp. Protest (Aug. 6, 2015) at 10.

Based on our review of the record, we conclude that NASA’s evaluation contains detailed analyses of each of JLL’s contract references.  These analyses describe the work performed and include details from the agency’s follow-up telephone calls with the individuals who completed JLL’s underlying evaluations.  AR-JLL, Tab 23, JLL Consensus Findings, at 19, 21.  These evaluations clearly satisfy the agency’s documentation obligation.  JLL has not advanced a specific argument as to how the agency’s evaluations failed to comply with the RFP or were otherwise improper or unreasonable.  Without additional detail, let alone evidence in the record, we find no basis to sustain this protest ground.

Best Value Tradeoff Decision

Finally, JLL protests the agency’s best value tradeoff decision, alleging that it was mechanical and failed to account for the relative strengths and weaknesses of the offerors’ proposals.  JLL Protest at 10-11.  Specifically, JLL contends that neither the source selection statement nor the agency report explain which strengths and weaknesses ultimately impacted the agency’s tradeoff analysis.  JLL Comments & Supp. Protest (Aug. 6, 2015) at 4.  JLL argues that the agency’s tradeoff decision was not justified because “[w]ithout comparing the two proposals against each other, there is no way to determine whether the relative strengths justify the price premium involved.”  Id. (internal citation omitted). 

As discussed above, best value tradeoff decisions must be based upon a qualitative assessment of proposals consistent with the solicitation’s evaluation scheme.  Highmark Medicare Servs., Inc., et al., supra.  Our Office examines the supporting record to determine whether the decision was reasonable, consistent with the stated evaluation criteria, and adequately documented.  Johnson Controls World Servs., Inc., supra

The agency responds that the record demonstrates that the SSA considered not only the ratings and scores, but also their underlying bases.  AR-JLL at 10.  We agree.  The record shows that the SSA reviewed and adopted the SEB’s evaluations, including the findings of strengths and weaknesses.  AR, Tab 33, Source Selection Statement, at 19.  Furthermore, the record shows that the SSA specifically compared the proposals of Jacobs and JLL, as follows:

Because Jones Lang LaSalle Americas had the second highest Mission Suitability score, received a High Past Performance Level of Confidence rating, and had the third lowest evaluated Cost/Price, I believed that the proposal submitted by this Offeror was the one that might be considered most competitive with the Jacobs Technology proposal.  In an effort to perform my due diligence, I thus read the entire Mission Suitability and Past Performance proposals of these two Offerors.

Id. at 21.  Although the evaluation is somewhat conclusory, the record shows that the SSA had a basis for the conclusion (including the advantages and disadvantages of the competing proposals), that the conclusion was adequately documented, and that the point scores assigned reasonably reflected the underlying merits of the proposals.  On this record, we find no basis to sustain JLL’s protest.

The protests are denied.

Susan A. Poling
General Counsel



[1] IAP currently serves as the incumbent contractor performing the majority of these

services.  COSF-IAP ¶ 1.1.

[2] Where the agency report documents are identical in content and tab number, the citation does not distinguish between the two protests.  Otherwise, citations to the IAP and JLL agency records are to filings in their respective protests. 

[3] The RFP defined strengths and weaknesses as follows:  a significant strength was “an aspect of the proposal that appreciably increases the probability of successful contract performance;” a strength was “an aspect of the proposal that increases the probability of successful contract performance;” a weakness was “a flaw that increases the risk of unsuccessful contract performance;” a significant weakness was “a flaw that appreciably increases the risk of unsuccessful contract performance,” and a deficiency was “a material failure of a proposal to meet a Government requirement or a combination of significant weaknesses in a proposal that increases the risk of unsuccessful contract performance to an unacceptable level.”  RFP § M.2(g). 

[4] As relevant here, the solicitation provided the following past performance confidence ratings:

Very High Level of Confidence:  The Offeror’s relevant past performance is of exceptional merit and is very highly pertinent to this acquisition, indicates exemplary performance in a timely, efficient, and economical manner and very minor (if any) problems with no adverse effect on overall performance.  Based on the Offeror’s performance record, there is a very high level of confidence that the Offeror will successfully perform the required effort.

High Level of Confidence:  The Offeror’s relevant past performance is highly pertinent to this acquisition; demonstrating very effective performance that would be fully responsive to contract requirements. Offeror’s past performance indicates that contract requirements were accomplished in a timely, efficient, and economical manner for the most part, with only minor problems that had little identifiable effect on overall performance.  Based on the Offeror’s performance record, there is a high level of confidence that the Offeror will successfully perform the required effort.

RFP at 94.

[5] The RFP uses the term work year equivalent, or WYE, which is another way of expressing the concept of a full-time employee.

[6] Although IAP initially argued that Jacobs’ inadequate staffing required a cost/price adjustment, it later abandoned this protest ground and all others relating to the agency’s cost/price evaluation.  IAP Comments (July 27, 2015) at 3 n.1. 

[7] A rating of poor for a mission suitability subfactor corresponds to a proposal where there are “one or more deficiencies or significant weaknesses that demonstrate a lack of overall competence or would require a major proposal revision to correct.”  RFP § M.2(f). 

[8] IAP also argues that the other offerors should receive similar treatment, such that JLL’s adjusted mission suitability point value would be no more than 636.5.  IAP Protest at 9.

[9] IAP also disputes the agency’s evaluation of its past performance and the assignment of a high--rather than a very high--past performance confidence level, arguing that NASA’s evaluation of a proposed subcontractor’s contract reference was unreasonable.  IAP Protest at 16.  Specifically, IAP argues that because the contract reference was from NASA’s own facilities management contract for the Stennis Space Center, NASA had no basis to claim that there were uncertainties about the scope of work or the value of the subcontract.  IAP Comments (July 27, 2015) at 24-26.  Even if we agree with IAP that its relevance and performance ratings should have been viewed as comparable to Jacobs’, as discussed below we find that there was no prejudice to IAP.

[10] According to NASA, the following are the number of task orders in excess of $100,000 per year:  7 (2011); 13 (2012); 16 (2013).  COSF-IAP ¶ 9.4.

[11] IAP also protests the agency’s assessment of a significant weakness in its small business utilization plan.  IAP Protest at 13, 16.  We have evaluated these and other protest grounds and find that none provides a basis to sustain the protest.

[12] Furthermore, we have no basis to conclude that a very high confidence rating for IAP would necessarily mean that its past performance would be considered superior to Jacobs’, given the higher relevance and quality scores for Jacobs’ contract references.  See AR, Tab 31, SSA Presentation, at 100, 103.

Oct 23, 2017

Oct 19, 2017

Oct 18, 2017

Looking for more? Browse all our products here