IndraSoft Inc.
Highlights
IndraSoft, Inc., of Reston, Virginia, protests awards of contracts to ActionNet, Inc., of Vienna, Virginia, AT&T Government Solutions, Inc., of Vienna, Virginia, Camber Corporation, of Huntsville, Alabama, CRGT, Inc., of Reston, Virginia, Intelligent Decisions, Inc., of Ashburn, Virginia, and SRA International, Inc., of Fairfax, Virginia, under request for proposals (RFP) No. USCA14R0014, which was issued by the Administrative Office of the United States Courts (AOUSC) for information technology (IT) support services. IndraSoft challenges the evaluation of its proposal.
We deny the protest.
DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.
Decision
Matter of: IndraSoft Inc.
File: B-411212
Date: June 16, 2015
Devon E. Hewitt, Esq., and Laura Shelkey Yeo, Esq., Protorae Law PLLC, for the protester.
David S. Cohen, Esq., Gabriel E. Kennon, Esq., Amy J. Spencer, Esq., John J. O’Brien, Esq., and Daniel Strouse, Esq., Cohen Mohr LLP, for Intelligent Decisions, Inc.; David S. Black, Esq., Elizabeth N. Jochum, Esq., and Gregory R. Hallmark, Esq., Holland & Knight LLP, for SRA International, the intervenors.
Brenda Oswalt, Esq., and Earl Friedman, Administrative Office of the United States Courts, for the agency.
Charles W. Morrow, Esq., and Jonathan L. Kang, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.
DIGEST
Protest challenging the agency’s assignment of an adjectival rating under an evaluation factor is denied, where the record reflects that the agency’s evaluation was reasonable, and where, in any event, the protester could not have suffered prejudice because the agency’s award decision reasonably considered the relative strengths of each offeror’s proposal, rather than relying on the adjectival ratings assigned.
DECISION
IndraSoft, Inc., of Reston, Virginia, protests awards of contracts to ActionNet, Inc., of Vienna, Virginia, AT&T Government Solutions, Inc., of Vienna, Virginia, Camber Corporation, of Huntsville, Alabama, CRGT, Inc., of Reston, Virginia, Intelligent Decisions, Inc., of Ashburn, Virginia, and SRA International, Inc., of Fairfax, Virginia, under request for proposals (RFP) No. USCA14R0014, which was issued by the Administrative Office of the United States Courts (AOUSC) for information technology (IT) support services. IndraSoft challenges the evaluation of its proposal.
We deny the protest.
BACKGROUND
As part of its mission, AOUSC provides IT support services to the judicial branch through a multiple-award contract known as the Judiciary Multiple Award Services (JMAS) contract. RFP at 5. These services include software development, implementation, and maintenance; testing/quality assurance, systems security, project management, planning and acquisition support; and, IT education and training. Id. The RFP, which was issued on April 1, 2014, sought proposals for the fourth iteration of this multiple-award contract (JMAS IV), which will provide a broad range of information technology support services, spanning the entire system’s development life-cycle, including services related to new and emerging technologies, with a primary focus on IT services and telecommunications. Id. at 6.
The RFP contemplated the award of up to six indefinite-delivery/indefinite-quantity (ID/IQ) contracts, which will have a base performance period of 1 year, four 1-year options, and an additional 6-month option. The contracts will provide for competition for task orders that will be fixed-price, labor-hour, time-and-material, or a hybrid of these types. Id. at 6, 54. Award was to be made on a best-value basis considering four technical evaluation factors and price. The first factor, technical capability and contract compliance certification, was to be evaluated on a go/no go basis; the other factors, listed in descending order of importance, were: (1) past performance, (2) management plan and approach, and (3) key personnel. Id. at 54‑55. Together the non-price factors were significantly more important than price. Id. at 54.
As relevant here, the RFP stated that evaluation under the management plan and approach factor would be based on a proposal’s completeness, feasibility, and demonstration of the offeror’s understanding of the requirements. Id. at 56. For purposes of the evaluation, the RFP required each offeror to address 15 areas of information concerning their management plan and approach in their proposal. See id. at 52-53.
Thirty offerors, including IndraSoft, ActionNet, AT&T, Camber, CRGT, Intelligent Decisions, and SRA, responded to the RFP by the closing date of May 21. The contracting officer found that each of the 30 proposals should be rated acceptable (“go”) under the technical capability and contract compliance certification evaluation factor. See Agency Report (AR), Tab 5.0, Source Selection Memorandum, at 4.
A technical evaluation team (TET), including a past performance evaluation team, evaluated proposals in accordance with the criteria established in the RFP and source selection plan (SSP). For past performance, the evaluation considered the quality of the offeror’s recent and relevant past performance, with greater weight being assigned to more relevant past performance, i.e., more similarity in terms of size, scope, and past performance. See RFP at 55-56; AR, Tab 6.0, SSP, at 5. For this purpose, each proposal was assigned a past performance confidence rating.[1] Under the management plan and approach, and key personnel factors, the TET assigned adjectival ratings to the proposals based on the documented strengths, weaknesses, and deficiencies identified in each proposal.[2] Price was evaluated for reasonableness. See id. at 14.
The pertinent results were as follows:
Evaluated Price |
Past Performance |
Management Plan/ Approach |
Key Personnel |
|
ActionNet |
$266,813,671 |
Good |
Good |
Good |
SRA |
$282,504,368 |
Good |
Outstanding |
Good |
Camber |
$295,597,539 |
Good |
Outstanding |
Outstanding |
IndraSoft |
$305,064,835 |
Good |
Acceptable |
Good |
AT&T |
$310,157,546 |
Good |
Good |
Good |
CRGT |
$315,919,884 |
Good |
Good |
Good |
Intelligent Decisions |
|
|
|
|
See AR, Tab 5, Source Selection Memorandum, at 6-7.
Based upon the evaluation, the source selection authority (SSA) found that the proposals of ActionNet, AT&T, Camber, CRGT, Intelligent Decisions, and SRA represented the best value to the government. In making this determination, the SSA compared all of the proposals under the past performance, management plan and approach, and key personnel factors. See AR, Tab 5.0, Source Selection Memorandum, at 8. The SSA concluded that each awardee’s proposal was a better value than IndraSoft’s proposal. See id. at 18. The AOUSC awarded contracts to ActionNet, AT&T, CRGT, Intelligent Decisions, SRA, and Camber, on February 27, 2015. After a debriefing, this protest followed.
DISCUSSION
IndraSoft contends that AOUSC unreasonably evaluated its proposal under the management plan and approach factor. Specifically, the protester argues that its proposal should have been assigned a good rating under this factor, rather than an acceptable rating, because the TET identified two strengths and no weaknesses in its proposal. Based on this allegation, the protester argues that the agency conducted a flawed best-value assessment of its proposal. The protester argues that had its proposal received a good rating, it would have been a better value than at least three of the awardees--CRGT, AT&T, and Intelligent Decisions--who had higher evaluated prices. In addition, the protester challenges the agency’s best-value tradeoff decisions concerning its proposal as compared to CRGT’s and Intelligent Decisions’ proposals. For the reasons discussed below, we find no basis to sustain the protest.
An agency’s evaluation of technical proposals is primarily the responsibility of the contracting agency, since the agency is responsible for defining its needs and identifying the best method of accommodating them. Wyle Labs., Inc., B‑311123, Apr. 29, 2008, 2009 CPD ¶ 96 at 5-6. In reviewing protests of an agency’s evaluation, our Office does not reevaluate proposals, rather, we review the record to determine if the evaluation was reasonable, consistent with the solicitation’s evaluation scheme, as well as procurement statutes and regulations, and adequately documented. See Research & Dev. Solutions, Inc., B-410581, B‑410581.2, Jan. 14, 2015, 2015 CPD ¶ 38 at 8; Wackenhut Servs., Inc., B‑400240, B‑400240.2, Sept. 10, 2008, 2008 CPD ¶ 184 at 6; Cherry Road Techs.; Elec. Data Sys. Corp., B‑296915 et al., Oct. 24, 2005, 2005 CPD ¶ 197 at 6. As our Office has consistently recognized, ratings, be they numerical, adjectival, or color, are merely guides for intelligent decision-making in the procurement process. Citywide Managing Servs. of Port Wash., Inc., B‑281287.12, B‑281287.13, Nov. 15, 2000, 2001 CPD ¶ 6 at 11. The evaluation of proposals and assignment of adjectival ratings should generally not be based upon a simple count of strengths and weaknesses, but on a qualitative assessment of the proposals consistent with the evaluation scheme. See Clark/Foulger-Pratt JV, B‑406627, B‑406627.2, July 23, 2012, 2012 CPD ¶ 213 at 14.
Evaluation of IndraSoft’s Proposal
First, IndraSoft argues that its proposal should have received a good rating under the management plan and approach factor. As discussed above, the TET rated IndraSoft’s proposal acceptable for this factor based on two strengths in its proposal. See AR, Tab 7.0, TET Report, at 9. The TET found that the proposal provided a “very good approach to [Deleted].” Also, the TET found that the proposal “described a very good approach to a [Deleted].” Id.
As relevant here, the SSP defined good and acceptable ratings as follows:
[Good] Proposal meets requirements and indicates a thorough approach and understanding of the requirements. Proposal contains strengths which outweigh any weaknesses. Risk of unsuccessful performance is low.
[Acceptable] Proposal meets requirements and indicates an adequate approach and understanding. Strengths and weaknesses are offsetting or will have little or no impact on contract performance. Risk of unsuccessful performance is no more than adequate.
AR, Tab 6.0, SSP, at 12-13.
IndraSoft argues that its proposal merited a good rating because the assessment of two strengths and no weaknesses was consistent with the SSP definition of a good rating, that is, a proposal that “contains strengths which outweigh any weaknesses.” AR, Tab 6.0, SSP, at 12. The protester further argues that a good rating was merited because the TET’s comments quoted above recognized the protester’s “very good” approach to managing a multiple-award task order contract and to managing employee training.
In response, the agency contends that the evaluation of IndraSoft’s proposal as acceptable was consistent with the definition for an acceptable proposal. We agree.
Based on the definition in the SSP set forth above, the evaluation of an offeror’s proposal under the management plan and approach factor entailed more than just an assessment of the number of strengths or weaknesses. Rather, the definitions stated that the agency would consider an offeror’s proposed approach to meeting the solicitation requirements and its understanding of those requirements. AR, Tab 6.0, SSP, at 12-13. In this regard, the RFP contemplated the evaluation of an offeror’s proposal under the management plan and approach factor for completeness, feasibility, and an indication of how well the offeror understood the requirements in 15 areas. See RFP at 52-53, 56.
The two strengths assessed for IndraSoft’s proposal under the management plan and approach factor related to: (1) proposed leadership, organizational structure, and relationships and (2) employee training. See AR, Tab 7.0, TET Report, at 9. Although the protester is correct that the agency did not assess any weaknesses for its proposal, we agree with the agency that the absence of weaknesses does not demonstrate that the protester’s proposal merited a good rating. Specifically, the record does not support the conclusion that IndraSoft’s proposal “indicated a thorough approach and understanding of the requirements,” where IndraSoft’s proposal’s strengths encompassed only two out of the fifteen areas being evaluated by the agency.
Further, although the TET evaluators referred to each of these strengths as reflecting a “very good” approach for meeting the RFP requirements, we find these statements to be general descriptions of the two aspects of the plan, as opposed to an overall assessment of the proposal for purposes of assigning a rating under the management plan and approach factor. See AR, Tab 7.0, TET Report, at 9. In our view, the evaluators’ overall assessment of IndraSoft’s proposal as acceptable, based on the identification of two strengths, out of the 15 areas set forth in the solicitation, was reasonable under the RFP evaluation scheme, which required offerors to demonstrate a “thorough,” rather than an “adequate” understanding of the requirements. See AR, Tab 6.0, SSP, at 12-13. Thus, we find no basis to conclude that the evaluation of IndraSoft’s plan as acceptable rather than good was unreasonable.
Additionally, as discussed below, we conclude that AOUSC’s award decision relied on a reasonable comparison of the offerors’ proposals that considered the specific strengths and weaknesses for each proposal, rather than a simple comparison of the numbers of strengths and weaknesses. For this reason, even if we determined that IndraSoft should have received a good, rather than an acceptable rating, we find no basis to conclude that the protester could have been prejudiced by such an error. Prejudice is an essential element of every viable protest, and we will not sustain a protest where it is clear from the record that a protester suffered no prejudice as a result of an agency evaluation error. A-Tek, Inc., B-404581.3, Aug. 22, 2011, 2011 CPD ¶ 188 at 10. Where the protester fails to demonstrate that, but for the agency’s actions, it would have had a substantial chance of receiving the award, there is no basis for finding prejudice. See, e.g., Special Servs., B‑402613.2, B‑402613.3, July 21, 2010, 2010 CPD ¶ 169 at 4. On this record, we find no basis to sustain the protest.
Best-Value Tradeoffs
Next, IndraSoft argues that AOUSC’s award decision was flawed because the agency made unreasonable best‑value tradeoffs between the protester’s proposal and those of CRGT and Intelligent Decisions. We conclude that the record reflects that the SSA did not simply just rely on the adjectival ratings, but instead made a reasonable comparison of the relative merits of each offeror’s proposal. See AR, Tab 5.0, Source Selection Memorandum, at 37-39, 82-83, 105-106.
First, IndraSoft argues that the tradeoff between its proposal and CRGT’s proposal was unreasonable because the SSA gave greater weight to the technical evaluation factor than the past performance factor, which was contrary to the RFP’s stated basis for award. We find no merit to this this contention.
As discussed above, the RFP stated that all proposals that were found to be acceptable under the technical capability and contract compliance certification factor would be evaluated based on the following three remaining technical factors, which were listed in descending order of importance: (1) past performance, (2) management plan and approach, and (3) key personnel. RFP at 54-55. Together the non-price factors were significantly more important than price. Id.
The record does not show that AOUSC’s award decision improperly accorded more weight to the management plan and approach and the key personnel factors than the past performance factor. In this regard, the SSA recognized that CRGT’s proposal was superior to IndraSoft’s under the management plan and approach and the key personnel factors, whereas IndraSoft’s proposal was superior to CRGT’s under the past performance and price factors. Based on the superiority under the two technical factors, the SSA found that CRGT’s technical advantage, and other benefits, outweighed IndraSoft’s advantages under the past performance and price factors, which justified paying the additional price premium associated with CRGT’s proposal. See id. at 82-83. While past performance was weighted more heavily that the other two non-price factors, the RFP’s evaluation scheme did not state that the past performance factor was more significant than the combined weight of the other two non-price factors. In addition, the evaluation scheme did not compel the agency to conclude that an offeror’s advantage under the past performance factor must, in all circumstances, outweigh advantages under the other two non-price factors. See RFP at 54‑55. Thus, we find the SSA’s determination that CRGT’s proposal’s superior technical ratings and quality outweighed IndraSoft’s superior past performance, was consistent with the terms of the RFP.
Next, IndraSoft argues that AOUSC’s evaluation of its proposal under the management plan and approach factor was unreasonable as compared to the evaluation of Intelligent Decisions’ proposal. In this regard, the agency found two strengths in each proposal concerning the same solicitation requirements, but assigned IndraSoft’s proposal an acceptable rating, and Intelligent Decisions’ proposal a good rating
While the record confirms that the agency found both proposals to contain the same number of strengths, it also reflects that the SSA considered Intelligent Decisions’ two strengths in this area to be more significant. See AR, Tab 5.0, Source Selection Memorandum, at 105-106. In comparing the two proposals, the SSA found Intelligent Decisions‘ “robust and comprehensive” management approach to be highly significant and its training plan to be moderately significant. Id. In contrast, the SSA found IndraSoft’s two strengths to be moderate to high and moderate in significance.
In challenging, the SSA’s determination that Intelligent Decisions’ proposal represented the better value under this factor, the protester does not challenge the evaluation of its own proposal; instead, the protester argues that the record does not explain why the agency assigned a higher rating to Intelligent Decisions’ proposal or concluded that the strengths assigned to Intelligent Decisions’ proposal were considered superior to those assigned to the protester’s proposal.
We think that the record reasonably reflects that the SSA made a qualitative assessment of the two proposals that supported the difference in ratings between the two proposals. See Clark/Foulger-Pratt JV, supra. Specifically, the record shows that the SSA relied upon strengths identified by the TET evaluators that identified qualitative distinctions between the strengths in Intelligent Decisions’ and IndraSoft’s proposals. AR, Tab 5.0, Source Selection Memorandum, at 106. The TET found that Intelligent Decisions’ proposal to manage multiple task orders was more robust and comprehensive than IndraSoft’s because: [Deleted]. AR, Tab 7.0, TET Report, at 11. Also, the TET found Intelligent Decisions’ training plan to be robust, and comprehensive, because it provided [Deleted]. Id. Thus, the record provides a rationale for the SSA’s qualitative distinction between the evaluation of these two offerors’ similar strengths.
Finally, IndraSoft argues that the agency’s evaluation under the key personnel factor was unreasonable, as there was no reasonable basis to distinguish between the experience of its proposed program director (PD) and Intelligent Decisions’ proposed PD. Specifically, the protester notes that although its proposed PD had [Deleted], the agency assigned its proposal a good rating; in contrast, although Intelligent Decisions’ PD had [Deleted], the agency rated Intelligent Decisions’ proposal as outstanding for this factor. See AR, Tab 7, TET Report, at 9.
AOUSC argues that the evaluation was reasonable because, while Intelligent Decisions’ proposed PD had [Deleted]. Specifically, the SSA concluded that Intelligent Decisions’ PD represented the better value based on the manager’s [Deleted]. AR, Tab 5.0, Source Selection Memorandum, at 106. We note that for purposes of this evaluation, the RFP did not set forth a minimum number of years of experience, or otherwise direct a particular rating based on a certain number of years of IT experience. Rather, the RFP provided for the evaluation of the candidate’s work experience, capability to perform the work, particularly with regard to the requirements of the solicitation. RFP at 9, 53, 56. On this record, we conclude that the agency reasonably found Intelligent Decisions’ PD’s experience managing the current JMAS III task order more valuable than IndraSoft’s PD’s [Deleted].
In sum, we find that IndraSoft’s disagreement with the AOUSC’s best-value judgments regarding the merits of the offerors’ proposals provides no basis to sustain the protest. Where, as here, an agency reasonably evaluates an offerors’ proposals, mere disagreement with the agency’s evaluation judgments does not render those judgments unreasonable. See General Dynamics, Am. Overseas Marine, B‑401874.14, B‑401874.15, Nov. 1, 2011, 2012 CPD ¶ 85 at 10.
The protest is denied.
Susan A. Poling
General Counsel