J5 Systems, Inc.

B-406800: Aug 31, 2012

Contact:

Ralph O. White
(202) 512-8278
WhiteRO@gao.gov

 

Office of Public Affairs
(202) 512-4800
youngc1@gao.gov

J5 Systems, Inc., of San Diego, California, protests the decision not to award it a contract under request for proposals (RFP) No. N66001-11-R-0004, issued by the Department of the Navy, Space and Naval Warfare Systems Command (SPAWAR), Systems Center (SSC) Pacific, for systems engineering, analysis, and technical support services. J5 asserts that the agency’s evaluation of its proposal was unreasonable.

We deny the protest.

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of: J5 Systems, Inc.

File: B-406800

Date: August 31, 2012

Nancy O. Dix, Esq., Dawn E. Stern, Esq., and Brandi J. Gill, Esq., DLA Piper LLP (US), for the protester.
Kevin P. Mullen, Esq., and Charles L. Capito, Esq., Jenner & Block LLP, for Scientific Research Corporation, an intervenor.
Sandra Castro Cain, Esq., Ryan J. Friedl, Esq., and Kyle Eppele, Esq., Department of the Navy, for the agency.
Louis A. Chiarella, Esq., and David A. Ashen, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

Protest that contracting agency improperly evaluated offeror’s oral presentation is denied where the record shows that the agency reasonably determined that the protester did not adequately address all required topics in the context of the assigned sample task.

DECISION

J5 Systems, Inc., of San Diego, California, protests the decision not to award it a contract under request for proposals (RFP) No. N66001-11-R-0004, issued by the Department of the Navy, Space and Naval Warfare Systems Command (SPAWAR), Systems Center (SSC) Pacific, for systems engineering, analysis, and technical support services. J5 asserts that the agency’s evaluation of its proposal was unreasonable.

We deny the protest.

BACKGROUND

The Command and Intelligence Division within SSC Pacific provides systems engineering, development, integration, test, and life-cycle support for a wide range of Navy, joint, and national command, control, communications, computers, and intelligence (C4I) systems both ashore and afloat. Performance Work Statement (PWS) § 2.0. These systems serve to consolidate C4I functions along with cryptologic, navigation, environmental, and logistic capabilities to provide an integrated C4I capability to the warfighter. In general terms, the PWS required the contractor to provide C4I systems engineering, analysis, and technical support services to SSC Pacific, including command and intelligence systems analysis, concept definition, interface requirements, system development and design for implementation, integration, interoperability, documentation, upgrades, and training. Id., § 1.0.

The RFP, issued on March 28, 2011, contemplated the award without discussions of multiple indefinite-delivery, indefinite-quantity (ID/IQ)-type contracts with a 3-year base period with one 2-year option. The solicitation provided offerors with an estimated level of effort of 410,800 hours for 5 years with 39.5 full-time equivalents per year. Award was to be made to one or more offerors whose proposals represented the “best value” considering five evaluation factors: past performance, organizational experience, oral presentation, participation of small business, and cost. The RFP provided that the oral presentation, organizational experience, and small business participation evaluation factors were in descending order of importance; these non-cost factors, when combined, were significantly more important than cost.[1] RFP at 74.

The RFP established a complex, multi-step process by which the Navy would evaluate proposals. In Step 1, the agency was to evaluate offerors’ past performance and organizational experience. Organizational experience (as well as the subsequent oral presentation) was to be rated as either excellent, very good, satisfactory, marginal, or unsatisfactory. Past performance was to be rated as either acceptable, unacceptable, or neutral; offerors found to have unacceptable past performance were not to be considered further for award. RFP at 75. After the evaluation of past performance and organizational experience, the Navy was to issue an “advisory opinion letter” indicating whether the offeror was considered to be a “viable competitor” for further evaluation.[2] Id. at 76.

In Step 2, an offeror’s key personnel were to deliver an oral presentation. Each offeror was to be given at the time of the oral presentation a sample task to discuss. RFP at 72. By means of this sample task “test,” offerors were to demonstrate their understanding of the PWS objectives both overall and in technical detail. Id. at 72, 76. The Navy was then to evaluate offerors’ oral presentations with respect to their understanding of, and approach to performing, the sample task. Id. at 76.

Steps 3 and 4 involved the evaluation of offerors’ proposals under the small business participation and cost factors, respectively. In Step 5, the Navy was to determine which proposal(s) represented the best value to the government. Id. at 77-78.

Eight offerors, including J5, submitted proposals to the Navy by the April 28 closing date.[3] A Navy technical evaluation board (TEB) evaluated offerors’ submissions under the noncost factors. In Step 1, J5’s proposal was found to have “acceptable” past performance and “satisfactory” organizational experience. The contracting officer nevertheless issued J5 an advisory opinion letter stating that it was unlikely to be a viable competitor. Agency Report (AR), AR, Tab 6, Navy Email to J5,
Aug. 17, 2011. Notwithstanding the advisory opinion letter, J5 elected to proceed with the oral presentation and the balance of the Navy’s evaluation of its proposal. AR, Tab 7, J5 Letter to Navy, Aug. 19, 2011.

After oral presentations were held, the Navy completed its evaluation. The proposals were first evaluated individually by the TEB members. AR, Tab 83, Declaration of TEB Chair, July 16, 2012, at 1. Following their independent review, the TEB met to discuss each proposal and establish a consensus evaluation. Id. The evaluators’ adjectival ratings were supported by narratives detailing the various strengths and weaknesses identified in the offerors' proposals. Id., Tab 11, TEB Consensus Report at 41-48. The final consensus evaluation ratings and costs of the remaining offerors’ proposals were as follows:

Offeror

Oral Presentation

Organizational Experience

Sm. Business Participation

Evaluated Cost

Accenture

Excellent

Very Good

Exceptional

$43,390,845

BAH

Excellent

Very Good

Exceptional

$35,596,517

SRC

Excellent

Very Good

Exceptional

$25,473,634

FGM

Excellent

Satisfactory

Exceptional

$29,492,903

SAIC

Excellent

Satisfactory

Exceptional

$40,189,743

J5

Marginal

Satisfactory

Exceptional

$40,121,470

Offeror G

Marginal

Satisfactory

Exceptional

$40,489,156

Id., Tab 11, TEB Consensus Report, at 41-48, 79; Tab 12, Source Selection Decision, at 1-2, 9-11, 48-51.

The contracting officer as source selection authority subsequently determined that the top five proposals represented the “best value” to the government. On May 11, 2012, the Navy awarded contracts to Accenture, BAH, SRC, FGM, and SAIC. After receiving a postaward debriefing that ended on May 24, 2012, J5 filed this protest with our Office.

DISCUSSION

J5 challenges the agency’s evaluation of its own proposal with respect to organizational experience and the oral presentation. The protester asserts that the weaknesses identified by the Navy in these areas were unreasonable in light of the information presented to the agency. As detailed below, we find no basis on which to sustain J5’s protest.

Evaluation of Organizational Experience

J5 asserts that the Navy’s evaluation of its proposal with regard to the organizational experience factor was unreasonable. Specifically, the protester maintains that all of the weaknesses identified by the agency evaluators were in fact adequately addressed in J5’s proposal by the references the offeror submitted. Had the Navy fully considered all the information in its proposal, J5 argues, its proposal would have received a “very good” rating for organizational experience.

The RFP required offerors to submit reference information sheets for not more than five references demonstrating experience in the various key sections of the PWS. RFP at 66-67. In this regard, the solicitation provided that an offeror would be evaluated on the extent to which its proposal demonstrated relevant organizational experience performing across the PWS’s key sections. Id. at 76.

J5 submitted five references regarding its organizational experience. AR, Tab 4, J5 Proposal, at III-A-1 to III-A-18. The TEB evaluated J5’s organizational experience as ”satisfactory,” but identified six significant weaknesses based on aspects of the PWS for which J5’s proposal did not evidence experience.[4] Id., Tab 11, TEB Consensus Report, at 43-47. The Navy first informed J5 of these organizational experience weaknesses in the August 17, 2011 advisory opinion letter, and then again in the May 24, 2012 postaward debriefing. Id., Tab 6, Navy Email to J5, Aug. 17, 2011; Tab 14, J5 Debriefing, May 24, 2012, 4-9. However, in its initial protest J5 asserted only that, “[i]n light of J5’s excellent past performance ratings under the prior systems engineering contract, the agency’s conclusion that J5 lacked breadth and depth of relevant experience is unsupported and lacks a rational basis.” Protest, May 25, 2012, at 3. It was not until July 6 that J5 first raised the “separate [protest] ground” that its organizational experience proposal adequately discussed all of the areas identified by the TEB as weaknesses. J5 Comments, July 6, 2012, at 3.

We dismiss this latter issue as untimely. Under our Bid Protest Regulations, a protest based on other than alleged improprieties in a solicitation must be filed no later than 10 calendar days after the protester knew, or should have known, of the basis for protest, whichever is earlier. 4 C.F.R. § 21.2(a)(2) (2012). Where a protester initially files a timely protest, and later supplements it with independent protest grounds, the later-raised allegations must independently satisfy the timeliness requirements, since our Regulations do not contemplate the unwarranted piecemeal presentation or development of protest issues. AINS, Inc., B-405902.3, May 31, 2012, 2012 CPD ¶ 180 at 6 n.12; FR Countermeasures, Inc., B-295375, Feb. 10, 2005, 2005 CPD ¶ 52 at 9. Because the protest ground is based on information provided to J5 on or before May 24, and was raised for the first time in its July 6 filing--after the 10-day period permitted by our Bid Protest Regulations--the protest ground is untimely.

J5 argues that its protest here is timely insofar as its initial challenge was to the Navy’s organizational experience evaluation as a whole. J5 Comments, July 20, 2012, at 21. We disagree. While its initial protest challenged the agency’s organizational experience evaluation, J5 relied on its claimed excellent past performance ratings under the prior systems engineering contract. It was not until its response to the agency’s report that J5 first alleged--and offered the type of details required to state a valid basis for protest--that the organizational experience evaluation was unreasonable in light of the information in J5’s organizational experience proposal. This detailed submission purporting to support J5’s earlier broad allegation, however, was not filed until at least 6 weeks after the protester had received the information on which the protest relies. Where a protester raises a broad ground of protest in its initial submission but fails to provide details within its knowledge until later, so that a further response from the agency would be needed to adequately review the matter, these latter more specific arguments and issues cannot be considered unless they independently satisfy the timeliness requirements under our Bid Protest Regulations. Foundation Eng’g Scis., Inc., B-292834, B-292834.2, Dec. 12, 2003, 2003 CPD ¶ 229 at 6-7. Since J5 failed to provide these latter detailed contentions within 10 days of its receipt of the information upon which they are based, they are untimely and will not be considered further. Id.

Evaluation of Oral Presentation

J5 protests the Navy’s evaluation of its oral presentation, challenging each of the significant weaknesses and weaknesses attributed to its presentation. Further, the protester contends that the TEB’s consensus evaluation ratings were unreasonable because they were inconsistent with the evaluators’ preliminary ratings. Had the agency conducted a proper evaluation, J5 argues, it would have received an “excellent” or “very good” rating for the oral presentation and would have been among the offerors whose proposals were deemed the “best value” to the government.

Although we do not specifically address all of J5’s arguments regarding the Navy’s evaluation of the offeror’s oral presentation, we have fully considered all of them and find that they furnish no basis on which to sustain the protest.

In reviewing a protest challenging the agency’s evaluation of proposals, including oral presentations, our Office will not reevaluate proposals nor substitute our judgment for that of the agency, as the evaluation of proposals is generally a matter within the agency’s discretion. Naiad Inflatables of Newport, B-405221, Sept. 19, 2011, 2012 CPD ¶ 37 at 6; Science Applications Int’l Corp., B-290971 et al., Oct. 16, 2002, 2002 CPD ¶ 184 at 4. Rather, we will review the record only to determine whether the agency’s evaluation was reasonable and consistent with the stated evaluation criteria and with applicable procurement statutes and regulations. Shumaker Trucking & Excavating Contractors, Inc., B-290732, Sept. 25, 2002, 2002 CPD ¶ 169 at 3. An offeror’s mere disagreement with an agency’s judgment is insufficient to establish that the agency acted unreasonably. Birdwell Bros. Painting & Refinishing, B-285035, July 5, 2000, 2000 CPD ¶ 129 at 5. Further, since an agency’s evaluation is dependent on the information furnished in a proposal, it is the offeror’s responsibility to submit an adequate proposal for the agency to evaluate. See Structural Assocs., Inc., B-403085.2, Sept. 21, 2010, 2010 CPD ¶ 234 at 3. Agencies are not required to adapt their evaluation to comply with an offeror’s submission, or otherwise go in search of information that an offeror has omitted or failed adequately to present. Naiad Inflatables of Newport, supra; LS3 Inc.,
B-401948.11, July 21, 2010, 2010 CPD ¶ 168 at 3 n.1; Hi-Tec Sys., Inc., B-402590, B-402590.2, June 7, 2010, 2010 CPD ¶ 156 at 3.

As set forth above, the purpose of the oral presentation here was to assess the technical understanding of an offeror’s key personnel by means of a sample task “test.” RFP at 72. Following receipt of the sample task, offerors were given 2½ hours to caucus and prepare a presentation. AR, Tab 8, Oral Presentation Guidelines, at 1. They were then provided up to 90 minutes to deliver their oral presentation in response to the sample task. Id. The Navy made an audio recording of each offeror’s oral presentation,[5] and took photographs of any paper charts that the offeror created to support its presentation.[6] All TEB members attended J5’s oral presentation and had access to the audio recording and the photographs when performing their evaluation. AR, Tab 83, Declaration of TEB Chair, July 16, 2012, at 1-2; Tab 82, Declaration of R.B., July 16, 2012, at 1-2; Tab 84, Declaration of M.D., July 16, 2012, at 1-2.

The oral presentation sample task that offerors were given was as follows:

SSC Pacific Code 532 requires contractor support to design and implement Services Oriented Architecture (SOA)-based, net-centric maritime C4I prototype to be hosted in the Common Computer Environment (CCE).[7] You are tasked to provide systems engineering support for this effort.

Develop an approach to designing and implementing this capability, addressing: key system engineering issues; concept of operations; knowledge management; communications and data links; networks; databases and decision aids; classification levels; information assurance; information display and distribution; and accommodation of growth and changes in network nodes, organizations, information/ data, classification and security requirements, as well as emerging technologies.

Discuss the following: staffing and logistics; how industry best practices will be used to ensure successful design, implementation, and support; and how the experience and qualifications of key personnel will contribute to the successful execution of this tasking.

Id., Tab 10, Oral Presentation – Sample Task.

J5’s senior systems/analyst engineer and six other individuals delivered the offeror’s oral presentation. The record indicates that J5 organized its oral presentation along the lines of the PWS sections and/or subject matter expertise of its presenters rather than the specific sample task areas that were required to be addressed. For example, one of J5’s presenters concentrated on interface engineering (PWS § 3.4), Tr. at 39-46, while another discussed training support requirements (PWS § 3.7), id. at 58-62, even though these were not among the sample task areas required to be specifically covered.[8] Similarly, to the extent that J5 addressed the various sample task areas (e.g., classification levels), it was done in piecemeal fashion at various points in the oral presentation as part of the discussion of the overall PWS requirements.[9]

The TEB identified one major strength, three minor strengths, one minor weakness, and eight major weaknesses in its evaluation of J5’s oral presentation. AR, Tab 11, TEB Consensus Report, May 24, 2012, at 42. Relevant to the protest here, the Navy evaluators found that J5’s oral presentation did not discuss eight of the specifically-required areas as set forth in the sample task: (1) knowledge management; (2) communications and data links; (3) networks; (4) database and decision aids; (5) information display and distribution; (6) classification levels; (7) a plan to accommodate for growth; and (8) the application of best practices in relation to the prototype to be developed under the scenario.[10] Id. The TEB concluded that J5’s response did not indicate a clear understanding of the sample task and its technical detail, and that the offeror’s

presentation focused on the deployment and installation of a completed prototype C4I system but displayed no evidence of how J5 would utilize their collective knowledge to provide and support a SOA-based, net-centric maritime C4I prototype hosted on CCE equipment.

Id. The TEB therefore rated the J5’s presentation as overall “marginal.” Id.

Based upon our review of the record, we find the evaluated weaknesses attributed to J5’s oral presentation to be reasonable. As a preliminary matter, as set forth above, J5 elected to organize its oral presentation along the lines of the overall PWS requirements (e.g., interface engineering, training support) rather than the specific sample task topics that were required to be addressed (e.g., networks, classification levels, information display and distribution). The record shows, and the protester does not dispute, that the offeror made no attempt to systematically address each of the required sample task areas in its oral presentation. As a result, to the extent J5 discussed each of the required topics, it did so in an extremely disjointed manner. The protester acknowledges, for example, that its discussion of a plan to accommodate for growth and changes occurred in nine disjointed instances of the oral presentation (sometimes no longer than a sentence or phrase), while its discussion of classification levels was in ten separate parts of the presentation.
J5 Comments, July 6, 2012, at 22-26.

It is an offeror’s responsibility to prepare a well-written proposal, with adequately detailed information which clearly demonstrates compliance with the solicitation and allows for a meaningful review by the procuring agency. American Title Servs., a Joint Venture, B-404455, Feb. 4, 2011, 2011 CPD ¶ 38 at 4; International Med. Corps, B-403688, Dec. 6, 2010, 2010 CPD ¶ 292 at 8. This requirement is as applicable to oral submissions as it is to written ones. See Business Mgmt. Assocs., B-403315, B-403315.2, Oct. 19, 2010, 2011 CPD ¶ 143 at 4. By failing to address each required sample task topic in a methodical fashion, J5 essentially imposed upon the Navy evaluators the burden of piecing together numerous dispersed portions of its presentation and perfecting J5’s submission, a responsibility which we find the agency was not required to assume. See Keystone Sealift Servs., Inc.,
B-401526.3, Apr. 13, 2010, 2010 CPD ¶ 95 at 4.

We also find reasonable the agency’s conclusion that J5’s oral presentation did not adequately address various required topics in relation to the prototype to be developed under the scenario. We discuss examples of J5’s failure in this regard below.

Knowledge Management

The oral presentation sample task required each offeror to address knowledge management as part of developing an approach to designing and implementing a SOA-based, net-centric maritime C4I prototype to be hosted in the CCE. The term “knowledge management” is a broad concept, but involves generally a range of strategies and practices used in an organization to identify, create, represent, distribute, and enable adoption of information (knowledge). See, e.g., AR, June 25, 2012, at 16 n.21.

In its evaluation the TEB focused on several areas related to knowledge management. AR, Tab 9, Declaration of TEB Chair, June 25, 2012, at 2. First, the Navy determined that J5’s oral presentation did not describe or apply knowledge management to designing and implementing a SOA-based, net-centric maritime C4I prototype to be hosted in the CCE, as required by the assigned sample task. Specifically, J5 did not address any of the data or tools used to track and share systems requirements, design consideration and decisions, interface specifications, and current system deficiencies. Second, the Navy determined that J5 did not address the data and methods for storing and sharing system information in the operational C4I system. Moreover, as J5 did not directly address or expressly mention knowledge management in the oral presentation, neither did the offeror describe its understanding of that term of art. See AR, June 25, 2012, at 16.

J5 has not shown the agency’s evaluation here to be unreasonable. In this regard, the protester argues that because it did discuss knowledge management topics at various points in its presentation, it therefore satisfactorily addressed knowledge management in relation to the prototype to be developed under the sample task. J5 Comments, July 6, 2012, at 11-14. We disagree.

As an initial matter, many of the instances on which the protester relies do not actually discuss the topic of knowledge management. For example, J5 refers to a discussion of the experience and qualifications of its onsite, fleet support technician. J5 Comments, July 6, 2012, at 11, citing Tr. at 48-49; AR, Tab 49, Chart No. 421; Tab 50, Chart No. 422. J5’s oral presentation does not explain, nor is it otherwise evident, how such qualifications specifically relate to knowledge management. Similarly, while J5 discussed user feedback within the spiral development model and the need for measurable metrics, J5 Comments, July 6, 2012, at 12, neither was related by J5 to the topic of knowledge management.

Moreover, we agree with the agency that, even where J5 addressed knowledge management generally, it did not do so in the context of the assigned sample task. For example, J5’s oral presentation discussed J5’s use of the global command and control system (GCCS) engineering nexus of information and user support (GENIUS) system.[11] J5 Comments, July 6, 2012, at 11, citing Tr. at 30-32; AR, Tab 35, Chart No. 407. However, J5 did not discuss how GENIUS related to the topic of knowledge management, the kinds of information GENIUS would be used to manage, or if GENIUS would be implemented at the development stage or be an operational component of the prototype to be developed in the context of the sample task scenario. As a result, the TEB reasonably found that J5’s discussion of GENIUS did not address the considerations involved in the operational use of a SOA-based C4I system.

In sum, we find no basis to question the agency’s judgment that J5 did not discuss knowledge management in the context of the provided sample task.

Classification Levels

The oral presentation sample task required each offeror to address classification levels as part of developing an approach to designing and implementing a SOA-based, net-centric maritime C4I prototype to be hosted in the CCE. Classification is the categorization of information into different levels (e.g., unclassified, confidential, secret, top secret) based on the sensitivity of the information and potential for national security harm should it be revealed. AR, Tab 9, Declaration of TEB Chair, at 4. In the context of the assigned sample task as related to classification levels, the TEB considered whether an offeror’s oral presentation described the proposed data security level(s) for the prototype as well as the provisions for use of the proposed system in a single or multi-level classification environment. Id. If an offeror planned for its prototype to work in a multi-level environment, the agency determined that a description of classification levels should also describe methods for storing and transferring data so as to retain the proper classification and prevent spillage of higher classified information into a lower classification domain. Id.

The TEB found that while J5’s oral presentation described generally the concepts of system security engineering, the Department of Defense (DoD) Information Assurance Certification and Accreditation Process (DIACAP), and multi-level security systems, it did not directly address classification levels. AR, June 25, 2012, at 40. According to the agency, the protester’s oral presentation merely mentioned the term classification levels (or classification labels) at various instances in its presentation, but never fully discussed how classification levels would affect the prototype design and implementation process. Id. at 43. The agency concluded that the protester’s oral presentation did not specifically relate classification levels to a SOA-based, net-centric maritime C4I prototype as required by the assigned sample task. Id.

J5 has not shown the evaluation in this area to be unreasonable. In this regard, the protester again cites to various parts of its oral presentation (including charts) where it allegedly addressed classification levels in the context of the scenario provided in relation to the prototype to be developed. J5 Comments, July 6, 2012, at 22-24. We have examined the protester’s citations and find the assertions unfounded.

For example, the protester cites to its discussion during the oral presentation of security engineering. J5 Comments, July 6, 2012, at 22-24. The record, however, indicates that security engineering involves the development of secure systems methods and tests, including the transition of information technology systems from layered protection to true multi-level security systems.[12] PWS § 3.6. While security engineering was a PWS requirement, it was not a requirement of the oral presentation sample task.[13] Further, security engineering (which focuses on systems) is not the same as classification levels (which focuses on data). AR, July 16, 2012, at 43. J5’s presentation failed to discuss how security engineering related to classification levels, or discuss what classification levels now exist, or at what classification level(s) its designed prototype would operate as required by the sample task. AR, June 25, 2012, at 41.

The protester also cites its discussion of multi-level, system security requirements as related to software characteristics and databases. J5 Comments, July 6, 2012, at 22, citing Tr. at 21; AR, Tab 71, Chart No. 443. In this regard, the record shows that J5 provided a general overview of different databases, and mentioned multi-level security for operating systems. As noted by the agency, however, J5 did not address how software characteristics would be used to determine appropriate classification levels for the design and implementation of a SOA-based, net-centric maritime C4I prototype, as required by the sample task. AR, June 25, 2012, at 40.

The protester further cites its discussion of systems-engineering tradeoffs between security and responsiveness. J5 Comments, July 6, 2012, at 22, citing Tr. at 18; AR, Tab 64, Chart No. 436; Tab 65, Chart No. 437. In this regard, the protester argues that its discussion of system security requirements and encryption strengths was part of its presentation regarding classification levels. The protester, however, confuses or mischaracterizes the issue, since the record indicates that multi-level security is not equivalent to multiple classification levels. AR, July 16, 2012, at 43-47. Also, as noted by the agency, J5’s presentation did not explain how this topic specifically related to classification levels; did not identify the kinds of information a prototype C4I system would use; and did not state how this information would be transferred between different classification levels. AR, June 25, 2012, at 41.

In sum, we find reasonable the Navy’s determination that the mere fact that J5 may have mentioned classification levels at various points in its presentation was simply not a substitute for systematically addressing the topic in the context of the assigned sample task, which the protester did not do.

Preliminary Evaluator Ratings Differed from Final Consensus Ratings

Finally, J5 argues that the TEB’s consensus evaluation report is inconsistent with the individual evaluators notes and certain preliminary group notes. [14] J5 Comments, July 20, 2012, at 6-15; J5 Comments, July 30, 2012, at 1-5. This argument is without merit. It is not unusual for individual evaluator ratings to differ significantly from one another, or from the consensus ratings eventually assigned; indeed, the reconciling of such differences among evaluators’ viewpoints is the ultimate purpose of a consensus evaluation.[15] Hi-Tec Sys., Inc., supra, at 5; Neeser Constr., Inc./ Allied Builders Sys., A Joint Venture, B-285903, Oct. 25, 2000, 2000 CPD ¶ 207 at 4. Our overriding concern is not whether an agency’s final ratings are consistent with preliminary ratings, but whether they reasonably reflect the relative merits of the proposals, consistent with the solicitation. See Naiad Inflatables of Newport, supra, at 11; Bering Straits Tech. Servs., LLC, B-401560.3, B-401560.4, Oct. 7, 2009, 2009 CPD ¶ 201 at 3. Based on our review, we see nothing unreasonable in the existence of differences between the evaluators’ preliminary ratings and the final consensus evaluation rating of J5’s oral presentation. See Naiad Inflatables of Newport, supra. Also, as discussed above, we find the consensus evaluation of J5’s oral presentation to be reasonable since it reasonably reflects the merits of J5’s proposal.

As the evaluation of J5’s proposal was reasonable, the Navy’s decision not to consider the protester’s proposal to be among those representing the “best value” to the government was also reasonable.

The protest is denied.

Lynn H. Gibson
General Counsel



[1] The RFP did not establish the relative importance of the past performance evaluation factor.

[2] An offeror could elect to participate further in the evaluation process despite an advisory opinion letter stating that it was considered not to be a viable competitor. Id. at 76.

[3] Other offerors included Accenture Federal Services LLC, Booz Allen Hamilton, Inc. (BAH), Scientific Research Corporation (SRC), FGM, Inc., and Science Applications International Corporation (SAIC).

[4] For example, the TEB found that, “[t]he offeror did not provide evidence of providing a turnkey solution, or spiral development processes for C4I proof-of-concept and prototype systems and networks. Id., Tab 11, TEB Consensus Report, at 47.

[5] As part of the protest here, J5 created a written transcript of the audio tape recording of its oral presentation, which the parties agree to be accurate. J5 Comments, July 6, 2012, exh. C, J5 Oral Presentation Transcript (Tr.).

[6] J5 presented a total of 46 charts during its oral presentation, and the Navy took one or more photographs of each chart. AR, June 25, 2012, at 11 n.17.

[7] SSC Pacific Code 532 is the Navy Program Executive Office (PEO) for C4I. SOA is a set of principles and methodologies for designing and developing software in the form of interoperable services. “Net-centric” refers to the attributes of a robust, globally-interconnected network environment in which data is shared timely and seamlessly among users, applications, and platforms. Id., Tab 9, Declaration of TEB Chair, June 25, 2012, at 2.

[8] Likewise, J5’s charts specifically referenced the various PWS requirements rather than the sample task areas. See e.g., AR, Tab 25, Chart No. 397; Tab 40, Chart No. 412; Tab 47, Chart No. 419; Tab 51, Chart No. 423; Tab 54, Chart No. 426.

[9] For example, with regard to classification levels, J5 contends that it discussed the topic in ten separate instances during its oral presentation. J5 Comments, July 6, 2012, at 22-24.

[10] The TEB also found that J5’s failure to address the tactical concept of operations in relation to the prototype to be developed was a minor weakness. AR, Tab 11, TEB Consensus Report, May 24, 2012, at 42

[11] GENIUS is a software application that is used by the agency to relate system problem symptoms with related system documentation. AR, Tab 9, Declaration of TEB Chair, June 25, 2012, at 2. While the protester characterizes GENIUS as a knowledge management tool, J5 Comments, July 6, 2012, at 11, the Navy considers it to be a troubleshooting tool. AR, July 16, 2012, at 18.

[12] Security engineering may also involve new system security technologies, e.g., secure local area networks, smartcard technology, network encryption systems, virtual private networks, public key infrastructure, biometric authentication, and secure databases. PWS § 3.6.

[13] Security engineering is more closely related to sample-task-topic information assurance, which the TEB identified as a major strength in J5’s oral presentation. See AR, June 25, 2012, at 41-43; Tab 11, TEB Consensus Report, at 41.

[14] J5 also argues that the Navy failed to follow its source selection plan by not evaluating offerors’ oral presentations “immediately” after their conclusion. J5 Comments, July 6, 2012, at 7-8. This complaint does not establish a valid basis to object to the agency’s evaluation. An agency’s source selection plan is an internal guide that does not give rights to parties; it is the RFP’s evaluation scheme, not internal agency documents such as source selection plans, to which an agency is required to adhere in evaluating proposals. Meadowgate Techs., LLC, B-405989,
B-405989.3, Jan. 17, 2012, 2012 CPD ¶ 27 at 6 n.7.

[15] Likewise, we are unaware of any requirement that every individual evaluator’s scoring sheet track the final evaluation report, or that the evaluation record document the various changes in evaluators’ viewpoints. See Smart Innovative Solutions, B-400323.3, Nov. 19, 2008, 2008 CPD ¶ 220 at 3.