Flairsoft, Ltd.

B-415716.37: Oct 21, 2019

Additional Materials:

Contact:

Ralph O. White
(202) 512-8278
WhiteRO@gao.gov

Kenneth E. Patton
(202) 512-8205
PattonK@gao.gov

 

Office of Public Affairs
(202) 512-4800
youngc1@gao.gov

Flairsoft, Ltd., a small business of Columbus, Ohio, protests the exclusion of its proposal from the competition by the Department of the Air Force under request for proposals (RFP) No. FA8771-17-R-1000 for information technology (IT) services. Flairsoft argues that the agency unreasonably evaluated its proposal under the technical experience factor.

We deny the protest.

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of:  Flairsoft, Ltd.

File:  B-415716.37

Date:  October 21, 2019

Matthew R. Keller, Esq., and Luke J. Archer, Esq., Praemia Law, PLLC, for the protester.
Alexis J. Bernstein, Esq., Colonel Patricia S. Wiegman-Lenz, Kevin P. Stiens, Esq., and Major Celina E. Duvall, Department of the Air Force, for the agency.
Katherine I. Riback, Esq., and Amy B. Pereira, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

Protest challenging agency’s evaluation of protester’s proposal under the technical experience evaluation factor is denied where the record shows that the evaluation was reasonable and consistent with the solicitation.

DECISION

Flairsoft, Ltd., a small business of Columbus, Ohio, protests the exclusion of its proposal from the competition by the Department of the Air Force under request for proposals (RFP) No. FA8771-17-R-1000 for information technology (IT) services.  Flairsoft argues that the agency unreasonably evaluated its proposal under the technical experience factor.

We deny the protest.

BACKGROUND

On September 28, 2017, the Air Force issued the Small Business Enterprise Application Solutions (SBEAS) RFP, which was set aside for small businesses, pursuant to the procedures of Federal Acquisition Regulation part 15.  Agency Report (AR), Tab 6, RFP at 162.[1]  The solicitation contemplated the award of 40 indefinite-delivery, indefinite-quantity (IDIQ) contracts with a 5-year base and 5-year option ordering period.  Id. at 138-139, 162.  The scope of the SBEAS RFP included a “comprehensive suite of IT services and IT solutions to support IT systems and software development in a variety of environments and infrastructures.”  Id. at 130.  Additional IT services in the solicitation included, but were not limited to, “documentation, operations, deployment, cybersecurity, configuration management, training, commercial off-the-shelf (COTS) product management and utilization, technology refresh, data and information services, information display services and business analysis for IT programs.”  Id. 

Proposals were to be evaluated based on two factors, technical experience and past performance.[2]  Id. at 164.  The technical experience factor was comprised of ten technical elements and various sub-elements (each with a designated point value), and one non‑technical experience element.[3]  Id. at 165-171.  The past performance factor was comprised of the following three subfactors in descending order of importance:  life‑cycle software services, cybersecurity, and information technology business analysis.  Id. at 164.  Award was to be made on a past performance tradeoff basis among technically acceptable offerors, using the three past performance subfactors.  Id. at 162. 

Section L of the solicitation instructed offerors that “[t]he proposal shall be clear, specific, and shall include sufficient detail for effective evaluation and for substantiating the validity of stated claims.”  Id. at 142.  Offerors were instructed to not simply rephrase or restate requirements, but to “provide [a] convincing rationale [addressing] how the [o]fferor’s proposal meets these requirements.”  Id.  The RFP also instructed offerors to assume that the agency has no knowledge of the offeror’s facilities and experience, and would “base its evaluation on the information presented in the [o]fferor’s proposal.”  Id. 

The solicitation provided that offerors should submit their proposals in four volumes:  capability maturity model integration (CMMI) documentation, technical experience, past performance, and contract documentation.[4]  Id. at 145.  As relevant to this protest, the technical volume was to contain a table of contents, a cross-reference matrix,[5] a glossary of terms, a self-scoring worksheet, and technical narratives (TNs).[6]  Id. at 149.  The RFP instructed offerors to describe, in their TNs, experience that supports the technical element points claimed in the self-scoring worksheet.  Id.

The solicitation stated that the agency intended to evaluate proposals and make awards without discussions to the offerors deemed responsible, and whose proposals conformed to the solicitation’s requirements and were judged, based on the evaluation factors, to represent the best value to the government.[7]  Id. at 162-163. 

Section M of the solicitation established a tiered evaluation process.  Id. at 163-164.  The first step of the evaluation was a CMMI appraisal, which required offerors to be certified at level 2 in CMMI.[8]  Id.  If an offeror passed the CMMI appraisal as level 2 certified, the agency would then evaluate an offeror’s technical experience using the self‑scoring worksheet and TNs provided by the offeror.  Id. at 164.  The solicitation provided that technical experience would receive an adjectival rating of acceptable or unacceptable.  Id. at 164-165.  A proposal would be considered acceptable when it attained 4,200 points per the self-scoring worksheet, and was “verified per the technical narratives.”  Id. at 165. 

In the event that technical experience was evaluated as acceptable, the agency would then evaluate the offeror’s past performance.  Id. at 164.  The agency would review the accompanying past performance narratives and evaluate each offeror’s past performance references for recency, relevancy, and quality.  Id. at 172.

Flairsoft timely submitted its proposal in response to the solicitation.  The agency evaluated the proposal as technically unacceptable because Flairsoft’s proposal did not obtain 4,200 points as required under the technical experience factor.  The protester filed a protest with our Office on August 24, 2018, (B-415716.12) challenging the agency’s evaluation of its proposal.  The agency notified our Office that it intended to take corrective action by reevaluating Flairsoft’s proposal.  Agency Notice of Corrective Action, (Nov. 19, 2018).  On November 27, our Office dismissed Flairsoft’s protest as academic.  Flairsoft, Ltd., B-415716.12, Nov. 27, 2018 (unpublished decision).  Following the re‑evaluation, on June 28, 2019, the agency notified Flairsoft that its proposal was considered unacceptable and had been eliminated from further consideration because its proposal, having received 3,650 points, did not receive the minimum required 4,200 points under the technical experience factor.  AR, Tab 11, Flairsoft Notice of Removal from Competition (June 28, 2019), at 1.  On July 12, following its debriefing, Flairsoft filed this protest with our Office.

DISCUSSION

Flairsoft challenges the agency’s exclusion of its proposal from the competition, asserting that the agency failed to properly evaluate its proposal under the technical experience factor.  Specifically, the protester argues that the agency unreasonably deducted points under three of the sub-elements of the life-cycle software services element, two sub-elements of the information technology business analysis element, one sub-element of the tools/software development methodologies, two sub-elements of the platforms/environments element, one sub-element of the database components element, three sub-elements of the mobile/internet of things, and the server operating systems element.[9]  Although we do not specifically address all of Flairsoft’s allegations, we have fully considered all of them and find that none provide a basis on which to sustain the protest.[10]

Our Office will examine an agency’s evaluation of an offeror’s technical experience only to ensure that it was reasonable and consistent with the stated evaluation criteria and applicable statutes and regulations.  See Shumaker Trucking & Excavating Contractors, Inc., B-290732, Sept. 25, 2002, 2002 CPD ¶ 169 at 3.  A protester’s disagreement with a procuring agency’s judgment, without more, is insufficient to establish that the agency acted unreasonably.  WingGate Travel, Inc., B-412921, July 1, 2016, 2016 CPD ¶ 179 at 4-5.  In addition, it is an offeror’s responsibility to submit an adequately written proposal with adequately detailed information which clearly demonstrates compliance with the solicitation requirements and allows a meaningful review by the procuring agency.  See International Med. Corps, B-403688, Dec. 6, 2010, 2010 CPD ¶ 292 at 8.  An offeror’s technical evaluation is dependent on the information furnished, and an offeror that fails to submit an adequately written proposal runs the risk of having its proposal rejected as unacceptable.  Wegco, Inc., B-405673.3, May 21, 2012, 2012 CPD ¶ 161 at 4.  Because the solicitation provided that an offeror must score a minimum of 4,200 points to be rated technically acceptable, for the reasons discussed below, we need only address Flairsoft’s challenges to the agency’s evaluation under two sub‑elements of the life-cycle software services element (1a and 1e), one sub-element of the IT business analysis element (3b), one sub-element of the tools/development methodology element (5c), one sub‑element of the database components element (7a), and the server operating systems element (9).

Life-Cycle Software Services Element

The life-cycle software services element was comprised of five sub-elements: developing/implementation; re-engineering; data or system migration; modernization; and COTS/GOTS/FOSS enterprise resource planning software systems.  RFP at 165‑166.  As relevant here, Flairsoft challenges the agency’s evaluation of its proposal under the developing/implementation, and COTS/GOTS/FOSS enterprise resource planning software systems sub‑elements of this element.  Protest at 10-18; Comments at 3-7.  In this regard, Flairsoft contends that the agency’s evaluation was unreasonable because the agency ignored other portions of Flairsoft’s proposal that established the required experience under these two sub-elements.  Id.

Developing/Implementation Sub-element

Flairsoft challenges the agency’s evaluation of its proposal under the developing/ implementation sub-element (i.e., 1a) of the life-cycle software services element.  To receive the 500 points available under this sub‑element, an offeror was required to demonstrate experience in the design, build, test, and implementation of an information system in each of the following four areas:

  • The process of implementing software solutions to one or more sets of problems.  [hereinafter “design”]
  • The process by which source code is converted into a stand-alone form that can be run on a computer or to the form itself.  One of the most important steps of a software build is the compilation process, where source code files are converted into executable code.  [hereinafter “build”]
  • Obtaining, verifying, or providing data for any of the following:  the performance, operational capability, and suitability of systems, subsystems, components, or equipment items; or vulnerability and lethality of systems, subsystems, components, or equipment items.  [hereinafter “test”]
  • Planning; coordinating; scheduling; deploying/installing (or providing all needed technical assistance to deploy/install) and transitioning a technical solution (e.g., information system) into the operational environment.  [hereinafter “implementation”]

RFP at 165‑166, and 185.  The agency’s evaluation concluded that while Flairsoft’s proposal contained sufficient design, test and implementation experience, the proposal did not demonstrate its experience building an information system (IS) as required by the solicitation.  AR, Tab 10, Flairsoft Technical Evaluation, at 5.  Finding Flairsoft to lack experience in one of the four areas, the agency awarded it no points for the developing/implementation sub‑element.  Id. at 6. 

Flairsoft argues that because this is a technical experience factor, not a technical approach factor, the agency unreasonably required that it demonstrate how it performed the stated functions, which amounted to the application of an unstated evaluation criterion.  The protester contends that the solicitation required it to identify recent contracts where it performed the tasks identified in the various sub-elements, which it did in its proposal.  Flairsoft cites to multiple sections of TN 1, a commercial software modernization project for Williams Energy Land Management, and TN 2, a project that dealt with the Michigan Statewide Automated Child Welfare Information System (MiSACWIS), that it contends demonstrate its experience building an IS.[11]  Protest at 11-16.  For example, with regard to TN 1, Flairsoft contends that:

Nowhere in the Solicitation does the Agency instruct offerors to describe their methods for converting and compiling source code.  Flairsoft stated in TN-1 that it performed that process for Williams when it designed, built, and implemented its Flairdocs solution for Williams.  Exhibit C-1, Volume II-Technical Experience, § 1.1.1.  It needed to say no more.

Id. at 8. 

In response, the agency contends that the solicitation required offerors to demonstrate the process by which source code test was converted into a stand-alone form that could be run on a computer or to the form itself.  RFP at 166; COS at 12.  The agency states that it found that neither TN 1 nor TN 2 demonstrated the process Flairsoft used to build the Williams or MiSACWIS systems.[12]  COS at 19-20; AR, Tab 8, Flairsoft’s Self‑Scoring Worksheet, at 2; Tab 9, Flairsoft’s Cross-Reference Matrix, at 2.  For example, the agency’s evaluation of Flairsoft’s TN 1 stated the following:

While the proposal uses the terms “build” and “compilation,” it does not demonstrate the offeror’s process of converting source code into a standalone form and does not describe the compilation process used on the Williams project.  The proposal only identifies a code editor [DELETED] and development framework [DELETED] that was used, but does not demonstrate the offeror’s process to convert and compile the source code into an executable and stand-alone form as is required by the sub-element.

AR, Tab 10, Flairsoft Technical Evaluation, at 5.  The agency similarly concluded that TN 2 did not provide any information on the offeror’s actual process to convert source code into a stand-alone form, and did not demonstrate the compilation process the offeror performed for the MiSACWIS project.  Id. 

We find no basis to question the agency’s determination that the protester failed to demonstrate the build experience required under this sub-element.  First, we reject the protester’s contention that simply because its proposal stated that it successfully completed a particular task, the agency was required to credit Flairsoft with that experience.Therefore, although Flairsoft’s proposal uses some of the same terms from the solicitation’s description of the required build experience, the protester fails to establish that the agency improperly determined that its proposal lacked specific details to demonstrate the required experience executing a build process.  See Microwave Monolithics, Inc., B-413088, Aug. 11, 2016, 2016 CPD ¶ 220 at 6.  Consequently, we deny this protest ground.

COTS/GOTS/FOSS Enterprise Resource Planning Software Systems Sub‑element

Flairsoft disputes the agency’s evaluation of its proposal under the COTS/GOTS/FOSS enterprise resource planning (ERP) software systems sub‑element (i.e., 1e), of the life‑cycle software services element.  To receive the 200 points available under the COTS/GOTS/FOSS ERP software systems sub‑element, an offeror was required to demonstrate experience in one of the following:

  • Implementing one (1) COTS, GOTS or FOSS ERP software package to satisfy complex business processes in the finance, personnel, and/or supply chain/manufacturing domain for one or more customer organizations where the offeror’s COTS, GOTS or FOSS ERP software implementation was ultimately fielded for operational use by the customer.
  • OR
  • Providing lifecycle software service support for one (1) COTS, GOTS or FOSS ERP software implementation for which the offeror was not the original implementer at initial deployment where one (1) of the following is demonstrated:
    • the offeror played a key role in working with the customer to develop, define and/or blueprint operational business rules that were implemented by the COTS, GOTS or FOSS ERP software package; or
    • the offeror performed gap analysis and developed resulting custom reports, interfaces, data conversions, and functional extensions to the COTS, GOTS or FOSS ERP software product.

RFP at 166-167.  The RFP provided that the agency would not accept points claimed by the offeror if the offeror did not “identify the COTS/GOTS/FOSS SW/ERP package with which it ha[d] experience.”  Id. at 166-167.

The solicitation defines an ERP system as:

A configurable, packaged, commercial software package designed to enable an organization to integrate and manage the efficient and effective use of resources by providing a total, integrated solution for the organization information-processing needs.  Consider a supply chain system which tracks information from procurement and inventory to automatically raise purchase orders for approval; and is capable of generating multiple reports on a single click.  ERP systems are capable of accessing data from all the modules and run it as an enterprise-wide system.  ERPs tend to be very large, involve a multitude of stakeholders, and take a long time and considerable cost to implement.

RFP at 211.  Flairsoft’s proposal listed TN 1 and TN 4--a project involving the Oregon Right of Way Information Tracking System--as demonstrating experience under this sub-element.  Both of these narratives mention Flairdocs, which was defined in Flairsoft’s technical proposal as a “web-enabled Right of Way Software and Real Estate Management workflow and document management solution.”  AR, Tab 8, Flairsoft’s Self-Scoring Worksheet at 2; Tab 9, Flairsoft’s Cross‑Reference Matrix at 2; Tab 7, Vol. II, Flairsoft’s Technical Proposal at 24. 

The agency determined that Flairsoft did not demonstrate the required experience using an ERP product because Flairdocs, the product that Flairsoft claims demonstrated its required experience in TN 1 and TN 4, did not meet the RFP definition of an ERP.  AR, Tab 10, Flairsoft’s Technical Evaluation, at 14.  The agency found that Flairsoft’s definition of Flairdocs in its proposal, as a “workflow and document management solution,” does not meet the solicitation’s definition of an ERP, which is a “configurable, packaged, commercial software package designed to enable an organization to integrate and manage the efficient and effective use of resources by providing a total, integrated solution for the organization information-processing needs.”  AR, Tab 7, Vol. II, Flairsoft’s Technical Proposal, at 24; RFP at 211.  The agency maintained that by relying on its commercial product, Flairdocs, the offeror’s proposal fails to demonstrate use of an ERP product, and, thus, was found not to meet the requirements for this sub-element.  AR, Tab 10, Flairsoft’s Technical Evaluation, at 14.  The agency contends that while Flairsoft, in its protest attempts to cite to certain statements in its proposal to establish that Flairdocs is an ERP, none of these statements address any part of the ERP definition provided in the solicitation with regard to Flairdocs.  COS at 26.

Flairsoft contends that its proposal thoroughly explains how Flairdocs is an ERP, and demonstrates the required experience.  Flairsoft also contends that the agency improperly determined that Flairdocs is not an ERP. 

We have reviewed the record, and while Flairsoft argues that Flairdocs is an ERP, it largely relies on block quotations from its applicable TNs coupled with conclusory statements that these quotations meet certain portions of the ERP definition.  Protest at 16-18; Comments at 5-6.  Lacking in the protester’s arguments is a substantive explanation of how these quotations meet the solicitation requirements or how the agency’s evaluation was unreasonable.  As such, Flairsoft provides our Office with no basis to find the agency’s evaluation unreasonable.  Flairsoft’s arguments in this regard reflect Flairsoft’s general disagreement with the agency’s evaluation findings, and are not sufficient to establish that the agency acted unreasonably.  Trofholz Techs., Inc., B‑404101, Jan. 5, 2011, 2011 CPD ¶ 144 at 3-4.  Consequently, we deny this protest ground.

IT Business Analysis Element

Flairsoft challenges the agency’s evaluation of its proposal under the testing, validation and verification sub‑element (i.e., 3b) of the IT business analysis element.  This element was comprised of the following four sub-elements:  requirements analysis; testing, validation and verification; service desk/help desk; and functional business area expert.  RFP at 167-168.  In order to receive the 150 points available under this sub-element, offerors were required to provide demonstrated experience providing testing, validation and verification as a life-cycle software service in all areas defined below:

  • Obtaining, verifying, or providing data for any of the following:  the performance, operational capability, and suitability of systems, subsystems, components, or equipment items; or vulnerability and lethality of systems, subsystems, components, or equipment items.  [hereinafter “testing”]
  • Evaluating a system or software component, in the development process to determine whether the item satisfies specified requirements; [hereinafter “validation”] and
  • Confirming a system element meets design-to or build-to specifications.  [hereinafter “verification”]

RFP at 167-168.  The agency evaluated TN 2, a project that dealt with the MiSACWIS systems, and TN 3,which concerns COTS software implementation for MDM Solutions,[13] cited by Flairsoft for this sub‑element in its self-scoring work sheet.[14]  AR, Tab 8, Flairsoft’s Self-Scoring Worksheet at 3.  The agency determined these technical narratives failed to demonstrate Flairsoft’s experience with regard to validation and verification.  AR, Tab 10, Flairsoft Technical Evaluation, at 22.  Finding Flairsoft to lack experience in two of the three required areas, the agency awarded Flarisoft’s proposal no points for this sub‑element.  Id. at 24. 

Flairsoft argues that its proposal demonstrates both validation and verification however the agency “completely ignored” its technical narratives.”  Protest at 9, 22-26.  As one example, the protester contends that the agency ignored the following description of its performance in TN 2, which focused on verification:

Our testing team relies on the [DELETED] and [DELETED] suite of tools to maintain a robust testing capability.  We can use [DELETED] or [DELETED] to provide functional testing.  We used [DELETED] in our MiSACWIS effort providing customer tester and our own internal testing team a common environment for test.  We standup a [DELETED] environment in days and provide user and functional testing.  We also implement [DELETED] to provide performance testing capability.  For our CREC effort, we delivered [DELETED] and similarly improved functional and performance testing times.  [DELETED] and [DELETED] all provide database for managing the test information, comprehensive test reporting, and test script automation. 

AR, Tab 7, Vol. II, Flairsoft’s Technical Proposal at 14. 

The agency responds that while Flairsoft’s proposal identified a testing tool [DELETED], and outlined the offeror’s “testing methodology,” its proposal did not demonstrate the offeror’s experience performing validation and verification services on the MiSACWIS project.  AR, Tab 10, Flairsoft’s Technical Evaluation at 23.  The agency determined that while this technical narrative provides the definition and purpose of the various tests, that “[n]aming a tool and test does not demonstrate the use of that tool to provide validation or verification of MiSACWIS.”  Id.

The record demonstrates that the agency reasonably determined that while the protester “states generally the validation and verification process,” it failed to provide sufficient information that would demonstrate the offeror’s specific experience providing validation, and verification, as defined in the solicitation.  Id.  In this regard, the agency reasonably determined that Flairsoft’s inclusion of general statements that it followed certain procedures, policies and processes for validation and verification, failed to meet the solicitation’s requirements regarding its experience.  Although Flairsoft contests the agency’s evaluation, we find its arguments amount to disagreement with agency’s evaluation which, by itself, is not sufficient to establish that the evaluation was unreasonable.  Ben‑Mar Enters., Inc., B‑295781, Apr. 7, 2005, 2005 CPD ¶ 68 at 7.  Therefore, we deny this protest ground.

Tools/Development Methodology Element

Flairsoft contests the agency’s evaluation of its proposal under the testing sub‑element (i.e., 5c) of the tools/development methodology element.  Protest at 26-29.  The tools/development methodology element was comprised of the following four sub‑elements:  security; quality; testing and software development methodologies.  RFP at 168-169.  In order to receive the 150 points available under the testing sub-element, offerors were required to demonstrate experience using a COTS, GOTS or FOSS tool in the functional area of testing to analyze source code for vulnerabilities during the life‑cycle of a project.  RFP at 169.  The RFP provided that the agency would evaluate the offeror’s experience using a common database to manage the test information for the system under test (to include capturing defects) or the offeror’s experience creating, maintaining, and executing test scripts using an automated tool.  Id.  The solicitation also stated that, in order to receive points under this sub-element, offerors were required to identify the tool with which they have experience “to include but not limited to:  Hewlett Packard Application Lifecycle Management (HP ALM), Selenium, [and] Quick Test Pro.”  Id. 

The agency’s evaluation concluded that Flairsoft’s proposal failed to demonstrate experience using a common database to manage the test information for the system under test or Flairsoft’s experience creating, maintaining, and executing test scripts using an automated tool.  AR, Tab 10, Flairsoft Technical Evaluation, at 29-30.

The protester claims that it demonstrated its experience “using a common database to manage test information for the system under test,” (RFP at 169) when it stated in TN 2 that it had “used [DELETED] in our MiSACWIS effort providing customer tester and our own internal testing team a common environment for test.”[15]  Protest at 28, quoting AR, Tab 7, Vol. II, Flairsoft Technical Proposal,at 14; Comments at 14.

The agency contends that Flairsoft’s declaration that the tool is used for “a common environment” for the customer and internal team, failed to demonstrate its “experience using the tool as a common database/environment to manage test information for MiSACWIS or creating, maintaining and executing test scripts using an automated tool as is required by the sub-element.”  AR, Tab 10, Flairsoft’s Technical Evaluation, at 30.  While Flairsoft’s proposal states that it used [DELETED] and [DELETED] on MiSACWIS, the agency contends that the proposal provided no further information to actually demonstrate the use of either tool as required by the solicitation.  Id.  The agency states that the criteria for this sub-element required “more than a generic statement” about the program capabilities of [DELETED] and [DELETED].  COS at 37.  The agency argues that Flairsoft included such a generic statement in its proposal when it stated that the program capabilities of [DELETED] and [DELETED] are to “provide database for managing the test information, comprehensive test reporting, and test script automation.”  Id. quoting AR, Tab 7, Vol. II, Flairsoft Technical Proposal,at 14. 

Based on our review of the record, we find reasonable the agency’s conclusion that Flairsoft’s statements in its proposal did not meet the requirements of this sub-element.  We also agree that Flairsoft failed to demonstrate its experience using a common database to manage test information, or its experience creating, maintaining, and executing test scripts using an automated tool.  In this regard, we agree with the agency that Flairsoft’s statement that it had used [DELETED] did not demonstrate its use on the project as a common database or provide automated testing with the required test script experience, as required by the solicitation.  Flairsoft’s challenge to the agency’s evaluation amounts to disagreement with the agency’s evaluation which, without more, is not sufficient to establish that the evaluation was unreasonable.  This protest ground is denied.

Database Components Element

Flairsoft challenges the agency’s evaluation of its proposal under the RDBMS sub‑element (i.e., 7a) of the database components element.  This element was comprised of the following three sub-elements:  relational database management system (RDBMS); not only structured query language (NoSQL); and RDBMS or NoSQL.  RFP at 170-171.  In order to receive the 200 points available under this sub‑element, offerors were required to demonstrate experience developing, designing or maintaining a RDBMS database to include, but not limited to:  Oracle, SQL Server, DB2, SyBase, Postgresql, MarialDB, JasperSoft, and MYSQL.  RFP at 170.  The solicitation stated that the agency would not accept points claimed by the offeror if the offeror did not identify the database with which it had experience.  Id. 

The agency evaluated Flairsoft’s referenced TNs and concluded that they did not demonstrate the offeror’s experience developing, designing or maintaining a RDBMS database.  AR, Tab 10, Flairsoft’s Technical Evaluation, at 38. 

Flairsoft disagrees.  The protester asserts that the agency improperly ignored portions of its cited technical narratives that provided the required information.  Id.  In support of its argument Flairsoft provides several excerpts from TN 1, including the following:

We provided the complete technical solution to include the design, build, and testing of the new software system along with delivery of the system architecture design and database design. 

AR, Tab 7, Vol. II, Flairsoft Technical Proposal,at 5.  The agency responds that while Flairsoft’s proposal states that the offeror “provided the complete technical solution to include the design,” and references “database design,” neither reference demonstrates Flairsoft’s actual designing of the database.  AR, Tab 10, Flairsoft Technical Evaluation, at 38.  The agency contends that the protester’s rephrasing and restating of the evaluation criteria is not sufficient to demonstrate its experience.  COS at 42.   In this regard, the agency’s evaluation stated that, “[p]roviding a declaration of the delivery of the database design does not fulfill the sub-element requirement to demonstrate experience developing, designing or maintaining a RDBMS database.”  AR, Tab 10, Flairsoft’s Technical Evaluation at 38.  The agency also concluded that “stating that the offeror tailors forms, reports, UI, installs Oracle, and is an Oracle Gold Partner does not demonstrate the offeror’s experience developing, designing or maintaining an Oracle RDBMS as is required by this sub-element.”  Id. at 39.  In this regard, the agency argues that Flairsoft’s proposal did not provide sufficient details for the agency to substantiate the offeror’s claimed experience.

Based upon our review of the record, we find no basis to question the agency’s conclusion that the protester failed to demonstrate the experience required under this sub-element.  As noted above, the solicitation clearly instructed offerors that proposals “shall be clear, specific, and shall include sufficient detail for effective evaluation and for substantiating the validity of stated claims.”  RFP at 142; see also RFP at 165 (“The Government will deduct points claimed by the offeror for a technical experience element when a technical narrative does not demonstrate the required experience.”).  Therefore, while Flairsoft, in TN 1, provided declarations that it “provided the complete technical solution to include the design,” and references “database design” (AR, Tab 7, Vol. II, Flairsoft Technical Proposal,at 5), the agency reasonably determined that Flairsoft failed to demonstrate its actual experience in developing designing or maintain a RDBMS database. Flairsoft’s disagreement with the agency’s evaluation, without more, is not sufficient to establish that the evaluation was unreasonable.[16]  Ben-Mar Enters., supra.  We deny this protest ground.[17]

Server Operating Systems Element

Finally, Flairsoft challenges the agency’s evaluation of its proposal under the server operating systems element (i.e., 9).  Protest at 38-41.  To receive the 300 points available under this element, an offeror was required to demonstrate experience providing life-cycle services to support the efficient operations of an information system for any of the following distribution servers:  Windows Server, Red Hat enterprise Linux, SUSE (Software and Systems Development), UBUNTU.  RFP at 171,188.  The solicitation further advised that the agency would not accept points claimed if the offeror did not identify one of the distribution servers previously listed, with which it had experience.  Id. at 171.  The RFP defined life-cycle services as “[t]he scope of activities associated with a system, encompassing the system’s initiation, development, implementation, operation and maintenance, and ultimately its disposal that instigates another system initiation.”  Id. at 216.

The agency’s evaluation concluded that Flairsoft’s proposal did not demonstrate experience providing services using one of the distribution servers required.  AR, Tab 10, Flairsoft Technical Evaluation, at 48-49.  In its protest, Flairsoft challenges the agency’s evaluation under this element, by quoting language from TN 3.[18]  The protester argues that activities that it performed, and that it described in TN 3, such as installing security patches, application patches and human monitoring, “are the essence of ‘support[ing] the efficient operation of an IS’ on a referenced system.”  Comments at 20-21. 

The agency evaluated TN 3 and found that Flairsoft provided a “generic statement of experience and responsibility,” but failed to correlate any of the services mentioned to supporting the efficient operation of an IS using any of the servers required.  AR, Tab 10, Flairsoft’s Technical Evaluation at 49.  The agency determined that Flairsoft failed to demonstrate the required experience of this sub-element.  Id.

Flairsoft’s filings with our Office fail to establish that the agency improperly ignored relevant information in its proposal regarding certain services that it had performed.  Rather, the agency reasonably determined that while Flairsoft stated in its proposal that it performed certain services, such as installing security patches, it failed to provide sufficient information in its proposal that would correlate these services to supporting the efficient operation of an IS using any of the servers required.  We therefore find that Flairsoft’s general challenge amounts to disagreement with agency’s evaluation which, by itself, is not sufficient to establish that the evaluation was unreasonable.[19]  This protest ground is denied.

Given our determinations above, we need not address the protester’s other challenges to the agency’s evaluation because even if Flairsoft were to prevail with regard to its remaining challenges, its proposal would remain technically unacceptable.  As stated above, in order to receive an acceptable rating under the technical experience factor, a proposal had to receive a score of at least 4,200 points; Flairsoft’s technical proposal received a score of 3,650 points.  Thus, even if our Office agreed with Flairsoft regarding its remaining alleged evaluation errors, this would only afford Flairsoft an additional 450 points, for a total technical score of 4,100, which is 100 points below a technically acceptable score.[20]

The protest is denied.

Thomas H. Armstrong
General Counsel



[1] Citations to the RFP are to the conformed copy provided by the agency.  AR, Tab 6, RFP.

[2] The solicitation stated that pursuant to “10 U.S.C. § 2305(a)(3)(C), as amended by Section 825 of the National Defense Authorization Act (NDAA) for Fiscal Year 2017, the Government will not evaluate cost or price for the IDIQ contract.  Cost or price to the Government will be considered in conjunction with the issuance of a task or delivery order under any contract awarded hereunder.”  Id. at 162.

[3] The technical experience factor was comprised of the following ten technical elements:  (1) life-cycle software services; (2) cybersecurity; (3) IT business analysis; (4) programming languages/frameworks; (5) tools/software development methodologies; (6) platforms/environments; (7) database components; (8) mobile/internet of things; (9) server operating systems; and (10) COTS/GOTS (government-off-the-shelf)/FOSS (free and open source software) software, as well as the non‑technical experience element of government facility clearance level.  Id. at 165‑171.  Under these ten elements are a series of sub-elements, designated by letters.  For example, under the first element are five sub-elements, designated as 1a, 1b, 1c, 1d, and 1e.  Id. at 165-166. 

[4] The technical experience volume and the past performance volume had page limits of 20 pages and 25 pages, respectively.  RFP at 145.  The CMMI documentation volume and the contract documentation volume had no page limitations.  Id.

[5] The RFP’s instructions directed offerors to complete a cross-reference matrix, which was attached to the solicitation.  Id. at 146, 179-183.  The offeror’s cross‑reference matrix was required to demonstrate “traceability” between the offeror’s contract references.  Id. at 146.  An offeror’s cross-reference matrix was required to show “which contract references [were] used to satisfy each technical element and each past performance sub-factor.”  Id.

[6] The solicitation allowed offerors to provide up to six contract references, each of which was to have its own TN, to demonstrate its technical experience.  Id. at 149.  TNs were to be submitted in numerical order (i.e., TN 1, TN 2, TN 3).  Id. 

[7] The agency’s estimated value for all of the SBEAS contract awards is a maximum of $13.4 billion.   AR, Tab 1, Contracting Officer’s Statement (COS) at 3. 

[8] CMMI is a process level improvement training and appraisal program that is administered by the CMMI Institute.

[9]Flairsoft initially challenged the agency’s evaluation under the commercial, non‑commercial, or hybrid cloud sub-element (i.e. 6c) of the platforms/environments element.  Protest at 29-30.  However, the agency provided a substantive response to this protest ground in its agency report, COS at 37-40, and Flairsoft failed to respond to this argument in its comments.  Comments at 13-14.  We therefore dismiss this protest ground as abandoned.  IntelliDyne, LLC, B-409107 et al., Jan. 16, 2014, 2014 CPD ¶ 34 at 3 n.3; see 4 C.F.R. § 21.3(i)(3).

[10] For example, the RFP instructed that technical experience and past performance information should be addressed in separate proposal volumes, each with specific page limitations.  RFP at 145.  The protester argues that the agency improperly failed to consider information found in Volume III of its proposal, titled Past Performance, regarding sub-elements 1a, 1d, and element 9, which were considered as part of the technical evaluation.  Protest at 41-45.  According to Flairsoft this information was “too close at hand” for the agency not to consider.  Id. at 41.  While the protester uses the term “too close at hand,” the protester is in fact arguing that the agency should have considered information in its past performance volume, Volume III, as part of the technical experience, Volume II evaluation.  An offeror’s technical evaluation is dependent on the information furnished and it is an offeror’s responsibility to submit a well-written proposal, with adequately detailed information which clearly demonstrates compliance with the solicitation requirements and allows a meaningful review by the procuring agency.  International Med. Corps., supra at 8.  Here, the solicitation clearly stated that the technical experience evaluation would utilize the offeror’s self‑scoring worksheet and its TNs, and made no mention of using past performance narratives in the technical evaluation.  RFP at 165.  While Flairsoft’s past performance narratives may contain information that the agency found lacking in the company’s technical proposal, the agency was not required to piece together disparate parts of Flairsoft’s proposal to determine whether it met the solicitation requirements.  James Constr., B‑402429, Apr. 21, 2010, 2010 CPD ¶ 98 at 5.  We deny this protest ground. 

Flairsoft further argues that, under sub-element 6d, the agency improperly failed to consider information about which the agency had first-hand knowledge regarding a certain Air Force project that Flairsoft contends the agency should have considered in evaluating its proposal.  Protest at 44-45.  However, our Office has found that while agencies may consider such information when evaluating experience, they are not required to do so.  SNAP, Inc., B-409609, B-409609.3, June 20, 2014, 2014 CPD ¶ 187 at 8.  In short, the Air Force was not required to remedy Flairsoft’s failure to include information in its proposal.  Id. 

[11] In its protest, Flairsoft also argues that portions of TN 4 demonstrate its build experience.  Protest at 12.  Flairsoft’s prior protest of the agency’s evaluation under this sub-element did not contain any arguments that the agency improperly evaluated or failed to evaluate TN 4 under this sub-element.  Flairsoft Protest (B-415716.12) at 9-10, 15-16, and 24.  Yet, Flairsoft knew at that time that the agency had not credited it with build experience it now argues is demonstrated by TN 4.  We dismiss the protester’s arguments regarding TN 4 as untimely because these arguments could have and should have been raised in its earlier protest.  The agency’s reevaluation does not provide the protester with a timely basis for raising an issue that could have been raised in the previous protest.  Savvee Consulting, Inc., B‑408416.3, Mar. 5, 2014, 2014, CPD 92 at 6-7 (dismissing as untimely new arguments that were based on information that was known to the protester in the prior protest); Loyal Source Gov’t Servs, LLC, B‑407791.5, Apr. 9, 2014, 2014 CPD ¶ 127 at 5-7.

[12] The agency notes that while the RFP’s Instructions to Offerors states that references should be the same for both the cross-reference matrix and self-scoring worksheet, that Flairsoft’s proposal contained multiple references in its cross-reference matrix that were not included in its self-scoring worksheet, such as TN 2 described above, with regard to sub-element 1(a).  COS at 20 n.3.  The agency states that in most instances it evaluated the TNs cited in the cross‑reference matrix in addition to those TNs cited in the self-scoring worksheet.  Id.  However, regarding sub-element 3b, Flairsoft’s self‑scoring worksheet listed TN 2 and TN 3 (AR, Tab 8, Flairsoft’s Self-Scoring Worksheet, at 3), and its cross-reference matrix listed TN 4 (AR, Tab 9, Flairsoft’s Cross-Reference Matrix at 2).  The agency evaluated only the technical narratives listed in the self-scoring worksheet. 

[13] The agency’s debriefing and the technical evaluation documents for this sub-element both contained a typographical error stating that the agency re-evaluated TN 2 and TN 4, rather than TN 2 and TN 3, as cited by Flairsoft for this sub‑element in its self‑scoring work sheet.  AR, Tab 8, AR, Tab 8, Flairsoft’s Self-Scoring Worksheet at 3; Tab 12, Flairsoft Debriefing, at 12; Tab 10, Flairsoft Technical Evaluation, at 22.  Despite this error, both the debriefing and the technical evaluation state that Flairsoft’s proposal cited TN 2 and TN 3 for this sub-element.  Id.  In addition, the second evaluation narrative clearly refers to TN 3 which concerns COTS software implementation for MDM Solutions, since MDM Solutions is referred to throughout the agency’s evaluation narrative and the agency even specifically refers to “TN #3” in its evaluation.  AR, Tab 12, Flairsoft Debriefing, at 113-14; Tab 10, Flairsoft Technical Evaluation, at 23-24. 

[14] Flairsoft’s protest also provided citations to TN 1 and TN 4, TNs not included in Flairsoft’s previous protest challenging the agency’s evaluation under this sub-element.  Flairsoft Protest (B-415716.12) at 11, 28-29.  As discussed above, we dismiss these protest grounds as untimely because they could have and should have been raised in its earlier protest.  Savvee Consulting, Inc., supra.; Loyal Source Gov’t Servs, LLC, supra.

[15] Flairsoft’s protest also included arguments that the agency improperly found that TN 1 did not show the required experience.  (Protest at 27-29).  The agency provided a substantive response to these arguments in its agency report.  COS at 32.  While Flairsoft’s comments mention TN 1, they do not refute the agency’s conclusions with regard to that technical narrative.  Comments at 13-14.  We therefore dismiss its arguments regarding TN 1 as abandoned.  IntelliDyne, LLC, supra; see 4 C.F.R. § 21.3(i)(3). 

[16] Flairsoft also included in its protest and comments citations to TN 2 and TN 4, TNs not cited in Flairsoft’s previous protest contesting the agency’s evaluation under this sub-element.  Flairsoft Protest (B-415716.12) at 21.  As discussed above, we dismiss these protest grounds as untimely because they could have and should have been raised in its earlier protest.  Savvee Consulting, Inc., supra; Loyal Source Gov’t Servs, LLC, supra.

[17] The protester, for the first time in its comments includes an explanation of how the sections of TN 1 cited in its protest provided the information required by the solicitation.  (compare Protest at 30-33, with Comments at 14‑16).  Our Bid Protest Regulations, 4 C.F.R. § 21.2(a)(2), require that protests other than those challenging the terms of a solicitation be filed within 10 days of when a protester knew or should have known of its basis for protest.  Further, our Bid Protest Regulations do not contemplate the piecemeal presentation or development of protest issues through later submissions citing examples or providing alternate or more specific legal arguments missing from earlier general allegations of impropriety.  See J5 Systems, Inc., B‑406800, Aug. 31, 2012, 2012 CPD ¶ 252 at 5.  We find that this additional explanation in Flairsoft’s comments amounts to a piecemeal presentation of information which we will not consider.  See Metasoft, LLC--Recon., B‑402800.2, Feb. 17, 2011, 2011 CPD ¶ 47 at 2‑3.

[18] While Flairsoft’s proposal cited TN 1 and TN 3 for this sub-element Flairsoft’s protest only addresses TN 3, so we will only address TN 3. 

[19] Flairsoft’s protest also included citations to TN 2, a TN that was not raised in Flairsoft’s previous protest contesting the agency’s evaluation under this sub-element.  Flairsoft Protest (B‑415716.12) at 13, 23-24, and 29.  As discussed above, we dismiss these protest grounds as untimely because they could have and should have been raised in its earlier protest.  Savvee Consulting, Inc., supra, Loyal Source Gov’t Servs, LLC, supra.

[20] In its protest, Flairsoft challenged the agency’s evaluation under the following 12 sub‑elements and one element:  1a, 1d, 1e, 3a, 3b, 5c, 6c, 6d, 7a, 8a, 8b, 8c, and 9.  As discussed above the protester’s allegations with regard to 6c is dismissed, and its allegations with respect to 1a, 1d, 1e, 3b, 5c, 6d, 7a, and 9 are denied.  Consequently, even if meritorious, the remaining protest grounds would only result in an additional 450 points, according to the following breakdown:  3a=150 points, 8a=100 points, 8b=100 points, and 8c=100 points.  RFP 185-188.