Skip to main content

TruLogic, Inc.

B-297252.3 Jan 30, 2006
Jump To:
Skip to Highlights

Highlights

TruLogic, Inc. protests the award of a contract to Command Technology, Inc. (CTI) under request for proposals (RFP) No. FA8104-05-R-0414, issued by the Department of the Air Force for Interactive Electronic Technical Manual (IETM) systems development and Technical Order (TO) "sustainment." TruLogic challenges the agency's technical and price evaluation, and contends that the source selection authority (SSA) was biased.

We deny the protest.
View Decision

B-297252.3, TruLogic, Inc., January 30, 2006

Decision

Matter of: TruLogic, Inc.

File: B-297252.3

Date: January 30, 2006

Theresa Armentrout for the protester.

Igor Boris, Command Technology, Inc., for the intervenor.

Maj. Jeffrey Branstetter, Department of the Air Force, for the agency.

Sharon L. Larkin, Esq., and James A. Spangenberg, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

Source selection authority's (SSA) disagreement with the majority of the evaluators and acceptance of the minority's recommendation that the awardee be selected for award is unobjectionable and does not evidence a lack of –impartiality,— where the SSA reached a reasoned conclusion, supported by the record, that the awardee's lower-priced, lower-rated proposal deserved a higher technical rating than was assigned by the majority and represented the best value to the government.

DECISION

TruLogic, Inc. protests the award of a contract to Command Technology, Inc. (CTI) under request for proposals (RFP) No. FA8104-05-R-0414, issued by the Department of the Air Force for Interactive Electronic Technical Manual (IETM) systems development and Technical Order (TO) –sustainment.— TruLogic challenges the agency's technical and price evaluation, and contends that the source selection authority (SSA) was biased.

We deny the protest.

BACKGROUND

TruLogic previously protested an Air Force decision to award a contract for IETM system development to CTI under the RFP. We dismissed the protest after the Air Force notified our Office that it was taking corrective action. The agency then amended the RFP, sought and evaluated revised proposals, and once again selected CTI for award. This protest challenges the new award to CTI.[1]

The RFP, set aside for small businesses, contemplated the award of a fixed-price contract for IETM systems development and TO sustainment support for specific Air Force engines. An IETM is a secure computer solution that distributes standards-based electronic technical data. RFP, Statement of Work (SOW), sect. 2.1. The required tasks in the SOW included developing IETM systems from government-furnished digital TO source files, converting TOs from legacy paper format to an Air Force-approved IETM system format, providing sustainment support for applicable TOs after the IETM systems are built, and providing improvements and enhancements to the IETM systems during the course of the contract. Id. sect. 1.1. The agency did not dictate how the IETM systems were to be built. Rather, it left to the offerors to determine which –methodology— and –processes— to use to develop the IETM systems. The SOW did state that the IETM and all deliverables had to be compliant with Air Force, Department of Defense, and other identified standards, as well as certain specified –functionality requirements.— Id. sections 2.1.2, 2.2.

The RFP provided that award would be made to a General Services Administration (GSA) contract holder under the firm's GSA Federal Supply Schedule contract for a base period with four 1-year options. The RFP stated that award would be made on a –best-value— basis, considering mission capability (including two equally weighted subfactors for program management[2] and technical performance[3]), proposal risk, past performance, and price. The first three factors were stated to be of equal importance and combined were –significantly more important than price; however, price will contribute substantially to the selection decision.— RFP, Evaluation Factors for Award, sections 2.1, 2.2.

With regard to the program management subfactor of the mission capability factor, offerors were requested to –describe in detail a sound and rational approach to program management for the IETM [systems] program— that addressed –at a minimum— each of the identified –essential components— of the approach. With regard to the technical performance subfactor of the mission capability factor, offerors were instructed to –describe in detail a sound and rational approach to program management for the IETM [systems] program— and include with their proposals a –demonstration disk— to demonstrate –basic functionalities and enhancements— listed in the SOW. The RFP stated that –[i]f a functionality is not included in the disk, the proposal must state how the offeror intends to meet the required functionality.— Offerors were informed that their –technical performance— would be evaluated to include their –technical approach as far as best value to the Government,— and that –[t]his will include future enhancements, upgradeability, suitability, maintainability, and interoperability and integration with other systems.— RFP, Instructions to Offerors, sect. 4.2.3.

The RFP stated that each of the mission capability subfactors would receive one of four adjectival ratings (blue, green, yellow, or red) based on the assessed strengths and proposal inadequacies within each offeror's proposal. The mission capability subfactor ratings were defined as follows:

Color

Rating

Description

Blue

Exceptional

Exceeds specified minimum performance or capability requirements in a way beneficial to the government; proposal must have one or more strengths and no deficiencies to receive a blue.

Green

Acceptable

Meets specified minimum performance or capability requirements delineated in the Request for Proposal; proposal rated green must have no deficiencies but may have one or more strengths.

Yellow

Marginal

Does not clearly meet some specified minimum performance or capability requirements delineated in the Request for Proposal, but any such uncertainty is correctable.

Red

Unacceptable

Fails to meet specified minimum performance or capability requirements; proposal has one or more deficiencies. Proposals with an unacceptable rating are not awardable.

RFP, Evaluation Factors for Award, sect. 2.4.

In addition, the RFP provided that each of these subfactors would be assessed risk ratings of high, moderate, or low risk. The RFP further advised that in evaluating risk, the agency would focus on –risks and weaknesses associated with an offeror's proposal approach,— including an assessment of the –potential for disruption of schedule, increased cost, degradation of performance, and the need for increased Government oversight, as well as the likelihood of unsuccessful performance.— RFP, Evaluation Factors for Award, sect. 2.5.

With regard to past performance, the RFP requested that offerors submit information on relevant contracts, performed within the 3 years prior to the issuance of the RFP, that would allow the agency to asses the offeror's probability of successfully performing the effort as proposed. From this information, the agency would assess –confidence— ratings of –high,— –significant,— –unknown,— –little,— or –no— confidence. Id. sect. 2.6.

With regard to price, the RFP contained numerous fixed-price contract line item numbers (CLIN) and subCLINs, for both the base period and each option year, covering IETM systems development and tasks related to the continued TO sustainment support, IETM improvements and additional functional requirements, and IETM source data files.[4] RFP, Schedule B. The RFP stated that –prices and rates— were to be provided for each CLIN, and that –no other rates will be allowed or considered for negotiating a price for each work request.— RFP, Instructions to Offerors sect. 5.1(a). The offerors were also requested to state their prices –in exact amounts— and informed that prices –should not be rounded.— Id. sect. 5.1.5.

The RFP stated that each offeror's proposed price would be evaluated for reasonableness based on the total price proposed for the base period requirements and all options. The prices of the –line items and sub-line items— were also to be evaluated to determine if the pricing was unbalanced. While offerors were requested to provide –information other than cost or pricing data— to support price reasonableness and cost realism in accordance with Federal Acquisition Regulation (FAR) sections 15.403-1(b), 15.403-3(a), and 15.403-5, id. sect. 5.1.4, according to the protester, at the –industry day— briefing for this procurement, offerors were instructed not to provide –cost build up— information and provide only CLIN pricing. Protest at 12.

Eight offerors, including TruLogic and CTI, responded to the RFP.[5] Different members of the source selection evaluation team (SSET) evaluated different aspects of the technical proposals. For example, with regard to the mission capability factor, three individuals evaluated the program management subfactor and the configuration management aspect of the technical performance subfactor, and four individuals evaluated the remaining aspects of the technical performance subfactor, including the demonstration disk. Two individuals evaluated past performance, and the contracting officer and an advisor evaluated price. Proposal risk was considered by the evaluators throughout the evaluation. Hearing Transcript (Tr.) at 29-31.[6] Once initial evaluations were complete, the SSET members developed –Evaluation Notices— (EN) that were sent to the offerors regarding technical and price issues. After several rounds of ENs, the entire SSET met to evaluate the responses.

With regard to the technical performance subfactor of the mission capability factor, the SSET members agreed that TruLogic's proposal demonstrated that its proposed IETM system provided –basic functionalities and several enhancements [that were] fully developed and available for immediate incorporation,— was –extremely user'friendly,— and presented a low risk solution. Agency Report (AR), Tab 7A, SSET Majority Report, Slide 24.

The SSET members also agreed that there were no –inadequacies— in CTI's proposal under the technical performance subfactor. However, the majority of the SSET determined that there remained –uncertainties— whether CTI's proposal met some of the functionality requirements of the SOW, for example, those concerning graphics display, bookmarking and annotation capability, and search functions. Id., Slides 17'19. The majority view that CTI's proposal had –significant system uncertainties— was based on their belief that CTI's demonstration disk did not fully demonstrate these capabilities, and reflected the majority's concern that the screen appearance and layout of some of the features were not –user-friendly— and thus would require –system improvement with Government direction.— Id., Slide 24; Tr. at 14, 109; see AR, Tab 6C, Source Selection Decision (SSD), at 6. The minority of members disagreed, noting that CTI fully explained how it met the functionality requirements of the SOW in its EN responses, and that ease of use was not an RFP requirement; rather appearance, formatting, and layout could be worked out during the initial –Guidance Conference— after award.[7] AR, Tab 7B, SSET Minority Report, Slides 16-19; Tr. at 35, 173; see AR, Tab 6C, SSD, at 6-7.

The majority SSET members drafted a final evaluation report, and the minority SSET members drafted a separate evaluation report documenting their disagreement. The resulting reports (both of which were presented to the SSA) reflected different ratings for CTI under the technical performance subfactor of the mission capability factor. In all other respects, the members agreed.[8] The proposal ratings assigned by the majority and minority SSET members were as follows:

TruLogic

CTI

Majority SSET

Minority SSET

Majority SSET

Minority SSET

Mission Capability

Program Management

Green/Low

Green/Low

Green/Low

Green/Low

Technical Performance

Blue/Low

Blue/Low

Yellow/Moderate

Green/Moderate[9]

Proposal Risk

Low

Low

Low/Moderate

Low/Moderate

Past Performance

High Confidence

High Confidence

Significant Confidence

Significant Confidence

Price

$4,163,946

$3,283,235

AR, Tab 7A, SSET Majority Report, Slides 10, 12, 17; Tab 7B, SSET Minority Report, Slides 11, 16. The majority SSET report recommended TruLogic for award based on TruLogic's superior technical rating. AR, Tab 7A, SSET Majority Report, Slide 25. The minority SSET report recommended CTI for award, finding that CTI's proposal met the requirements of the RFP and was the –better value,— given the firm's lower cost and essentially –equal— past performance. AR, Tab 7B, Minority SSET Report, Slide 27.

The SSA (who was also the contracting officer), in the SSD, agreed with the SSET's evaluation and –blue— rating of TruLogic's proposal under the technical performance subfactor of the mission capability factor, and with the assessment that TruLogic's proposal was technically superior to CTI's under this subfactor. In so doing, he set forth the various strengths in TruLogic's proposal that supported that rating. AR, Tab 6C, SSD, at 4-5.

However, the SSA did not concur in the majority SSET's yellow rating of CTI's proposal under this subfactor and instead adopted the minority SSET views that CTI's proposal met the requirements of the RFP in all areas identified by the majority SSET members as –uncertainties— and that it should be rated –green— under this subfactor.[10] Id. at 6. In addition, he found a number of strengths in CTI's proposal that were overlooked by the majority SSET members in their report. For example, he noted that CTI possessed experience converting TOs into the S1000D[11] format, which would be –beneficial to the Air Force in terms of decreased oversights and costs— and would –decrease the time required to field Technical Order systems in S1000D format.— Id. at 7. Another strength was found in CTI's database system, which would make it –easier and less costly to make future upgrades when moving to a Type 2 IETM system,— id. at 11, and would make it easier to transfer data to existing systems and gather and analyze engine operational and maintenance data. Id. at 7. The SSA also found additional strengths in CTI's configuration management and storage, backup, and security plan. Id.Accordingly, the SSA rated CTI's proposal green under the technical performance subfactor of the mission capability factor.

Under the program management subfactor of the mission capability factor, the SSA found that both of these offerors had excellent management structures and described the strengths of each proposal that supported this determination. He also stated that while TruLogic had –more experience in Air Force Technical Order sustainment than CTI,— this was –balanced by— CTI's experience with the Navy and its –extensive commercial experience— with building IETM systems, as well as the fact that CTI's team members were –highly experienced— with Army, Navy, Air Force, and commercial entities. The SSA also determined that the personnel teams of each offeror were –equal.— Based on the foregoing, the SSA rated both proposals green under the program management factor. Id. at 3.

Under the proposal risk factor, the SSA found that TruLogic's proposal presented the –least risk to the schedule and performance— because of the firm's experience on Air Force TO systems. However, the SSA found that CTI's proposal also deserved a low risk rating, rather than the moderate risk rating assigned by the majority SSET members under the technical performance subfactor, because, in his view, CTI's proposal did not contain the –uncertainties— identified by the majority SSET members in their report, and thus there was –little doubt— as to CTI's ability to perform. Id. at 8-9.

Under the past performance factor, the SSA found TruLogic's performance to be slightly superior to CTI's, but found that CTI also presented past performance examples that were –highly rated and very relevant in magnitude and complexity to this effort.— As a result, the SSA concluded that he had –no doubt that either TruLogic or CTI could perform this requirement on schedule per the [SOW].— Id. at 11.

The SSA selected CTI for award, based on his belief that that –CTI will successfully perform the requirement, and at a significantly lower price— than TruLogic and that CTI's proposal provided the –best overall value— to the Air Force. The strengths identified in TruLogic's proposal, in the SSA's opinion, did not –offer sufficient advantage over the features proposed by CTI such as to merit a 27% price premium.— Id. at 12. The SSA notified TruLogic that CTI was selected for award, and this protest followed.

DISCUSSION

TruLogic challenges the agency's evaluation and source selection decision. Where an evaluation is challenged, our Office will not reevaluate proposals, but instead will examine the record to determine whether the agency's judgment was reasonable and consistent with stated evaluation criteria and applicable statutes and regulations. Sam Facility Mgmt., Inc., B-292237, July 22, 2003, 2003 CPD para. 147 at 3. A protester's mere disagreement with the agency's judgment is not sufficient to establish that an agency acted unreasonably. Entz Aerodyne, Inc., B-293531, Mar. 9, 2004, 2004 CPD para. 70 at 3.

TruLogic first argues that the SSA failed to act with –impartiality.—[12] It asserts that the SSA unreasonably ignored the SSET majority view, instead relying on the minority view, and –manipulated— the evaluation to support his selection of CTI for award. TruLogic further contends that the SSA either ignored or only mentioned in a cursory way TruLogic's proposal strengths, while dedicating several paragraphs of the SSD to emphasize CTI's proposal strengths.

The record, however, shows a well-documented, reasoned evaluation and award decision without evidence of bias. Despite TruLogic's insistence that the SSA should have adopted the majority view, source selection officials are not bound by the evaluation judgments of lower level evaluators; they may come to their own reasonable evaluation conclusions. MW-All Star Joint Venture, B-291170.4, Aug. 4, 2003, 2004 CPD para. 98 at 3 n.3. Here, we find that the SSA reasonably concluded that the minority view was a more accurate assessment of CTI's proposal.[13] The record confirms that CTI's EN responses and final proposal revision adequately address the functionality requirements at issue, and supports the minority SSET report and the SSA's conclusion that CTI's proposal met the functionality requirements of the RFP.[14] The SSA did not –manipulate— the evaluation, as alleged, but documented in detail his disagreement with the majority of the SSET. To the extent that TruLogic complains that the SSD contains more paragraphs discussing CTI's proposal than TruLogic's, the agency explains that this was because the SSA was explaining his disagreement with the majority of the SSET, not because the SSA was ignoring the benefits of TruLogic's approach or over-emphasizing CTI's proposal strengths. In sum, our review of the record reveals that the SSD fairly considered the benefits and drawbacks of both TruLogic's and CTI's proposal features, and reasonably concluded that CTI's proposal provided the better value.

TruLogic challenges the assessment of strengths and weaknesses in its and CTI's proposals. For example, it complains that the agency unreasonably assessed a strength in CTI's proposal for its database system. The record shows that the agency reasonably concluded that CTI's database system offered advantages to the agency, not offered by TruLogic's system, including the ability to transfer and gather data more easily, make it easier to upgrade in the future to a –Type 2— IETM system, and to provide more interactivity with existing and future systems. Tr. at 61-62, 83-84. Although TruLogic asserts that the agency misevaluated the offerors' systems in this regard, it has not demonstrated that the evaluation of this aspect of the proposals was unreasonable.[15]

TruLogic also complains that the agency evaluated unstated criteria by giving credit to CTI's proposal for its database system, asserting that a database system was not required by the RFP. However, the SOW contemplates that a database IETM system could be provided.[16] RFP, SOW, sect. 2.2.1.a (citing MIL'PRF-87268A-A1, which references requirements for a database system); Tr. at 70-71. Since the RFP informed offerors that their technical approaches would be evaluated for –future enhancements, upgradeability, suitability, maintainability, and interoperability and integration with other systems,— RFP, Instructions to Offerors, sect. 4.2.3, we find no error in the Air Force's assessment that CTI's database system was deserving of a strength for providing benefits in these areas.

TruLogic next complains that the agency assessed CTI's proposal a strength for the firm's S1000D experience, but ignored TruLogic's –RCM— (that is, Reliability Centered Maintenance) experience.[17] However, the agency reasonably concluded that CTI's experience converting TOs to S1000D format for the Navy was easily transferable to the Air Force and provided a benefit to the Air Force since the RFP contemplated that future IETM systems would be S1000D compliant. AR, Tab 6C, SSD, at 7; see RFP, SOW, sections 2.2.2.a, 2.3.1. In contrast, TruLogic had not performed S1000D conversion efforts, and had not provided any deliverables under its RCM contract at the time of the evaluation such that the agency could have evaluated this work. Tr. at 99-102.

TruLogic also contends that the agency rated CTI's proposal too high under the program management subfactor for the firm's commercial and Navy experience, and rated TruLogic's proposal too low given TruLogic's assertedly more relevant Air Force experience. However, the record shows that, although the offerors' experience is discussed in the SSD, the essential bases for both firms' green ratings under this subfactor were the firms' –excellent— management plans as set forth in their respective proposals and discussed in the SSD. Although it is not clear why the SSD compared the firms' experience under this subfactor, given that the RFP did not state that experience would be evaluated under this subfactor, the record nonetheless shows that both offerors had comparable and valuable experience, as reasonably considered by the agency. In this regard, as stated above, the agency favorably considered TruLogic's Air Force experience, but reasonably found that this was –balanced by— CTI's experience with commercial entities and the Navy, and by the broad experience of CTI's team members.[18] AR, Tab 6C, SSD, at 3.

TruLogic also asserts that in order to accomplish the contract work, CTI must employ labor categories for technical writers, illustrators, and quality assurance personnel that are not identified in its GSA Schedule contract. However, as pointed out by the Air Force, technical writers and illustrators are not required under the RFP and their services are not being purchased by the agency here. Tr. at 52, 57; Contracting Officer's Statement (Nov. 16, 2005) at 5. According to the SOW, these tasks are being performed by the Air Force. RFP, SOW, sections 2.5, 3.2. With regard to quality assurance, the agency reasonably determined that tasks requiring this skill could be accomplished within the labor categories identified in CTI's GSA Schedule contract. See Tr. at 53, 170-71.

TruLogic also challenges the agency's evaluation of price, arguing that the agency deviated from the stated evaluation criteria, failed to evaluate price reasonableness,[19] and overlooked CTI's assertedly unbalanced prices. The record demonstrates, however, that the agency adhered to the RFP's evaluation criteria in evaluating price. In this regard, the agency compared each offeror's overall price and the price for each CLIN and sub-CLIN to other offerors' prices and to the government estimate for each year of the contract (including option years). Where the evaluators found unexplained outliers in price (that is, prices that deviated substantially from the other offerors or the government estimate), the agency issued ENs to the offeror to better understand the offeror's approach. From this analysis, which the agency explained in detail at GAO's hearing, we find that the agency reasonably concluded that CTI's price was reasonable and not unbalanced.

TruLogic raises a number of other specific concerns about CTI's price. For example, it asserts that CTI's price must be unbalanced since CTI did not alter its CLIN pricing based on the number of pages that must be processed. However, as the agency reasonably explains, CTI's proposed IETM system is not dependent on page counts and will not lead to additional costs for CLINs with a greater number of pages. See Tr. at 144-47.

TruLogic also complains that CTI's price contains –hidden— costs, that is, costs necessary to render CTI's proposal compliant with the RFP's functionality requirements identified by the majority SSET report as –uncertainties.— However, as discussed above, we find that the SSA reasonably concluded that CTI's proposal met these functionality requirements and thus there are no –hidden— costs.

TruLogic also complains that CTI's prices for some CLINs were –per seat— unit prices prohibited by the RFP. In this regard, the SOW stated that –[t]he Government will not accept a per-seat licensing solution for the IETM program. The contractor's solution must provide the authority for unlimited Government distribution of the IETM system.— RFP, SOW, sect. 2.1. TruLogic misinterprets CTI's pricing. CTI's CLIN price was developed from multiplying its unit price (which was –per seat—) by some multiple to arrive at a total fixed price amount for –unlimited seats.— Tr. at 163; AR, Tab 9A, CTI's Price Proposal, Vol. III, Schedule B. CTI thus offered an unlimited seat solution, not a –per seat— solution as TruLogic argues.

TruLogic further asserts that CTI's proposal did not comply with the requirement for –exact— rather than –rounded— pricing, noting that many of CTI's CLINs are identically priced and are rounded to whole dollar amounts. The agency explains that CTI's prices were not rounded, but were based on its GSA Schedule pricing, which was stated in whole dollars. Tr. at 163-64. According to documents included with CTI's price proposal, however, CTI's GSA Schedule pricing is not stated in whole dollars and its pricing, therefore, appears to be rounded. AR, Tab 9A, CTI's Price Proposal, Vol. III, GSA Schedule Price List. Although it appears from the record that the agency may have waived the requirement for –exact— pricing with regard to CTI's proposal, the record does not show that the competition was in any way compromised by the agency's acceptance of CTI's proposal or that TruLogic was otherwise prejudiced as a result. There is no evidence, and it is not otherwise apparent, that the agency's waiver of the requirement would in any way materially affect any offerors' proposed overall price (TruLogic does not claim, for example, that this would allow it to materially lower its overall price), such that TruLogic could overcome the substantial price differential between its and CTI's proposals in order to have a reasonable chance for award. In the absence of prejudice, the protest cannot be sustained on this ground. See Citywide Managing Servs. of Port Washington, Inc., B-281287.12, B'281287.13, Nov. 15, 2000, 2001 CPD para. 6 at 10 (waiver or violation of solicitation requirements does not provide basis to sustain protest where there is no prejudice).

Finally, TruLogic complains that CTI's price for the travel CLIN ($32,400) was underpriced and should have been accounted for in the price evaluation, especially given TruLogic's higher price ($166,903) for this CLIN. Protest at 13-14; see CLIN Summary Sheet. In contrast to the fixed-priced CLINs for the other contract work, travel was to be evaluated based on estimated travel costs for two people making 27 trips from the offeror's place of business to Tinker Air Force Base (in Oklahoma) or San Antonio, Texas, using Joint Travel Regulation rates. RFP, Evaluation Factors for Award, sect. 2.7. The agency evaluated CTI's pricing based on the assumption of only one individual making the trip, since the agency was aware from CTI's proposal that CTI's teaming partner (which could provide the second individual) was located locally in Oklahoma and this would substantially reduce travel costs. Tr. at 151, 159, 162'63. Our review of the record confirms that CTI's teaming partner will be contributing support to the performance of this contract, such that it is reasonable to assume that CTI's travel costs will be less than those of an offeror such as TruLogic, which does not have such a local partner and which will necessarily have significantly higher travel costs. Consequently, we find that the Air Force's evaluation of this CLIN provides no basis to sustain TruLogic's protest.[20]

The protest is denied.

Anthony H. Gamboa

General Counsel



[1] The protester here proceeded pro se and thus did not have access to certain information in the record. Accordingly, our discussion in this decision is necessarily general in order to avoid reference to protected information. Our conclusions, however, are based on our review and consideration of the entire record.

[2] The program management subfactor included the evaluation of management structure, quality plan, safety plan, management of funds, and personnel staffing and training.

[3] The technical performance subfactor included the evaluation of a demonstration disk; storage, security and backup; configuration management; and hardware and software requirements.

[4] As discussed later in this decision, there were also CLINs for travel for which offerors were to submit estimated costs.

[5] Four offerors were later eliminated from the competitive range.

[6] Our Office conducted a hearing on the protest on January 5, 2006. Both the SSA and the author of the minority SSET report (discussed later in this decision) testified at the hearing.

[7] The SOW provided that a –Guidance Conference— would be held with the awardee within 15 days of award to discuss such issues as –style, work package format, illustrations, data that requires amplification, and missing data.— RFP, SOW, sections 6.3, 6.4.

[8] The minority did not disagree with the evaluation of Tru-Logic's proposal.

[9] Although the minority SSET report did not change the risk rating from moderate to low for the technical performance subfactor, the discussion in the report regarding proposal risk addressed and rejected each of the concerns raised by the majority SSET members that led to the moderate risk rating.

[10] Contrary to TruLogic's interpretation of the redacted report, the agency does not admit that CTI's proposal deviates from solicitation requirements.

[11] S1000D is an international standard from the ISO 9001 area. Tr. at 93.

[12] TruLogic also complains that the agency did not make a –new— source selection decision, as it promised in its corrective action letter to our Office that led to our dismissal of its prior protest, but merely reselected CTI for award. However, the agency's promise of making a new decision did not mean that CTI was ineligible for award or that CTI could not be selected after the reevaluation, as is here suggested by TruLogic. In addition, TruLogic complains that, when taking corrective action, the agency amended the solicitation to alter requirements solely for the benefit of CTI. However, to the extent that TruLogic now complains about the RFP amendments, its protest is untimely, as challenges to defects in solicitation amendments must be raised before the date set for receipt of proposal submissions following the amendment, and these were not timely raised. 4 C.F.R. sect. 21.2(a)(1) (2005).

[13] TruLogic asserts that the majority SSET report presented the more accurate view since the majority members were –high ranking enlisted representatives— and managers, and were more qualified and experienced than the minority SSET members. In this regard, TruLogic speculates that the minority SSET report author was not an –actual 'user'— and only held an –administrative/management— position. However, the SSA explains, and the record confirms, that the minority members had more –direct— experience with TOs than the majority members whose experience was more –administrative.— Tr. at 15-25, 40. TruLogic also contends that the majority view was more credible since the majority SSET members (or at least some of the majority members, as the record shows) reviewed the demonstration disk, and the minority members did not review the disk. However, the minority members discussed with the rest of the SSET the majority's concerns, and credibly explained why they did not need to review the disk to determine from CTI's EN responses and final proposal revision that CTI's proposal met the functionality requirements of the solicitation. See Tr. at 28-29. While TruLogic disagrees with the SSA's determination to give more weight to the minority report, it has not shown the SSA's judgment to be unreasonable. See Entz Aerodyne, Inc., supra.

[14] Although TruLogic contends that CTI's proposal should have been found technically unacceptable because its demonstration disk did not sufficiently address the functionality requirements and that the SSA should not have considered CTI's EN responses, the RFP provided that an offeror could explain how it would address the functionality requirements of the solicitation that were not addressed in its demonstration disk. RFP, Instructions to Offerors, sect. 4.2.3.

[15] For example, TruLogic asserts that the agency misinterpreted its proposal as offering a –Type 1— IETM system and that these categories are applicable only to the Navy and not the Air Force. However, as the agency explains, the classifications are applicable throughout the Department of Defense and, because TruLogic's proposed system did not offer the –interactivity— required of higher-level systems, the agency reasonably concluded that TruLogic's system was only a –Type 1— IETM system. Tr. at 78-79. Although we do not address all of TruLogic's numerous other arguments contesting the nature and relative capability of its and CTI's IETM systems, we have reviewed all of the arguments and find that they do not demonstrate that the evaluation of the proposals in this respect was unreasonable.

[16] Although TruLogic states that all IETM systems are database systems, the agency persuasively explains why, in its view, this is not the case. See Tr. at 70-71, 76'77.

[17] RCM refers to a program or system for collecting equipment inspection, maintenance, and failure data, so that the agency can better forecast the need for the acquisition or replacement of parts. Tr. at 84-85, 98.

[18] TruLogic also disputes the agency's determination that the firms' personnel were essentially –equal— under the program management subfactor. However, the record shows that the two firms offered personnel with similar years and range of experience.

[19] TruLogic complains that price reasonableness cannot be determined without –bulk rate and build up costs,— which were not provided by CTI (or TruLogic, for that matter) or required by the agency. Since TruLogic was aware from the industry day briefing that this information was not required and did not protest prior to the closing date for receipt of proposals, its protest now that this information should have been required is untimely. 4 C.F.R. sect. 21.2(a)(1) (alleged solicitation defects must be protested prior to due date for receipt of proposals).

[20] While TruLogic argues that travel costs should not have been included in the price evaluation because they are too uncertain, this constitutes an untimely challenge to the terms of the solicitation. 4 C.F.R. sect. 21.2(a)(1).

Downloads

GAO Contacts

Office of Public Affairs