Matter of: CXR Telecom File: B-249610.5 Date: April 9, 1993

B-249610.5: Apr 9, 1993

Additional Materials:

Contact:

Edda Emmanuelli Perez
(202) 512-2853
EmmanuelliPerezE@gao.gov

 

Office of Public Affairs
(202) 512-4800
youngc1@gao.gov

Failed a benchmark conducted by the agency where the solicitation provided for retesting if benchmark failure was attributable to causes other than a failure of the offeror's equipment and the failure appears to have resulted from unanticipated test conditions and variations in the capabilities of the different test equipment used by the offerors and the agency. Was improper where the benchmark was a fundamental government requirement and the second benchmark was so technically flawed that it could not reasonably form a basis for award. Contending that the agency improperly relaxed the test parameters to permit ComPro's modem to pass a rerun of the initial test and that the agency should have made award on the basis of the initial test that ComPro's modem failed and CXR's modem passed.[1] We sustain the protest.

Matter of: CXR Telecom File: B-249610.5 Date: April 9, 1993

PROCUREMENT Competitive Negotiation Offers Evaluation Technical acceptability Benchmark testing Agency properly reran benchmark of all offerors' equipment after an offeror--that had pre-tested and certified its modem in accordance with the solicitation's detailed benchmark parameters--failed a benchmark conducted by the agency where the solicitation provided for retesting if benchmark failure was attributable to causes other than a failure of the offeror's equipment and the failure appears to have resulted from unanticipated test conditions and variations in the capabilities of the different test equipment used by the offerors and the agency. PROCUREMENT Competitive Negotiation Offers Evaluation errors Evaluation criteria Benchmark testing Award to a firm, which failed the agency's first benchmark test, based on its passing of a second benchmark, was improper where the benchmark was a fundamental government requirement and the second benchmark was so technically flawed that it could not reasonably form a basis for award.

Attorneys

DECISION CXR Telecom protests the Department of Commerce's award of a contract to ComPro Systems, Inc., under request for proposals (RFP) No. 52 -SOBC-1-00009 for approximately 4,500 high speed modems to be used by the Census Bureau. CXR protests the agency's pre-award testing of the modems, contending that the agency improperly relaxed the test parameters to permit ComPro's modem to pass a rerun of the initial test and that the agency should have made award on the basis of the initial test that ComPro's modem failed and CXR's modem passed.[1]

We sustain the protest.

The Census Bureau uses the modems to transfer data from lap-top computers in the field over telephone lines to agency computer facilities. Specifically, the RFP is for Consulting Committee for International Telephone and Telegraph (CCITT) V.32bis, 14,400 bits per second (bps) modems. The V.32bis is a CCITT standard or modulation protocol for high speed modems, ratified in 1991, that defines the encoding of the computer's digital signal into the telephone system's analog signal. The V.32bis is a full-duplex modulation protocol (i.e., the modem simultaneously sends and receives data) that uses the full band-width of the telephone voice channel, necessitating the use of echo cancellation technology.

Because of past difficulties experienced in transmitting data over local loops (i.e., local telephone lines that connect the user to a central telephone office), the agency decided to subject the modems to a modem analysis performance test (MAPT), that would reveal whether, in the face of simulated poor transmission conditions/impairments,[2] an offeror's modem could transmit data over the 13 different types of line impairment configurations representative of the public telephone system's transmission lines without exceeding a specified error rate. In formulating and conducting the pre-award tests, the agency employed an independent consultant, who is an expert in modem testing. In this regard, at the time of the procurement, this type of high speed modem had just appeared on the market and the telecommunications industry was still in the process of setting performance standards against which to judge the modem's performance.

The RFP presents the MAPT as Table J.1.1 (Table).[3] The Table is based on work from three sources: an industry group[4] (Committee), the independent consultant, and agency personnel. The Committee had published a standard[5] that defined six standardized transmission test characteristics, or line impairment configurations, for telephone lines likely to be encountered by slower (e.g., 2,400 bps) modems.[6] The independent consultant, a Committee member, gave the agency access to an unpublished Committee working document[7] that described the balance of the 13 types of line impairment configurations,[8] but only as they would be encountered by modems operating at 9,600 bps, and not as they would appear to the solicited V.32bis modems operating at 14,400 bps. In order to adapt the Committee's work to the requirements of the 14,400 bps modems, the agency and the independent consultant made further modifications to the Table's parameters, including the alteration of the levels at which many of the impairments were set, Tr. at 28, and dropping a fourteenth impairment[9] from the Table. Tr. at 30. These modifications were made as the independent consultant and the agency acquired new information concerning the characteristics of the telephone network and transmission simulation test equipment. Tr. at 30. The Table that appeared in the RFP was configured such that offerors could directly enter the test parameters into their respective test equipment without making any intermediate calculations. Tr. at 121, 224, 226.

The agency required all offerors to pass two identical pass/fail pre- award performance tests--in effect, two benchmarks, the first being a self -benchmark and the second, an agency benchmark. The RFP required (1) that both tests be "functionally identical and . . . conform to the test procedure described in Section J," and (2) that the test be performed on "identical proposed modems." Each offeror was required to certify that it had passed the self-benchmark, and to provide the agency with its test results. The RFP warned that:

"If the results of the [g]overnment's test vary with the proposer's submitted results to the degree that there is reason to believe that the proposer has attempted to defraud the [g]overnment, the [g]overnment will deal with the matter aggressively."

In an earlier procurement where the agency had not required actual test results and had only required offerors to certify that their modems had passed, only 3 of the 17 firms that had so certified were able to pass the agency benchmark. Tr. at 160.

The benchmarks had to be run on very specialized telephone network simulation test equipment (test equipment) that is intended to roughly approximate the conditions found on the public telephone network when two modems communicate. In describing the modem tests at issue here the following network diagram may be consulted:

TABULAR OR GRAPHICAL MATERIAL OMITTED

The left side of the network diagram is designated as the A side (also called the far-end) and its right side called the B side (also called the near-end). On the diagram, the modem under test appears on the right side, or near-end, and is designated as Modem B, while the other modem (also known as the reference modem) that the Modem B is communicating with over the network during the test is on the left side of the diagram and called Modem A. A signal being sent from Modem B to Modem A has to overcome numerous obstacles (i.e., the impairments or line conditions) in its journey across the network.

Only three firms manufacture the test equipment required to perform this type of testing. Although newer digital equipment was available from one firm, Processing Telecom Technologies (PTT), at the time the solicitation was issued, nothing in the solicitation indicated that the agency would run the benchmark on PTT or digital equipment, or that the offerors should not use their older analog test equipment, or that different test equipment could give different test results.[10]

In response to the RFP, 27 offerors submitted proposals. Tr. at 179. The agency summarily rejected 17 proposals during the technical evaluation because the accompanying self-benchmark certifications and test results showed that the offerors' modems had not passed all items of the benchmark.[11] Tr. at 190. In April 1992, the agency benchmarked the remaining 10 proposals. The 10 proposals included modems made by 4 different modem manufacturers. After the benchmark, the agency rejected 7 proposals, including ComPro's, because they failed the benchmark's line impairment configuration No. 12B test on the RFP Table. All of the rejected proposals offered the same manufacturer's equipment (i.e., U.S. Robotics, Inc. modems). The agency viewed the failure as significant because the modems could not establish the initial communications link, also called the "handshake."[12] This left in the competition three proposals, including CXR's, each offering a different modem manufacturer's equipment, all of which passed the April benchmark.

At the time of the failure of the April test, U.S. Robotics, itself an offeror, immediately objected, on behalf of itself and its six distributors,[13] to the agency's conduct of the benchmark, arguing that the echo impairment in the test was improperly measured. The echo impairment in the test is intended to emulate certain physical conditions in a telephone network that, when encountered by the signal of the modem being tested (Modem B), act like barriers and cause another signal, or echo, to be formed that bounces back to Modem B.[14] The echo signal interferes with the signal that Modem B is attempting to transmit to Modem A, and Modem B must compensate for the echo signal's presence if Modem A is to receive an error free transmission of Modem B's signal.[15]

Depending on where Modem B's signal encounters the obstacle that generates the echo signal, the echo so generated will take varying lengths of time to get back to Modem B--e.g., a close-in or near-side obstacle will cause an echo that returns sooner than an echo generated by an obstacle at a further location. This time lag is referred to as delay and where a specification does not state the location of an echo generating obstacle, it is possible to determine the obstacle's position by the delay associated with the echo signal. Echoes are generally referred to as either listener echoes or talker echoes; the talker echo has three varieties (near-end echo, intermediate echo, and far-end echo). The relative positions of the near-end echo (close to Modem B) and the far-end echo (close to Modem A) present no problem; however, the location of the intermediate echo can be anywhere between the far-end echo and the near- end echo. Here, the RFP Table did not specify exactly where the intermediate echo should be. Instead, the Table provided an intermediate delay of 20ms, which, from our review of the record, would normally be expected to place the intermediate echo nearer to Modem A than to Modem B.[16] Thus, the intermediate echo was inserted at Point A on the left or A side of the diagram as opposed to Point B on the right or the B side. It was the placement of the intermediate echo that was the focus of U.S. Robotics's complaint.

U.S. Robotics had run the self-benchmark on TAS test equipment that it used for general engineering and manufacturing tests, but U.S. Robotics also owned PTT equipment that it used for research and development testing. Tr. at 287. U.S. Robotics discovered that its modem could pass the test using PTT test equipment, but only if the test equipment was set up with the intermediate echo inserted at Point B, and that its modems failed when the intermediate echo was inserted at Point A on the PTT test equipment--which was where the agency had placed it during the April test.[17] U.S. Robotics argued that the agency should position the echo at Point B because the intermediate echo was only supposed to experience a 20ms delay, and when positioned at Point A, the intermediate echo instead encountered a 40ms delay that effectively placed the intermediate echo in the assertedly impossible position of residing out beyond the far-end echo, which was only supposed to be a 20ms delay away. Essentially, U.S. Robotics claimed that it was inconsistent to test using equipment that actually results in a 40ms delay when the specifications called for a 20ms delay in the intermediate echo.

Although the agency initially rejected U.S. Robotics arguments, basically on the grounds that the RFP Table dictated the parameters to be plugged into Point A of the test equipment and no further inquiry was necessary,[18] the independent consultant remained concerned that U.S. Robotics had passed the self-benchmark, but failed the April test he had administered, particularly since he had, prior to this procurement action, successfully tested both 9,600 bps and 14,400 bps U.S. Robotics modems on PTT equipment against essentially the same parameters and expected to see similar results. Tr. at 88.

When the agency decided, based on the independent consultant's advice, to conduct another pre-performance test in June 1992 (apparently for reasons related to echo shaping as discussed below), it also acceded to U.S. Robotics's request that the intermediate echo be inserted at Point B instead of Point A.[19] PTT apparently prompted the independent consultant's changed opinion as to the proper location of the intermediate echo when:

"they pointed out that by placing the echo in position A, even though I had been testing that way for quite a while beforehand and had modems operate and pass that, it was inaccurate, because I was now placing it-- since that delay is additive, I was placing it beyond the far-end echo geographically and time-wise, so that was an inappropriate thing to have done." Tr. at 84-85.

Earlier, on May 8, 1992, after studying the problem and consulting with PTT, the independent consultant advised the agency that one reason the U.S. Robotics modems did not pass the April test, as he anticipated, was likely attributable to differences in the configuration of the PTT test equipment during his earlier independent testing of the U.S. Robotics modems and the April test with regard to an "echo shaping"[20] impairment used in his earlier test. Basically, the independent consultant found that while the digital PTT equipment had the ability to simulate echo shaping, it could also be configured to mimic the analog test equipment that could not simulate echo shaping; the PTT test equipment did this by turning off its echo shaping function. The consultant reported that, during his earlier independent testing of U.S. Robotics modems, the PTT test equipment had the echo shaping function turned on, but the April test had been conducted with the echo shaping function off. The absence of echo shaping makes the test more difficult since the tested modem has to correctly interpret the transmitted signal while at the same time coping with a strong echo signal undiminished by any shaping. While the independent consultant explained that "no bandwidth shaping was applied because at the time of the release published test specifications did not call for it nor did any of PTT's competitors offer it," the agency, on May 14, decided to conduct a second benchmark with the echo shaping function turned on, reasoning that since echo is a form of noise that a modem has to deal with, the lack of echo shaping in the April test increased "the noise level above that [for] which our test had originally been designed."

Based on the foregoing, the agency advised offerors in the competitive range that it would conduct a second benchmark in June 1992. Even though it was the agency's understanding that the June test would be conducted by inserting the intermediate echo (a type of talker echo) at Point B and by including echo shaping, it nevertheless advised offerors that the retest was being conducted because the April test was "not reflective of the impairment description parameters, lines 7 [through] 13, of the MAPT. The listener echo calibrations used were not in accordance with the original intent of the test."

All four types of modems passed the June test and ultimately, on October 2, 1992, the agency made award to ComPro for the U.S. Robotics modem at a total evaluated price of $702,840.97. CXR protests that the June test should not have been performed and award should have been based on the April test, which the U.S. Robotics modem had failed, and that the June test was defective and represented a waiver of the specification requirements.[21]

We have previously emphasized the need for flexibility in the application of benchmark and other demonstration-type test requirements and the concomitant undesirability of "pass/fail" benchmark tests leading to the automatic exclusion of otherwise potentially acceptable offerors. CompuChem Laboratories, Inc., B-242889, June 17, 1991, 91-1 CPD Para. 572; OAO Corp.; 21st Century Robotics, Inc., B-232216; B-232216.2, Dec. 1, 1988, 88-2 CPD Para. 546; International Computaprint Corp., B-207466, Nov. 15, 1982, 82-2 CPD Para. 440. Instead, benchmark testing should be an inherent part of the negotiation process, during which deficiencies should be pointed out and then corrected if possible, see CompuServe Data Sys., Inc., 60 Comp.Gen. 468 (1981), 81-1 CPD Para. 374, and benchmarks should be rerun as necessary to correct a deficiency in order to maximize competition. The Computer Co., B-198876, Oct. 3, 1980, 80-2 CPD Para. 240, aff'd, 60 Comp.Gen. 151 (1981), 81-1 CPD Para. 1.

We find nothing objectionable in the agency's decision to rerun the April test after the U.S. Robotics's modem--that had been tested and certified in accordance with the solicitation's detailed benchmark parameters-- failed the April test. The solicitation expressly provided for retesting where benchmark failure was attributable to causes other than a failure of the offeror's equipment. Here, the failure appears to have resulted from unanticipated test conditions and variations in the capabilities of the different test equipment used by the offerors and the agency. In this regard, all of the agency testing was conducted on state-of-the-art PTT test equipment, rather than the older TAS test equipment used by U.S. Robotics or other testing equipment used by other offerors in self-testing their modems. Moreover, the agency was reasonably persuaded that the intermediate echo should be set at Point B during the retest benchmark. The record indicates that setting the intermediate echo at Point A on the PTT equipment --as was done in the April test--may provide anomalous test results because the PTT equipment will sum the intermediate echo delay and the round trip delay, which effectively places the echo in an unintended position. Tr. at 75, 278. In addition, the agency decided that "echo shaping" should be included in the test--which the agency admits would have the effect of making the benchmark less stringent. Tr. at 183.[22] Under the circumstances, given the legitimate concerns about the fairness of the April tests with regard to the U.S. Robotics equipment, the agency had a reasonable basis to retest it.

Nevertheless, the agency's implementation of its decision to retest was fatally deficient for a variety of reasons. First, the conduct of the June test, without notice to all offerors, constituted an improper relaxation of the solicitation requirement that the agency pre-performance tests be identical to the offerors' self-testing. In this regard, the June test was designed to be less stringent than the test that resulted from the direct entry of the RFP Table test parameters into the offerors' test equipment as was contemplated by the RFP. The agency's use of state-of-the-art test equipment and additional unannounced test parameters, specifically, the intermediate echo shaping and the insertion of the intermediate echo at Point B, effectively relaxed the Table's specific parameters.[23] The agency should have specifically advised offerors of the new relaxed parameters of the June benchmark.[24] See W.D.C. Realty Corp., 66 Comp. Gen. 302 (1987), 87-1 CPD Para. 248.

The more fundamental problem with the June test is that it was so flawed that the agency could not reasonably base an award on it. It is not disputed that it was critical to the agency that it benchmark modems rather than accepting the self-benchmark test results submitted by offerors. Here, the agency's independent consultant admits that the June test was technically flawed based on the information that has come to light during the course of this protest. Tr. at 275, 278. In this regard, the June test results, on which the agency based its award decision, contained erroneous attenuation levels (power levels). Tr. at 116, 275. The record also shows that the insertion of the intermediate echo value at Point B--as was done in the June test--has the side effect of providing the wrong attenuation levels, which essentially negates the validity of the test. Tr. at 275, 278. The agency's technical representative admitted that if the power levels on the June test are wrong, as they are, the test is invalid.[25] Tr. at 129. In sum, the problem with the June test was that the agency's "fix"--to compensate for the PTT test equipment's handling of the Table's delay values--effectively unbalanced the Table, distorting the other parameters--specifically the attenuation figures-- rendering the June test invalid with regard to all tested modems. Tr. at 278. In contrast, the April benchmark was valid as specified in the Table with regard to all modems except for those manufactured by U.S. Robotics; thus, CXR would have been entitled to an award based on the April benchmark. Therefore, given the critical nature of the benchmark requirement, the award to ComPro, based on a benchmark that had no validity, cannot be said to represent the agency's actual requirements. See Mine Safety Appliances Co.; Interspiro, Inc., B-247919.5; B-247919.6, Sept. 3, 1992, 92-2 CPD Para. 150; Paper Corp. of U.S., B-229785, Apr. 20, 1988, 88-1 CPD Para. 388.

We understand that ComPro has substantially performed the contract. To the extent that orders have not been placed, ComPro's contract should be terminated and the agency's remaining modem requirements resolicited. Under the circumstances, CXR is entitled to be reimbursed its costs of preparing its proposal and pursuing the protest, including reasonable attorneys' fees. 4 C.F.R. Sec. 21.6(d)(1). The protester should submit its certified claim for such costs directly to the agency within 60 days of receipt of this decision. 4 C.F.R. Sec. 21.6(e).

The protest is sustained.

1. A hearing was conducted pursuant to 4 C.F.R. Sec. 21.5 (1993) to receive testimony regarding the conduct of the pre-award tests under the RFP.

2. Impairments are different telephone line conditions that affect the strength of telecommunications signals (voice or modem) as they transit a telephone network. Impairments include such things as frequency-dependent attenuation, phase jitter, echo, and echo shaping. Hearing Transcript (Tr.) at 29.

3. The Table is a 13 by 13 matrix. Its 13 columns represent the 13 types of line impairment configurations, while its 13 rows represent the 13 different impairments the agency was testing, with each impairment's setting (e.g., 20 decibels (dB), 30 milliseconds (ms), etc.) varying according to the type of line impairment configuration it was in.

4. The industry group consists of the Electronics Industry Association- Telecommunications Industry Association TR30 Committee on modem standards and the TR30.3 working group on Data Communication Network Interfaces.

5. The Electronic Industries Association Recommended Standard 496-A (EIA- RS-496-A), Interface between Data Circuit-Terminating Equipment and the Public Switched Telephone Network.

6. The standard appears in the Table as the first six columns.

7. The document, TR-30.3/89-06048, was presented in June 1989. However, the two agency-run benchmarks were based on Committee publications of the specification as recent as April 12, 1990. Tr. at 60.

8. These appear in the Table as columns 7 through 13.

9. Non-linear distortion.

10. The three brands of test equipment are manufactured by PTT, Telcom Analysis Systems (TAS), and AEA/Consultronics (AEA). The test equipment costs between $30,000 and $40,000 and is available in analog (older technology) and digital (newer "state-of-the-art" technology introduced around 1989) models. Tr. at 280. Because of the high replacement cost, many smaller modem manufacturers still use analog test equipment and some larger firms continue to use their analog test equipment along with the newer digital equipment. Tr. at 281.

11. An agency witness stated that "[t]hose who almost passed would make statements like `nobody can pass this test; therefore, we are submitting this figuring that since nobody else can pass it, we are going to be the closest one to come to passing it.'" Tr. at 179-180. The same witness continued by stating "[j]ust that some flunked many lines and some flunked only one or two, but we had already stated in our criteria that flunking one was flunking the whole test." Tr. at 180.

12."[T]he modems didn't just have a simple error rate . . . [t]hey actually failed to complete the handshake, which indicated a gross failure, not just a simple inability to handle some of the impairments." Tr. at 50.

13. U.S. Robotics performed one self-benchmark test of its Courier V.32bis modem and then made copies of the test results for dissemination to U.S. Robotics distributors. These distributors, in turn, submitted the results of that one test along with their respective proposals. Tr. at 168. The distributors tendered different modems (i.e., different serial numbers), albeit the same make and model, for the agency benchmark, as was permitted by the solicitation.

14. The telephone network can be viewed as being divided into different electrically homogenous blocks. Where these blocks interface, there are physical conditions (e.g., differences in wiring, such as going from 2- wire to 4-wire) that can cause the impedance of the transmission line to vary between the different blocks. This mismatch of impedance at the interfaces generates the echo signals. Thus, (1) near-end echo is generated by the modem's signal encountering the interface of the near-end local loop block and the central office block, (2) intermediate echo results from the interface between the central office block and the long distance carrier block, and (3) far-end echo is derived from the interface connecting the far-end central office block to the far-end local loop block. Tr. at 34-36.

15. The echo signal is subject to being misinterpreted by the modems as additional information or noise, which may result in the generation of data errors.

16. When questioned, following U.S. Robotics allegations, about using an echo insertion point nearer to Point B instead of Point A, the agency was told by the independent consultant "that the echo signal was inserted [i.e., at Point A] where such a signal logically, when physically encountered, would originate."

17. The agency, after learning of U.S. Robotics's argument, privately experimented on its PTT test equipment to see what would happen if the echo was inserted at Point B, the private test resulted in a second U.S. Robotics modem failure. However, the agency later discovered that the second failure resulted from a bug in certain utility software used to upload to the test software the parameter set on the PTT test equipment; this software, unbeknownst to the agency, had merely reinserted the echo at Point A. Tr. at 279.

18. The agency position, taken in an internal memorandum, was that "[t]he test parameters, as they appear in Table J.1.1 [of the solicitation], are the actual parameters that are to be inserted at the appropriate points in the test set up menu."

19. At the hearing, the independent consultant, referring to the Table, explained why the April test was conducted with the echo inserted at Point A, and why this insertion point was in error, stating as follows:

"[W]e are told that the intermediate signal-echo ratio would be set to 20db, and the intermediate delay would be set to 20[ms]. In and of itself, there is no next line down that says intermediate echo is at position A or position B. It's left up to the operator to interpret that. But with a 20[ms] delay, we're looking at a placement of the echo in position A on this diagram here in terms of time frame of the presence of the echo away from the reference modem.

"At the time we ran these tests, it was my understanding that when I programmed into [EZBERT--the PTT test equipment's software] an echo delay value, that even though I'm setting it for the echo at position A, the value I'm programming in is what the modem at B would see as a delay. It turns out that I had incomplete information or understanding of the equipment at the time, and the value I programmed in at position A was summed with the forward signal delay in there, which is also set at 20[ms] on this particular test." Tr. at 63-64.

20. The term "shaping" is descriptive of the distortions and attenuations (loss of strength) that a signal experiences as it travels through a network. The amount of shaping a signal will experience on its journey depends on the length and size of the wires as well as the types of equipment that the signals pass through. Generally, the more shaping a signal encounters the weaker it will be when it arrives at its destination.

21. We disagree with the agency's initial contention that CXR's protest of the April and June tests is untimely, since CXR's protests of the tests was filed within 10 working days of when it was actually apprised of how the tests were conducted--which was disclosed in reports on earlier CXR protests. 4 C.F.R. Sec. 21.2(a)(2); see United Tel. Co. of the Northwest, B-246977, Apr. 20, 1992, 92-1 CPD Para. 374. CXR was not in the position to have earlier protested these matters, since, as indicated above, the agency misled the offerors concerning the reasons for the retest, and the record shows that the differences in the April and June tests took the form of changed parameter entries that were buried in EZBERT software menus that were, in turn, bundled into software configuration files that the agency ran automatically during the April and June tests. Consequently, no offeror's observer could reasonably have been aware of the agency's conduct of these tests. See Tr. at 62.

22. When asked if "the effect of the insertion of echo shaping in fact did relax the specification. Is that correct[?]," an agency witness replied, "[s]haping is definitely easier to handle than a flat line." Tr. at 183. Indeed, a reasonable reading of the Table is that it was intended to be a complete listing of the parameters that would be "plugged" into the modem manufacturer's test equipment. Tr. at 121. The Table already provided for intermediate echo and did not contemplate special echo shaping. In this regard, the RFP stated that the agency had "developed a test incorporating the attenuation distortion and envelope delay curves from the approved standard," and that "Table J.1.1 tabulates these parameters." The attenuation distortion impairment is found in row one of the Table and the envelope delay impairment is in row two. These are the only two stated impairments that constitute "shaping." Tr. at 40. If the agency's requirement included shaping beyond the shaping described in rows one and two of the Table (e.g., echo shaping) that aspect of the requirement was not described in the Table and thus was not contemplated by the RFP.

23. We note that, while the agency intended the June test to apply echo shaping, this actually never occurred because the agency's PTT test equipment is unable to apply shaping to an intermediate echo signal inserted at Point B. Tr. at 95, 131. The agency's ultimate inability to relax the test requirement does not contradict its conclusion that this was a more accurate statement of its actual requirements.

24. The agency's actions may have deterred offerors from offering less robust, lower cost modems, and if offerors were aware that the benchmark requirements were relaxed, as they were, there may have been a different competitive environment. Cylink Corp., B-242304, Apr. 18, 1991, 91-1 CPD Para. 384; MTS Sys. Corp., B-238137, Apr. 27, 1990, 90-1 CPD Para. 434. We note that 27 offerors submitted proposals, but only 10 were found technically acceptable as a result of the agency's strict pass/fail application of its identical test requirement as it applied to the self- tests; it seems apparent that other offerors, given the same relaxed tests, could have qualified under the self-tests. While all 27 offerors could have set up their self-benchmark test equipment in the same manner that the agency set up the April test, Tr. at 65, many of these offerors could not set up to the June test parameters due to the special attributes of the PTT test equipment that were lacking in other makes and older versions of test equipment. In fact, the agency never did ascertain what makes of test equipment the 17 rejected firms had used to perform their self-benchmarks. Tr. at 190. Also, the independent consultant testified that some manufacturers had more than one modem in their product lines capable of transmitting at the required speed. Tr. at 93.

25. We also note that the agency's lack of control over the June test configuration file rendered it unable to verify its theory that the erroneous power levels in the test results may have been attributable to a transcription error instead of erroneous entries in the configuration files under which the tests were conducted.