TV White Space Device Testing Off and Running
Yesterday at 10 AM a crowd of about 30 gathered at the FCC Lab in Columbia Maryland to watch the beginning of Phase 2 of the Docket 04-186 White Space Device testing. Four devices are being testing: Adaptrum, Microsoft, Motorola, and Philips. Some are detectors only, some are both detectors and transmitters.
It was announced that a fifth unit from Singapore Institute for Infocomm Research (I²R - pronounced "i-squared-r") was on the way from Singapore for testing but had a shipping hangup with DHL that had trouble posting a bond for reexport with US Customs and shipped the unit back to Singapore. I²R is now using a different shipper and arrival is expected shortly.
The Google prototype is missing in action without any explanation and FCC staff was vague about whether additional units would be accepted - implying that this was subject to the Chairman's usual micromanagement.
It was announced that testing schedule updates would be posted on the testing web site.
[But no information has actually been posted to date. I was amused to hear that in the current FCC micromanagement environment, the Chairman's Office approval was needed for OET to post information to this site and had been obtained.]
As I speculated in the previous post, different developers got different guidance from the FCC staff on what interfaces from the devices were requested to speed testing. Philips was told to indicate the results of 30 trials to detect a given frequency and interpreted to give a percentage of the tries that indicated the frequency was in use. Adaptrum was asked more specifically to deliver a text string of zeroes and ones indicating the outcome of each trial. I suspect this inconsistency comes from the inability of FCC staff members to issue written guidance on even such microscopic matters without multiple layers of oversight - so they just issue inconsistent verbal guidance.
When the testing started the 30 observers gathers in a lab room and watch initial testing of the Philips units with undistorted DTV signals from a Rhode and Schwarz signal generator fed directly into their device. Testing started at -110 dBm and reach -120 dBm when we broke for lunch. At that point the Philips device still had a perfect detection record.
MSTV observers repeated questioned whether devices might just always declare a signal was present to get a perfect record. Apparently MSTV has no faith in market forces since such a device would fail in the market as it would never transmit a signal and also never cause interference. MSTV also quibbled whether smaller antennas might make detectors less sensitive. While this is true in a technical sense, it is irrelevant for policy consideration since the FCC proposal, the position of most parties, and the 5 GHz U-NII precedent deals with system performance not just the detector electronics: if the total system doesn't meet the performance standard in the final rule, if can't be sold.
The initial testing also revealed the ambiguities in the published test plan. The first test, I.A, was testing detection at various DTV signal levels. The procedure actually used was to use a Rhode & Schwarz SFU signal generator to directly produce signals at levels in the -110 dBm range (much more sensitive than DTV receivers) and below and to send them by cable to the equipment under test (EUT) a few feet away. There are lingering doubts whether the signal reaching the nearby EUT in such a setup are actually the power indicated on the SFU since at such very low signal levels unintended paths might create comparable signals. Oddly, a nearby screen room was not used for this test and would have decreased uncertainty about unintended coupling. The test plan did not address this level of detail.
Indeed, it is puzzling why all this testing is necessary: In the 5 GHz case, NTIA and FCC determined 2 pairs of detection levels and maximum transmit power that would protect the cochannel radar systems and left it to industry to develop systems that met the standard. Truly a pass/fail system. The existence of such a rule would stimulate capital formation to finance the development of such systems while reliably protecting the few homes that receiver over-the-air TV reception. The current lingering uncertainty about what the standard is and the confusion of that issue with prototype testing is a big disincentive to capital formation - something I would have thought a Republican administration would understand.
At the birth of Wi-Fi and Bluetooth in Docket 81-413 many parties of vested interests quibbled over whether affordable electronic could be built to meet the rules. Wisely the Commission chose a fail/safe approach of adopting a standard that protected other systems and just waited to see if anyone could build equipment. Within 2 years the first commercial product reached the market from a startup firm and several years later 802.11 standardization began. Now that Wi-Fi and Bluetooth are household names, few remember the uncertainties of the 1980s.
It was announced that a fifth unit from Singapore Institute for Infocomm Research (I²R - pronounced "i-squared-r") was on the way from Singapore for testing but had a shipping hangup with DHL that had trouble posting a bond for reexport with US Customs and shipped the unit back to Singapore. I²R is now using a different shipper and arrival is expected shortly.
The Google prototype is missing in action without any explanation and FCC staff was vague about whether additional units would be accepted - implying that this was subject to the Chairman's usual micromanagement.
It was announced that testing schedule updates would be posted on the testing web site.
[But no information has actually been posted to date. I was amused to hear that in the current FCC micromanagement environment, the Chairman's Office approval was needed for OET to post information to this site and had been obtained.]
As I speculated in the previous post, different developers got different guidance from the FCC staff on what interfaces from the devices were requested to speed testing. Philips was told to indicate the results of 30 trials to detect a given frequency and interpreted to give a percentage of the tries that indicated the frequency was in use. Adaptrum was asked more specifically to deliver a text string of zeroes and ones indicating the outcome of each trial. I suspect this inconsistency comes from the inability of FCC staff members to issue written guidance on even such microscopic matters without multiple layers of oversight - so they just issue inconsistent verbal guidance.
When the testing started the 30 observers gathers in a lab room and watch initial testing of the Philips units with undistorted DTV signals from a Rhode and Schwarz signal generator fed directly into their device. Testing started at -110 dBm and reach -120 dBm when we broke for lunch. At that point the Philips device still had a perfect detection record.
MSTV observers repeated questioned whether devices might just always declare a signal was present to get a perfect record. Apparently MSTV has no faith in market forces since such a device would fail in the market as it would never transmit a signal and also never cause interference. MSTV also quibbled whether smaller antennas might make detectors less sensitive. While this is true in a technical sense, it is irrelevant for policy consideration since the FCC proposal, the position of most parties, and the 5 GHz U-NII precedent deals with system performance not just the detector electronics: if the total system doesn't meet the performance standard in the final rule, if can't be sold.
The initial testing also revealed the ambiguities in the published test plan. The first test, I.A, was testing detection at various DTV signal levels. The procedure actually used was to use a Rhode & Schwarz SFU signal generator to directly produce signals at levels in the -110 dBm range (much more sensitive than DTV receivers) and below and to send them by cable to the equipment under test (EUT) a few feet away. There are lingering doubts whether the signal reaching the nearby EUT in such a setup are actually the power indicated on the SFU since at such very low signal levels unintended paths might create comparable signals. Oddly, a nearby screen room was not used for this test and would have decreased uncertainty about unintended coupling. The test plan did not address this level of detail.
Indeed, it is puzzling why all this testing is necessary: In the 5 GHz case, NTIA and FCC determined 2 pairs of detection levels and maximum transmit power that would protect the cochannel radar systems and left it to industry to develop systems that met the standard. Truly a pass/fail system. The existence of such a rule would stimulate capital formation to finance the development of such systems while reliably protecting the few homes that receiver over-the-air TV reception. The current lingering uncertainty about what the standard is and the confusion of that issue with prototype testing is a big disincentive to capital formation - something I would have thought a Republican administration would understand.
At the birth of Wi-Fi and Bluetooth in Docket 81-413 many parties of vested interests quibbled over whether affordable electronic could be built to meet the rules. Wisely the Commission chose a fail/safe approach of adopting a standard that protected other systems and just waited to see if anyone could build equipment. Within 2 years the first commercial product reached the market from a startup firm and several years later 802.11 standardization began. Now that Wi-Fi and Bluetooth are household names, few remember the uncertainties of the 1980s.
1 comment:
Regarding your comment on MSTV's lack of confidence in market forces, I think the point might have been missed. It has been MSTV's belief according to past statements that prototypes submitted for testing (and subsequent devices submitted for final approval) would perform well meeting any minimal current and/or future stated specifications, but the performance of the final manufactured products in consumer hands would be far less reliable or accurate and not necessarily meeting published specifications. (Actually, this is exactly the kind of market force I expect from a manufacturer of a regulated product when they can get away with it.)
On a similar note, whereas I agree philosophically with the concept of meeting system performance specifications as exampled by the 5GHz U-NII service, the reality is that of a different set of parameters for WSDs: Propagation characteristics of the UHF spectrum versus that of 5GHz, the number and type of incumbent devices, the fact that almost all 5GHz U-NII systems are fixed antennas outside high up (on top of roofs and light poles) while WSDs would be moving around indoors and out at 'ground level' in the millions, most 5GHz U-NII systems are point to point while whitespace service would be point to multipoint and omnidirectional, makes a similar approach for WSDs less appropriate.
Henry Cohen
Production Radio Rentals
Post a Comment