I'm confused.
An AP sending a CTS is telling others not to transmit so the AP can listen and successfully decode energy per the RTS originator. I don't see this a denial of service, though a device blasting CTS might be able to create a DoS attack - not sure.
Going back to the original question about ED only which I think hit the issues Jonathan pointed out.
- The software might be simpler, but the hardware would need to be overspecified to the point of making it unreasonably expensive for consumer devices.
- Radio hardware generally has a significant TX/RX turnaround time, required for the RX deafening circuits to disengage. Without those deafening circuits, the receivers would be damaged by the comparatively vast TX power in the antenna.
- So in practice, it's easier to measure SNR at the receiver, or indirectly by observing packet loss by dint of missing acknowledgements returned to the transmitter.
Being a software guy, I hope it's ok to ask more "dumb" questions ;)
- Why would consumer "hardware" be unreasonably expensive? Is it a mfg yield thing? Not possible per the state of CMOS radio process technology? Just curious to what would drive the expense.
- Maybe indirect detection via packet loss is good enough - not sure. But still can't get rid of first try transmit's EDCA back offs even if when they aren't useful, e.g. ED only would have been sufficient? Can a device (tx) know the state of the EDCA arbitrations and decide if backoffs are likely to be required or not?
Again thanks to all for the edifications.
Bob