Don't you want to accuse the size of the buffer, rather than the latency?
For example, say someone has some hardware and their line is fairly slow.
it might be RED on the graph because the buffer is quite big relative to the
bandwidth delay product of the line. A test is telling them they have
bloated buffers.
Then they upgrade their product speed to a much faster product, and suddenly
that buffer is fairly small, the incremental latency is low, and no longer shows
RED on a test.
What changed? the hardware didn't change. Just the speed changed. So the
test is saying that for your particular speed, the buffers are too big. But for a
higher speed, they may be quite ok.
If you add 100ms to a 1gigabit product the buffer has to be what, ~10mb?
but adding 100ms to my feeble line is quite easy, the billion router can have
a buffer of just 100kb and it is too high. But that same billion in front of a
gigabit modem is only going to add at most 1ms to latency and nobody
would complain.
Ok I think I talked myself around in a complete circle: a buffer is only bad IF
it increases latency under load. Not because of its size. It might explain why
these fiber connection tests don't show much latency change, because
their buffers are really inconsequential at those higher speeds?