Interesting article from MIT: https://news.mit.edu/2022/algorithm-computer-network-bandwidth-0804

The paper can be found on Venkat Arun's website: https://people.csail.mit.edu/venkatar/

Main take-away (as I understand it) is something like "In real-world networks, jitter adds noise to the end-to-end delay such that any algorithm trying to infer congestion from end-to-end delay measurements will occasionally get it wrong and this can lead to starvation". Seems related to Jaffe's work on network power (titled "Flow control power is non-decentralizable").

Thoughts?

--
Bjørn Ivar Teigen
Head of Research
+47 47335952 | bjorn@domos.no | www.domos.no