Main take-away (as I understand it) is something like "In real-world networks, jitter adds noise to the end-to-end delay such that any algorithm trying to infer congestion from end-to-end delay measurements will occasionally get it wrong and this can lead to starvation". Seems related to Jaffe's work on network power (titled "Flow control power is non-decentralizable").
Thoughts?