[Starlink] It’s the Latency, FCC
Colin_Higbie
CHigbie1 at Higbie.name
Wed May 15 10:55:11 EDT 2024
Sebastian,
You are, of course, correct that content providers have no contractual or other direct CONTROL over ISPs in terms of bandwidth. The term I've been using is that the MARKET dictates the bandwidth requirements an ISP must meet to be considered operating at a market-acceptable level. If that sounds like it's a self-fulfilling tautology, in a way it is. The market determines what it needs and all the players need to deliver what the market demands to be effective participants. That requirement can (and will) change over time, but at any given moment, to participate in the market as a serious player, the entrant must meet those market requirements, or recognize that they aren't a serious player relative to the competition.
It's the same thing in the commodity industry. If I grow corn or mine gold, I would have to meet, within a few percent, the specs of everyone else and sell it for the same price (or sell for a little bit less to massively out-earn my competitor providers, which results in the market price dropping for everyone). Far from a perfect analogy, of course, because a commodity is defined as a market where suppliers have no control over it, which is quite different from the semi-monopolistic market for ISP services. Still, it helps to illustrate the notion of market standard requirements.
I would say anyone who sells or releases products or defines go-to-market strategy in any industry would agree with the statement that the market dictates the minimum requirements for their industry. The market isn't any good at innovation (a realm where engineers and some academics shine), but it's the peerless best at establishing its own requirements.
That doesn't mean someone can't come along and deliver less in particular cases: if customers are desperate with no alternatives or the alternatives are expensive in a certain region, those customers may accept a service that falls short of general market requirements because it's still better than their alternatives.
But in order to provide market-standard service, ISPs need to meet the market standards on bandwidth just as much as on latency (and reliability and other features, e.g., market standard is a variable IP4 address – many of us may want a static IP#, but that's not a standard). And the market standard on bandwidth is established by the mainstream services that customers can get via the Internet. Those are set by the streaming services in the aggregate (not by any one company dictating it).
Note that they are NOT set in a vacuum. The streaming services have set 25Mbps because a significant enough portion of their customers already have access to 25Mbps or greater bandwidth, compression algorithms allow for 4K HDR to be delivered reliably within that 25Mbps bitrate, and movie and television studios have maxed out video resolutions and color depth at 4K HDR. No other popular service requires more bandwidth than 4K HDR video streaming, so that's the benchmark. The confluence of all these factors together have led to a market REQUIREMENT of 25Mbps.
That required bitrate likely will increase in the future, but we can see that it will at least be years before 25Mbps will be a painful constraint for any one stream (if you have multiple family members who all need access to the same services at the same time, it's already too small), whether because eventually we will expect greater than 4K pictures, or maybe because game streaming becomes more popular (e.g., Xbox and PS consoles have a 6-8 year generation, by the end of which they seem terribly obsolete compared to their PC-gaming contemporaries; streaming would ensure gamers always have access to the latest tech), IoT takes off due to some unforeseen killer app, etc. Bandwidth needs will almost certainly increase, not decrease, over time, in spite of compression improvements. In fact, compression generally follows as needs grow faster with an assumed static capacity, so compression has never historically reduced bandwidth requirements, it just increases the quality of what can be done with existing bandwidth.
You also rightly point out that many customers don't do 4K. That's fine. And I have no objection to ISPs offering a lower bandwidth option at a lower price. What the market has definitively set is that in any region, at least in the U.S. (I know less about other parts of the world), a significant % of users will need 25Mbps service in order to access the mainstream streaming services. See above for reasoning.
This is not debatable other than as an opinion (like saying chocolate is better than mint chip ice cream) as evidenced by the fact that ALL THE MAJOR STREAMING SERVICES are at the same place. Keep in mind that they compete with each other. If Disney thought that it could get a leg up on Netflix by saying we're better because we only require 10Mbps or 20Mbps for the same quality, they would promote that. This would give them a sales advantage over their archcompetitor in regions where bandwidths are limited. They don't because 25Mbps is market-set as the requirement.
Again, by definition of what the market sets, it has set it.
While you are correct and I agree with you on the technical facets to this, I can't support your statement that there is "enough uncertainty" in the market. There is zero uncertainty for the reasons I have outlined above. I can drill much deeper into any of these if you like (as long as my overly-verbose emails are, they are just scratching the surface on all the reasons that can be provided), but be aware that the fact that not all customers have the same preferences is not a relevant factor. Of course, different customers have different preferences (and that's a good thing).
You asked about filming cameras and 4K vs 8K. Lens aberration is the primary concern as you get much above 4K. Original filming is done for TV now generally at HD or 4K. Movie production does tend to user higher (usually very expensive IMAX) cameras that are in the 8K area. But note that IMAX is taller (more square) than "normal" 21:9 or 16:9 cameras, so the total resolution is not as much higher as that sounds (i.e., IMAX 8K is less than four 4K images). On the film side, IMAX are said to produce up to an 12K digital equivalent. Beyond 12K, even in bright light with high-speed film, there is just too much lens aberration for more pixels to add anything to image quality. And in less than ideal lighting, especially with fast-moving scenes, the problems get worse. That is, the natural sharpness/blurriness to the image can't benefit from more pixels, because those extra pixels are just storing the blur, which does not provide any additional information.
Important industry point here though: in general (some possible occasional exceptions for Disney and Warner Bros Discovery (owns MAX) who are both movie studios and streaming services), the studios don't want home viewers to have access to picture quality as good as in theaters. 4K only became an option for home viewing as IMAX became popular for theaters. Now, to be fair, the delta in quality between home and theater have plummeted, so this argument may be fading, but these changes take time. Even if all movies were IMAX and showed 8K movies, many studios would do everything within their power to retain 8K as a theater-only experience.
By 1K, do you mean HD - 1920x1080? If so, yeah, I strongly agree with you win that the big jump in quality up to HD. From HD to 4K is a much less noticeable jump (even 1080p from 720p is a more distinct jump than 1080p to 4K for most eyes on 65" TVs). What most people assume is the improvement between HD and 4K is actually the removal of compression artifacts. Typical cable and satellite TV channels are extremely compressed to save money in the transmission, losing a tremendous amount of data. For most of their channels, they throw out a lot more data than the streaming services (so HD Netflix looks better than HD SyFy channel). You will often see them prioritize some channels (less compression) at the expense of others (highly compressed, typically news or little-watched stations where picture quality matters less). 1080p HD can look fantastic when there are no or minimal compression artifacts. Most serious gamers game it 1080p NOT 4K, because that allows them to achieve 120+ fps, where almost every serious gamer will tell you that frames per second are more important than image quality, and 4K is a pointlessly high image quality. When they do go above 1080p, it's usually to 1440p, and only for games that can still hit at least 120fps at 1440. Many gamers prefer 240fps, but at that point, like going beyond 4K in terms of resolution, you're basically past what most human eyes can discern. (personally, I can't tell the difference between 120fps and 240fps, but younger eyes can).
On 8K gaming, I do agree with your statement that even if it's included on both next-gen Xbox and PS6, I would not describe that as a "market requirement" in the way I describe 25Mbps for ISP bandwidth as a market requirement. In that case (assuming my prediction is correct, which it may not be), it would just be a high-end feature. Assuming next-gen Xbox and PS6 use the same technical requirements of the TV to get there, however, I would call it a standard, just not a market-required standard, YET.
Also, please note that an option anyone can address by going to the store and purchasing something (i.e, a new 8K TV) is different from the maximum available bandwidth from the ISP, which once established can't easily change in many cases for many, many years. If ISP XYZ builds-out new Internet capacity into a formerly unserved rural area at 20Mbps, they probably won't circle back to upgrade that for more than a decade. There is nothing those residents can do to fix the problem (well, they could get Starlink, but I mean there is nothing they can do with that ISP). It's the semi-permanent nature of the rollout that concerns me so much and why I say ISPs need to at least hit the market requirement in their top-tier offering, because if they don't, there's no solution for those customers with that ISP, and because many ISP's do have de facto regional monopolies, this is a serious problem.
Cheers,
Colin
-----Original Message-----
From: Sebastian Moeller <moeller0 at gmx.de>
Sent: Wednesday, May 15, 2024 2:52 AM
To: Colin_Higbie <CHigbie1 at Higbie.name>
Cc: Alexandre Petrescu <alexandre.petrescu at gmail.com>; Frantisek Borsik <frantisek.borsik at gmail.com>; starlink at lists.bufferbloat.net
Subject: Re: [Starlink] It’s the Latency, FCC
Hi Colin,
since you bring this up again, I want to address a few points below...
> On 14. May 2024, at 21:23, Colin_Higbie via Starlink <starlink at lists.bufferbloat.net> wrote:
>
> Apologies, these had been routed to my junk folder. Just saw them, so this is a bit behind.
> Nothing wrong with musings and opinions (we all do it and have them) but frustrated by the reluctance to accept data and when people try to replace data with their opinions with comments like, “This means that probably ISPs are inclined to do more than that 4K over the Internet, maybe 8K, to distinguish their service from DVB.”
> ISP’s do NOT provide any significant portion of streaming video. That comes from Netflix, Disney, Amazon Prime, MAX, Hulu, Paramount Plus, Peacock, and other commercial streaming services that distribute TV shows and movies. The ISP needs to offer a bandwidth level sufficient to meet those content providers requirements.
[SM] This is not how this works IMHO, as long as the streaming providers do not pay the ISPs they can not bring their requirements to the ISPs directly. I assume however you allow for an indirect path via the ISP's end customers. But at that point the requirement is already diluted, as that end customer might not care for UHD quality.
> Again, I would feel like I’m beating a dead horse here, except that people keep resuscitating the horse by posting as if that bandwidth requirement is just my opinion. That bandwidth requirement is 25Mbps and not my opinion. Alex wrote: “- about 25mbit/s bw needs for 4K: hdmi cables for 4K HDR10 (high dynamic range) are specified at 18gbit/s and not 25mbit/s (mbyte?). These HDMI cables dont run IP. But, supposedly, the displayed 4K image is of a higher quality if played over hdmi (presumably from a player) than from a server remote on the Internet. To achieve parity, maybe one wants to run that hdmi flow from the server with IP, and at that point the bandwidth requirement is higher than 25mbit/s. This goes hand in hand with the disc evolutions (triple-layer bluray discs of 120Gbyte capacity is the most recent; I dont see signs of that to slow).”
[SM] I fully agree with the lossy compression versus 'raw' argument, even a measly VGA stream at 30 Hz would eat (640*480*30*3*8)/(1000^2) = 221.184 Mbps, so uncompressed video is still neither desired by the end customer (in the mass market, individual exceptions might exiost but IMHO are not relevant here) nor by the content providers, as these have to pay for traffic capacity.
> Yes, if you put a UHD disc in a Blu-ray player or convey gaming video at 4K HDR10 120fps, it will send an UNCOMPRESSED video signal that uses an 18 - 48Gbps cable (small ‘b’ = bits, capital ‘B’ = bytes in these abbreviations). That has nothing to do with streaming bandwidth, which is COMPRESSED video using, primarily these days, H.265/HEVC. H.265 is an amazingly efficient compression system (effectively reducing stream size by a factor of several hundred, approaching 1,000 over the raw uncompressed stream). I don’t doubt there will be further improvements in the future, but I would not expect dramatic additional bandwidth savings. H.265 is a lossy compression scheme with a variable bitrate that depends on the scene and amount of movement. The requirement the streamers have for 25Mbps is a reasonably safe upper limit for 4K HDR video compressed via H.265. I believe (not 100% sure and somewhat subjective) that the most intensive scenes start to noticeably lose fidelity to fit within 25Mbps, but with buffering it’s never a real-world problem. Far more important, that’s all moot: what one person may WANT in terms of bandwidth or picture quality is completely irrelevant. THIS IS DICTATED BY THE MARKET.
[SM] Let me gently push back here... this assumes ption assumes a perfect market, but that is an economists phantasy and does not exist in real life.
> I don’t know if it’s because this is primarily an academic group that drives a reluctance here to accept the market as the definitive answer to what’s required, but if you’re target is consumer Internet and not the DOD for military use, then the MARKET IS THE ONLY FACTOR THAT MATTERS in determining what bandwidth is needed.
[SM] Let's talk data, we would need to know:
a) the percentage of streams >= 4K versus <4K, as far as I can tell that number is not public for individual streaming services let alone for all together
b) the fraction of these streams that are accidentally 4K compered to consciously selected.
c) as a proxy we might be tempted to look at the capabilities of the different streaming tiers and the percentage these are booked; however we run into to show stopper quickly, 1) we do not know the number of customers per tier and streaming service 2) these tier die not only differ in maximum video quality but often also in things like number of parallel streams, so we would also need to know the break down of why customers selected a 4K-capable tier
IMHO this is enough uncertainty to make any 'the market has spoken' argument somewhat sub optimal, it might have uttered something, but unfortunately mumbled so badly it is has to make sense out of this.
What IMHO is clear, all/most streaming providers offer 4K tiers and hence do have an interest in this tier being usable by their customers (but clearly not bad enough to actually talk to those that are expected to make this happen*).
*) And that is OK with me, as end customer I expect my ISP to deliver on its contracted rates and otherwise get out of the way of my traffic, asking content poviders for extra money per customer would be double dipping I am opposed to. What I could accept is streaming providers granting lump sums to ISPs to improve the connectivity of the under-served areas, to enlarge the set of potential streaming customers, but I digress.
> Alex wrote: “a reduced audience, yet important, is that of 8K TV via satellites. There is one japanese 8K TV satcom provider, and the audience (number of watchers) is probably smaller than that of DVB 4K HDR. Still, it constitutes competition for IPTV from ISPs.”
> This also doesn’t matter if there is no 8K content available from studios.
[SM] OK, since you seem to be in the know, is it true that movie production is limited to 4-5K cameras, I thought I read somewhere that 8K cameras are used in movie production. Snd I naively? assumed they would have the money to get equipment like lenses that actually work at that resolution (IIUC movie gear is often rented, so can be amortised over many productions, which would allow for 'glog-plated' lenses).
> That means it’s equivalent to the demo 8K, 12K, and 16K HDR content you can already stream from YouTube, which can (depending on motion of the scene – you’ll generally notice those high-resolution demos are brightly lit and very slow-moving) require more than 25Mbps, but that’s not market-demanded content from a production studio. These are just tech demos.
> The only mainstream content provider testing 8K streaming is Warner Bros Discovery for their MAX channel as a special set of trailers in conjunction Samsung to help Samsung sell 8K TVs. Currently, this is intended for store demos of those Samsung 8K TVs, not for home use, but this is the first indicator that 8K streaming MIGHT be coming to home use sooner than I had argued in my prior messages. If you believe 8K content is something ISP’s need to support with sufficient bandwidth, that would be the most compelling data point to reference.
> The other would be game streaming (different from Eugene’s esports, which does not stream the video and only needs a few Mbps, often less than 1Mbps). Please note that game STREAMING is different from game PLAYING. Game Streaming means all the gameplay work is handled remotely. The gamer just has a controller and a dumb terminal (like a smart TV), which sends the controller data to the server and then receives the streaming video of the game. Game streaming can exceed 25Mbps even at 4K, because gamers want to play at 120fps, which is bandwidth equivalent to 8K @ 30fps.
[SM] Pretty sure that is not how that works for pre rendered streaming content... 8K does not seem to require 4 times the bandwidth of 4K at equal framerate and equal perceptual quality. For on-line rendered material in games however that might be correct, given relative tight latency requirements there is little buffer space/time to employ clever encoding tricks.
> The text and other HUD elements in games are also more susceptible to compression artifacts than TV/movie scenes, so they don’t compress as well as streaming video.
> Important for this group: Latency is absolutely critical to gaming (especially esports, but important for other forms of gaming too).
> I don’t personally believe there will be much market interest in 8K streaming in the next few years, because:
>
> • a viewer can’t tell the difference between 4K and 8K on a
> standard size television and normal view distances
[SM] Fully agree, but then I also argue this for 1K versus 4K, where the difference for many users is not important... (over here cable TV still tops out at 1K, and viewers do not seem to quantitatively switch to streaming providers). Then most video material tells a story, and if that is compelling enough viewers are willing to suspend their disbelieve... that worked in the old SD (0.5K?) days and will still work today.
> • camera lenses and sensors (film or digital) are not good enough
> to capture a notably clearer picture at 8K over 4K (actually, they can
> capture up to about 5.5K, so that is slightly better than 4K) except
> in bright sunlight – so that next Max Max movie, classically having
> most scenes recorded in very bright environments, might be a good
> example for an 8K movie[]
[SM] See above. I would have thought that this is primarily a question of money, and which lens sizes one is willing to accept?
> • 8K TV’s still cost quite a bit more than 4K TV’s
[SM] But that is their whole reason to exist, to 'justify' the higher prices of the top end TVs... if 8K became mainstream, we still would need something novel for the top end...
> and are just starting to hit the market, while content providers want to know that at least a reasonable % of the viewing market will watch on 8K before they go through the expense of trying to create it [this is the least of these factors, as high-end purchases of 8K TVs are growing and streaming services do try to appeal to the high end, provided it will be market driving and not just bragging rights with a few videophiles]
> • For gaming, the esports players don’t particularly care about resolution, they care about refresh rates and latency, happily playing 1080p at 120+ FPS with a sub-10ms (they’d prefer sub-5ms) latency.
[SM] I have no reliable numbers on this, but this sounds reasonable.
> • For the other side of gaming, game streaming, this is mostly a cost-savings solution, which means the people doing the streaming tend not to have the latest display tech so also won’t use 8K, but just happen to live where high-speed Internet is readily available – e.g., inner cities.
> In spite of D and E above, my expectation is that 8K on a TV will be
> used for gaming significantly before it’s a widely supported streaming
> standard like 4K,
[SM] Given the use of assisted upsampling already used in games it is not a stretch tor expect this to happen for 8K as well, how decent that looks in the end is a different question...
> but that’s my opinion (informed from working in this space, but still just my opinion) and subject to error. With gaming, the camera and recording problems don’t exist. I expect the next Xbox and PS6 will support 8K gaming.
[SM] Which IMHO is likely, but would not tell that the customer demands that, but would simply be an easy differentiator to use to set the new models apart from their predecessors, no?
> Alex also wrote, “Still, that growth in bandwidth requirement does not say anything about the latency requirement. That can be found elsewhere, and probably it is very little related to TV.”
> I strongly agree with that statement. Due to the ease of buffering video (but not gaming), latency and even jitter are largely irrelevant to streaming video (again, not for gaming though).
[SM] This is, to my surprise, a contentious point. I tend to agree that the latency requirements are milder than in gaming, I do hear folks that argue that the delay in switching channels, jumping around in a timeline or fast forwarding/reversing makes streaming video latency sensitive. (My take is that for normal video consumption these should be rare events, and video production where I would ecpectr that to happen often, probably should not happen over the internet ;) ).
> My point in pushing the 25Mbps floor for a new top-tier offering from
> an ISP
[SM] I might have misunderstood the discussion, I thought we where discussing the minimal requirement here, not the top end?
> (i.e., they may offer cheaper plans that don’t reach that speed, but must at least offer a 25Mbps or higher tier) is to ensure that members of this group trying to persuade those ISPs to adopt Cake and FQ-Codel and anything else that improves latency under load are armed with that knowledge so you appear well-informed on the interests of the ISPs AND to reduce the risk of anyone inadvertently leading some ill-informed ISP to actually adding new services that fall short of 25Mbps, hurting the people in those regions for years to come, due to infrequent upgrades in the places that still today lack high speed Internet access.
[SM] Pretty sure no ISP will need any of us to understand 'capacity sells', after all that is what they have been marketing on for the last two decades. What I expect is that some ISPs might switch from more and more meaningless numbers and will market something like 4K-streaming capable.
Regards
Sebastian
> I’m sure all things being equal, with no change to cost, we’d all rather have more bandwidth and lower latency. 25Mbps is not great. It’s only enough for one 4K HDR stream plus some modest additional activity in the background. But it’s a sufficient minimum to do all the mainstream activities that the market provides over the Internet. And that’s the key.
> - Colin
More information about the Starlink
mailing list