[Starlink] It’s the Latency, FCC

Sebastian Moeller moeller0 at gmx.de
Wed May 15 02:52:06 EDT 2024


Hi Colin,

since you bring this up again, I want to address a few points below...


> On 14. May 2024, at 21:23, Colin_Higbie via Starlink <starlink at lists.bufferbloat.net> wrote:
> 
> Apologies, these had been routed to my junk folder. Just saw them, so this is a bit behind.
>  Nothing wrong with musings and opinions (we all do it and have them) but frustrated by the reluctance to accept data and when people try to replace data with their opinions with comments like, “This means that probably ISPs are inclined to do more than that 4K over the Internet, maybe 8K, to distinguish their service from DVB.”
>  ISP’s do NOT provide any significant portion of streaming video. That comes from Netflix, Disney, Amazon Prime, MAX, Hulu, Paramount Plus, Peacock, and other commercial streaming services that distribute TV shows and movies. The ISP needs to offer a bandwidth level sufficient to meet those content providers requirements.

[SM] This is not how this works IMHO, as long as the streaming providers do not pay the ISPs they can not bring their requirements to the ISPs directly. I assume however you allow for an indirect path via the ISP's end customers. But at that point the requirement is already diluted, as that end customer might not care for UHD quality.


>  Again, I would feel like I’m beating a dead horse here, except that people keep resuscitating the horse by posting as if that bandwidth requirement is just my opinion. That bandwidth requirement is 25Mbps and not my opinion.  Alex wrote: “- about 25mbit/s bw needs for 4K:  hdmi cables for 4K HDR10 (high dynamic range) are specified at 18gbit/s and not 25mbit/s (mbyte?).  These HDMI cables dont run IP.  But, supposedly, the displayed 4K image is of a higher quality if played over hdmi (presumably from a player) than from a server  remote on the Internet.   To achieve parity, maybe one wants to run that hdmi flow from the server with IP, and at that point the bandwidth requirement is higher than 25mbit/s.  This goes hand in hand with the disc evolutions (triple-layer bluray discs of 120Gbyte capacity is the most recent; I dont see signs of that to slow).”

[SM] I fully agree with the lossy compression versus 'raw' argument, even a measly VGA stream at 30 Hz would eat (640*480*30*3*8)/(1000^2) = 221.184 Mbps, so uncompressed video is still neither desired by the end customer (in the mass market, individual exceptions might exiost but IMHO are not relevant here) nor by the content providers, as these have to pay for traffic capacity.

>  Yes, if you put a UHD disc in a Blu-ray player or convey gaming video at 4K HDR10 120fps, it will send an UNCOMPRESSED video signal that uses an 18 - 48Gbps cable (small ‘b’ = bits, capital ‘B’ = bytes in these abbreviations). That has nothing to do with streaming bandwidth, which is COMPRESSED video using, primarily these days, H.265/HEVC. H.265 is an amazingly efficient compression system (effectively reducing stream size by a factor of several hundred, approaching 1,000 over the raw uncompressed stream). I don’t doubt there will be further improvements in the future, but I would not expect dramatic additional bandwidth savings. H.265 is a lossy compression scheme with a variable bitrate that depends on the scene and amount of movement. The requirement the streamers have for 25Mbps is a reasonably safe upper limit for 4K HDR video compressed via H.265. I believe (not 100% sure and somewhat subjective) that the most intensive scenes start to noticeably lose fidelity to fit within 25Mbps, but with buffering it’s never a real-world problem.  Far more important, that’s all moot: what one person may WANT in terms of bandwidth or picture quality is completely irrelevant. THIS IS DICTATED BY THE MARKET.

[SM] Let me gently push back here... this assumes ption assumes a perfect market, but that is an economists phantasy and does not exist in real life.


> I don’t know if it’s because this is primarily an academic group that drives a reluctance here to accept the market as the definitive answer to what’s required, but if you’re target is consumer Internet and not the DOD for military use, then the MARKET IS THE ONLY FACTOR THAT MATTERS in determining what bandwidth is needed.

[SM] Let's talk data, we would need to know:
a) the percentage of streams >= 4K versus <4K, as far as I can tell that number is not public for individual streaming services let alone for all together
b) the fraction of these streams that are accidentally 4K compered to consciously selected.
c) as a proxy we might be tempted to look at the capabilities of the different streaming tiers and the percentage these are booked; however we run into to show stopper quickly, 1) we do not know the number of customers per tier and streaming service 2) these tier die not only differ in maximum video quality but often also in things like number of parallel streams, so we would also need to know the break down of why customers selected a 4K-capable tier

IMHO this is enough uncertainty to make any 'the market has spoken'  argument somewhat sub optimal, it might have uttered something, but unfortunately mumbled so badly it is has to make sense out of this.

What IMHO is clear, all/most streaming providers offer 4K tiers and hence do have an interest in this tier being usable by their customers (but clearly not bad enough to actually talk to those that are expected to make this happen*).

*) And that is OK with me, as end customer I expect my ISP to deliver on its contracted rates and otherwise get out of the way of my traffic, asking content poviders for extra money per customer would be double dipping I am opposed to. What I could accept is streaming providers granting lump sums to ISPs to improve the connectivity of the under-served areas, to enlarge the set of potential streaming customers, but I digress.

>  Alex wrote: “a reduced audience, yet important,  is that of 8K TV via satellites.   There is one japanese 8K TV satcom provider, and the audience (number of watchers) is probably smaller than that of DVB 4K HDR.  Still, it constitutes competition for IPTV from ISPs.”
>  This also doesn’t matter if there is no 8K content available from studios.

[SM] OK, since you seem to be in the know, is it true that movie production is limited to 4-5K cameras, I thought I read somewhere that 8K cameras are used in movie production. Snd I naively? assumed they would have the money to get equipment like lenses that actually work at that resolution (IIUC movie gear is often rented, so can be amortised over many productions, which would allow for 'glog-plated' lenses).

> That means it’s equivalent to the demo 8K, 12K, and 16K HDR content you can already stream from YouTube, which can (depending on motion of the scene – you’ll generally notice those high-resolution demos are brightly lit and very slow-moving) require more than 25Mbps, but that’s not market-demanded content from a production studio. These are just tech demos.
>  The only mainstream content provider testing 8K streaming is Warner Bros Discovery for their MAX channel as a special set of trailers in conjunction Samsung to help Samsung sell 8K TVs. Currently, this is intended for store demos of those Samsung 8K TVs, not for home use, but this is the first indicator that 8K streaming MIGHT be coming to home use sooner than I had argued in my prior messages. If you believe 8K content is something ISP’s need to support with sufficient bandwidth, that would be the most compelling data point to reference.
>  The other would be game streaming (different from Eugene’s esports, which does not stream the video and only needs a few Mbps, often less than 1Mbps). Please note that game STREAMING is different from game PLAYING. Game Streaming means all the gameplay work is handled remotely. The gamer just has a controller and a dumb terminal (like a smart TV), which sends the controller data to the server and then receives the streaming video of the game. Game streaming can exceed 25Mbps even at 4K, because gamers want to play at 120fps, which is bandwidth equivalent to 8K @ 30fps.

[SM] Pretty sure that is not how that works for pre rendered streaming content... 8K does not seem to require 4 times the bandwidth of 4K at equal framerate and equal perceptual quality. For on-line rendered material in games however that might be correct, given relative tight latency requirements there is little buffer space/time to employ clever encoding tricks.


> The text and other HUD elements in games are also more susceptible to compression artifacts than TV/movie scenes, so they don’t compress as well as streaming video.
>  Important for this group: Latency is absolutely critical to gaming (especially esports, but important for other forms of gaming too).
>  I don’t personally believe there will be much market interest in 8K streaming in the next few years, because:
>  
>     • a viewer can’t tell the difference between 4K and 8K on a standard size television and normal view distances

[SM] Fully agree, but then I also argue this for 1K versus 4K, where the difference for many users is not important... (over here cable TV still tops out at 1K, and viewers do not seem to quantitatively switch to streaming providers). Then most video material tells a story, and if that is compelling enough viewers are willing to suspend their disbelieve... that worked in the old SD (0.5K?) days and will still work today.


>     • camera lenses and sensors (film or digital) are not good enough to capture a notably clearer picture at 8K over 4K (actually, they can capture up to about 5.5K, so that is slightly better than 4K) except in bright sunlight – so that next Max Max movie, classically having most scenes recorded in very bright environments, might be a good example for an 8K movie[]

[SM] See above. I would have thought that this is primarily a question of money, and which lens sizes one is willing to accept?

>     • 8K TV’s still cost quite a bit more than 4K TV’s

[SM] But that is their whole reason to exist, to 'justify' the higher prices of the top end TVs... if 8K became mainstream, we still would need something novel for the top end...

> and are just starting to hit the market, while content providers want to know that at least a reasonable % of the viewing market will watch on 8K before they go through the expense of trying to create it [this is the least of these factors, as high-end purchases of 8K TVs are growing and streaming services do try to appeal to the high end, provided it will be market driving and not just bragging rights with a few videophiles]
>     • For gaming, the esports players don’t particularly care about resolution, they care about refresh rates and latency, happily playing 1080p at 120+ FPS with a sub-10ms (they’d prefer sub-5ms) latency. 

[SM] I have no reliable numbers on this, but this sounds reasonable.

>     • For the other side of gaming, game streaming, this is mostly a cost-savings solution, which means the people doing the streaming tend not to have the latest display tech so also won’t use 8K, but just happen to live where high-speed Internet is readily available – e.g., inner cities.
>  In spite of D and E above, my expectation is that 8K on a TV will be used for gaming significantly before it’s a widely supported streaming standard like 4K,

[SM] Given the use of assisted upsampling already used in games it is not a stretch tor expect this to happen for 8K as well, how decent that looks in the end is a different question...

> but that’s my opinion (informed from working in this space, but still just my opinion) and subject to error. With gaming, the camera and recording problems don’t exist. I expect the next Xbox and PS6 will support 8K gaming.

[SM] Which IMHO is likely, but would not tell that the customer demands that, but would simply be an easy differentiator to use to set the new models apart from their predecessors, no?

>  Alex also wrote, “Still, that growth in bandwidth requirement does not say anything about the latency requirement.  That can be found elsewhere, and probably it is very little related to TV.”
>  I strongly agree with that statement. Due to the ease of buffering video (but not gaming), latency and even jitter are largely irrelevant to streaming video (again, not for gaming though).

[SM] This is, to my surprise, a contentious point. I tend to agree that the latency requirements are milder than in gaming, I do hear folks that argue that the delay in switching channels, jumping around in a timeline or fast forwarding/reversing makes streaming video latency sensitive. (My take is that for normal video consumption these should be rare events, and video production where I would ecpectr that to happen often, probably should not happen over the internet ;) ).


> My point in pushing the 25Mbps floor for a new top-tier offering from an ISP

[SM] I might have misunderstood the discussion, I thought we where discussing the minimal requirement here, not the top end?

> (i.e., they may offer cheaper plans that don’t reach that speed, but must at least offer a 25Mbps or higher tier) is to ensure that members of this group trying to persuade those ISPs to adopt Cake and FQ-Codel and anything else that improves latency under load are armed with that knowledge so you appear well-informed on the interests of the ISPs AND to reduce the risk of anyone inadvertently leading some ill-informed ISP to actually adding new services that fall short of 25Mbps, hurting the people in those regions for years to come, due to infrequent upgrades in the places that still today lack high speed Internet access.  

[SM] Pretty sure no ISP will need any of us to understand 'capacity sells', after all that is what they have been marketing on for the last two decades. What I expect is that some ISPs might switch from more and more meaningless numbers and will market something like 4K-streaming capable.

Regards
	Sebastian

>  I’m sure all things being equal, with no change to cost, we’d all rather have more bandwidth and lower latency. 25Mbps is not great. It’s only enough for one 4K HDR stream plus some modest additional activity in the background. But it’s a sufficient minimum to do all the mainstream activities that the market provides over the Internet. And that’s the key.
>  - Colin
>      From: Alexandre Petrescu <alexandre.petrescu at gmail.com>
> Sent: Monday, May 6, 2024 7:19 AM
> To: Frantisek Borsik <frantisek.borsik at gmail.com>; Colin_Higbie <CHigbie1 at Higbie.name>
> Cc: starlink at lists.bufferbloat.net
> Subject: Re: [Starlink] It’s the Latency, FCC
>   Le 02/05/2024 à 21:50, Frantisek Borsik a écrit :
> Thanks, Colin. This was just another great read on video (and audio - in the past emails from you) bullet-proofing for the near future.
>  To be honest, the consensus on the bandwidth overall in the bufferbloat related circles was in the 25/3 - 100/20 ballpark
>  To continue on this discussion of 25mbit/s (mbyte/s ?) of 4k, and 8k, here are some more thoughts:
> - about 25mbit/s bw needs for 4K:  hdmi cables for 4K HDR10 (high dynamic range) are specified at 18gbit/s and not 25mbit/s (mbyte?).  These HDMI cables dont run IP.  But, supposedly, the displayed 4K image is of a higher quality if played over hdmi (presumably from a player) than from a server  remote on the Internet.   To achieve parity, maybe one wants to run that hdmi flow from the server with IP, and at that point the bandwidth requirement is higher than 25mbit/s.  This goes hand in hand with the disc evolutions (triple-layer bluray discs of 120Gbyte capacity is the most recent; I dont see signs of that to slow).
> - in some regions, the terrestrial DVB (TV on radio frequencies, with antenna receivers, not  IP) run at 4K HDR10 starting this year.  I dont know what MPEG codec is it, at what mbit/s speed.  But it is not over the Internet.  This means that probably  ISPs are inclined to do more than that 4K over the Internet, maybe 8K, to distinguish their service from DVB.  The audience of these DVB streams is very wide, with cheap one-time buy receivers (no subscription, like with ISP) already widely available in electronics stores.
> - a reduced audience, yet important,  is that of 8K TV via satellites.   There is one japanese 8K TV satcom provider, and the audience (number of watchers) is probably smaller than that of DVB 4K HDR.  Still, it constitutes competition for IPTV from ISPs.
> To me, that reflects a direction of growth of the 4K to 8K capability requirement from the Internet.
> Still, that growth in bandwidth requirement does not say anything about the latency requirement.  That can be found elsewhere, and probably it is very little related to TV.
> Alex
> , but all what many of us were trying to achieve while talking to FCC (et al) was to point out, that in order to really make it bulletproof and usable for not only near future, but for today, a reasonable Quality of Experience requirement is necessary to be added to the definition of broadband. Here is the link to the FCC NOI and related discussion:
> https://circleid.com/posts/20231211-its-the-latency-fcc
>  Hopefully, we have managed to get that message over to the other side. At least 2 of 5 FCC Commissioners seems to be getting it - Nathan Simington and Brendan Carr - and Nathan event arranged for his staffers to talk with Dave and others. Hope that this line of of cooperation will continue and we will manage to help the rest of the FCC to understand the issues at hand correctly.
> 
> All the best,
>  Frank
> Frantisek (Frank) Borsik
>  https://www.linkedin.com/in/frantisekborsik
> Signal, Telegram, WhatsApp: +421919416714 
> iMessage, mobile: +420775230885
> Skype: casioa5302ca
> frantisek.borsik at gmail.com
>   On Thu, May 2, 2024 at 4:47 PM Colin_Higbie via Starlink <starlink at lists.bufferbloat.net> wrote:
> Alex, fortunately, we are not bound to use personal experiences and observations on this. We have real market data that can provide an objective, data-supported conclusion. No need for a chocolate-or-vanilla-ice-cream-tastes-better discussion on this. 
> 
> Yes, cameras can film at 8K (and higher in some cases). However, at those resolutions (with exceptions for ultra-high end cameras, such as those used by multi-million dollar telescopes), except under very specific conditions, the actual picture quality doesn't vary past about 5.5K. The loss of detail simply moves from a consequence of too few pixels to optical and focus limits of the lenses. Neighboring pixels simply hold a blurry image, meaning they don't actually carry any usable information. A still shot with 1/8 of a second exposure can easily benefit from an 8K or higher sensor. Video sometimes can under bright lights with a relatively still or slow moving scene. Neither of these requirements lends itself to typical home video at 30 (or 24) frames per second – that's 0.03s of time per frame. We can imagine AI getting to the point where it can compensate for lack of clarity, and this is already being used for game rendering (e.g., Nvidia's DLSS and Intel's XESS), but that requires training per scene in those games and there hasn't been much development work done on this for filming, at least not yet.
> 
> Will sensors (or AI) improve to capture images faster per amount of incoming photons so that effective digital shutter speeds can get faster at lower light levels? No doubt. Will it materially change video quality so that 8K is a similar step up from 4K as 4K is from HD (or as HD was from SD)? No, at least not in the next several years. Read on for why.
> 
> So far that was all on the production side. But what about the consumer side? Mass market TV sizes max out below about 100" (83" seems to be a fairly common large size, but some stores carry larger models). Even those large sizes that do reach mass-market locations and are available on Amazon, still comprise a very small % of total TV sales. The vast, vast majority of TV sales are of sub 70" models. This is not just because of pricing, that's a factor. It's also because home architecture had not considered screens this big. At these sizes, it's not just a matter of upgrading the entertainment console furniture, it's a matter of building a different room with a dedicated entertainment wall. There is a lot of inertia in the architecture and building that prevents this from being a sudden change, not to mention the hundreds of millions of existing homes that are already sized for TV's below 100".
> 
> And important to this discussion, at several feet from even a 70" - 90" screen, most people can't see the difference between 4K and 8K anyway. The pixels are too small at that distance to make a difference in the User Experience. This is a contrast with 4K from HD, which many people (not all) can see, or from SD to HD, an improvement virtually everyone can see (to the point that news broadcasts now blur the faces of their anchors to remove wrinkles that weren't visible back in the SD days).
> 
> For another real-world example of this curtailing resolution growth: smartphones raced to higher and higher resolutions, until they reached about 4K, then started pulling back. Some are slightly higher, but as often as not, even at the flagship level, many smartphones fall slightly below 4K, with the recognition that customers got wise to screens all being effectively perfect and higher resolutions no longer mattered.
> 
> Currently, the leading contender for anything appearing at 8K are games, not streaming video. That's because games don't require camera lenses and light sensors that don't yet exist. They can render dimly lit, fast moving scenes in 8K just as easily as brightly lit scenes. BUT (huge but here), GPUs aren't powerful enough to do that yet either at good framerates, and for most gamers (not all, but a significant majority), framerate is more important resolution. Top of the line graphics cards (the ones that run about $1,000, so not mainstream yet) of the current generation are just hitting 120fps at 4K in top modern games. From a pixel moving perspective, that would translate to 30fps at 8K (4x the # of pixels, 120/4 = 30). 30fps is good enough for streaming video, but not good enough for a gamer over 4K at 120fps. Still, I anticipate (this part is just my opinion, not a fact) that graphics cards on high-end gaming PCs will be the first to drive 8K experiences for gamers before 8K streaming becomes an in-demand feature. Games have HUDs and are often played on monitors just a couple of feet from the gamer where ultra-fine details would be visible and relevant.
> 
> Having said all of that, does this mean that I don't think 8K and higher will eventually replace 4K for mass market consumer streaming? No, I suspect that in the long-run you're right that they will. That's a reasonable conclusion based on history of screen and TV programming resolutions, but that timeframe is likely more than 10 years off and planning bandwidth requirements for the needs 10-years from now does not require any assumptions relating to standard video resolutions people will be watching then: we can all assume with reasonable confidence based on history of Internet bandwidth usage that bandwidth needs and desires will continue to increase over time.
> 
> The point for this group is that you lose credibility to the audience if you base your reasoning on future video resolutions that the market is currently rejecting without at least acknowledging that those are projected future needs, rather than present day needs.
> 
> At the same time, 4K is indeed a market standard TODAY. That's not an opinion, it's a data point and a fact. As I've said multiple times in this discussion, what makes this a fact and not an opinion are that millions of people choose to pay for access to 4K content and the television programs and movies that are stored and distributed in 4K. All the popular TV devices and gaming consoles support 4K HDR content in at least some versions of the product (they may also offer discounted versions that don't do HDR or only go to 1080p or 1440). The market has spoken and delivered us that data. 4K HDR is the standard for videophiles and popular enough that the top video streaming services all offer it. It is also not in a chaotic state, with suppliers providing different technologies until the market sorts out a winner (like the old Blu-ray vs. HD-DVD fight 15 years ago, or VHS vs. Beta before that). Yes, there are some variants on HDR (Dolby Vision vs. HDR-10), but as TV's are manufactured today, Dolby Vision is effectively just a superset of HDR-10, like G-Sync is a superset of Adaptive Sync for variable refresh rate displays needed for gaming. So, yes, 4K HDR is a standard, whether you buy a Blu-ray UHD movie at Walmart or Best Buy or stream your programming from Netflix, Disney+, Max, or Amazon Prime.
> 
> So again, this is why the minimum rational top bandwidth any new ISP should be developing (at least in developed countries – I think it's fair to say that if people have no Internet access within hundreds of miles, even slow Internet for connectivity to a local library in travel distance from home is far better than nothing) is 25Mbps as the established bandwidth required by the 4K providers to stream 4K HDR content. This does not mean more would not be better or that more won't be needed in the future. But if you are endorsing ISP buildout focused around low-latency under load at anything LESS THAN 25Mbps, you have simply shifted the problem for customers and users of the new service from poor latency (this group's focus) to poor bandwidth incapable of providing modern services.
> 
> To be taken seriously and maximize your chances at success at influencing policy, I urge this group's members to use that 25Mbps top bandwidth as a floor. And to clarify my meaning, I don't mean ISPs shouldn't also offer less expensive tiers of service with bandwidth at only, say, 3 or 10Mbps. Those are fine and will be plenty for many users, and a lower cost option with less capability is a good thing. What I mean is that if they are building out new service, the infrastructure needs to support and they need to OFFER a level of at least 25Mbps. Higher is fine too (better even), but where cost collides with technical capability, 25Mbps is the market requirement, below that and the service offering is failing to provide a fully functional Internet connection.
> 
> Sorry for the long message, but I keep seeing a lot of these same subjective responses to objective data, which concern me. I hope this long version finally addresses all of those and I can now return to just reading the brilliant posts of the latency and TCP/IP experts who normally drive these discussions. You are all far more knowledgeable than I in those areas. My expertise is in what the market needs from its Internet connectivity and why.
> 
> Cheers,
> Colin
> 
> 
> -----Original Message-----
> From: Starlink <starlink-bounces at lists.bufferbloat.net> On Behalf Of starlink-request at lists.bufferbloat.net
> Sent: Thursday, May 2, 2024 5:22 AM
> To: starlink at lists.bufferbloat.net
> Subject: Starlink Digest, Vol 38, Issue 13
> 
> Today's Topics:
> 
>    1. Re: It’s the Latency, FCC (Alexandre Petrescu)
> 
> 
> ----------------------------------------------------------------------
> 
> Message: 1
> Date: Thu, 2 May 2024 11:21:44 +0200
> From: Alexandre Petrescu <alexandre.petrescu at gmail.com>
> To: starlink at lists.bufferbloat.net
> Subject: Re: [Starlink] It’s the Latency, FCC
> Message-ID: <94ba2b39-1fc8-46e2-9f77-3b04a63099e1 at gmail.com>
> Content-Type: text/plain; charset=UTF-8; format=flowed
> 
> 
> Le 30/04/2024 à 22:05, Sebastian Moeller via Starlink a écrit :
> > Hi Colin,
> > [...]
> >
> >> A lot of responses like "but 8K is coming" (it's not, only 
> >> experimental YouTube videos showcase these resolutions to the general 
> >> public, no studio is making 8K content and no streaming service 
> >> offers anything in 8K or higher)
> > [SM] Not my claim.
> 
> Right, it is my claim.  '8K is coming' comes from an observation that it is now present in consumer cameras with ability to film 8K, since a few years now.
> 
> The SD-HD-4K-8K-16K consumer market tendency can be evaluated. One could parallel it with the megapixel number (photo camera) evolution, or with the micro-processor feature size.   There might be levelling, but I am not sure it is at 4K.
> 
> What I would be interested to look at is the next acronym that requires high bw low latency and that is not in the series SD-HD-4K-8K-16K.  This series did not exist in the times of analog TV ('SD' appeared when digital TV 'HD' appeared), so probably a new series will appear that describes TV features.
> 
> Alex
> 
> >
> >> and "I don't need to watch 4K, 1080p is sufficient for me,
> > [SM] That however is my claim ;)
> >
> >> so it should be for everyone else too"
> _______________________________________________
> Starlink mailing list
> Starlink at lists.bufferbloat.net
> https://lists.bufferbloat.net/listinfo/starlink
> _______________________________________________
> Starlink mailing list
> Starlink at lists.bufferbloat.net
> https://lists.bufferbloat.net/listinfo/starlink




More information about the Starlink mailing list