[Starlink] It’s the Latency, FCC
Alexandre Petrescu
alexandre.petrescu at gmail.com
Mon May 6 07:19:24 EDT 2024
Le 02/05/2024 à 21:50, Frantisek Borsik a écrit :
> Thanks, Colin. This was just another great read on video (and audio -
> in the past emails from you) bullet-proofing for the near future.
>
> To be honest, the consensus on the bandwidth overall in the
> bufferbloat related circles was in the 25/3 - 100/20 ballpark
To continue on this discussion of 25mbit/s (mbyte/s ?) of 4k, and 8k,
here are some more thoughts:
- about 25mbit/s bw needs for 4K: hdmi cables for 4K HDR10 (high
dynamic range) are specified at 18gbit/s and not 25mbit/s (mbyte?).
These HDMI cables dont run IP. But, supposedly, the displayed 4K image
is of a higher quality if played over hdmi (presumably from a player)
than from a server remote on the Internet. To achieve parity, maybe
one wants to run that hdmi flow from the server with IP, and at that
point the bandwidth requirement is higher than 25mbit/s. This goes hand
in hand with the disc evolutions (triple-layer bluray discs of 120Gbyte
capacity is the most recent; I dont see signs of that to slow).
- in some regions, the terrestrial DVB (TV on radio frequencies, with
antenna receivers, not IP) run at 4K HDR10 starting this year. I dont
know what MPEG codec is it, at what mbit/s speed. But it is not over the
Internet. This means that probably ISPs are inclined to do more than
that 4K over the Internet, maybe 8K, to distinguish their service from
DVB. The audience of these DVB streams is very wide, with cheap
one-time buy receivers (no subscription, like with ISP) already widely
available in electronics stores.
- a reduced audience, yet important, is that of 8K TV via satellites.
There is one japanese 8K TV satcom provider, and the audience (number of
watchers) is probably smaller than that of DVB 4K HDR. Still, it
constitutes competition for IPTV from ISPs.
To me, that reflects a direction of growth of the 4K to 8K capability
requirement from the Internet.
Still, that growth in bandwidth requirement does not say anything about
the latency requirement. That can be found elsewhere, and probably it
is very little related to TV.
Alex
> , but all what many of us were trying to achieve while talking to FCC
> (et al) was to point out, that in order to really make it bulletproof
> and usable for not only near future, but for today, a reasonable
> Quality of Experience requirement is necessary to be added to the
> definition of broadband. Here is the link to the FCC NOI and related
> discussion:
> https://circleid.com/posts/20231211-its-the-latency-fcc
>
> Hopefully, we have managed to get that message over to the other side.
> At least 2 of 5 FCC Commissioners seems to be getting it - Nathan
> Simington and Brendan Carr - and Nathan event arranged for his
> staffers to talk with Dave and others. Hope that this line of of
> cooperation will continue and we will manage to help the rest of the
> FCC to understand the issues at hand correctly.
>
> All the best,
>
> Frank
>
> Frantisek (Frank) Borsik
>
> https://www.linkedin.com/in/frantisekborsik
>
> Signal, Telegram, WhatsApp: +421919416714
>
> iMessage, mobile: +420775230885
>
> Skype: casioa5302ca
>
> frantisek.borsik at gmail.com
>
>
>
> On Thu, May 2, 2024 at 4:47 PM Colin_Higbie via Starlink
> <starlink at lists.bufferbloat.net> wrote:
>
> Alex, fortunately, we are not bound to use personal experiences
> and observations on this. We have real market data that can
> provide an objective, data-supported conclusion. No need for a
> chocolate-or-vanilla-ice-cream-tastes-better discussion on this.
>
> Yes, cameras can film at 8K (and higher in some cases). However,
> at those resolutions (with exceptions for ultra-high end cameras,
> such as those used by multi-million dollar telescopes), except
> under very specific conditions, the actual picture quality doesn't
> vary past about 5.5K. The loss of detail simply moves from a
> consequence of too few pixels to optical and focus limits of the
> lenses. Neighboring pixels simply hold a blurry image, meaning
> they don't actually carry any usable information. A still shot
> with 1/8 of a second exposure can easily benefit from an 8K or
> higher sensor. Video sometimes can under bright lights with a
> relatively still or slow moving scene. Neither of these
> requirements lends itself to typical home video at 30 (or 24)
> frames per second – that's 0.03s of time per frame. We can imagine
> AI getting to the point where it can compensate for lack of
> clarity, and this is already being used for game rendering (e.g.,
> Nvidia's DLSS and Intel's XESS), but that requires training per
> scene in those games and there hasn't been much development work
> done on this for filming, at least not yet.
>
> Will sensors (or AI) improve to capture images faster per amount
> of incoming photons so that effective digital shutter speeds can
> get faster at lower light levels? No doubt. Will it materially
> change video quality so that 8K is a similar step up from 4K as 4K
> is from HD (or as HD was from SD)? No, at least not in the next
> several years. Read on for why.
>
> So far that was all on the production side. But what about the
> consumer side? Mass market TV sizes max out below about 100" (83"
> seems to be a fairly common large size, but some stores carry
> larger models). Even those large sizes that do reach mass-market
> locations and are available on Amazon, still comprise a very small
> % of total TV sales. The vast, vast majority of TV sales are of
> sub 70" models. This is not just because of pricing, that's a
> factor. It's also because home architecture had not considered
> screens this big. At these sizes, it's not just a matter of
> upgrading the entertainment console furniture, it's a matter of
> building a different room with a dedicated entertainment wall.
> There is a lot of inertia in the architecture and building that
> prevents this from being a sudden change, not to mention the
> hundreds of millions of existing homes that are already sized for
> TV's below 100".
>
> And important to this discussion, at several feet from even a 70"
> - 90" screen, most people can't see the difference between 4K and
> 8K anyway. The pixels are too small at that distance to make a
> difference in the User Experience. This is a contrast with 4K from
> HD, which many people (not all) can see, or from SD to HD, an
> improvement virtually everyone can see (to the point that news
> broadcasts now blur the faces of their anchors to remove wrinkles
> that weren't visible back in the SD days).
>
> For another real-world example of this curtailing resolution
> growth: smartphones raced to higher and higher resolutions, until
> they reached about 4K, then started pulling back. Some are
> slightly higher, but as often as not, even at the flagship level,
> many smartphones fall slightly below 4K, with the recognition that
> customers got wise to screens all being effectively perfect and
> higher resolutions no longer mattered.
>
> Currently, the leading contender for anything appearing at 8K are
> games, not streaming video. That's because games don't require
> camera lenses and light sensors that don't yet exist. They can
> render dimly lit, fast moving scenes in 8K just as easily as
> brightly lit scenes. BUT (huge but here), GPUs aren't powerful
> enough to do that yet either at good framerates, and for most
> gamers (not all, but a significant majority), framerate is more
> important resolution. Top of the line graphics cards (the ones
> that run about $1,000, so not mainstream yet) of the current
> generation are just hitting 120fps at 4K in top modern games. From
> a pixel moving perspective, that would translate to 30fps at 8K
> (4x the # of pixels, 120/4 = 30). 30fps is good enough for
> streaming video, but not good enough for a gamer over 4K at
> 120fps. Still, I anticipate (this part is just my opinion, not a
> fact) that graphics cards on high-end gaming PCs will be the first
> to drive 8K experiences for gamers before 8K streaming becomes an
> in-demand feature. Games have HUDs and are often played on
> monitors just a couple of feet from the gamer where ultra-fine
> details would be visible and relevant.
>
> Having said all of that, does this mean that I don't think 8K and
> higher will eventually replace 4K for mass market consumer
> streaming? No, I suspect that in the long-run you're right that
> they will. That's a reasonable conclusion based on history of
> screen and TV programming resolutions, but that timeframe is
> likely more than 10 years off and planning bandwidth requirements
> for the needs 10-years from now does not require any assumptions
> relating to standard video resolutions people will be watching
> then: we can all assume with reasonable confidence based on
> history of Internet bandwidth usage that bandwidth needs and
> desires will continue to increase over time.
>
> The point for this group is that you lose credibility to the
> audience if you base your reasoning on future video resolutions
> that the market is currently rejecting without at least
> acknowledging that those are projected future needs, rather than
> present day needs.
>
> At the same time, 4K is indeed a market standard TODAY. That's not
> an opinion, it's a data point and a fact. As I've said multiple
> times in this discussion, what makes this a fact and not an
> opinion are that millions of people choose to pay for access to 4K
> content and the television programs and movies that are stored and
> distributed in 4K. All the popular TV devices and gaming consoles
> support 4K HDR content in at least some versions of the product
> (they may also offer discounted versions that don't do HDR or only
> go to 1080p or 1440). The market has spoken and delivered us that
> data. 4K HDR is the standard for videophiles and popular enough
> that the top video streaming services all offer it. It is also not
> in a chaotic state, with suppliers providing different
> technologies until the market sorts out a winner (like the old
> Blu-ray vs. HD-DVD fight 15 years ago, or VHS vs. Beta before
> that). Yes, there are some variants on HDR (Dolby Vision vs.
> HDR-10), but as TV's are manufactured today, Dolby Vision is
> effectively just a superset of HDR-10, like G-Sync is a superset
> of Adaptive Sync for variable refresh rate displays needed for
> gaming. So, yes, 4K HDR is a standard, whether you buy a Blu-ray
> UHD movie at Walmart or Best Buy or stream your programming from
> Netflix, Disney+, Max, or Amazon Prime.
>
> So again, this is why the minimum rational top bandwidth any new
> ISP should be developing (at least in developed countries – I
> think it's fair to say that if people have no Internet access
> within hundreds of miles, even slow Internet for connectivity to a
> local library in travel distance from home is far better than
> nothing) is 25Mbps as the established bandwidth required by the 4K
> providers to stream 4K HDR content. This does not mean more would
> not be better or that more won't be needed in the future. But if
> you are endorsing ISP buildout focused around low-latency under
> load at anything LESS THAN 25Mbps, you have simply shifted the
> problem for customers and users of the new service from poor
> latency (this group's focus) to poor bandwidth incapable of
> providing modern services.
>
> To be taken seriously and maximize your chances at success at
> influencing policy, I urge this group's members to use that 25Mbps
> top bandwidth as a floor. And to clarify my meaning, I don't mean
> ISPs shouldn't also offer less expensive tiers of service with
> bandwidth at only, say, 3 or 10Mbps. Those are fine and will be
> plenty for many users, and a lower cost option with less
> capability is a good thing. What I mean is that if they are
> building out new service, the infrastructure needs to support and
> they need to OFFER a level of at least 25Mbps. Higher is fine too
> (better even), but where cost collides with technical capability,
> 25Mbps is the market requirement, below that and the service
> offering is failing to provide a fully functional Internet connection.
>
> Sorry for the long message, but I keep seeing a lot of these same
> subjective responses to objective data, which concern me. I hope
> this long version finally addresses all of those and I can now
> return to just reading the brilliant posts of the latency and
> TCP/IP experts who normally drive these discussions. You are all
> far more knowledgeable than I in those areas. My expertise is in
> what the market needs from its Internet connectivity and why.
>
> Cheers,
> Colin
>
>
> -----Original Message-----
> From: Starlink <starlink-bounces at lists.bufferbloat.net> On Behalf
> Of starlink-request at lists.bufferbloat.net
> Sent: Thursday, May 2, 2024 5:22 AM
> To: starlink at lists.bufferbloat.net
> Subject: Starlink Digest, Vol 38, Issue 13
>
> Today's Topics:
>
> 1. Re: It’s the Latency, FCC (Alexandre Petrescu)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Thu, 2 May 2024 11:21:44 +0200
> From: Alexandre Petrescu <alexandre.petrescu at gmail.com>
> To: starlink at lists.bufferbloat.net
> Subject: Re: [Starlink] It’s the Latency, FCC
> Message-ID: <94ba2b39-1fc8-46e2-9f77-3b04a63099e1 at gmail.com>
> Content-Type: text/plain; charset=UTF-8; format=flowed
>
>
> Le 30/04/2024 à 22:05, Sebastian Moeller via Starlink a écrit :
> > Hi Colin,
> > [...]
> >
> >> A lot of responses like "but 8K is coming" (it's not, only
> >> experimental YouTube videos showcase these resolutions to the
> general
> >> public, no studio is making 8K content and no streaming service
> >> offers anything in 8K or higher)
> > [SM] Not my claim.
>
> Right, it is my claim. '8K is coming' comes from an observation
> that it is now present in consumer cameras with ability to film
> 8K, since a few years now.
>
> The SD-HD-4K-8K-16K consumer market tendency can be evaluated. One
> could parallel it with the megapixel number (photo camera)
> evolution, or with the micro-processor feature size. There might
> be levelling, but I am not sure it is at 4K.
>
> What I would be interested to look at is the next acronym that
> requires high bw low latency and that is not in the series
> SD-HD-4K-8K-16K. This series did not exist in the times of analog
> TV ('SD' appeared when digital TV 'HD' appeared), so probably a
> new series will appear that describes TV features.
>
> Alex
>
> >
> >> and "I don't need to watch 4K, 1080p is sufficient for me,
> > [SM] That however is my claim ;)
> >
> >> so it should be for everyone else too"
> _______________________________________________
> Starlink mailing list
> Starlink at lists.bufferbloat.net
> https://lists.bufferbloat.net/listinfo/starlink
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.bufferbloat.net/pipermail/starlink/attachments/20240506/20d98a21/attachment-0001.html>
More information about the Starlink
mailing list