From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: from mail-pg1-x536.google.com (mail-pg1-x536.google.com [IPv6:2607:f8b0:4864:20::536]) (using TLSv1.2 with cipher ECDHE-RSA-AES128-GCM-SHA256 (128/128 bits)) (No client certificate requested) by lists.bufferbloat.net (Postfix) with ESMTPS id A6F593CBC9 for ; Mon, 6 May 2024 09:44:08 -0400 (EDT) Received: by mail-pg1-x536.google.com with SMTP id 41be03b00d2f7-61eba9f9c5dso1560878a12.0 for ; Mon, 06 May 2024 06:44:08 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=nathan.io; s=google; t=1715003047; x=1715607847; darn=lists.bufferbloat.net; h=cc:to:subject:message-id:date:from:in-reply-to:references :mime-version:from:to:cc:subject:date:message-id:reply-to; bh=dwxzqnPf2RIABu58Dlrxt022PlTKkxPx6JlRthGzUZE=; b=YLyW0RNQCHZLMNYPX76FBMEekG/5mhfMoiCfP2Ta3A1cGhAwrGRhW9pnPeuhWo4Fju Fd3xx8CwHVszy63elfWcbqVMYkt4qpaMjPO68/GIbI5xDXA0nQs9OooJFfA8b8zUk26N /KbroJM0oCBe+NIFTP94fPwhqRBCjNy9+/9gmTg+/lTmoyoUmPqiA4pM5GKxtXd8F1iE dwczIZ/eXriGvjPf9IbWaaSI5SgozTwS3AOsZddg0R2g4UqvjMjDEtihUkgAC/tJ2Mpr nxWZDGJ9yfee2536Woob5hgeR3x2zKlA/BFRqHpdliN+SYrKraTb9g9jwCN/XXO0tlCK M04A== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1715003047; x=1715607847; h=cc:to:subject:message-id:date:from:in-reply-to:references :mime-version:x-gm-message-state:from:to:cc:subject:date:message-id :reply-to; bh=dwxzqnPf2RIABu58Dlrxt022PlTKkxPx6JlRthGzUZE=; b=wCLnEd9SflEq2Vmz/8v9JITJlNXDsbL1PUe3/x9gjVj8P4AyWh7QhNeuV9hJHWxQWJ ut2NJrrUg5hBmvpSUidW7fxJFdXAz5RsjpTIb/IEDrQafHtNNj1dGcFDHi8Swg/x2pz3 Twv7OshIrYXcjchmBtkfn601HGEi+PVC+Y183VNBSg9NK45DjhZtjdb5kZtY1RR5Il2i lC5GRtzIjXWjj5Xw+ilnoQJYshHWxOOPng4OsvEXaHRvo7yHOHfhhiJkcIjzoUY3A4TU 9176fVMrJyx2Qch4FTpDS5vKSluEqfFhNMkBJz+x7nhKsjsHE8Nnza7sFBEUdbu7Wr6d NA1g== X-Forwarded-Encrypted: i=1; AJvYcCW0blQmnyzx8QiK5j35AOc+48OjnUnkVKcCLXWthcJfXBqfc8GpNZnEvq9JetQMmgqJ1P2S5NB2KgNlxT58IJuWktGMvvPV3Q5CuA4zY9E= X-Gm-Message-State: AOJu0YyiGUoJoxE4LDY0iOhci+91ISQseafwODPYy/CmMSWcRZr9Komg SKZZYzqCwT9EkggWUkO54/Fr0DUX7vfeOWCeKqnIGNvyiXDRTEQ2OrQA0/egf2tPmYbmD1mAWM6 qGWrxygzMBrL8DmJAKTuSZEXS3Zoj63oQKfMa4w== X-Google-Smtp-Source: AGHT+IFQtqnAAhCiy9ZJPqwwkqr3UIQfvXSp9KIm6ZFXtjqFlUYLG0jT7HiV83KytVo7zpTd3tmP72y+I64TbUIPee8= X-Received: by 2002:a05:6a20:734e:b0:1a3:6833:1cf5 with SMTP id v14-20020a056a20734e00b001a368331cf5mr10948617pzc.29.1715003047176; Mon, 06 May 2024 06:44:07 -0700 (PDT) MIME-Version: 1.0 References: <298126c9-7854-47c5-a965-c0f89a855939@gmail.com> In-Reply-To: <298126c9-7854-47c5-a965-c0f89a855939@gmail.com> From: Nathan Owens Date: Mon, 6 May 2024 09:43:55 -0400 Message-ID: To: Alexandre Petrescu Cc: Colin_Higbie , Frantisek Borsik , "starlink@lists.bufferbloat.net" Content-Type: multipart/alternative; boundary="0000000000006a89050617c94356" Subject: Re: [Starlink] =?utf-8?q?It=E2=80=99s_the_Latency=2C_FCC?= X-BeenThere: starlink@lists.bufferbloat.net X-Mailman-Version: 2.1.20 Precedence: list List-Id: "Starlink has bufferbloat. Bad." List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-List-Received-Date: Mon, 06 May 2024 13:44:08 -0000 --0000000000006a89050617c94356 Content-Type: text/plain; charset="UTF-8" Content-Transfer-Encoding: quoted-printable You really don=E2=80=99t need 25Mbps for decent 4K quality - depends on the content. Netflix has some encodes that go down to 1.8Mbps with a very high VMAF: https://netflixtechblog.com/optimized-shot-based-encodes-for-4k-now-streami= ng-47b516b10bbb Apple TV has the highest bitrate encodes of any mainstream streaming service, and those do top out at ~25Mbps. Could they be more efficient? Probably=E2=80=A6 On Mon, May 6, 2024 at 7:19=E2=80=AFAM Alexandre Petrescu via Starlink < starlink@lists.bufferbloat.net> wrote: > > Le 02/05/2024 =C3=A0 21:50, Frantisek Borsik a =C3=A9crit : > > Thanks, Colin. This was just another great read on video (and audio - in > the past emails from you) bullet-proofing for the near future. > > To be honest, the consensus on the bandwidth overall in the bufferbloat > related circles was in the 25/3 - 100/20 ballpark > > > To continue on this discussion of 25mbit/s (mbyte/s ?) of 4k, and 8k, her= e > are some more thoughts: > > - about 25mbit/s bw needs for 4K: hdmi cables for 4K HDR10 (high dynamic > range) are specified at 18gbit/s and not 25mbit/s (mbyte?). These HDMI > cables dont run IP. But, supposedly, the displayed 4K image is of a high= er > quality if played over hdmi (presumably from a player) than from a server > remote on the Internet. To achieve parity, maybe one wants to run that > hdmi flow from the server with IP, and at that point the bandwidth > requirement is higher than 25mbit/s. This goes hand in hand with the dis= c > evolutions (triple-layer bluray discs of 120Gbyte capacity is the most > recent; I dont see signs of that to slow). > > - in some regions, the terrestrial DVB (TV on radio frequencies, with > antenna receivers, not IP) run at 4K HDR10 starting this year. I dont > know what MPEG codec is it, at what mbit/s speed. But it is not over the > Internet. This means that probably ISPs are inclined to do more than th= at > 4K over the Internet, maybe 8K, to distinguish their service from DVB. T= he > audience of these DVB streams is very wide, with cheap one-time buy > receivers (no subscription, like with ISP) already widely available in > electronics stores. > > - a reduced audience, yet important, is that of 8K TV via satellites. > There is one japanese 8K TV satcom provider, and the audience (number of > watchers) is probably smaller than that of DVB 4K HDR. Still, it > constitutes competition for IPTV from ISPs. > > To me, that reflects a direction of growth of the 4K to 8K capability > requirement from the Internet. > > Still, that growth in bandwidth requirement does not say anything about > the latency requirement. That can be found elsewhere, and probably it is > very little related to TV. > > Alex > > , but all what many of us were trying to achieve while talking to FCC (et > al) was to point out, that in order to really make it bulletproof and > usable for not only near future, but for today, a reasonable Quality of > Experience requirement is necessary to be added to the definition of > broadband. Here is the link to the FCC NOI and related discussion: > https://circleid.com/posts/20231211-its-the-latency-fcc > > Hopefully, we have managed to get that message over to the other side. At > least 2 of 5 FCC Commissioners seems to be getting it - Nathan Simington > and Brendan Carr - and Nathan event arranged for his staffers to talk wit= h > Dave and others. Hope that this line of of cooperation will continue and = we > will manage to help the rest of the FCC to understand the issues at hand > correctly. > > All the best, > > Frank > > Frantisek (Frank) Borsik > > > > https://www.linkedin.com/in/frantisekborsik > > Signal, Telegram, WhatsApp: +421919416714 > > iMessage, mobile: +420775230885 > > Skype: casioa5302ca > > frantisek.borsik@gmail.com > > > On Thu, May 2, 2024 at 4:47=E2=80=AFPM Colin_Higbie via Starlink < > starlink@lists.bufferbloat.net> wrote: > >> Alex, fortunately, we are not bound to use personal experiences and >> observations on this. We have real market data that can provide an >> objective, data-supported conclusion. No need for a >> chocolate-or-vanilla-ice-cream-tastes-better discussion on this. >> >> Yes, cameras can film at 8K (and higher in some cases). However, at thos= e >> resolutions (with exceptions for ultra-high end cameras, such as those u= sed >> by multi-million dollar telescopes), except under very specific conditio= ns, >> the actual picture quality doesn't vary past about 5.5K. The loss of det= ail >> simply moves from a consequence of too few pixels to optical and focus >> limits of the lenses. Neighboring pixels simply hold a blurry image, >> meaning they don't actually carry any usable information. A still shot w= ith >> 1/8 of a second exposure can easily benefit from an 8K or higher sensor. >> Video sometimes can under bright lights with a relatively still or slow >> moving scene. Neither of these requirements lends itself to typical home >> video at 30 (or 24) frames per second =E2=80=93 that's 0.03s of time per= frame. We >> can imagine AI getting to the point where it can compensate for lack of >> clarity, and this is already being used for game rendering (e.g., Nvidia= 's >> DLSS and Intel's XESS), but that requires training per scene in those ga= mes >> and there hasn't been much development work done on this for filming, at >> least not yet. >> >> Will sensors (or AI) improve to capture images faster per amount of >> incoming photons so that effective digital shutter speeds can get faster= at >> lower light levels? No doubt. Will it materially change video quality so >> that 8K is a similar step up from 4K as 4K is from HD (or as HD was from >> SD)? No, at least not in the next several years. Read on for why. >> >> So far that was all on the production side. But what about the consumer >> side? Mass market TV sizes max out below about 100" (83" seems to be a >> fairly common large size, but some stores carry larger models). Even tho= se >> large sizes that do reach mass-market locations and are available on >> Amazon, still comprise a very small % of total TV sales. The vast, vast >> majority of TV sales are of sub 70" models. This is not just because of >> pricing, that's a factor. It's also because home architecture had not >> considered screens this big. At these sizes, it's not just a matter of >> upgrading the entertainment console furniture, it's a matter of building= a >> different room with a dedicated entertainment wall. There is a lot of >> inertia in the architecture and building that prevents this from being a >> sudden change, not to mention the hundreds of millions of existing homes >> that are already sized for TV's below 100". >> >> And important to this discussion, at several feet from even a 70" - 90" >> screen, most people can't see the difference between 4K and 8K anyway. T= he >> pixels are too small at that distance to make a difference in the User >> Experience. This is a contrast with 4K from HD, which many people (not a= ll) >> can see, or from SD to HD, an improvement virtually everyone can see (to >> the point that news broadcasts now blur the faces of their anchors to >> remove wrinkles that weren't visible back in the SD days). >> >> For another real-world example of this curtailing resolution growth: >> smartphones raced to higher and higher resolutions, until they reached >> about 4K, then started pulling back. Some are slightly higher, but as of= ten >> as not, even at the flagship level, many smartphones fall slightly below >> 4K, with the recognition that customers got wise to screens all being >> effectively perfect and higher resolutions no longer mattered. >> >> Currently, the leading contender for anything appearing at 8K are games, >> not streaming video. That's because games don't require camera lenses an= d >> light sensors that don't yet exist. They can render dimly lit, fast movi= ng >> scenes in 8K just as easily as brightly lit scenes. BUT (huge but here), >> GPUs aren't powerful enough to do that yet either at good framerates, an= d >> for most gamers (not all, but a significant majority), framerate is more >> important resolution. Top of the line graphics cards (the ones that run >> about $1,000, so not mainstream yet) of the current generation are just >> hitting 120fps at 4K in top modern games. From a pixel moving perspectiv= e, >> that would translate to 30fps at 8K (4x the # of pixels, 120/4 =3D 30). = 30fps >> is good enough for streaming video, but not good enough for a gamer over= 4K >> at 120fps. Still, I anticipate (this part is just my opinion, not a fact= ) >> that graphics cards on high-end gaming PCs will be the first to drive 8K >> experiences for gamers before 8K streaming becomes an in-demand feature. >> Games have HUDs and are often played on monitors just a couple of feet f= rom >> the gamer where ultra-fine details would be visible and relevant. >> >> Having said all of that, does this mean that I don't think 8K and higher >> will eventually replace 4K for mass market consumer streaming? No, I >> suspect that in the long-run you're right that they will. That's a >> reasonable conclusion based on history of screen and TV programming >> resolutions, but that timeframe is likely more than 10 years off and >> planning bandwidth requirements for the needs 10-years from now does not >> require any assumptions relating to standard video resolutions people wi= ll >> be watching then: we can all assume with reasonable confidence based on >> history of Internet bandwidth usage that bandwidth needs and desires wil= l >> continue to increase over time. >> >> The point for this group is that you lose credibility to the audience if >> you base your reasoning on future video resolutions that the market is >> currently rejecting without at least acknowledging that those are projec= ted >> future needs, rather than present day needs. >> >> At the same time, 4K is indeed a market standard TODAY. That's not an >> opinion, it's a data point and a fact. As I've said multiple times in th= is >> discussion, what makes this a fact and not an opinion are that millions = of >> people choose to pay for access to 4K content and the television program= s >> and movies that are stored and distributed in 4K. All the popular TV >> devices and gaming consoles support 4K HDR content in at least some >> versions of the product (they may also offer discounted versions that do= n't >> do HDR or only go to 1080p or 1440). The market has spoken and delivered= us >> that data. 4K HDR is the standard for videophiles and popular enough tha= t >> the top video streaming services all offer it. It is also not in a chaot= ic >> state, with suppliers providing different technologies until the market >> sorts out a winner (like the old Blu-ray vs. HD-DVD fight 15 years ago, = or >> VHS vs. Beta before that). Yes, there are some variants on HDR (Dolby >> Vision vs. HDR-10), but as TV's are manufactured today, Dolby Vision is >> effectively just a superset of HDR-10, like G-Sync is a superset of >> Adaptive Sync for variable refresh rate displays needed for gaming. So, >> yes, 4K HDR is a standard, whether you buy a Blu-ray UHD movie at Walmar= t >> or Best Buy or stream your programming from Netflix, Disney+, Max, or >> Amazon Prime. >> >> So again, this is why the minimum rational top bandwidth any new ISP >> should be developing (at least in developed countries =E2=80=93 I think = it's fair >> to say that if people have no Internet access within hundreds of miles, >> even slow Internet for connectivity to a local library in travel distanc= e >> from home is far better than nothing) is 25Mbps as the established >> bandwidth required by the 4K providers to stream 4K HDR content. This do= es >> not mean more would not be better or that more won't be needed in the >> future. But if you are endorsing ISP buildout focused around low-latency >> under load at anything LESS THAN 25Mbps, you have simply shifted the >> problem for customers and users of the new service from poor latency (th= is >> group's focus) to poor bandwidth incapable of providing modern services. >> >> To be taken seriously and maximize your chances at success at influencin= g >> policy, I urge this group's members to use that 25Mbps top bandwidth as = a >> floor. And to clarify my meaning, I don't mean ISPs shouldn't also offer >> less expensive tiers of service with bandwidth at only, say, 3 or 10Mbps= . >> Those are fine and will be plenty for many users, and a lower cost optio= n >> with less capability is a good thing. What I mean is that if they are >> building out new service, the infrastructure needs to support and they n= eed >> to OFFER a level of at least 25Mbps. Higher is fine too (better even), b= ut >> where cost collides with technical capability, 25Mbps is the market >> requirement, below that and the service offering is failing to provide a >> fully functional Internet connection. >> >> Sorry for the long message, but I keep seeing a lot of these same >> subjective responses to objective data, which concern me. I hope this lo= ng >> version finally addresses all of those and I can now return to just read= ing >> the brilliant posts of the latency and TCP/IP experts who normally drive >> these discussions. You are all far more knowledgeable than I in those >> areas. My expertise is in what the market needs from its Internet >> connectivity and why. >> >> Cheers, >> Colin >> >> >> -----Original Message----- >> From: Starlink On Behalf Of >> starlink-request@lists.bufferbloat.net >> Sent: Thursday, May 2, 2024 5:22 AM >> To: starlink@lists.bufferbloat.net >> Subject: Starlink Digest, Vol 38, Issue 13 >> >> Today's Topics: >> >> 1. Re: It=E2=80=99s the Latency, FCC (Alexandre Petrescu) >> >> >> ---------------------------------------------------------------------- >> >> Message: 1 >> Date: Thu, 2 May 2024 11:21:44 +0200 >> From: Alexandre Petrescu >> To: starlink@lists.bufferbloat.net >> Subject: Re: [Starlink] It=E2=80=99s the Latency, FCC >> Message-ID: <94ba2b39-1fc8-46e2-9f77-3b04a63099e1@gmail.com> >> Content-Type: text/plain; charset=3DUTF-8; format=3Dflowed >> >> >> Le 30/04/2024 =C3=A0 22:05, Sebastian Moeller via Starlink a =C3=A9crit = : >> > Hi Colin, >> > [...] >> > >> >> A lot of responses like "but 8K is coming" (it's not, only >> >> experimental YouTube videos showcase these resolutions to the general >> >> public, no studio is making 8K content and no streaming service >> >> offers anything in 8K or higher) >> > [SM] Not my claim. >> >> Right, it is my claim. '8K is coming' comes from an observation that it >> is now present in consumer cameras with ability to film 8K, since a few >> years now. >> >> The SD-HD-4K-8K-16K consumer market tendency can be evaluated. One could >> parallel it with the megapixel number (photo camera) evolution, or with = the >> micro-processor feature size. There might be levelling, but I am not s= ure >> it is at 4K. >> >> What I would be interested to look at is the next acronym that requires >> high bw low latency and that is not in the series SD-HD-4K-8K-16K. This >> series did not exist in the times of analog TV ('SD' appeared when digit= al >> TV 'HD' appeared), so probably a new series will appear that describes T= V >> features. >> >> Alex >> >> > >> >> and "I don't need to watch 4K, 1080p is sufficient for me, >> > [SM] That however is my claim ;) >> > >> >> so it should be for everyone else too" >> _______________________________________________ >> Starlink mailing list >> Starlink@lists.bufferbloat.net >> https://lists.bufferbloat.net/listinfo/starlink >> > _______________________________________________ > Starlink mailing list > Starlink@lists.bufferbloat.net > https://lists.bufferbloat.net/listinfo/starlink > --0000000000006a89050617c94356 Content-Type: text/html; charset="UTF-8" Content-Transfer-Encoding: quoted-printable
You really don=E2=80=99t need 25Mbps for decent 4K qualit= y - depends on the content. Netflix has some encodes that go down to 1.8Mbp= s with a very high VMAF:=C2=A0

Apple TV h= as the highest bitrate encodes of any mainstream streaming service, and tho= se do top out at ~25Mbps. Could they be more efficient? Probably=E2=80=A6= =C2=A0

On Mon, May 6, 2024 at 7:19=E2=80=AFAM Alexandre Petrescu via St= arlink <starlink@lists= .bufferbloat.net> wrote:
=20 =20 =20


Le 02/05/2024 =C3=A0 21:50, Frantisek Borsik a =C3=A9crit=C2=A0:
=20
Thanks, Colin. This was just another great read on video (and audio - in the past emails from you) bullet-proofing for the near future.

To be honest, the consensus on the bandwidth overall in the bufferbloat related circles was in the 25/3 - 100/20 ballpark


To continue on this discussion of 25mbit/s (mbyte/s ?) of 4k, and 8k, here are some more thoughts:

- about 25mbit/s bw needs for 4K:=C2=A0 hdmi cables for 4K HDR10 (hi= gh dynamic range) are specified at 18gbit/s and not 25mbit/s (mbyte?).=C2=A0 These HDMI cables dont run IP.=C2=A0 But, supposedly,= the displayed 4K image is of a higher quality if played over hdmi (presumably from a player) than from a server=C2=A0 remote on the Internet.=C2=A0=C2=A0 To achieve parity, maybe one wants to run that = hdmi flow from the server with IP, and at that point the bandwidth requirement is higher than 25mbit/s.=C2=A0 This goes hand in hand wit= h the disc evolutions (triple-layer bluray discs of 120Gbyte capacity is the most recent; I dont see signs of that to slow).

- in some regions, the terrestrial DVB (TV on radio frequencies, with antenna receivers, not=C2=A0 IP) run at 4K HDR10 starting this year.=C2=A0 I dont know what MPEG codec is it, at what mbit/s speed.= =C2=A0 But it is not over the Internet.=C2=A0 This means that probably=C2=A0= ISPs are inclined to do more than that 4K over the Internet, maybe 8K, to distinguish their service from DVB.=C2=A0 The audience of these DV= B streams is very wide, with cheap one-time buy receivers (no subscription, like with ISP) already widely available in electronics stores.

- a reduced audience, yet important,=C2=A0 is that of 8K TV via satellites.=C2=A0=C2=A0 There is one japanese 8K TV satcom provider, = and the audience (number of watchers) is probably smaller than that of DVB 4K HDR.=C2=A0 Still, it constitutes competition for IPTV from ISPs.

To me, that reflects a direction of growth of the 4K to 8K capability requirement from the Internet.

Still, that growth in bandwidth requirement does not say anything about the latency requirement.=C2=A0 That can be found elsewhere, and probably it is very little related to TV.

Alex

, but all what many of us were trying to achieve while talking to FCC (et al) was to point out, that in order to really make it bulletproof and usable for not only near future, but for today, a reasonable Quality of Experience requirement is necessary to be added to the definition of broadband. Here is the link to the FCC NOI and related discussion:

Hopefully, we have managed to get that message over to the other side. At least 2 of 5 FCC Commissioners seems to be getting it - Nathan Simington and Brendan Carr - and Nathan event arranged for his staffers to talk with Dave and others. Hope that this line of of cooperation will continue and we will manage to help the rest of the FCC to understand the issues at hand correctly.

All the best,

Frank

Frantisek (Frank) Borsik

=C2=A0

https://www.linkedin.com/in/fra= ntisekborsik

Signal, Telegram, WhatsApp: +421919416714=C2= =A0

iMessage, mobile: +420775230885

Skype: casioa5302ca

frantisek.borsik@gmail.com



On Thu, May 2, 2024 at 4:47= =E2=80=AFPM Colin_Higbie via Starlink <starlink@lists.bufferbloat.net> wrote:
Alex, fortunately, we are not bound to use personal experiences and observations on this. We have real market data that can provide an objective, data-supported conclusion. No need for a chocolate-or-vanilla-ice-cream-tastes-better discussion on this.

Yes, cameras can film at 8K (and higher in some cases). However, at those resolutions (with exceptions for ultra-high end cameras, such as those used by multi-million dollar telescopes), except under very specific conditions, the actual picture quality doesn't vary past about 5.5K. The loss of detail simply moves from a consequence of too few pixels to optical and focus limits of the lenses. Neighboring pixels simply hold a blurry image, meaning they don't actually carry any usable information. A still shot with 1/8 of a second exposure can easily benefit from an 8K or higher sensor. Video sometimes can under bright lights with a relatively still or slow moving scene. Neither of these requirements lends itself to typical home video at 30 (or 24) frames per second =E2=80=93 t= hat's 0.03s of time per frame. We can imagine AI getting to the point where it can compensate for lack of clarity, and this is already being used for game rendering (e.g., Nvidia's DLSS an= d Intel's XESS), but that requires training per scene in those games and there hasn't been much development work done on thi= s for filming, at least not yet.

Will sensors (or AI) improve to capture images faster per amount of incoming photons so that effective digital shutter speeds can get faster at lower light levels? No doubt. Will it materially change video quality so that 8K is a similar step up from 4K as 4K is from HD (or as HD was from SD)? No, at least not in the next several years. Read on for why.

So far that was all on the production side. But what about the consumer side? Mass market TV sizes max out below about 100" (83" seems to be a fairly common large size, but some stores carry larger models). Even those large sizes that do reach mass-market locations and are available on Amazon, still comprise a very small % of total TV sales. The vast, vast majority of TV sales are of sub 70" models. This is not just because of pricing, that's a factor. It's also because ho= me architecture had not considered screens this big. At these sizes, it's not just a matter of upgrading the entertainment console furniture, it's a matter of building a different room with a dedicated entertainment wall. There is a lot of inertia in the architecture and building that prevents this from being a sudden change, not to mention the hundreds of millions of existing homes that are already sized for TV's below 100"= ;.

And important to this discussion, at several feet from even a 70" - 90" screen, most people can't see the differe= nce between 4K and 8K anyway. The pixels are too small at that distance to make a difference in the User Experience. This is a contrast with 4K from HD, which many people (not all) can see, or from SD to HD, an improvement virtually everyone can see (to the point that news broadcasts now blur the faces of their anchors to remove wrinkles that weren't visible back in the SD days).=

For another real-world example of this curtailing resolution growth: smartphones raced to higher and higher resolutions, until they reached about 4K, then started pulling back. Some are slightly higher, but as often as not, even at the flagship level, many smartphones fall slightly below 4K, with the recognition that customers got wise to screens all being effectively perfect and higher resolutions no longer mattered.
Currently, the leading contender for anything appearing at 8K are games, not streaming video. That's because games don'= t require camera lenses and light sensors that don't yet exist. They can render dimly lit, fast moving scenes in 8K just as easily as brightly lit scenes. BUT (huge but here), GPUs aren't powerful enough to do that yet either at good framerates, and for most gamers (not all, but a significant majority), framerate is more important resolution. Top of the line graphics cards (the ones that run about $1,000, so not mainstream yet) of the current generation are just hitting 120fps at 4K in top modern games. From a pixel moving perspective, that would translate to 30fps at 8K (4x the # of pixels, 120/4 =3D 30). 30fps is good enough for streaming video, but not good enough for a gamer over 4K at 120fps. Still, I anticipate (this part is just my opinion, not a fact) that graphics cards on high-end gaming PCs will be the first to drive 8K experiences for gamers before 8K streaming becomes an in-demand feature. Games have HUDs and are often played on monitors just a couple of feet from the gamer where ultra-fine details would be visible and relevant.

Having said all of that, does this mean that I don't think 8K and higher will eventually replace 4K for mass market consumer streaming? No, I suspect that in the long-run you're right that they will. That's a reasonable conclusion based on history of screen and TV programming resolutions, but that timeframe is likely more than 10 years off and planning bandwidth requirements for the needs 10-years from now does not require any assumptions relating to standard video resolutions people will be watching then: we can all assume with reasonable confidence based on history of Internet bandwidth usage that bandwidth needs and desires will continue to increase over time.

The point for this group is that you lose credibility to the audience if you base your reasoning on future video resolutions that the market is currently rejecting without at least acknowledging that those are projected future needs, rather than present day needs.

At the same time, 4K is indeed a market standard TODAY. That'= s not an opinion, it's a data point and a fact. As I've sai= d multiple times in this discussion, what makes this a fact and not an opinion are that millions of people choose to pay for access to 4K content and the television programs and movies that are stored and distributed in 4K. All the popular TV devices and gaming consoles support 4K HDR content in at least some versions of the product (they may also offer discounted versions that don't do HDR or only go to 1080p or 1440). The market has spoken and delivered us that data. 4K HDR is the standard for videophiles and popular enough that the top video streaming services all offer it. It is also not in a chaotic state, with suppliers providing different technologies until the market sorts out a winner (like the old Blu-ray vs. HD-DVD fight 15 years ago, or VHS vs. Beta before that). Yes, there are some variants on HDR (Dolby Vision vs. HDR-10), but as TV's are manufactured today, Dolby Vision is effectively just a superset of HDR-10, like G-Sync is a superset of Adaptive Sync for variable refresh rate displays needed for gaming. So, yes, 4K HDR is a standard, whether you buy a Blu-ray UHD movie at Walmart or Best Buy or stream your programming from Netflix, Disney+, Max, or Amazon Prime.

So again, this is why the minimum rational top bandwidth any new ISP should be developing (at least in developed countries =E2=80=93 I think it's fair to say that if people have no Int= ernet access within hundreds of miles, even slow Internet for connectivity to a local library in travel distance from home is far better than nothing) is 25Mbps as the established bandwidth required by the 4K providers to stream 4K HDR content. This does not mean more would not be better or that more won't be needed in the future. But if you are endorsing ISP buildout focused around low-latency under load at anything LESS THAN 25Mbps, you have simply shifted the problem for customers and users of the new service from poor latency (this group's focus) to poor bandwidth incapable of providing moder= n services.

To be taken seriously and maximize your chances at success at influencing policy, I urge this group's members to use that 25Mbps top bandwidth as a floor. And to clarify my meaning, I don't mean ISPs shouldn't also offer less expensive tiers= of service with bandwidth at only, say, 3 or 10Mbps. Those are fine and will be plenty for many users, and a lower cost option with less capability is a good thing. What I mean is that if they are building out new service, the infrastructure needs to support and they need to OFFER a level of at least 25Mbps. Higher is fine too (better even), but where cost collides with technical capability, 25Mbps is the market requirement, below that and the service offering is failing to provide a fully functional Internet connection.

Sorry for the long message, but I keep seeing a lot of these same subjective responses to objective data, which concern me. I hope this long version finally addresses all of those and I can now return to just reading the brilliant posts of the latency and TCP/IP experts who normally drive these discussions. You are all far more knowledgeable than I in those areas. My expertise is in what the market needs from its Internet connectivity and why.

Cheers,
Colin


-----Original Message-----
From: Starlink <starlink-bounces@lists.bufferbloat.net>= ; On Behalf Of starlink-request@lists.bufferbloat.net
Sent: Thursday, May 2, 2024 5:22 AM
To: starlink@lists.bufferbloat.net
Subject: Starlink Digest, Vol 38, Issue 13

Today's Topics:

=C2=A0 =C2=A01. Re: It=E2=80=99s the Latency, FCC (Alexandre Petr= escu)


----------------------------------------------------------------------

Message: 1
Date: Thu, 2 May 2024 11:21:44 +0200
From: Alexandre Petrescu <alexandre.petrescu@gmail.com>
To: starlink@lists.bufferbloat.net
Subject: Re: [Starlink] It=E2=80=99s the Latency, FCC
Message-ID: <94ba2b39-1fc8-46e2-9f77-3b04a63099e1@gma= il.com>
Content-Type: text/plain; charset=3DUTF-8; format=3Dflowed


Le 30/04/2024 =C3=A0 22:05, Sebastian Moeller via Starlink a =C3=A9crit=C2=A0:
> Hi Colin,
> [...]
>
>> A lot of responses like "but 8K is coming" (it= 's not, only
>> experimental YouTube videos showcase these resolutions to the general
>> public, no studio is making 8K content and no streaming service
>> offers anything in 8K or higher)
> [SM] Not my claim.

Right, it is my claim.=C2=A0 '8K is coming' comes from an observation that it is now present in consumer cameras with ability to film 8K, since a few years now.

The SD-HD-4K-8K-16K consumer market tendency can be evaluated. One could parallel it with the megapixel number (photo camera) evolution, or with the micro-processor feature size.=C2=A0=C2=A0 = There might be levelling, but I am not sure it is at 4K.

What I would be interested to look at is the next acronym that requires high bw low latency and that is not in the series SD-HD-4K-8K-16K.=C2=A0 This series did not exist in the times of analog TV ('SD' appeared when digital TV 'HD' app= eared), so probably a new series will appear that describes TV features.

Alex

>
>> and "I don't need to watch 4K, 1080p is suffici= ent for me,
> [SM] That however is my claim ;)
>
>> so it should be for everyone else too"
_______________________________________________
Starlink mailing list
Starlink@lists.bufferbloat.net
https://lists.bufferbloat.net/listinfo/st= arlink
_______________________________________________
Starlink mailing list
Starlin= k@lists.bufferbloat.net
https://lists.bufferbloat.net/listinfo/starlink
--0000000000006a89050617c94356--