[Starlink] It’s the Latency, FCC

Sebastian Moeller moeller0 at gmx.de
Sat Mar 16 15:32:49 EDT 2024


Hi Colin,


> On 16. Mar 2024, at 20:10, Colin_Higbie via Starlink <starlink at lists.bufferbloat.net> wrote:
> 
> Just to be clear: 4K is absolutely a standard in streaming,

[SM] Over here the lower priced streaming tiers tend to only offer 1080p, not 4K, so 4K is certain a standard, but not the standard in streaming, no?

> with that being the most popular TV being sold today. 8K is not and likely won't be until 80+" TVs become the norm. The few 8K streaming videos that exist are available primarily as YouTube curiosities, with virtually no displays on the market that support it yet and none of the big content providers like Netflix or Disney+ provide 8K streams.
> 
> Virtually all modern streaming programming on Netflix and Disney+ is 4K HDR. That is the standard to support.
> 
> The objective quality difference to the average human eye between SD and HD is huge and changes whether you see the shape of a face or the detailed expression on a face. Completely different viewing experience. The difference between HD and 4K is significant on today's larger TV displays (not so visible on the smaller displays that populated living rooms in prior decades).

[SM] This really is a function of size of the pixels and viewing distance as the human retina has a fixed resolution in degree visual angle (around 1-2 minute of arc depending on context), it is a simple exercise in trigonometry to figure out at what distance a given screen resolution will be coarser or finer than the retina"s resolution... But this is far less important than one tends to think, we got by with SD resolution for decades and people still enjoyed the (low pass filtered) content.

> On an OLED TV (not so much on an LCD) the difference between SDR and HDR is bigger than the difference between HD and 4K. But because HDR generally comes with 4K and tends not to be used much on HD streams, the real standards to contrast are HD (in SDR) and 4K in HDR.

[SM] Not 100% sure, there is plenty of non HDR 4K material outhere, and e.g. our apple TVs defaulted to 4K SDR (but that might depend on the TV they where connected to first).

> 
> The minimum bandwidth needed to reliably provide a 4K HDR stream is about 15Mbps. Because of the way video compression works, a simpler scene may get by with less than 10Mbps. A complex scene (fire, falling confetti like at the end of the Super Bowl) can push this up to near 20Mbps. Assuming some background activity on a typical network, safest is to think of 20Mbps as the effective minimum for 4K. Netflix says 25Mbps to add an extra safety margin.

[SM] Netflix recommendation like >= 15 Mbps for 4K cover the whole internet link so are already slightly generous...

> True that latency doesn't matter much for streaming. For streaming, unlike VoIP, video conferencing, and gaming, bandwidth is more important.

[SM] It is fars forwarding, fast reversing and jumping around in the timeline where latency starts to become immediately perceptible, but these (at least for me) are not that common. 

> 
> VoIP, Video conferencing, and gaming drive low-latency use cases (web browsing is also affected, but as long as the page starts to appear w/in about 1s and has mostly completed within about 5s, users don't notice the lag, which is why even geosync satellite Internet with its several hundred ms latency can be acceptable for browsing). 
> 
> Video conferencing drives high-upload (5Mbps minimum) use cases.
> 
> 4K streaming drives high-download (20Mbps per user or per stream with some safety and overhead) use cases. 
> 
> These are all valid and important overall in architecting needs for an ISP, but not all will necessarily be important to every user.

[SM] Yes, but ISPs will need to do some planning ahead and hence by necessity need to operate with "average users" that average over all the differences in individual requirements and use-cases.

> 
> Cheers,
> Colin
> 
> -----Original Message-----
> From: Starlink <starlink-bounces at lists.bufferbloat.net> On Behalf Of starlink-request at lists.bufferbloat.net
> Sent: Saturday, March 16, 2024 1:37 PM
> To: starlink at lists.bufferbloat.net
> Subject: Starlink Digest, Vol 36, Issue 20
> 
>> ...
> 
> I think the 4K-latency discussion is a bit difficult, regardless of how great the codecs are.
> 
> For one, 4K can be considered outdated for those who look forward to 8K and why not 16K; so we should forget 4K.  8K is delivered from space already by a japanese provider, but not on IP.  So, if we discuss TV resolutions we should look at these (8K, 16K, and why not 3D 16K for ever more strength testing).
> 
> Second, 4K etc. are for TV.  In TV the latency is rarely if ever an issue.  There are some rare cases where latency is very important in TV (I could think of betting in sports, time synch of clocks) but they dont look at such low latency as in our typical visioconference or remote surgery or group music playing use-cases on Internet starlink.
> 
> So, I dont know how much 4K, 8K, 16K might be imposing any new latency requirement on starlink.
> 
> Alex
> 
> Date: Sat, 16 Mar 2024 18:21:48 +0100
> From: Alexandre Petrescu <alexandre.petrescu at gmail.com>
> To: starlink at lists.bufferbloat.net
> Subject: Re: [Starlink] It’s the Latency, FCC
> Message-ID: <d04bf060-54e2-4828-854e-29c7f3e3de98 at gmail.com>
> Content-Type: text/plain; charset=UTF-8; format=flowed
> 
> I retract the message, sorry, it is true that some teleoperation and visioconf also use 4K. So the latency is important there too.
> 
> A visioconf with 8K and 3D 16K might need latency reqs too.
> 
> _______________________________________________
> Starlink mailing list
> Starlink at lists.bufferbloat.net
> https://lists.bufferbloat.net/listinfo/starlink



More information about the Starlink mailing list