Monday, December 30, 2019

Bandwidth to edge devices, when is enough really enough?

Hopefully this makes sense. But I understand Server bandwidth inside a data center will always increase to handle the workload of thousands of connections occurring simultaneously. But traditionally we can see that Data Centers have backbones that are far GREATER speeds compared to end devices (workstations etc...).

With this being said what I'm trying to think about is, will we ever reach a limit in terms of bandwidth speeds to end devices that will be able to accommodate for any type of applications/software/connections/resolution that it needs? Like, even the most bandwidth intense applications with the highest resolutions possible, what does that look like from a bandwidth perspective for an end user?

I would think that the biggest player in figuring out what speed will be needed (10GIG/100GIG etc..) is the resolution of the application being used. 4k, 8k, 16k, virtual reality?

Have there been any bandwidth tests using these resolutions? I know youtube videos now support 4k, but I have no idea how to find any information on 16k and Virtual Reality bandwidth specifications.

Anyone have any clue where to find this information?



No comments:

Post a Comment