Wednesday, December 23, 2020

Why do we need bandwidth of a certain size when transmitting a signal in wireless network?

I come from a CS background so my EE and signal theory is not that great. So I was studying wireless networks (let's take the old 802.11b standard) where (for ex) we have a bandwidth of 20MHz per channel and a carrier signal at 2.4GHz more or less depending on the channel. Let's imagine a basic scenario where we have a simple modulation where one high frequency is a 1 bit and a low frequency withing the bandwidth is a 0. My question is: why do we need all the 20MHz? Will we actually utilize them just changing between these 2 frequencies? And if we take more advanced modulations like amplitude or phase, then do we still need to utilize all those 20MHz just to change amplitude? Shouldn't changing power change the amplitude? How are frequencies related to changing amplitude?

Moreover I remember my professor of an unrelated subject saying that "a rule of thumb without any fancy modulation or anything is that 1 Hz = 1 bit transmitted" but still this implies that we utilize all the bandwidth even if we are just using 2 frequencies (1 high and 0 low). How does that work?

Thanks!



No comments:

Post a Comment