Hi,
I've lived in relative 5GHz isolation for the past few years by using channels the FCC originally designated unavailable for most residential wireless APs. Even after they were made available for use, most wireless APs weren't updated to actually enable their use.
Yesterday, I noticed an AT&T AP from one of my neighbors using the channel my AP was set to, 100 - 5.5GHz. I switched to 132 - 5.66GHz just for my own sanity and went on with my day. Fast forward to today and I noticed the AT&T box had jumped to 132. I thought maybe the neighbor had done this manually, but after switching again to 100, I saw the AT&T box followed after only a few minutes. Repeated again to be sure, and sure enough the same thing happened.
I'm not questioning the performance implications of sharing a channel, I realize it's not a big performance hit to share. What I'm curious about is the question of my post topic - why might this AT&T AP be programmed to jump to a channel already in use instead of one not being used? As a software engineer, I'm thinking it's poorly written code that just happens to result in this scenario playing out. However, a wireless engineer may recognize a reason behind this.. I'm just curious.
Any ideas?
No comments:
Post a Comment