Long story short: the previous guy planned wireless networks by pointing at the ceiling and saying “Gee, that looks like a good spot!” They also didn’t consider the range differences between 2.4 and 5 GHz.
As of last week I was baptized in as the new “wireless guy” for my company with a complaint about a remote offices WiFi quality. Due to budget constraints, we don’t have the travel funds to send me onsite. However, I was not surprised when I checked it out remotely (2 access points; L shaped office; ~8500 square feet; obviously huge 5GHz coverage abysses).
While gathering data remotely with an end user (I tried to do a basic passive site survey with a basic WiFi analyzer app), I noticed there was a disparity between the signal strength values from the client perspective and the AP perspective.
The controller/AP showed a client signal strength of -70dbm
The client showed an RSSI of -60dbm
Now on to my actual question:
In my research, [i think] I’ve learned that true passive site surveys go off of the client side RSSI value.
What is the significance here as it relates to how I should be interpreting the data? Does the signal strength on the controller side “not matter at all” as long as the client side RSSI is decent (-67dbm)?
I know there’s more to consider than just signal strength; with this question, I’m just trying to scope my thoughts as precisely as possible but also give you all the relevant details about why had an end user walking around with a WiFi analyzer app in the first place.
No comments:
Post a Comment