Hoping someone here might have some ideas for me.
I'm working on an issue where a wireless security camera at a customer's site across a WAN from their data center is experiencing intermittent interruptions to it's stream. In performing a wireshark from the server that the RTP stream is coming in to, I am seeing an average of about 150ms delta for the stream, but every 5 minutes 7 seconds, there is a massive spike up to over 3-4 seconds, then back down to average.
I've got a debug log from the wireless controller, (Aruba), and it looks like the device is being seen as idle, waits 300 seconds, deauths it, and then the device immediately re-auths and reconnects. The time difference is about 7 seconds total from start to finish of that process it looks like.
I have used a wired camera at the same location on the same switch that the WAP connects through, and it does not exhibit the same issues.
I guess my question is, what could potentially cause an Aruba WLC to believe that a device that is consistently streaming at between 400kbps and 1.5Mbps as idle?
The device uses H.264 for the codec, audio is G.711 and the streams are interleaved. RTSP looks to be not having issues, and the RTCP send/receive reports indicate that there's not a significant amount of loss for the most part. So it's really just delay that is being introduced into this stream.
Just trying to wrap my head around this. I've been spinning my gears on it for a while trying to determine why this might be happening. Unfortunately I don't have direct access to the controller to grab logs myself. I'm using VLC to view the stream to allow me to direct the traffic to the system where I'm performing the capture.
No comments:
Post a Comment