Thursday, December 12, 2019

I'm misunderstanding something fundamental, I think. Can someone help me out?

Can someone help me understand why this broke? Here's the high-level design:

Pic

There are several thousand clients on each network. In an effort to throttle the amount of bandwidth available to Network 2, I statically set the connection on Link 2 to 100FDX. That link was constantly saturated. When that connection would saturate, Network 1 would turn to shit. Lots of packet loss and latency jitter. To my understanding, doesn't each individual interface have a buffer? Even if Link 2 filled its buffer I would think I'd only see those issues on that network. I never saw a large number of packets waiting in the global buffer on the firewall.

What gives?

After setting Link 2 back to auto (1000FDX, no more bottleneck) all of my issues on Network 1 disappeared.



No comments:

Post a Comment