On the Cisco docs I noticed this:
Total Latency for bandwidths below 1 gigabit = (Delay*65536)/10, where 65536 is the wide-scale constant.
Total Latency for bandwidths above 1 gigabit = (107* 65536/10)/ Bw, 65536 is the wide-scale constant.
Does that not result in 1Gbps and 100Mbps having the same interface speed?
source:
No comments:
Post a Comment