Saturday, April 6, 2019

Why is ping the sum of send and receive time in gaming?

Suppose Machine1 sends a packet to Server1 in 30ms and receives data from other computers in 25ms, through the server. We will say that the latency of the Machine1 to the server is 55ms. But, if the computer can send and receive simultaneously (which I am not sure about), then ping should be the maximum latency, viz. 30ms, as while the data is being sent in 30ms, in that time period only the data is received. Please clarify and explain if a client can send and receive packets simultaneously or not.



No comments:

Post a Comment