You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am running a bandwidth test between a server-client pair along a one-hop path. The bandwidth test is of maximal capacity. I am testing different TCP buffer sizes in order to obtain optimal client-side data-rates. However, I noticed that the server-side data-rates are multiplicatively larger than the client-side data-rates, in an unusual way, for larger TCP buffer sizes. For instance, at a TCP buffer of 40MB,
are the datarates I get on the Server Side. On the client side, it averages around 30 Gbps.
The server and client side datarates seem to be approximately equal at around 1MB TCP buffers, where the data-rates average around 25-30 Gbps, and at around 4-5 MB, they seem to start getting weird (where the server side starts getting multiplicatively higher datarates).
Thanks!
The text was updated successfully, but these errors were encountered:
I am running a bandwidth test between a server-client pair along a one-hop path. The bandwidth test is of maximal capacity. I am testing different TCP buffer sizes in order to obtain optimal client-side data-rates. However, I noticed that the server-side data-rates are multiplicatively larger than the client-side data-rates, in an unusual way, for larger TCP buffer sizes. For instance, at a TCP buffer of 40MB,
are the datarates I get on the Server Side. On the client side, it averages around 30 Gbps.
The server and client side datarates seem to be approximately equal at around 1MB TCP buffers, where the data-rates average around 25-30 Gbps, and at around 4-5 MB, they seem to start getting weird (where the server side starts getting multiplicatively higher datarates).
Thanks!
The text was updated successfully, but these errors were encountered: