Hello everyone,
I am creating a deterministic lock-step server for my mobile App. All my server does is take requests from players and sends it to all other players connected in the same room. No game logic on server or anything.
While determining how many players my server can handle by benchmarking some areas of the server. I found something that is really disappointing..
It turns out that it is really expensive to send a packet. On my server that is running on DigitalOcean, sending one packet takes a whooping 1600 ticks! this seems incredibly expensive.
I thought it may be because of running the server on Linux through mono. So I tried benchmarking on my windows. It takes roughly 800 ticks to send just one packet.. That is still expensive if you are planning to have the few hundred players on one server.
For example, if it takes 1600 tick to send a packet and you send a packet 30 times a second(which means sending a packet every 33.33 milliseconds) If you do the math: 33.33 milliseconds = 333300 ticks. 333300/1600 = 208.That means you can only have about 200 players connected at the same time if you want to make sure to send an update every 33.33 milliseconds.
This is a little disappointing for server that all it does is relay the packets among all other players.
I wondering if you guys can help me by posting how many ticks it takes to send a packet on your server. You can easily check using C# stopwatch. for example this is how i benchmark mined.
long start = System.Diagnostics.Stopwatch.GetTimestamp();
// If it's the first datagram, begin the sending process
mSocket.BeginSendTo(buffer.buffer, buffer.position, buffer.size,
SocketFlags.None, ip, OnSend, null);
Console.WriteLine("Send time: " + (System.Diagnostics.Stopwatch.GetTimestamp() - start));
Also please state what kind of server and providing you have.
Thank you every much for reading this