Thanks Manmax for your answer, but it missed my point a bit:
A few milliseconds of inaccuracy wouldn't be bad. But now I got inaccuracy for half a second, which is quite obviously on the start of a racing game. Like I mentioned, the problem is the ping which always stays high on the same level. When I got pings like 250ms, but the actual ping on two local instances should actually be around 16, that's not a surprise. The ping is calculated on level start, so it is quite high because of the level loading process. But I still don't get why the ping is not getting updated afterwards.
The game didn't go far yet (no big data processing), my computer here at work is amazingly fast, so the profiler data is inconspicuous.
//supplement
While writing this post I got an idea myself:
My initial approach was like this (like the quote in the first post): On level start the server sent his time to the other player every half second while the countdown went down. The client tried to calculate the offset between his time and server time with the added ping. Since the ping did not get updated, the offset differed around two times the ping.
Now I let the client send (empty) RFC calls to the server in similar intervals. By that at least the ping on the server side gets updated and now shows the expected 16-17ms. The ping on client side still stays wrong on high level and does not get updated. To avoid this problem I simply add the ping twice on server side now. Now the time is the equal (enough) on both instances.
I am not totally happy with this solution, because it feels a bit like a hack (sending empty RFCs, using only the server ping) and not like the solid way to do this, but at least it is working. I am going to test it with higher latency now.
Any hints or opinions are still welcome.
