ID:608661
Mar 31 2012, 8:33 am (Edited on Mar 31 2012, 12:10 pm)
|
|||||||||||||
| |||||||||||||
When running a game locally in hardware mode, it maxes out at 60 FPS. If something in the game is happening faster than that (ie: a bullet moving at 100 FPS), it seems to slow the entire game down to almost half speed; to keep the bullet's speed proportional.
|
Nah, nevermind that last post. Anyway, this is a local issue only. What kind of testing do you want me to do with network delays? Normally I have it set to 0
|
If you try setting the delay to 1, 2, 3, etc. you may find a sweet spot where the server acts as it should and the client should look okay (although it will be skipping frames). I suspect the sweet spot would drop the client frame rate down to 50 FPS at most; past that, I often start to see performance drop.
|
I've been running some tests on this myself and I do see a performance change when changing .configure delay, so my theory makes sense there. What I'm seeing in the code is that in cases where the server bogs down in DS, it doesn't modify the delay to compensate. The reason is that delay is based on messages piling up, but when the client and server are in the same process this never happens, so the delay is never adjusted in those cases.
|
In a game where the server doesn't need to do much, maybe it would be feasible to simply lower the client's workload and have it only render at a lower frame rate. One way you can test this is by using the .configure delay command to adjust network lag, which should (even when hosting locally) alter the number of map messages sent to the client. If the client isn't getting map updates as frequently, its drawing workload should go down and in turn free up time for server operations. If you can confirm this works, then I can probably come up with a system to help adjust the network delay to account for DS-hosted worlds; however it would be extremely helpful to have a demo.