ID:2299284
 
(See the best response by Ter13.)
So I'm trying to schedule that one unit attacks another unit after x time the being hit animation occurs but how can I calculate the animation time and then how can I be sure about how this value is going to be rounded to milliseconds??


Animation frame delays are handled in milliseconds, just like every other measure of time in DM.
Best response
Animation frame delays are handled in milliseconds, just like every other measure of time in DM.

No they aren't. Animation frame delays are measured in deciseconds.

1 standard tick = 1 decisecond = 0.1 seconds.

server ticks are measured in standard ticks ergo deciseconds.

40fps = 0.25 deciseconds per server tick = 0.025 seconds per server tick (1/40th of a second, or 1/4th of a standard tick)


As for timing in DM, you will never be totally sure of when an animation happens on the client because of network latency. On the other hand, you can schedule animations to be sent roughly immediately after the current one is intended to end, and given consistent latency, things will more or less line up.

As for rounding, anything < one frame is effectively one frame. Anything > 1 frame, and <2 frames is effectively 1 frame.

If an animation's delays are set in such a way that they don't align properly to the frame step on the client, you can wind up with frames skipping (Frame doesn't draw, because the current time is past that frame's end time).

Best thing you can do is understand how time is handled in DM and make sure you line up your animations properly taking into account the number of frames each portion will take up. Always try to work in even multiples of the tick lag if you can, or otherwise tween according to multiples of the tick lag as well.
In response to Ter13
Foiled again.
I never considered frame dropping so I should use multiples of 10/client.fps for frame delays??

Well with trial and error I can time the animations but what I noticed is that sleep(numOfFrames*frameLag) is rounded up and I end up with a delay.

EDIT: well my issue seems to be using very high world.fps now that I reduced that I get far less lag and better sleep accuracy. Also what exactly does client fps do I mean I keep changing it and I see no difference.
It's slightly more complicated than that. You need to round up to the nearest millisecond.

The correct math for your frame delays should be:

ceil(1000/client.fps)/100


Where ceil is:

#define ceil(x) (-round(-(x)))


Effectively that would make 60fps 0.17 deciseconds instead of 0.16 deciseconds.