Descriptive Problem Summary:
The "time" variable of animate() disregards world.fps and always plays at 10FPS.
Example: a world FPS of 40 results in 40 frames per second, or 40 ticks per second. This means a sleep(0.25) will sleep for exactly one tick. Using a while() loop that sleeps for tick_lag I can update something at the world's full framerate.
This results in a smoother animation than using the animate() procedure.
Numbered Steps to Reproduce Problem:
set world fps to anything over 10. (Suggest 40 or 60 for testing)
Use an animation.
Expected Results:
animations plays for the specified time at the framerate of the world (fps/tick_lag). A time of 10 at 40 fps should result in 40 different animation frames.
Actual Results:
animations framerate is tied to the "time" variable. A time of 10 at 40 FPS has only 10 frames.
Workarounds:
Use a while() loop as well as the Interpolate() matrix procedure to produce similar results. Using the tick_lag to sleep for each loop, it's possible to perfectly replicate what an animation should do at higher world FPS
I would also suggest adding a "delay" variable to the animate() procedure that defaults to world.tick_lag. This would allow the use of slower delays if necessary.
ID:1378785
Sep 17 2013, 5:35 am
|
|||||||||||||
Not a bug
| |||||||||||||
Just posting a followup: Bravo1 has confirmed he was actually using 1205, not 1208 as thought. The bug that only showed animation when another map change occurred (in this case, most likely an icon that was animated with a 1/10s delay) was in play, causing the visual discrepancy.
|
As I said in the other thread, this is observation error only. The animations are done every "client tick", which is governed not by the 1/10s rule but by world.fps, and would only appear slower when the map drawing is skipping frames.
For the sake of due diligence, I've reviewed the entire process to make sure times can't be rounded off. From the time animate() is called to when it's transmitted to when it's used, the time var is only ever manipulated and stored as floating point, never an integer. But because sometimes analyzing the code isn't enough, I also ran this test:
Again this is in a 40 FPS project. If the times were being rounded off to 10 FPS, the mob would have blinked. Instead, it fades out and back in smoothly, though quickly. Posting output from the debugger confirmed that animation is happening at the proper times.