world/fps = 10
client/fps = 60
What I'm expecting is that a pixel slide from one point to the next should take at least 0.1s because the next slide can't happen until then.
What I'm currently experiencing is, the slide is actually happening almost instantaneously when moving at low speeds (pixels per world-tick). As speed increases, movement appears to become smoother, but at low speeds, it's as if there's no interpolation at all.
GIF
(I'm only recording at 30 FPS though)
I start off taking small, short steps, then start moving continuously.
As far as I can tell, there's maybe one or two frames of interpolation, instead of an expected ~6 frames.
Next, I start taking bigger steps, and then move continuously that larger step size.
The interpolation is very apparent and it actually looks like 60 FPS (although noticeably choppy still).
Are clients not informed of world.fps for calculating pixel glides?
Also, setting animate_movement to FALSE causes weird jumpiness to occur when crossing tiles.
GIF
It would be nice if that would disable client interpolation on a per-object basis.