ID:260529
 
See [link]

I don't know if it's true if the delay can be abused, but I do think it may be better if the developer had control over this delay.

Introducing client.tick_lag. The default value of -1 says that the client should decide what this value should be. Any other value overrides it.

Perhaps even client.virtual_tick_lag, which is the current tick_lag set by the client: you could run a loop and check if this is out of a certain bounds, and if so set client.tick_lag to 0 and back to -1 to reset it.

Or you could, of course, take the easy way out by preventing the network delay from being set to anything higher than "50", which may actually still screw up timing.

-- Data
".configure delay" seems to be a purely client-side effect. To see this, try:

mob/Move()
world << world.time
..()


Then, set the delay to 100, and move around. The times outputted will be correct, it's just that you (not the server) won't see any effect of the movement until every 10th second.
In response to Garthor
Garthor wrote:
".configure delay" seems to be a purely client-side effect.

Yeah, but couldn't it be abused if say, I set mob.sight|=(SEE_TURFS|SEE_OBJS|SEE_MOBS) or something akin to that for 1 second? The player could then potentially cheat and see the screen for a longer time, allowing them to interpret what exactly is surrounding them.

-- Data
In response to Android Data
Perhaps, but I don't see that as being a huge issue. However, something similar: because the src setting is only handled client-side. So, with a high network delay, you can mash a verb that the src setting should only allow you to do once. IE:

obj/teleporter
verb/teleport()
set src in oview(1)
loc = locate(1,1,1)


The fix to this would be to include the line "if(!(src in oview(1))) return" to have a server-side check on the condition.