ID:2054850
 
so i notice if i put the fps up everything that was slow is now super hyper fast how can i fix this so i don't have to go and edit every icon state and turf etc to slow it down

The FPS is literally the frames per second; if your icon_state has something going for 1.5 frames and another for 3, they will be included in the equation (f/s) just the same way.

The icons, you'll have to change the frames.
However, if you're wanting to change movement speed, you can simply change the step_size or go into the mob Move proc and add a delay.

mob
var/tmp{moving = FALSE;delay = 5}
Move()
if(moving)return
moving = TRUE
spawn(delay)moving = FALSE
..()
In response to Konlet
Konlet wrote:
The FPS is literally the frames per second; if your icon_state has something going for 1.5 frames and another for 3, they will be included in the equation (f/s) just the same way.

The icons, you'll have to change the frames.
However, if you're wanting to change movement speed, you can simply change the step_size or go into the mob Move proc and add a delay.

> mob
> var/tmp{moving = FALSE;delay = 5}
> Move()
> if(moving)return
> moving = TRUE
> spawn(delay)moving = FALSE
> ..()
>


no they wont. the icon state delay is independent from the fps...

also your movement example is horrible. use time comparison, not spawn(). and you're not even returning the value in Move! Why would you suggest that? That example you gave could potentially break his existing code because of how dysfunctional it is
icon_state delays are measured in ticks. Changing the FPS does not affect them unless the ticks are no longer divisible by the world's tick_lag. If the ticks are not divisible by tick_lag, you can get frameskipping.

As for your code snippet, Konlet, it's... well, there's no other way to put it, it's just completely wrong. The spawn is a bad idea and it will completely break the way that the Move function is supposed to function because it returns the wrong values.

#define TILE_WIDTH 32 //you may have to adjust these
#define TILE_HEIGHT 32
#define FPS 40 //set to whatever your world.fps value is
#define TICK_LAG (10/FPS)

#define floor(x) round(x) //better rounding analogues
#define ceil(x) (-round(-(x)))

mob
appearance_flags = LONG_GLIDE //make diagonal movement smoother
var/tmp
next_move = 0
move_delay = 2.5 //4 tiles per second. You may need to make this evenly divisible by your FPS value.

Move(atom/NewLoc,Dir=0)
if(next_move>world.time) return 0
var/frames = ceil(move_delay/TICK_LAG)
var/speed = frames*TICK_LAG
glide_size = TILE_WIDTH/frames
. = ..()
next_move = world.time+speed


Using a spawn every time the user tries to move is a recipe for trouble, and setting a boolean like that is a second recipe for disaster even without mentioning the fact that your return values in Move() are all wrong.
You don't have to change your icons unless they're BYOND 3.0-era .dmi files. In those, the delay is measured in ticks. Nowadays, all icon delays are measured in units of 1/10s, a standard tick, so changing your FPS won't impact that at all.

What probably will change are things like gliding. If you're using gliding instead of pixel movement, you might want to lower glide_size. (And if you don't have a glide_size, you might want to set one, at least as a default, for all movables.)

If you delay movement to a certain speed, and use sleep() or spawn() to reenable movement, that shouldn't be a problem because those go by 1/10s units too. But if you use world.time, you'll need to adjust that because world.time counts ticks. [Edit: world.time is in fact counted in standard ticks, not server ticks.]
If you delay movement to a certain speed, and use sleep() or spawn() to reenable movement, that shouldn't be a problem because those go by 1/10s units too. But if you use world.time, you'll need to adjust that because world.time counts ticks.

Ummm... This bit. It's intensely misleading. I know what you mean, but it's extremely misleading to anybody that doesn't understand how the ticker works.

There is no distinction between ticks and 1/10s units. a 1/10s second unit is a standard tick. When you change FPS, you begin using a non-standard tick.

All intervals of time are measured in 1/10ths of a second in BYOND. It's just that TICK_LAG can be measured in fractions of a 1/10th second interval because legacy is a bastard and it should never have been measured in 1/10ths seconds in the first place.
In response to Ter13
I think we're on different wavelengths here. When I say "tick", I mean a single server tick, based on whatever tick_lag is set to. A standard tick is 0.1s.

Although I actually have to correct myself on one point: world.time is apparently measured in standard ticks, not server ticks.
Although I actually have to correct myself on one point: world.time is apparently measured in standard ticks, not server ticks.

...I don't think that's actually true, because dumping world.time every frame at 60fps gives me multiples of world TICK_LAG.

Either I'm not understanding something, or (more likely) discussing the issue of BYOND's ticker setup is ridiculous in the first place because it's one of those areas of the engine that is pure nonsense.

EDIT: Confirmed: world.time is 100% measured in server ticks, not 1/10th second ticks.
In response to Ter13
Ter13 wrote:
Although I actually have to correct myself on one point: world.time is apparently measured in standard ticks, not server ticks.

...I don't think that's actually true, because dumping world.time every frame at 60fps gives me multiples of world TICK_LAG.

That would be correct, then. If world.fps is 40, then I get multiples of 0.25 for world.time when I output it on every frame. So world.time is measured in standard ticks. If it were measured in server ticks, then each frame it would increment by exactly 1 regardless of FPS.

My revision to what I said earlier was because I looked at the code and saw world.time returns the internal value world_fticks, which is set as world_ms / 100.0 on every server tick.

Maybe we need some more consistent terminology here, so instead of "standard tick" we could say "BYOND Time Units". Although BTU is taken.

[edit]
To clarify, when I say "server tick" I mean one actual tick on the server end, which happens faster or slower depending on world.tick_lag. A standard tick is 0.1s, a fixed unit of time.
or (more likely) discussing the issue of BYOND's ticker setup is ridiculous in the first place because it's one of those areas of the engine that is pure nonsense.

QFE
In response to Lummox JR
Maybe you could call it like the 'Standard Time Resolution' or something.