ID:1045715
 
Not a bug
BYOND Version:497
Operating System:Windows 7 Pro 64-bit
Web Browser:Chrome 23.0.1271.64
Applies to:DM Language
Status: Not a bug

This is not a bug. It may be an incorrect use of syntax or a limitation in the software. For further discussion on the matter, please consult the BYOND forums.
Setting world.fps to anything over 55 will begin to add additional frames per second unrequested.

For example, setting world.fps to 55 will actually result in world.fps being 56.

Setting world.fps to 54 gives you 54. Setting world.fps to 60 gives you 63.

This does not seem intended.
What BYOND game(or any 2D indie game for that matter) runs at anything higher than ~35?
In response to Kumorii
Kumorii wrote:
What BYOND game(or any 2D indie game for that matter) runs at anything higher than ~35?

Please stay on topic.
Cloud Magic wrote:
Could it be that anything over 55 FPS is pushing the engine to rev things so fast, that sometimes they rev too fast?

Considering the engine runs on Direct X, and Direct X is capable of whatever your hardware can handle, then no.
I suspect it has something to do with rounding, but its impossible to set it at 64 FPS, for example, which would give you pixel perfect / frame locked gliding in certain situations and is currently not possible but should be possible (functionally, and mathematically).
It can't possibly be working too hard for 60 frames per second. I can push my game to the max of 100 frames per second and it runs just fine. There's nothing about it that's working too hard. It get the same exact CPU usage I got at 30. It has to be a bug.
Metamorphman wrote:
Cloud Magic wrote:
Could it be that anything over 55 FPS is pushing the engine to rev things so fast, that sometimes they rev too fast?

Hahahaha are you serious? 'Rev too fast'? This is not a car engine.

FIREking wrote:
setting world.fps to 55 will actually result in world.fps being 56.

How did you find out what fps it was actually showing you?


world
fps = 60
mob
verb/announce_fps()
world << "[world.fps]"
sorry, i deleted my post cause i answered the question for myself just a minute afterwards haha!
but anyways, yeah, i just tried that out for myself. The associated tick_lag value seems to still be correct though.
In response to Metamorphman
Metamorphman wrote:
sorry, i deleted my post cause i answered the question for myself just a minute afterwards haha!
but anyways, yeah, i just tried that out for myself. The associated tick_lag value seems to still be correct though.

Is it?
world
fps = 64
mob
verb/announce_fps()
world << "[world.fps]"
world << "[world.tick_lag]"
world << "[10/64]"


Here's what this outputs:

67
0.15
0.15625

I guess this is probably going to just turn into a feature request for higher resolution ticks but they'll just say no so I guess this is pointless.
There's definitely a pattern here. I tested it with 60, 70, 80, 90, and 100. The "even" ones (60 and 80) all added 3 FPS to it. The "odd" ones (70,90) all added 1. 100 didn't change. Inbetween they all still went off of the FPS I specified. I can't find any pattern for the inbetween numbers though...
sort of. I think the issue is it is being cut off at one decimal point withouit the rouding (as you said) which causes problems. Though this may be intended.

world/tick_lag = 10/60 //fps=60
client/verb/test()
src<<world.tick_lag
src<<1/6
src<<world.fps
src<<10/world.fps


output:

0.16
0.166667
63
0.15873

[EDIT] Beaten to the punch! :P
Blah. They're gonna blame it on windows not giving accurate timing resolution (which is true, but in 2012 there are ways around it now). *sigh*
Windows does not in fact give truly accurate timer resolution, although it's not too bad generally. (There are other methods of timing, it's true, but for technical reasons they don't apply.)

The real reason you're seeing a discrepancy though has nothing to do with timer accuracy, and everything to do with how the var is actually stored (and passed to the client). The world.fps var is just a shorthand for world.tick_lag. Internally, tick_lag is always stored as milliseconds. Setting world.fps=60 is rounding that down to 16 ms, and when you read the var back again it's rounding to 63 fps.
Lummox JR resolved issue (Not a bug)
You couldn't run a daemon thread and get a higher resolution timer? Or use a timer library with proper license?

The real reason there's a discrepancy is because you don't store a higher accuracy version of the tick rate because you don't support that high of a rate.
Actually Windows does give high accuracy timer, it's good enough to count even nanoseconds.

http://msdn.microsoft.com/en-us/library/windows/desktop/ ms644904(v=vs.85).aspx
In response to Zaoshi
Zaoshi wrote:
Actually Windows does give high accuracy timer, it's good enough to count even nanoseconds.

http://msdn.microsoft.com/en-us/library/windows/desktop/ ms644904(v=vs.85).aspx

They probably still support Windows 98 so that's out the door, haha.

Edit: I was right.
In response to FIREking
FIREking wrote:
You couldn't run a daemon thread and get a higher resolution timer? Or use a timer library with proper license?

We have a higher-res timer, but found it was causing bugs in the frontend that were basically irreconcilable otherwise. Most notably, this was responsible for the infamous file dialog crash.

The real reason there's a discrepancy is because you don't store a higher accuracy version of the tick rate because you don't support that high of a rate.

As I said, it's that the actual tick rate is in milliseconds and has 1ms resolution. Switching to a float tick rate would require changes to the messaging and other internals, and considering the discrepancy is so small it would be pretty pointless. The world.fps var really only exists as a nod to the fact that it should be easier to just say you want a certain rate rather than having to calculate tick_lag manually, which is the way users used to have to do it. It's just a convenience value.
In response to Lummox JR
Lummox JR wrote:
FIREking wrote:
You couldn't run a daemon thread and get a higher resolution timer? Or use a timer library with proper license?

We have a higher-res timer, but found it was causing bugs in the frontend that were basically irreconcilable otherwise. Most notably, this was responsible for the infamous file dialog crash.

The real reason there's a discrepancy is because you don't store a higher accuracy version of the tick rate because you don't support that high of a rate.

As I said, it's that the actual tick rate is in milliseconds and has 1ms resolution. Switching to a float tick rate would require changes to the messaging and other internals, and considering the discrepancy is so small it would be pretty pointless. The world.fps var really only exists as a nod to the fact that it should be easier to just say you want a certain rate rather than having to calculate tick_lag manually, which is the way users used to have to do it. It's just a convenience value.

Fair enough.