In response to Lummox JR
Lummox JR wrote:
The final equation for the x and y of each corner has six terms, combining t, sin(n+mt), cos(n+mt), t*sin(n+mt), t*cos(n+mt), and a constant.

Why are the arguments of since and cosine linear? I'm not seeing why you'd have that just for applying a rotation.
In response to Popisfizzy
Popisfizzy wrote:
Lummox JR wrote:
The final equation for the x and y of each corner has six terms, combining t, sin(n+mt), cos(n+mt), t*sin(n+mt), t*cos(n+mt), and a constant.

Why are the arguments of since and cosine linear? I'm not seeing why you'd have that just for applying a rotation.

Interpolation of transforms is done by this formula, when there is an angle difference (other than exactly 180°):

M = S * R * T

The S matrix represents scale and shear, and T is translation; both of those are done linearly. R is strictly a rotation matrix, and the angle is interpolated linearly. Therefore R is always in the form of matrix(cos(a+b*t), sin(a+b*t), 0, -sin(a+b*t), cos(a+b*t), 0).

So basically with the S matrix every element (except the translation parts, which aren't used) is in the form of a+b*t, and in R it's either sin(a+b*t) or cos(a+b*t). That gives you four types of terms (sine and cosine with and without being multiplied by t), and then the translation adds a t term and a constant.
In response to Lummox JR
Lummox JR wrote:
Have to agree with you on that. Everything up to and including the calculus level can be quite beneficial to programming, but anything more advanced than that probably isn't.

Being able to identify whether a problem is NP-hard and approximate-able, NP-hard and unapproximate-able, or solvable in polynomial time will save you a lot of time when structuring a section of your code around the assumption that algorithms do or don't exist to tackle the specific topic you are interested in. Especially when you want to know how effectively you can approximate a problem, you need to understand some reasonably difficult math.

Sometimes it's not so obvious. For example finding the shortest path from A to B is easy, but it is impossible to find even a (reasonable) approximation to the longest path. Maybe you want to create an automatic map coloring tool that, for minimalism's sake, colors your map in as fewest colors as possible - no. Four colors? Okay. You have two map files stored in a different format and want to automatically check if they construct the same map? Are your maps placed on a grid? Fine. Nodes connected to each other in some way? Better consult the best theoreticians of your generation. Pack items in your inventory into the smallest area possible? Nah. A close-to-optimal area? Okay, if your inventory space is large enough that they don't notice the additive constant.

Of course, it's a fallible rule of thumb since at times polynomial-time algorithms have such large overhead that they can't be used in practice. But familiarity with a lot of classes of algorithms is not unhelpful as a "take a quick smell your surroundings" kind of tool.
In response to Toadfish
Well yeah, some understanding of how algorithms connect to math is kind of a big deal as you get deeper into it.
While math is useful to learning how to write concise algorithms, I find that language skills are far more applicable to 90% of what programmers do.

Understanding linguistics is a really good way to understand code. Most of what we're doing involves math at some level, but really we're shuffling data and creating abstract data models by which to think of our code in the general sense.

If you guys have ever read any of my explanations of things, I like to use metaphors relating to language rather than math/engineering concepts to explain complex programming concepts.

Particularly for this community, I find it far more effective than explaining it in the way you'd typically see in the college intro to programming world.
In response to Ter13
Ter13 wrote:
project-wide settings:

> #define FPS 40
> #define TICK_LAG 0.25
> #define TILE_WIDTH 32
> #define TILE_HEIGHT 32
>
> world
> fps = FPS
> icon_size = TILE_WIDTH
> ...
> ...
> ...
> sleep(TICK_LAG) //sleep for one frame
> sleep(16*TICK_LAG) //sleep for 16 frames
> sleep(10) //sleep for 1 second
>

Using these settings will allow you to change the desired FPS or tile size of the project on the fly without having to update hundreds of lines of code.


This caught my attention.

There is no excuse for using defines for these.

world.tick_lag and world.fps are intertwined, so setting one sets the other, and allows you to access them in your code.

and world.tick_size is also accessible.

The key reason i bring this up, is because it seems like it would make more sense to just have all of your code read world.tick_lag/world.fps/world.icon_size, mainly so you can change the first two at will mid round as needed or for testing.

world. vars are built in vars, hence why you can't make new ones. So they are compiled in and access at the same speed as a proc var, if not faster.

The speed benefit of defining them isn't enough to lose the ability to change the tickrate mid round.
In response to MrStonedOne
Oh shut up. sleep functions are used extensively across many user code files, and the speed difference does show under large amount of iterations. Your rant is unnecessary; you should be using define constants over accessing fixed variables for optimal performance.

I don't care that you think it's "cleaner", because the VM doesn't optimize for that at all.

Edit: Seriously your rant is nothing but stupid.
In response to MrStonedOne
MrStonedOne wrote:
world. vars are built in vars, hence why you can't make new ones. So they are compiled in and access at the same speed as a proc var, if not faster.

Technically no, they're not as fast as proc (local) vars. They access at roughly the same speed as built-in atom vars. First a routine uses a switch() based on the type (world), and then calls the right routine for world vars which does another switch() for the name.

Local vars are much, much faster to access because there's a straight array and they're indexed by number, not by name. Args are the same way.

The speed benefit of defining them isn't enough to lose the ability to change the tickrate mid round.

I suspect the speed benefit is not very high, but to be honest I've never understood why anyone would want to change the tick rate mid-round. Typically there's no reason you'd ever want to deviate from whatever rate was compiled in.
@stonedone:

If you need to be able to change the tick lag:

#define FPS 40
#define TICK_LAG (10/FPS)

world
fps = FPS

#undef FPS
#undef TICK_LAG
#define FPS world.fps
#define TICK_LAG world.tick_lag


It's less about locking them to something that's faster, it's making them changable project-wide without having to touch thousands of lines of code.

There's just a slight added benefit of speed from using a constant when you don't need a variable tick rate. (which, TBH in a polished product is IMO a terrible idea because input is tied to the tick rate. Inconsistent interface response times tends to be a bad thing, not a good thing with respect to game design.)
in production not as much, but we do regular testing, and need to consistently test how the code does at other fpses, as downstream server bases don't always have the same hardware.

in /tg/station, world.fps is a config option, it sets at world init, so it can't be a compile time defined. We also change it a lot for memes, "Time dilation field detected. Please report any space time related oddities." followed by setting world.fps really low. I even did one round where I set it uncanningly high to the point they were asking for 17fps back because it was too smooth.

It's just that nothing that wants to know the current tickrate should be using a define, it should just read directly from world.tick_lag

It seems like you are duplicating information by making it a define, and that doesn't make sense. Whats next? Do you do this with world.area/world.turf. It's stupid and unnecessary and an abuse of defines.

You could get the same result of "It's less about locking them to something that's faster, it's making them changable project-wide without having to touch thousands of lines of code." if you just have those thousands of lines of code use world.tick_lag rather then TICK_LAG, same for FPS and same for TILE_SIZE

in /tg/station, world.fps is a config option, it sets at world init, so it can't be a compile time defined. We also change it a lot for memes, "Time dilation field detected. Please report any space time related oddities." followed by setting world.fps really low. I even did one round where I set it uncanningly high to the point they were asking for 17fps back because it was too smooth.

The situation you've presented here is absolutely horrid design practice. You could emulate time dilation without changing the fps, and if you DO, then the response times/interaction between UI elements will stay consistent.

Every Space Station 13 build I've laid eyes on has been absolute shit, and this is another fine example of why they are.

It seems like you are duplicating information by making it a define, and that doesn't make sense. Whats next? Do you do this with world.area/world.turf. It's stupid and unnecessary and an abuse of defines.

Larger games ~ higher iterations. They do show a difference. It's not an abuse whatsoever; it's optimal programming.

You could get the same result of "It's less about locking them to something that's faster, it's making them changable project-wide without having to touch thousands of lines of code." if you just have those thousands of lines of code use world.tick_lag rather then TICK_LAG, same for FPS and same for TILE_SIZE

You could override FPS's define to
#define FPS world.fps

And you'd only have to change another line of code, the code that initially defines it. Your argument is weak.
TILE_SIZE

Er... No. No you can't. TILE_SIZE can sometimes be a rectangular shape resulting in a string that you would need to parse. In those cases, my TILE_WIDTH/TILE_HEIGHT definitions are quite useful. I've been using them a long time. They serve me well. I don't see how something is useless because it makes a feature that I elect not to use just as useful, but require an almost inconsequential code change to use.

I write a ton of code that has to be flexible and descriptive enough to fit with just about any project because I offer a ton of help around these parts.

I keep these defines because I disagree that it's always necessary to depend on a variable tick rate. Where you do depend on the variable tick rate, the preprocessor macro can simply be set to world.fps/world.tick_lag without consequence.

You and I develop things based on completely different ideologies. I tend to develop based on retro fetishism. Projects emulating NES, SNES, and Gameboy era single-player videogames.

You develop online round-based action/roleplay type of games.

Our logic on this is very different because our mediums have very different ideologies.

For you, interface considerations come second to hard gameplay. For me, interface and ui responsiveness is everything. I don't change the tick rate because it changes the way that the user interacts with the project. I don't recommend changing the tick rate dynamically either for that reason. It can make sense for certain projects, but for those that want to focus on a consistent visual and control response, a static tick rate using a preprocessor macro is completely fine.

Whats next? Do you do this with world.area/world.turf. It's stupid and unnecessary and an abuse of defines.

Nice straw man there. I agree, using it with world.area/world.turf is kind of pointless and stupid. However, the "abuse of defines"... Eh... I disagree on that.

There's all kinds of neat stuff that you can do with preprocessor macros that would be considered "abuse". They are quite useful regardless of whether you consider it abuse or not.

My approach suits my needs and allows me to recycle plug and play code that can be configured to one of several different permutations just by swapping around a few defines in the DME.

It probably wouldn't suit your needs for SS13 because you'd just be disabling it and might as well be using world.fps/world.tick_lag in the first place.

I prefer the ability to use a constant masked behind a preprocessor definition. You don't. It doesn't make my approach stupid or pointless at all.
You and I develop things based on completely different ideologies. I tend to develop based on retro fetishism. Projects emulating NES, SNES, and Gameboy era single-player videogames.

You develop online round-based action/roleplay type of games.

Uhh, no. It has nothing to do with genre; it has more to do with working with a large team and keeping team collaboration manageable. But it doesn't really matter... He's still wrong.

Let's not forget this guy hasn't made Space Station 13, he helps develop a server of it. He's the equivalent of a ripper on BYOND and the game itself awfully plays like one.

I prefer the ability to use a constant masked behind a preprocessor definition. You don't. It doesn't make my approach stupid or pointless at all.

The systems/approaches in Space Station 13 itself are stupid and inefficient. He shouldn't be changing fps to emulate time dilation; that's just lazy and stupid.
You could get the same result if you just have those thousands of lines of code use world.tick_lag rather then TICK_LAG, same for FPS and same for TILE_SIZE

Address this.


It seems like you are duplicating information

Address this
Or, and here is another one.

What benefit does making it a define give, that you can't get with using the world. var?

What does it add to the code?

The answer is nothing, it restricts the code and gives no benefit.
In response to MrStonedOne
MrStonedOne wrote:
You could get the same result if you just have those thousands of lines of code use world.tick_lag rather then TICK_LAG, same for FPS and same for TILE_SIZE

Address this.

Common ripper ideology
Result /=/ Performance

It seems like you are duplicating information

Address this

He's inlining the constants for better processing. But he already did address it, and said in some cases it needs to be parsed, in such case a constant would be a better alternative. Can you not read?

MrStonedOne wrote:
Or, and here is another one.

What benefit does making it a define give, that you can't get with using the world. var?

What does it add to the code?

The answer is nothing, it restricts the code and gives no benefit.

Oh so in other words, to you, "performance doesn't matter." No wonder Space Station 13 is trash. You're the one restricting yourself by dynamically changing world.fps to emulate time dilation like an idiot.

I'd rather have UI interactivity responsive and letting the user have a throughout experience, +inlined constants for better processing, over lazy code.
In response to MrStonedOne
MrStonedOne wrote:
You could get the same result if you just have those thousands of lines of code use world.tick_lag rather then TICK_LAG, same for FPS and same for TILE_SIZE
...
What benefit does making it a define give, that you can't get with using the world. var?

What does it add to the code?

The answer is nothing, it restricts the code and gives no benefit.

Actually those questions dovetail together.

I don't have a dog in this fight, but I see the utility of the #defines. Namely, it's that by using a #define, you can change things up very easily at any time. If you wanted to switch from using world.tick_lag to a hard-coded value at any time, it'd be trivial to change the macro.

Actually the BYOND source has lots and lots and lots of #define'd constants, and this has been an enormous blessing when it's come to making changes to certain things.
Well. yes, but replacing a world.tick_lag with a define for what you assigned to world.tick_lag, in places where only world.tick_lag make sense (like, say, calculating what glide_size to use based on when you know they will be able to move next and world.tick_lag) just doesn't make sense. in your example usecase you still can't move to a system where world.fps is, say, 40, but fps is 20, (to make things smoother without a rise in cpu usage) because it breaks in movement code, where the REAL tick_lag/fps become extreamly relevant.

Either way you have to change a bunch of code

Don't get me wrong, defines are useful for every other usecase, but this one. It duplicates information needlessly and brings no gain, on top of locking the code to static tick_lags for again, no gain.

Ya, you can just replace it with #define FPS world.fps, but now you've still gained nothing, and you slowed down compile time just that much more from defines (yes, this is something we've had to start taking account for at /tg/station)
In response to MrStonedOne
It's not "duplicating" code in any way at all. However I do acknowledge that it could have a minor impact on compile time, and in a project of SS13's size that could add up. (Although I doubt it adds up to more than a couple of seconds.)
However I do acknowledge that it could have a minor impact on compile time, and in a project of SS13's size that could add up.

Acknowledged. It's just not a concern for projects of the size I generally see or work with.

Compile times really don't stack up in my case even with complex projects because I don't use a number of deeply flawed methodologies that SS13's codebase is based on.

One of the reasons that SS13's compile times are so high actually has a lot to do with token duplication.

Every single polymorphic override or definition contains the full object prototype path. Being worried about something as marginally influential as a global preprocessor macro taking a few cycles during the compiler's expansion phase is completely irrelevant when you are throwing dozens of redundant tokens at the compiler every third line of code.

For my usecase, the defines work more than fine. I haven't run into a case where they are a hindrance. At worst, they are of no help but are also not a significant problem.

If it's not for you or the projects you are aiming for, that's all well and good. It's absolutely not completely worthless and it saves me a lot of headache porting code from 16x16 or 8x8 40fps projects to 32x32 60fps projects.


EDIT:

...Just logged out and noticed that there are people on my ban list involved in this one.

The tone of this one, I've read so far as respectful between the three main posters. If any interlopers want to look at this like a fight, I haven't interpreted it that way. StonedOne has his perspective, I have mine. We disagree. We both care about information and logic. This is a disagreement coming from two different perspectives, not two people trying to piss on one another screaming "my way is better.". We just have different constraints that operate on us in terms of development because he works within a hulking mammoth of a codebase with hundreds of cooks stirring the pot, and I largely write small chunks of code to hand off to randoms and tinker with my own single-player stuff on my own time.

I just don't want anyone thinking that this is a pissing match, and if any interlopers kick the volume up thinking it is a fight, you are on your own with that. I respect StonedOne and what he does, and definitely am not here to piss in anyone's cornflakes or anything.
Page: 1 2 3 4 5 6