Off topic, but speaking about this.

I don't really like the 0.33333... * 3 = 0.99999... = 1 = 1/3 * 3 "proofs" because they assume various things about how multiplication is defined in regards to something like 0.99999.... To prove 0.999...=1 you first need to define what you mean by 0.999.... Some intuitive ways to define this might be the limit of the sequence 9/10+9/100+9/1000+..., or perhaps 'the smallest real number that is bigger than 1-(1/10)^n for every n'. From both these definitions it follows that 0.999... is a limit which equals 1.

However, one might feel cheated by the above proofs as well, since it seems as though defining 0.999... to be a limit is kinda 'cheating'. This is actually a valid concern. Ultimately the decision to define 0.999... as a real number seems mostly related to convenience of notation and intuition gained by long division. There's nothing saying that in some obscure arithmetical system one might find that there is in fact a difference between 0.999... and 1.

For these reasons it's kind of hard to demonstrate 0.999...=1 to someone who is only working with his/her arithmetical intuition. It's simply convention of definition. What we can do instead is justify, without any awkward assumptions, the intuition behind the definition: there is no obvious way to explain in what ways the intuitive sizes of 0.999... and 1 are different. 0.999... is bigger than every finitely long number of the form 0.(...). And yet it's obviously not bigger than 1. A careful analysis of these two claims shows that reasonably, the difference between 0.999... and 1 should be defined as 0 (or 0.000....1, but hopefully you will agree this number might also be defined as 0).

But the thing is, again, it's ultimately convention and convenience. Unless you're in the unlikely situation where you actually need to concretely define what 0.999... means... which certainly won't happen when you're programming games... you don't need to care what it means. Round it up, round it down, kill it with fire, whatever works for you.
All mathematics is definitional. Imagine you had three numbers, i, j, and k, such that i*i = j*j = k*k = -1, but i != j != k. Also, i * j = k, j * k = i, k * i = j. Finally, they anticommute: j * i = -k, k * j = -i, i * k = -j. So what does that mean? Well, it turns out that it provides a nice representation of three-dimensional transformations. It's referred to as quaternions.

In the standard definition of rational or real numbers, there is no difference between 0.999... and 1. They are the same number. This may conflict with intuitive feelings about how arithmetic works - these intuitive feelings, however, do not define real numbers as used by mathematicians. They're their own mathematical system, that may or may not be consistent or useful.

I don't see why 0.999... being the output of a limit is a concern. The value is well-defined. What other way could you possibly define it?
This is a major misunderstanding one might have. There are a couple standard definitions of the reals. Mathematicians might work with them in highly theoretical settings. But there are no standard 'real numbers used by mathematicians'. Mathematics is not so structured that, say, whenever you're proving a theorem in combinatorics, you are starting from some set of axioms, through the definition of real numbers, finally working your way dooown to whatever it is you want to prove. This isn't even implied. It's a lot more informal. The value of a proof ultimately depends on what it 'does', what it shows. So you have no need to define 0.999... formally until you actually come to a setting where this is useful, and if at that time making 0.999... not equal 1 is useful to you, then you should go ahead and do it.

Since by convention and intuition it's very convenient to define 0.999... to be 1 you will (almost) never see anyone needing to change this. But there is nothing to say that it can't be anything else. And thus when most people doubt that 0.999... equals 1, they're not explicitly wrong, they simply are pointing out one area of the system of real numbers they're working with that hasn't been clarified yet.

As for another way you could define 0.999..., first you need to define a certain system of infinitesimals (an infinitesimal is intuitively what 1/infinity or perhaps 0.000...1 are) and hyperreals in general, and then you can define 0.999.... to be the Kth element of the sequence (0.9,0.99,0.999,...) where K is a hypernatural rank such that the Kth element of this sequence is smaller than 1. This is not necessarily possible to do for all systems of infinitesimals, but it is for some.
So "Assuming you're not talking real numbers, here's a sensible alternate definition". We're talking real numbers, not the hyperreals. There's no sensible way to define the reals such that they're essentially congruent with reals as used by all mathematicians everywhere and not have 0.999... be equivalent to 1. You need infintesmals, and they don't exist in anything that works with reals.
I can see where my post is unclear. When I discuss the reals I'm simply talking about the general concept every middle schooler has of a 'real number' (basically what a middle school teacher calls a real number), you know, those numbers of the form 1/3, 1, pi or 3.1444444(...), not a formal system of real numbers that you might learn in a Real Analysis class. Hyperreal numbers are also called nonstandard reals. The number 0.0(...)1 can be considered a 'real number' in this sense.

Let's try this again. Your misunderstanding is that you're assuming that whenever we refer to 0.999... we're talking about the formal system of the reals we might've been taught in a Real Analysis class. This is true, in the context of such a class. It does not follow that whenever a mathematician refers to 0.999... he is somehow talking about the real numbers in the sense of, say, Dedekind cuts. No such thing is assumed unless the context makes it obvious that this is the intent. Let alone a mathematician: when a child argues that 0.999... is smaller than 1 he is not wrong, since he's not associating 0.999... with any formally defined class of numbers but rather his intuition. According to his intuition 0.999... is something like 1 minus an infinitesimal.

This is why it's perfectly sensible for a child to argue 0.999... is smaller than 1. Unless we associate these numbers with a specific system, this is an unanswerable question. And since in middle school, for example, we certainly don't discuss the definitions of numbers, and we certainly can't supply an actual proof one way or another.

Ultimately telling someone 0.999...=1 in a non-formal setting is pure nonsense. What we should say is that 'in the standard system of real numbers, 0.999...=1'. But the answer is really as you please. The argument is not that it's untrue that in one formal system or another 0.999...=1, it's simply that assuming one definition or another of 0.999... in a non-formal context is nonsense. There's certainly no reason to assume it isn't a hyperreal number, for example.
When performing math on 1/3, you would end up with 3/3, which is 1
Page: 1 2 3