I always knew there were people who disputed or disbelieved this fact. It did in no way worry me, because for every fact there are people who disbelieve it. There is even a man out there who claims that (–1)·(–1)=–1.

But then I came across conversations on the discussion forums of the popular *xkcd* webcomic (about which I wrote in this article), and that *did* worry me. After all, the adherents of this comic are all (assumedly) scientifically educated or at least interested—or they could not appreciate the comic fully. So those educated people were discussing the matter as well, without reaching a concensus. So I concluded that it must be a point worth clearing up once and for all, and hence this article.

On some website of jokes in a section titled “You know you are mathematician if …” I read that you could recognize mathematicians by their interest in the question whether 0.999999… was 1 or not. This made me angry; in fact you can much rather recognize non-mathematicians by it. Mathematicians don’t ponder the question because they know the answer, and it is boringly simple; they certainly do not discuss it with their peers. If they ever talk about it, it is in order to explain it to non-mathematicians who have doubts. And this, by the way, is exactly why I am doing it here.

So I will now make a note of the observation that the argument usually brought forth against 0.999999… being 1 is this:

1 is a normal number, whereas 0.999999… is not a number in the usual sense, but aprocess.

and I will adress this in detail below.

There is a Wikipedia article on the topic, but I ran over the page and think it does not adress the arguments commonly brought up against 0.999999… being 1. It simply takes the things for granted that are disputed by people. I thought I could do better. If, for all that, you still feel that your objection was not adressed by my article, please leave a comment, and I will try to clarify that point!

### Decimal literals

First let’s look at a finite decimal literal. We find what it means is a (finite) sum of fractions, e.g.

Now a nonending literal like 0.999999… is therefore an infinite sum, which in mathematics is called a *series*. They can be written with the “Σ” symbol:

*i*= 1

^{i}

And that is the point where the objection from above comes in. Summing up infinitely many numbers is a thing that simply cannot be done, because, indeed, it is a process that would never end. And all of mathematics only deals with operations that end at one point. So the argument is really to the point.

But notations as the ones above are used all the time in mathematics. How can we mathematicians use them if they are impossible? Simple: We mathematicians are a lazy lot, and we use infinite sums simply as a convenient shorthand notation for a quite different thing. In fact, all of mathematicians is a tower of shorthands of shorthands of shorthands of shorthands. If a mathematician sees that infinite sum he/she knows that it is just a shorthand for

*i*= 1

^{i}

*N*→ ∞

*N*

*i*= 1

^{i}

which is the *limit* of the sequence of the finite sums, all of which can be easily calculated. This gives us

This is the actual, precise meaning of a literal like 0.999999…, which you probably already knew. But what is a limit actually? It is the issue all the question hinges on. You may not have learned this at school. I will show you now.

### Limits of sequences

I think we have a fuzzy intuitive notion of a limit. If people are asked to describe it without mathematics they will say thinks like “it is the point where the sequence goes when the indices become infinite.” It is very much like the popular phrase that is fed to school kids about how parallel lines “meet at infinity”. It is a convenient way of visualizing the thing (in a fuzzy way), but if you think about it with any degree of stringency, you get into trouble, because it is simply *wrong*. You cannot track a sequence up to infinity, because it is a never-ending process. Also infinity is not a place nor a number. Parallel lines do not meet at infinity because there is no such place. They do not meet at all. Likewise the sequence 0.9, 0.99, 0.999, 0.9999, … *never* reaches 1.

No, limits are a mathematically precisely defined concept. For a sequence *a _{1}, a_{2}, a_{3}, a_{4},* … we call a number

*a*the limit of the sequence if and only if

*N*

_{ε}∈ ℕ: ∀

*n*>

*N*

_{ε}: |

*a*–

*a*| < ε

_{n}This is of course mathematical notation, which you might not understand, but I just could not resist putting it here because it is so beautiful. What it means in plain English is this:

A sequence is called

convergingto a number (which is then called thelimitof that sequence), if any arbitrarily small vicinity of the limit is eventually entered by the sequence and never left any more.

With “vicinity” we mean an interval of real numbers containing the limit, but where the limit is not one of the borders. Usually we make these intervals symmetric,
namely (if *a* is the limit) [*a*–ε, *a*+ε], which is the interval of all numbers whose distance from *a* is at most ε.

So there has to be a sequence index *n*, where the sequence element *a _{n}* has a distance from

*a*less than ε and

*all subsequent elements do so too.*For example the sequence

does *not* converge to zero because every third element is 5, and therefore for every distance smaller than 5 the sequence will break out of the vicinity around 0 at every third element.

So obviously not every sequence has a limit (i.e., is convergent), but if it has a limit, it has only one. There cannot be multiple different limits to one sequence, which is very easy to prove.

Note that this approach (which was invented by Cauchy in the early 19^{th} century) works in a way backwards from the intuitive notion. We can *not* take a sequence, do the limiting process, and arrive at a limit number. Instead we have to start with a number we think is the limit number (by educated guessing, intuition or one of many tricks that exist for this purpose), and then merely *test* whether it satisfies the condition for a limit as stated above. This may sound like a frustrating state of affairs, but it cannot be done better.

Let’s consider a stupid illustrative example. Imagine a turtle that crawls towards a goal line. After one minute it arrives at a position one meter before the line. But the run has made it tired, so now it is getting slower and in the next minute it only manages half of the distance remaining, ending up at half a meter from the line. It’s getting more tired still, so after the third minute it is still a quarter of a meter from the line. After the fourth minute one eighth of a meter remains and so on.

Now the intuitive approach would be to ride the turtle, so to speak, and see where it will arrive. As we know by now, it won’t work, because it will never arrive anywhere but will crawl on without end.

The Cauchy method means that we guess where the turtle is going (the goal line), so we just position ourselves on that line and consider: For any small distance from the line, will the turtle eventually arrive there and not go back? The answer is of course yes; we can easily give a formula when it will pass any point with distance ε>0 from the goal line (it involves a logarithm). Also we know it will never overshoot the goal, so the limit is not somewhere *beyond* the line. So we have proven that the goal line is precisely the limit of the turtle’s movement.

I think I have said enough now about limits, and you understand the concept. But there is another thing I want to mention because it is fun (for me, at least).

Famous Argentine author Jorge Luis Borges once wrote an essay titled *The Eternal Footrace*, which deals with the famous ancient greek logical paradox of the race between Achilles and a turtle. (Invented by Zeno of Elea, for us it is not a paradox anymore. We have solved it, precisely by the application of Cauchy’s definition of a limit.) To quote from that essay:

One only needs to numeralize the speed of Achilles as one meter per second to give the time he needs as follows:

10 + 1 ++110+1100+11000+ …110000The final value of the sum of this infinte geometric progression is twelve (more exactly, eleven and a fifth; more exactly eleven and three twenty-fifths), but it will never be reached.

The funny thing is that Borges correctly identified the series as a geometric one, but totally failed to give the correct limit. He gave several values, as you saw, but admitted they were only approximations. I have no idea why he could not do it. The correct formula to derive the limit is taught in schools, and has been known for thousands of years. In this case the result is **eleven and a ninth**. Also you might have noticed how turtles seem to be important mathematical animals.

### Wrapping things up

Now we can apply this definition to our sequence 0.9, 0.99, 0.999, 0.9999, …. I claim the limit is exactly one; let’s test if the Cauchy criterion holds.

As with the turtle example, the sequence will never overshoot 1; therefore the limit is certainly not larger than 1. Take for example a distance of 0.0001 from 1: it will be reached by the sequence with the fouth element and the following ones will not drop back down again. A distance of 0.0000000001 will be reached with the 10^{th} element and so on. Generally a distance of ε>0 will be reached with the first element whose index is at least log(1/ε), where “log” is the common logarithm.

So that’s the proof and therefore

What’s there to say more on the issue? I don’t know. You might be saying, “you claim a limit is that thing you described, but I say it is something else.” Well, that is exactly as if you said that “” is not a quarter note, but something else. I am sorry, but it *is*. It just is defined that way.

Mathematical expressions like “0.999999…” are just strokes on the paper (or screen) that were ascribed certain meanings to by mathematicians. And if you use these strictly defined meanings you arrive at 0.999999…=1 as shown. There’s no hidden notion there that transcends the definition. It’s just black strokes, and we define what they mean.

Now please note this: You may very well assign a different meaning to these penstrokes and symbols, which does not give you the value of 1. It is allowed , and mathematicians play with things all the time. There is no conspiracy or establishment or mainstream at work to keep you from it. The usual definition is in no way more true than any other (provided it is logically consistent!), just like baseball is not more true than softball. Just be aware that your results do not tell you anything about the results of people who are using a different definition. In addition, always clearly state your definitions, if they deviate from the usual, so others can see what you were doing. For example, the word “also” means “too” or “as well”, whereas in German “also” means “therefore”. Would you say the English word proves that German is wrong?