(note: this post has been closed to comments; comments about it on other pages will be deleted!)
Okay, this is going to be my last post on the topic of how .999...=1, started in this post, and continued here. I will engage in more refuting, but I have begun to see how useless the refutations are because many of the "non-believers" (and I use that word jokingly because, as I wrote here, I don't really consider it a matter of belief) don't bother to visit the refutations page or don't read it if they do. This blog has gotten over 70,000 hits since the original post was on the front page of digg. The discussion there and at numerous other small forums makes it clear that the refutations aren't being read. There have even been meta-discussions on how this fact can get a warning from digg about containing inaccurate information, even though every knowledgeable source of information agrees with me. The only reasonable criticism I found on the digg site is that this doesn't belong in the news because the proof has been around for so long that it ought not count as news. Unfortunately, enough people seem to disbelieve it that even old news needs to be explained just one more time in the hopes that a few people will come to understand the math better.
But on to the refutations. This time, I will do them in decreasing order of sophistication of the argument. This puts the most reasonable first. (And again, I give full credit to some of the comments for their help in refuting.)
Variations on: 1 - .999... = 1/infinity, which is greater than zero.
I already discussed this a little bit, but there's more to be said. While there are some mathematicians who have formalized the notion of infinitesimals, those systems require a very careful formal extension of the real numbers to what they call the hyperreal numbers. I admit that I am not versed in this stuff because it is way beyond my understanding. However, I understand enough to know that they are not meant to replace the real numbers, but to extend them. The inventors of hyperreal numbers and the algebra of infinitesimals assuredly still agree that within the real numbers (a caveat that I have been very careful to include ever since this objection arose) .999...=1.
Within the real numbers, 1/infinity (which I will represent in this essay with the variable E, for epsilon, a standard name for very small (but not infinitely small!) numbers) behaves just like the number zero. As an example of this, I will refute:
Variations on: .333... doesn't really equal 1/3. It is just slightly less than 1/3 in the same way that .999... is just slightly less than 1.
(Now remember, I have to put some words into objectors' mouths, since they do not present their objections precisely, which is the whole problem. I apologize if I mischaracterize an objection somehow.) The only way to make this consistent, I think, is to claim that:
1/3 = .333... + E (remember that E is 1/infinity)
This would mean that:
1 = 3/3 = 3(.333... + E) = .999... + 3E
So if you object like this, tell me about this 3E number. Is it greater than E? If it is, then .999... isn't really as close as you can get to 1, is it? "Aha, Mr. Polymath! You yourself said that you can't use normal algebra on infinity, so it's still possible that the 3E number (the residue from tripling 1/3) might be the smallest possible postive number!" Yes, indeed, I said that. But if you're going to say that E and 3E are really the same thing, then you are saying exactly what I'm saying. If E and 3E are the same, then E is behaving exactly like the number zero. Or, if you don't like using algebra on this, we can just consider it like this: if .333... lacks something when it tries to be 1/3, and if .999... lacks the same thing when it tries to be 1, then either that lacking quantity is behaving just like the number 0, or .999... is actually not as close to 1 as you can get.
Variations on: Why do you insist that there be a number between .999... and 1? Can't .999... simply be the next number down from 1?
The real numbers have been defined in such a way as to guarantee that they are "dense". That means that between any two real numbers, you can always find another real number (infinitely many, actually, but certainly at least one). The easiest one to find is typically the average (x+y)/2. But whether you specifically refer to the density property or not, I would still challenge these objectors to tell me what happens if you add .999... and 1, and then divide by 2. I assume they would have to answer that 1+.999... = 1.999..., and then 1.999.../2 is what? .999...5? I covered in the last refutations essay how .999...5 is an abuse of notation and doesn't have any meaning because you can't tell me the denominator of the place value represented by the 5. Sooooo.....what is 1.999.../2? If it's something larger than .999..., then you've contradicted yourself since now .999... is not the "next number down" from 1. If you say it's less than .999..., then you've just completely ruined the idea of taking an average and ending up between the numbers you started with. And if you say it's equal to .999... then (using N for .999...) you're saying that:
(1 + N)/2 = N, which means (if you multiply both sides by 2):
1 + N = 2N, which means (if you subtract N from both sides):
1 = N, which is what I've been saying all along.
The only thing left for you to say is that the average of 1 and the "next number down" from 1 doesn't exist, since that average can't be less than, greater than or equal to .999.... And if the average doesn't exist, you'll have to deny the basic fact (called closure) that if you add two real numbers or divide a real number by 2, you'll end up with another real number. I suspect you're not really trying to deny the closure of the reals.
Variations on: .333... is an approximation of 1/3, but it's not actually equal to 1/3.
These objectors seem to think of the number .333... as having some independent existence from the number 1/3, and thus has the option of being the same or different from it. Actually, .333... is merely a notational description of what happens when you divide 3 into 1, which is the meaning of 1/3 (see below). If you do long division to determine the decimal equivalent of 1/8 (and I'm not going to demonstrate that here), you'll find that the tenths place contains a 1, the hundredths place contains a 2, and the thousandths place contains a 5, and the ten-thousandths place (and every place after that) contains a 0. We write that as 1/8=.125 with no problem (although it clearly could also be written as .125000...). When you try the same thing with the long division for 1/3, you'll find that the tenths place contains a 3, the hundredths place contains a 3, the thousandths place contains a 3, the ten-thousandths place contains a 3, and eventually you'll notice the pattern. Every place you might ask about contains a 3. How are we going to write that? We have decided to write it as .333.... That's all. It's just a notation. .333... has to be the same as 1/3 because it's merely a notation for the result of the dividing 1 by 3.
Variations on: 1/3 isn't really a number because you can't truly divide anything equally into 3 parts without a residue left over.
(Biiiiig breath here...) Wow. 1/3 isn't a number. This represents a pretty basic problem with the understanding of how numbers work, and explaining something this basic to someone who doesn't understand is probably futile, but here goes.
None of these objectors was really claiming that 1/2 isn't a number. They see that dividing 1 by 2 gives .5, but they claim that 3 doesn't really "fit" into 1 (they might be interested in my theory of remainders). While I can (sort of) see how you might give a special privilege to dividing something into halves that you don't want to give to dividing into thirds, do you really also want to give that special privilege to 1/5 (which also has a nice, easy decimal representation)? Or even 1/125? (Of course the reason some numbers make nice, easy decimals is the accident of our base 10 counting system. You can't easily take a third of a dollar, but you can take a third of 90 cents.)
The fact is, it's so possible to divide 1 by 3 that the very meaning of the symbol 1/3 is "the number you get when you divide 1 by 3". We created 1/3 to actually be the result of dividing 1 by 3, and we made up a symbol for it (namely, 1/3) that is supposed to remind you of it. If you don't think 1/3 is really a number, you'd better stop all those 2nd-grade kids who are learning about it.
(Whew, that's the best I can do at that one.)
(Actual quote from comments) "Is is NOT and WILL not EVER be exactly 1, because, by DEFINITION, it is
LESS than 1. If you fail to understand this, then you simply fail to
understand the definition of .9 repeating. I was sick in school, and
never went on to uni and higher math, but even I know you're waay wrong." (emphasis mine)
Of course, by now you'll know how I'm going to answer this person mathematically. The definition of .999... actually says that it is equal to 1. But that's not my point in bringing up this objection, which basically is just disagreement by intuition.
This (and other comments of which this is the most blatant) seem to imply the following:
"You mathematicians are just sooooo pointy-headed and obstinate that you have your little blinders on and you can't think outside the box. Out here in the real world, we know what's what, and we don't have to study your stupid Cauchy sequences or Dedekind cuts to know stuff about math. In fact, not having studied formal mathematics actually gives me more credibility, so you should believe what I say."
Along with this seems to come the notion that I am irresponsible for teaching this, and that people like me are what's wrong with education today—namely, a bunch of geeks who don't teach anything about the real world.
No, no, no, no, no. The thing that's wrong with education today is that so many people (not just kids) just want the shortcutting, life-applicable, "give me the bullet" answer without taking the time to really think about anything. Do you truly believe that in the thousands of years of brilliant, creative thinking about math, that no one noticed that .999... seems at first to be less than 1? The sun seems to be revolving around the earth, but we don't question the astronomers. Matter seems to be infinitely divisible, but we don't question atomic physicists. Consuming sugar seems to be the right thing to do when I feel that my blood-sugar level is low, but I didn't question my doctor when he (correctly) explained that this would (counterintuitively) just perpetuate the low-blood-sugar cycle. We trust professional chefs to make our restaurant food taste good, airline mechanics to fix our planes, architects to design our skyscrapers. All of these people have spent years and years studying their fields and they hold lifetimes of research and intuition in their heads. But mathematicians are just too dorky and clueless to understand a basic (although slightly counterintuitive) fact about representing numbers in our base 10 notation system? Give me a break.
I'm not claiming to be right on everything, just this fact about math. And I'm not claiming to be the greatest teacher in the world. But I'm not what's wrong with our education system. Your non-appreciation for rigorous math, and the centuries of thought that went into it, is what's wrong with our education system. Sticking your fingers in your ears and going La-La-La-La-I-Won't-Listen-To-Math-Geeks-La-La is what's wrong with our education system. Show some humility.