(note: this post has been closed to comments; comments about it on other pages will be deleted!)
Okay, so I'm still on vacation, but it's clearly time for me to step back into this. Thanks to the people at scienceblogs, namely Goodmath/Badmath and the EvolutionBlog for taking up my cause. And thanks very much to the people in the comments of my other posts for doing their best to explain. Some of my explanations here use some of their ideas.
Let's do this by classifying the problems people have with my posts, from the most ridiculous to the least (roughly):
You must be a public school teacher, and I fear for your students. You don't know enough math to teach it. Stop filling their heads with nonsense.
Wow. I actually teach at a private school. Some of my former students have gone on to get perfect scores on SAT's, study advanced math at top-level universities, and place well at the national level in math competitions. I have B.A. in math from a top-20 school. All the information online (you can start here and here) agrees with me. In addition, while I know that there are public high school teachers who are not very good with math, all of the ones I know who teach at, say, the pre-calculus level or higher also agree with me. Please don't insult the entire profession just to discredit me. If you're that worried, go to graduate school and study math, and teach it yourself.
.999... is clearly less than 1, but mathematics isn't advanced enough to handle infinity, so you can't prove it. My intuition isn't flawed, math is.
This is another one that's so wrong that I barely know where to start. Historically speaking, this debate is quite old. In the 19th century, this apparent paradox (.999... is clearly less than 1 vs. .999... equals 1) was addressed by some great mathematicians (Cauchy and Dedekind, among others). In particular, they formalized the notion that the real numbers are infinitely finely divisible, and in their formulations, all arithmetic operations worked the way they seemed like they should. Using their formulations, all proofs result in .999...=1. Those formulations are discussed more at the wikipedia site linked above. Mathematics has been handling infinity well (using definitions that don't even require the use of the word 'infinity') for at least 100 years now. Until you've studied that work, you might want to be careful about saying that math can't handle infinity.
Variations on: 1 - .666... = .444..., but .999... - .666... = .333..., so 1 and .999... can't be equal.
You'd think that people trying to argue with mathematicians would at least check their work:
1 - .666... = .444...? If that's true then to check it, I should be able to add .666... and .444... and get 1. But .6 + .4 is already 1. So the sum of two larger numbers can't also equal 1. The check fails.
Variations on:
10x = 9.5
- x = .5
9x = 9So x = 1 and also x = .5. SEE!! I can use your stupid method to prove that 1 = .5!
If x is .5, then 10x is 5, not 9.5. Your equations are unrelated, so of course you can prove something false.
(A lot of birds with one stone):
10 * .999... isn't 9.999..., it's 9.999...0
1 - .999... is .000...1
2 * .999... is 1.999...8
This is a common mistake among my students. You're mistaking notation for mathematics. Notation is not mathematics. Mathematics is the study of ideas about patterns and numbers, and we have to invent notations to communicate those ideas. Just because you can write something that looks mathematical, that doesn't imply that what you wrote has meaning. The number written as .999... has meaning:
9*(1/10) + 9*(1/100) + 9*(1/1000) + ...
Every 9 in the list means 9*(1/something). But .000...1, for example, is an abuse of notation. It doesn't correspond with any meaning, so it doesn't communicate anything. If you think it means something (and putting 1 at the END of an ENDLESS list of zeros shouldn't mean anything), you're going to have to tell me what's in the denominator of the fraction represented by that decimal place. If you can't tell me that denominator, you're not using the notation right. If you tell me the denominator is 'infinity', then see the next entry.
1 - .999... is 1/infinity, which is a number bigger than 0.
Mathematicians don't really use the noun 'infinity' very much, and when they do, it's usually as a shorthand for an idea that is relatively easy to define without the concept of infinity. They do use the adjective 'infinite' to describe lists and sets, and the adverb 'infinitely' to describe how the real numbers are divisible. While some intuitive ideas can be captured by using the idea of 'infinity' as a kind of number, you have to be very careful with it, and standard arithmetic doesn't usually work. But as an intuitive idea, anything that might be written as 1/infinity never behaves differently from the number 0. I can't prove that, however, since 1/infinity doesn't really mean anything. Using infinity as a number creates fallacies that even the doubters of .999... = 1 would disagree with.
.999... effectively equals 1, but it doesn't actually equal 1. It rounds off to 1. You can never really .999... going on forever because you can't live long enough to write it.
These are all really arguments that claim that .999... isn't really a number, and that you therefore have to stop writing it or round it off at some point. Look, either you allow the possibility that there could be infinitely many (note the use as an adverb!) decimal places or you don't. If you don't allow it, you'll have a lot of trouble with proofs that pi or square-root-of-2 can't be written using a number that has finitely many decimal places. If you do allow it, you have to be prepared to discuss what happens if they all equal 9 at the same time, and you have to discuss it without rounding or talking about when they end.
.999... = 1 if you allow limits, but not if you're just talking about numbers. The limit of the series isn't the same as adding up the numbers.
The evolution blog linked at the beginning of this post has an excellent discussion about this. The upshot is that once you admit that there's an infinite geometric series here (which you have admitted as soon as you merely write .999...), there is no difference between the limit and what the thing actually equals. They have to be the same by any defintion that is internally consistent.
Your fraction argument only works if I admit that 1/3 equals .333.... I don't think it does, so I don't think your arguement works. 1/3 can't be precisely expressed with decimal numbers.
Well, at least the people who argue this are not abusing notation, and they're not attacking me personally, and they understand that the assumptions have to be correct in order for the argument to be correct. So I'm giving some credit to this one. But unfortunately, .333... really does equal 1/3. If you think 1/3 equals some other decimal, you're going to have to tell me what it is. If you think that you can't express it with decimals, then remember that the very word 'decimal' itself comes from our base 10 number system, and that's a biological coincidence due to our 10 fingers. In a different base, 1/3 might be no problem, but 1/2 might be. Any problem results from notation, not from the concept of 1/3. Remember, notation is not math, notation just communicates math.
Okay, I really have to go now...I'M ON VACATION, PEOPLE!!!
Thanks... this clears up a problem I was having. I thought that if you subtract .999.. from 1, you would get .111... But this is clearly not true because it would equal a number larger than one. Great article!
Posted by: mark | June 21, 2006 at 10:41 AM
Wow, I too can't believe that bruhaha that resulted from your post. I came over from Good Math/Bad Math and when I saw that you were closing on 100 comments, I was astonished.
Emjoy your vacation
Posted by: franky | June 21, 2006 at 11:51 AM
I just don't see it. Logically it doesn't make sense. I see the math, but how on a number line could 2 different numbers in a sense be one. It's not two different forms of writing it. I see that 1/3 + 1/3 + 1/3=1 but there just seems no effective way of representing 1/3 as a decimal and still retain what it is.
I can't blindly say I beleive your explanation until I understand it, so convince me.
Posted by: sean | June 21, 2006 at 12:40 PM
"I see that 1/3 + 1/3 + 1/3=1 but there just seems no effective way of representing 1/3 as a decimal and still retain what it is."
x = 0.33...
[mult. both sides by 10]
10x = 3.33...
[subtract x from both sides]
9x = 3
[div both sides by 9]
x = 3/9 = 1/3
So, 0.33... = 1/3
It is. Not approximately.
Posted by: j | June 21, 2006 at 12:46 PM
"x = 0.33...
[mult. both sides by 10]
10x = 3.33...
[subtract x from both sides]
9x = 3
[div both sides by 9]
x = 3/9 = 1/3
So, 0.33... = 1/3
It is. Not approximately. "
Your math fails.
You can't start off with x = 0.33... since 0.33... is not a number.
Posted by: Adam | June 21, 2006 at 01:24 PM
Sean, you're getting tripped up on language. They're not actually two different numbers; .999... and 1 are two different *names* for the same *point* on the number line. If you think of our notation as simply being names for abstract entities (numbers, or points on the number line), the fact that .999...=1 is just saying that the same point has two names. Which isn't so strange; the fraction 1/2 also has the names 2/4, 5/10, 100/200, etc.
And Adam, .333... is a perfectly good number. Or rather, a perfectly good name of a perfectly good number.
Posted by: Davis | June 21, 2006 at 01:37 PM
0.33... is a number. It's two pieces of a six-slice pizza.
Posted by: j | June 21, 2006 at 01:40 PM
Adam, all you're doing is announcing to the world that you are a fool. Decimal representations are perfectly fine for numbers. They're just notation for a very well-understood and well-formulated concept. YOU obviously just do not understand them.
Come back when you're done with 3rd grade.
Posted by: Garthnak | June 21, 2006 at 02:29 PM
"Adam, all you're doing is announcing to the world that you are a fool."
No I am announcing to the world that the world is not flat, and everyone is calling me a fool for it.
History does repeat itself.
I stick by my word, no matter how blind everyone may be.
"And Adam, .333... is a perfectly good number."
Prove it is a number, show me it.
"0.33... is a number. It's two pieces of a six-slice pizza."
No, two pieces of a six-slice pizza is 2/6 or 1/3 or a six-slice pizza.
Posted by: Adam | June 21, 2006 at 02:36 PM
*of a six-slice pizza.
Posted by: Adam | June 21, 2006 at 02:38 PM
"Prove it is a number, show me it."
I believe you are being delibrately obtuse.
If you can asking us to express 1/3 in decimal form without using ... or bars or brackets or any other shorthand for recurring decimals, then there isn't time in the universe to express it or space on this blogger's server to store it.
Go read.
http://en.wikipedia.org/wiki/Repeating_decimal
Posted by: j | June 21, 2006 at 02:43 PM
Yeah, you're a real Galileo, fighting against the raging hordes of oppressive, ignorant...mathematicians. They might throw a protractor at you!
You don't have to write out an entire number in order to reason about it, if there is an identifiable pattern or abstraction. In the case of 0.333..., I can tell you what any digit of the number is. Just ask me, and I'll tell you. I don't even have to write it all out, because I have this thing in my head called a brain. It's pretty cool, actually. Brains are great for recognizing patterns. You know, when you use them. See, I can take my brain and point it at "0.333..." and it instantly knows that the ellipses are describing the rest of an infinite series, therefore it can predict that EVERY digit following the initial 0.333 is going to be a "3". Because MY brain knows how real numbers work. Therefore, it can think, "hmm, how many times would that go into 1?" And it knows - three times. Or if it can't figure it out, it could even write out that real number as an infinite geometric series, like I did in my comment to the other post, and then it could determine that the convergence of that geometric series ends up at "1/3".
Your brain obviously does not understand patterns and can not reason about them. You should have that looked at.
Posted by: Garthnak | June 21, 2006 at 02:46 PM
"If you can asking us to express 1/3 in decimal form without using ... or bars or brackets or any other shorthand for recurring decimals, then there isn't time in the universe to express it or space on this blogger's server to store it."
That's the whole point, you can't prove it. I am representing a point in a way that you find out on your own. That is far from being obtuse.
And as for your post Garthnak, it is not the stray protractor that I should watch out for, it is people like you who out of frustration over a losing debate, resort to belittling me instead of keeping a mature non cynical tone.
*dodge*
Posted by: Adam | June 21, 2006 at 03:10 PM
I'll just quote our host's words, point you again to the wiki article on repeating decimals and leave it at that:
"But unfortunately, .333... really does equal 1/3. If you think 1/3 equals some other decimal, you're going to have to tell me what it is. If you think that you can't express it with decimals, then remember that the very word 'decimal' itself comes from our base 10 number system, and that's a biological coincidence due to our 10 fingers. In a different base, 1/3 might be no problem, but 1/2 might be. Any problem results from notation, not from the concept of 1/3. Remember, notation is not math, notation just communicates math."
In base 10, in standard real numbers, 0.99... = 1
Posted by: j | June 21, 2006 at 03:26 PM
Losing? You've provided no proof, only bizarre claims that only demonstrate a poor understanding of mathematics. I'll concede I'm losing when you've successfully disproven a single one of our host's proofs. Stop flattering yourself - I'm frustrated not by losing, but by your obtuseness and outright ignorance. It's like arguing with a two-year-old, who's just SO convinced that the Sun revolves around the Earth.
Posted by: Garthnak | June 21, 2006 at 03:44 PM
"But unfortunately, .333... really does equal 1/3."
Still waiting for the proof.
"If you think 1/3 equals some other decimal, you're going to have to tell me what it is."
There is no number in base 10 that represents 1/3 perfectly.
"In a different base, 1/3 might be no problem, but 1/2 might be."
Showing that different bases can represent these problem fractions without repeating decimals would only help my case, not hinder it.
"Any problem results from notation, not from the concept of 1/3. Remember, notation is not math, notation just communicates math."
What is your point?
Posted by: Adam | June 21, 2006 at 03:57 PM
"You've provided no proof, only bizarre claims that only demonstrate a poor understanding of mathematics."
First, I want to point out that you did not deny my original claim that you resorted to belittling me instead of keeping a mature non cynical mood, instead you jump straight to the bait. I added 'losing' specifically to see if you would skip everything and attack that first, thus showing me if you actualy had an open mind, or was just defending your ego.
Now that my social engineering showed me the truth, I will refrain from further communication with you as it will only be a waste of my time.
Post what you will after this, I'm not going to see it as I rest my case and bid you good day.
Ciao~
Posted by: Adam | June 21, 2006 at 04:04 PM
No, I didn't deny anything, but I have no apologies for being cynical at this point.
However, my mind IS open - if you can provide any proof of your position. These guys have done their leg-work, you do yours. You have demonstrated a seeming failure to understand repeating decimal notation, so I don't know where we can really go from there, hence my frustration - which I believe should be understandable. Oh, and bravo, you exposed an obvious, unhidden truth through your clever use of "social engineering."
I shall await your complete disproofs with bated breath. Calling "1/3 = 0.333..." into question is not allowed unless you have some ABSOLUTELY BRILLIANT new insight that HUNDREDS OF YEARS of mathematicians have SOMEHOW MISSED. Remember, we don't live in Galilean times any more. It's not like "1 = 0.999..." is written in some holy book somewhere; it's something we believe we've proven, time and again, in numerous different ways.
Oh, and if you have such a new insight abut decimals, I would really, truly be interested in that. You'll be world-famous. We'll name the damned thing after you - the Adam Principle, demonstrating conclusively that all mathematics using decimal notation have introduced gaping flaws. If you've come up with that, I'll be the first to apologise, shake your hand, and buy you a beer at the nearest drinking establishment. So please, make with the proofs.
Posted by: Garthnak | June 21, 2006 at 06:24 PM
Dearest Adam,
I am new to this post, and I've come with a very open mind, I can assure you. In any argument, I am the kind of person who tends to drift towards the side that is least well argued. Unfortunately, I believe that at this moment, that side is yours. However, a "Devil's Advocate" (so to speak) is a neccessary component in any group discussion. I am very willing and interested to hear your reasons for believing that .999.... does not equal 1.
You wrote: "Showing that different bases can represent these problem fractions without repeating decimals would only help my case, not hinder it."
Could you please explain this point? I do not understand.
Posted by: Steve Albright | June 21, 2006 at 07:08 PM
This whole circus is amazingly unbelievable.
Posted by: Dave | June 21, 2006 at 11:33 PM
awesome discussion. Im happy to see that my math education was thourough. I have seen that proof before and needed no convincing, but your arguments and descriptions are great. kudos.
Posted by: Tree | June 22, 2006 at 01:42 AM
"I am representing a point in a way that you find out on your own."
--Adam
This is a meaningless sentence. Too bad Adam has left the party.
Boy, Polymath is some kinda bad-ath, isn't he?
Posted by: versus | June 22, 2006 at 09:38 AM
If you're in the UK, the Guardian also mentioned this in their weekly maths/stats column by Gavyn Davis. (Former Chairman of the BBC)
http://www.guardian.co.uk/Columnists/Column/0,,1803028,00.html
I'm fairly certain now that he reads this blog, or maybe scienceblogs which links to this.
IMHO, the best way to illustrate that 0.9999... = 1 is by elucidating that 0.999... actually is. Any other way really requires an implicit algebra of limits to be rigorous - it doesn't obviously make sense that you can can multiply 0.333... by 3 to get 9s in every position.
If we define 0.99..., or indeed infinite decimal expansions in general as a limit of a sequence (e.g. 0.9, 0.99, 0.999...), then the result should be obvious. Or maybe that's my university maths education talking.
Posted by: FhnuZoag | June 22, 2006 at 10:52 AM
Adam: you don't know what you are talking about. Go and buy a book on real analysis and you will see how these sorts of problems were solved in the 19th century, and also hopefully learn what a decimal is, and then you will see that 0.3r is a perfectly good one.
The problem is that you are trying to reason using something you think you understand, when you actually don't. Do you have a problem with things like sqrt(2) and Pi, which do not have even repeating decimal expansions, I would be interested to know?
Posted by: richardj | June 22, 2006 at 03:08 PM
If Adam ever comes back to read here, maybe he'll be nice enough to explain to me why 0.333... does not equal 1/3.
By definition, 1/3 means "1 divided by 3" right? So, if we go back to classic long division (this is really basic math here), and start dividing 1 by 3, what happens?
3 can't go into 1, so the first number we get is 0, and now we have to go down a decimal place to make that 1 into a 10. 3 can go into 10 three times, so
10-(3x3)=10-9=1
So far we have 0.3 with a remainder of 1. We continue our long division by dividing that remainder by 3. (the answer of 3 with a remainder of 1 repeats forever)
Unless Adam is saying that basic long division is fundamentally flawed, I can't see where he gets the notion that 1/3 doesn't equate to 0.3333(repeating 3s).
If he (or someone else) claims that 1/3 is not an actual number in reality, then the very presence of fractions such as "half" or "quarter" are suspect as well.
Posted by: Monimonika | June 22, 2006 at 03:12 PM