 tastee freeze said: "... one realizes that no further explanation is necessary because none will be accepted."

That is not true. I might be won over. I might be the one person who proves your statement false :-). I just haven't been won over *yet*. There's a trust issue here. Not that I distrust you. Please don't get me wrong. It's just that if I am going to begin believing and saying that 0.999... = 1, then I need verification and explanation from sources that I fully trust.

Indeed, at least one math expert whom I've talked to today confirms that it is generally accepted in the world of math that 1 = 0.9999.... Now, I still want to read and hear that from other sources before I fully buy into it.

More later, maybe. I really have to get back to work now. Good thing my boss doesn't know how much time I've burned today thinking about whether 0.999... really does equal 1. Sheesh. "Wrong. Repeating decimals are not approximations.

0.33... = 1/3

Equals. Not approximates. Not is almost. Equals. "

Prove to me that 0.33... = 1/3 perfectly.

I want you to perform this calculation on paper and show me the anwser when your finished without using expressions such as ... or the infinity symbol.

Soon as you show me the math done to completion, I will believe that 1/3 = 0.333... instead of it being an approximation.

Good luck, cause you will need it. "If you could show me a mathematical proof that 1 + 1 = 3, that does not mean 1 + 1 = 3, it means there is something wrong with the laws of our math in general."

So you're saying that something is wrong with math because you don't like the fact that .999...=1? Gosh, I wish I had known that before spending all these years studying math in graduate school.

"We know instinctively that 1 does not equal 0.999999..."

Do you *really* have an instinct for what .999... is? I doubt it. Your intuition is leading you astray, which happens often when you start getting into unexpected results in math. The interesting thing is that I can't at the moment think of any division problem other than 1/3, or one that reduces to 1/3, that will yield an infinite series like 0.333... That alone suggests that maybe it is reasonable to think in terms of 1/3 being equal to 0.333...

OTOH, the argument that 0.333.... never quite approaches 1/3 is a very intuitive one. I can certainly sympathize w/it.

BTW, if someone can come up with a division problem that yields 0.333... without also reducing to 1/3, I'd be interested in hearing about that. I fiddled with the problem a bit this morning, but couldn't come up with anything other than 1/3.

Keep an open mind Adam. Read some other sources too. It's important to rely on more than one person's blog. Jonathan, you won't be able to come up with any other division problem that does that, because .3333...=1/3.

.33333 is an approximation; .33333... is not. I took some time over lunch to read the Wikipedia article. It's a well-done article, and I'm now ready to accept that 0.999.... = 1.

The "From first principles" section in the Wikipedia article is interesting. I don't fully understand everything in that section yet, but I understand enough that I'm willing to buy into the conclusion.

The "Alternative algebras and expansions" section of the Wikipedia article is interesting as well. Apparently there are some number systems in which 0.999... <> 1. At least the section hints at that. But I'm way out of my depth at this point, and so-called standard reals are all I really care about right now.

I can even make the conclusion that 0.999... = 1 intuitive in my own mind. It's just a matter of adjusting the way that I think about numbers, of adjusting my underlying mental model.

What I find interesting about all of this, is that ultimitely, at some level, it seems to me that 0.999... = 1 is true because we say it's true. A friend once pointed out to me that mathematicians strive for elegance, and orthogonality, and for consistency (i.e., they hate exceptions). It then seems to me that a guiding principle in determining the rules of math is that rules are accepted when they increase elegance and consistency, and rejected when they lead to exceptions and inelegance.

Maybe I'm way off base here, but is it possible that 0.999...=1 is true because all the underlying rules (axioms?) and assumptions and definitions that lead to that conclusion also lead to desirable elegance and consistency? And if we decided that we really wanted 0.999...=1 to be false, would the ripple effect from that decision not then force us to revise underlying assumptions and definitions and axioms to the point of increasing exceptions and decreasing elegance? Surely the question of whether 0.999...=1 is bound up in the very fabric of mathematics, and to change the answer to that question is to change the fabric, yes?

I know I probably have a strange way of thinking about things. You all probably think I'm crazy by now. At least you've convinced me on the question of what 0.999... really means. x = 0.33...
[mult. both sides by 10]
10x = 3.33...
[subtract x from both sides]
9x = 3
[div both sides by 9]
x = 3/9 = 1/3 "So you're saying that something is wrong with math because you don't like the fact that .999...=1?"

Nope. Never said that.

"Do you *really* have an instinct for what .999... is? I doubt it."

No, I have an intuition that 1 does not equal 0.999...

I have an open mind, I see the points people like j and Davis are trying to make, but at the same time I see faults in their logic.

"A friend once pointed out to me that mathematicians strive for elegance, and orthogonality, and for consistency (i.e., they hate exceptions)."

This is a problem, as it shows that mathematicians are more worryed about things being simple and pretty rather than being correct and precise. Instead of admiting that there could actualy be something wrong with their math, they just invent new rules and math to circumvent the exceptions. I've heard the term 'beautiful math' used to describe these methods of working around an exception to make it 'just work' and I find that very unscientific.

Kinda like how Super String theorys maths fall apart so bad, that they had to invent the concept of us living in a 16 dimensional universe in order for their math to 'just work'.

(book ref: HyperSpace by Michio Kaku)

"x = 0.33...
[mult. both sides by 10]
10x = 3.33...
[subtract x from both sides]
9x = 3
[div both sides by 9]
x = 3/9 = 1/3

Sorry you have been disqualified for using expressions like ... to represent your math instead of doing the actual work as per my original instructions. No cookie for you. :/ If you don't allow me to write 0.33... I can't very well prove anything about it, now can I?

"..." doesn't represent my math. It it the decimal representation of 1/3, which is what we're talking about. "If you don't allow me to write 0.33... I can't very well prove anything about it, now can I?"

Exactly!

You can't prove that 1 = 0.99...!

That was my point.

Also, there is no 100% precise representation of 1/3 using decimal. "Also, there is no 100% precise representation of 1/3 using decimal."

Yes, there is.

It's 0.33...

This is essentially what you're asking me to do, from your own words:

"Prove to me that 0.33... = 1/3 perfectly without using expressions such as ..."

Tell me how I could _possibly_ do this. "This is a problem, as it shows that mathematicians are more worryed about things being simple and pretty rather than being correct and precise."

it's clear that you've never met any mathematicians.

There's absolutely no fault in the logic given in any of the proofs here. Yes, you have an intuition that .999... does not equal 1, but intuition is not logical proof. I have an intuition that there shouldn't be space-filling curves (there are) and functions that are everywhere continuous but nowhere differentiable (there are), but that intuition is wrong.

The fact that .999...=1 follows from the axioms of the Real number system; it also follows from the study of infinite series in calculus. It's a very subtle fact, and there's nothing wrong with finding it hard to believe. However, it is a fact nonetheless. You want him to prove that "0.333... = 1/3" without using "0.333..."? Now who's being unreasonable?

This is as good as you're getting:

0.3333... = 0.3 + 0.03 + 0.003 + ...

= 3 * 0.1 + 3 * 0.01 + 3 * 0.001 + ...

= 3 * (1/10)^1 + 3 * (1/10)^2 + 3 * (1/10)^3 + ...

= Σ(k=1,∞) 3 * (1/10)^k

= lim(n→∞) Σ(k=1,n) 3 * (1/10)^k

= 3 * (1/10)^1 / (1 - (1/10))

= (3/10) / (9/10)

= 30 / 90

= 1 / 3

In step 6 I use the property of the convergence of an infinite geometric series:

http://en.wikipedia.org/wiki/Geometric_series#Infinite_geometric_series I will buy into the observation that 1/3 = 33333... and 2/3 = 0.66666... This is due to the limitation of the base 10 (decimal system).
However unless someone can show me a math problem that yields an answer of 0.99999... that was derived from REAL numbers, I will say that the premis is DEAD wrong. I will accept the premis tha 1/3 = 0.3333... and 2/3 = 0.6666... because this is just a limitation of the base 10 decimal system.
However, unless someone can show me a math problem using REAL numbers yielding an answer of 0.9999... , I say that this is garbage.

3/3 = 1.0 NOT 0.9999... Adam: a proof with out ellipses:

Long division. I've been working on this for days (yeah, right!). Actually for a few minutes anyway.

1/3 is one-third. Also called 1 over 3. Also known in division as 1 divided by 3. So by long division

1 / 3 = 3 tenths remainder 1. 0.3

I have a remainder, the same remainder as the original value, but at the tenths place. Intuitively, at this point, I know that everytime I divide 3 into 1, regardless where in the decimal place, the next decimal place will be 3 with a remainder of 1. The first six terms are

0.3 r .1
0.33 r .01
0.333 r.001
0.3333 r .0001
0.33333 r .00001
0.333333 r .000001

Each with a remainder 1 to be divided by 3. This is practially the definition of an never ending series. It doesn't matter where I stop, I will always have more work to do. So unlinking from day to day reality and digging into the abstract, I have to realize that the proper representation that exactly equals 1/3 is 0.33333|3 with the threes continuing forever, even past the end of all time and all existance, the 3s will continue, never a place to stop and rest because always that remainder 1 ready for another round of divide by 3.

If you were to write this as a limit you might use a limit such that for n starting with 1 and adding the next term forever to the function 3*(1/(10^n)). The value of the limit is exactly 1/3.

The tricky part is understanding that even though humans can not actually reach the never ending nth position, we can know the sum of that sequence.

In this case, 1/3.

There I never once used your prohibited terms. On a side note, what about divergent series; I've read one which tried to show that 2=1... lim (1+(1/x))^x = e
x->oo

if 1/x > 0, then 1-(1/x) oo

1-(1/x) =.9999999... Mathematicians are not making up rules to make things work. If 0.9999... different from 1, then our whole notion of real numbers would be wrong. The issue here is that to precisely define real numbers is not easy. The definition of a real number is NOT a decimal expansion (possibly infinite).

To construct the reals, start with the rationals, and take cauchy sequences of rational numbers. A cauchy sequence is a sequence (a_n) such that for all epsilon >0 there exists an N such that for all n, m>N, |a_n - a_m| < epsilon. Intuitively, caucy sequences are sequences that have a limit (although we can't talk about this limit yet, because it's a "real number").

Take the set of all cauchy sequences of rational nubmers. For those who know what it is, this set is a ring with respect to the usual addition and multiplication.

Look at all cauchy sequences with limit 0 (we can talk about these, because we already constructed 0). This set of cauchy sequences is (by definition) the real number 0. Formally, this set is an ideal of the ring of all cauchy sequences. To construct all real numbers, we take the ring of all cauchy sequences of rationals, and mod out by this ideal (cauchy sequences converging to 0). The real numbers are the elements of this quotient ring.
What this amounts to saying is that a real number is defined as a set of converging sequences of rational numbers that also get arbitrarily close to each other as n goes to infinity. In particular, the constant sequence 1, and the sequence 0.9, 0.99, 0.999, ... are two cauchy sequences of rationals, that get arbitrarily close to one another, so they both ARE 1.

Hope this helps get some intuition about why this is both precise and correct. I have noticed numerous references to the fact that:

"Adding more nines on the end of 0.9 makes it closer to one, but it will NEVER reach there."

I did not know a number could flux.

Also, has anyone mentioned this proof?

0.9999... = 0.9 + 0.09 + 0.009 + 0.0009 + ...

= 9·0.1 + 9·0.01 + 9·0.001 + 9·0.0001 + ...

= 9·10-1 + 9·10-2 + 9·10-3 + 9·10-4 + ...

n=∞
= Σ 9·10-n
n=1

n=N
:= lim Σ 9·10-n
N→∞ n=1

= lim ( 9·10-1 + 9·10-2 + ... + 9·10-N )
N→∞

= lim ( 9·0.1 + 9·0.01 + ... + 9·0.000...0001 )
N→∞ \________/
N digits

= lim ( 0.9 + 0.09 + ... + 0.000...0009 )
N→∞ \________/
N digits

= lim ( 0.999...999 )
N→∞ \_______/
N nines

= lim ( 1 - 0.000...0001 )
N→∞ \________/
N digits

= lim ( 1 - 10-N )
N→∞

= lim 1 - lim 10-N
N→∞ N→∞

= 1 - 0
= 1

Thanks to: http://qntm.org/pointnine I think I can solve this issue with a little thing called common sense. Let's say I have an infinity dollar bill. Great! I'm on my to Gluephoria, my local teachers' store. No, seriously. Wouldn't I rather have 9999999... dollars? I mean, they'd never end, right? Each time I looked in my wallet, there would be another nine dollars. So, problem solved. 9999... is more.

---

You're missing the point. How is 9999... more than infinity? 999.. EQUALS infinity. uhhh, that post was intended to be satire. i'm sure, because i know the poster. you people are such nerds Who are these people who think it most reasonable that math is mistaken rather than their intuition. Most normal uses of the word intuition imply a guess as to the nature of something. Don't these people realilze that by guessing (or intiuting) they are acknowledging that they might be wrong? It seems to me that his case is a good time to declare one's guess wrong, no? I don't think 0.99.. is equal to 1 because .. ok let's backtrack a bit. When is it equal to 1? After you add '9' infinitely many times. But there is no 'after' after an infinity because otherwise it's not infinite. Therefore 0.99.. is a misleading way to describe a limit. Any description should expressly deal with the key idea of 'after infinite operations'. That's exactly why it's misunderstood.

.99.. is not YET equal to one. It will be in infinite amount of years starting.. NOW. You can only post saying that it's equal after you wrote all those numbers, and took a photo as a proof. 1/3 is not .333.. for the same reason. It will be, eventually, but it's not just yet.

Going from the other direction: .999.. is defined as the result of adding these successive numbers if they could be added as a final operation of an infinite sequence, and if we agree to define a n imaginary final operation as resulting in a number equal to 1, then this is it.

The comments to this entry are closed.

## Other blogs I like

• EvolutionBlog
He writes mostly about evolution, but he's a math guy.