« The Saga Continues | Main | Refutations »

Comments

Adam

I wrote a short computer program to solve this.

CODE:

Try
If 1 = 0.9999999999... Then
Print True
Else
Print False
End If
Catch Exception ex
Print ex.message
End Try

The result: "Error: Can not convert theoretical values into real world values."

There you have it folks! End of discussion.

Mathgeek292

I think I can solve this issue with a little thing called common sense. Let's say I have an infinity dollar bill. Great! I'm on my to Gluephoria, my local teachers' store. No, seriously. Wouldn't I rather have 9999999... dollars? I mean, they'd never end, right? Each time I looked in my wallet, there would be another nine dollars. So, problem solved. 9999... is more.

Michael

I can't believe people believe this. Adding more nines on the end of 0.9 makes it closer to one, but it will NEVER reach there.

A googolplex has more zeros in it than there are sub-atomic particles in the universe. If you have a googolplex of nines at the end, it won't equal one.

Infinity is no different. The difference between 0.999... and 1 is infinitely small, but to acknowledge that it doesn't exist is complete nonsense.

j

"The difference between 0.999... and 1 is infinitely small, but to acknowledge that it doesn't exist is complete nonsense."

What is the difference between 0.999... and 1? Can you express it? What can be added to 0.999... to make 1?

What's the difference between "infinitely small" and 0?

Adam Drake

Michael:
Your intuition is absolutely correct. However, mathematics is tricky and often your intuition will fail you.

I have written a proof that .9999... = 1 on my site at http://adrake.blogdns.com

If you prefer, you can download the pdf from the following link
http://adrake.blogdns.com/wp-content/uploads/proof.9equals1.pdf

If you have any questions or would like to discuss the problem or others, please contact me.

Another example of intuition being wrong is that there is only one quantity called infinity.

In actuality, some infinities are BIGGER than others.

For example, think of the list of integers, it is infinite in length and consequently size.

However, if you have another list with all the integers in addition to the halves between them, it will be a larger list...a BIGGER infinity.

Tricky thing that infinity.

Warmest Regards,
Adam Drake

Ughb

"What's the difference between "infinitely small" and 0?"

Well, 0 is nothing. Infinitley small is something, but not quite nothing. I'd assume it would be the closest number to 0 that you could possibly express.

The problem is how to express it, assuming you can't do something like 0.000...1

JJ

Oh,oh, I'm afraid you've just introduce your (great) blog's trolls to that wikipedia article...

j

"The problem is how to express it, assuming you can't do something like 0.000...1"

No, you can't. There's no such thing as a 1 at the end of an infinite number of zeros. There is no end to an infinite number of zeros.

ondra

http://qntm.org/pointnine

Kevin

Just because you can't express something doesn't mean it doesn't exist. It just means your method of expression is insuffcient. Infinitely small and not at all are significantly different.

j

So, Kevin, you are willing to believe in numbers so tiny that our understanding of mathematics in insufficient to even express it...

...but you won't believe the mathematical proofs shown here and elsewhere time and again.

Why's that?

Cat

j: I think you just answered your own question.

ib

The "non-believers" have to just suck it up and listen to the experienced mathematicians here.

This 1 = 0.999... deal isn't something that a few crackpots just dreamt up. This is time-tested, proven, and BASIC mathematical fact.

j

'The "non-believers" have to just suck it up and listen to the experienced mathematicians here.'

That's the thing. No, they don't. There's no need for ad authoritatum arguments here. The proof is in the proof.

Garthnak

But j, you have to understand the terms in the proof in order to comprehend it. These folks obviously can't even begin to understand the proofs, so it's just easier to say "listen to the experienced mathematicians". They won't even believe us that we're using the language of math correctly, they have their own bizarre beliefs about how numbers and ADDITION and SUBTRACTION work. How do you correct THAT?

j

What gets me is that people will call into question decimal representation, base 10, algebraic proof, even mathematics itself rather than admit that this makes sense.

Blob

Here's a table that shows how much you'd have to add to make the decimal equal to 1:

0
.
9 0.1
9 0.01
9 0.001

It doesn't matter how many nines you add at the end, because there will always be that small remainder. And it's _impossible_ for it to go away.

Davis

Sheesh, I'd have to fail half of you from my math classes.

There's is a *huge* gap between having finitely many 9s, and infinitely many. Very, very huge, and that's where so many of you are being led astray.

Numerous valid proofs have been presented in the discussion showing that .999...=1. As far as math goes, that's the end of the story *unless you can find errors in the proofs.* It's fine to have these misunderstandings about math, especially regarding a subtle fact like this -- ignorance can be corrected with effort. It's not fine to obstinately insist your understanding of math somehow surpasses that of every working mathematician on the planet (who all understand this fact).

Ken Summers

Polymath, you have just become my idol (and no doubt, my daughter's just as soon as I email this link to her)

Jonathan Gennick

I was thinking about this problem last night. The assertion
that 1 = 0.999... is unintuitive, and it bothers me. Last
night I finally put my finger on why the assertion bothers me,
and, for what it's worth, I'll lay out my argument here.

(Note: Math is a game, and it's all about the rules. I don't
have enough stature to lay down those rules. I'll lay down
an argument against the idea that 1 = 0.9999..., but at the
end of the day I have to accept that I'm a follower of the
rules, not a maker. Someone like Stephen Wolfram might have
enough stature to make some new rules, but I am not Stephen!
I'll come back to this issue at the end of my response.)

Interestingly, at the end of his article, the guy says:

"The only way out for you now, if you still don't
believe it, or to deny the very
existence of the number .9999.... "

And indeed, my line of thought forces me to deny that
0.999... is a number. That may not be a well-accepted
assertion, and I'm not sure I'm really willing to live with
the consequences (because I don't know what they all are!),
but bear with me a bit while I explain.

The crux of the issue here is one of representation. A
repeating decimal does not represent a "number". It
represents our best attempt to *approximate* a number that
our decimal system does not allow us to represent precisely.

A repeating decimal does not so much imply a number as it
does a division problem that will take forever to solve.
Consider the problem the fraction 1/9 using decimal
notation? We can begin by writing:

0.1

But that's really 1/10, leaving us just a bit short of 1/9.
We can get our representation closer to 1/9 by adding 0.01
to get:

0.11

But we're still short, so we can add 0.001 to get 0.111, and
we can keep on adding small amounts forever. This is why we
commonly say that:

1/9 = 0.1111111...

But it's not really correct to say that "one-ninth *equals*
point-one repeating". It's more correct to say that "the
division of 1 by 9 in our decimal system will leave us
writing 0.111... forever, and we'll never, ever get to the
answer, because our decimal system cannot represent the
precise result." That infinitely long series of 0.111...
will not "equal" 1/9. It is simply our way of representing a
the result of a division problem without end. Thus, 0.111...
is not a number.

Well, what about numbers like pi and e that go on forever?
I'm out of my depth in arguing about those numbers, but it
seems to me that in pi and e we have managed to define
numbers that we are unable to represent precisely. Thus, we
use the pi and e symbols to represent those values. In the
real world, when we want a discrete result from an
expression involving pi or e, we must settle for an
approximation.

The whole business of multiplying 0.999... by 10 to get
9.999... is then invalid, because 0.999... is not a number.
In order to do that math, we must first decide on a number,
and that means no repeating decimals. We'd need to decide on
a discrete value to use in place of 0.999..., and the moment
we do that, the rest of the "proof" falls apart.

Now, I need to come back to "math as a game". There are
implications to the assertion that 0.999... is not a number.
Am I willing to live with those implications? I'm honestly
not so sure. Mathematicians have been "playing the game" and
thinking about the implications for centuries, far longer
than I've been around. If people who have spent their whole
lives playing the game have decided that the rules work out
better when repeating decimals such as 0.999... are
considered to be numbers, then I am reluctant to toss out
their collected wisdom on the basis of one consequence of
those rules that I don't (initially) like.

I'm not fully convinced either way yet, on the issue of whether
point-nine repeating is the same as one. I'm keeping an open
mind on the point for now. I do know a few mathematicions whom
I trust not to mislead me with numerical trickery, and I'm
hoping to get the chance to discuss the issue w/them.

Adam

"Numerous valid proofs have been presented in the discussion showing that .999...=1."

If you could show me a mathematical proof that 1 + 1 = 3, that does not mean 1 + 1 = 3, it means there is something wrong with the laws of our math in general.

We know instinctively that 1 does not equal 0.999999...
If you can use math to show differently, then that proves not that 1 = 0.99999... but that there is something wrong with your math, or the laws of our math itself.

Thus, every proof shown in these discussions that tryed to show 1=0.999... is wrong.

1 != 0.999...

The problem here is that usualy only math teachers understand the problem enough to explain it, and unfortunatly they are also the least likly candidates to step out of the box and dare consider the laws of math that they swear by are actualy at fault.

Adam

The post above this one by Jonathan Gennick hit the nail on the head.

0.9999... is not a number and therefor can not be used in a mathematical equation until you define its finite limit.

Thus all proofs used to show 1 = 0.999... fall apart from the start since they all failed to convert the 0.999... into an actual number first.

Like I said above in my previous post, if your proof can prove that 1 + 1 = 3, it is not our intuition that is wrong, it is your math.

j

More "math is wrong" arguments.

"The crux of the issue here is one of representation. A
repeating decimal does not represent a "number". It
represents our best attempt to *approximate* a number that
our decimal system does not allow us to represent precisely."

Wrong. Repeating decimals are not approximations.

0.33... = 1/3

Equals. Not approximates. Not is almost. Equals.

Any finite representation of that infinite decimal is an approximation.

It is possible for a number to have multiple decimal representations.

0.5 = 0.49...
0.25 = 0.249...
and
1.0 = 0.9...

tastee freeze

It is actually sort of liberating to read the resolute posts expressing denials of not only the proof, but mathematics in general. Instead of feeling that someone MIGHT get it after a few more attempts at explanation, one realizes that no further explanation is necessary because none will be accepted. Who would've thought that talking until you're blue in face would feel so good?

Jonathan Gennick

[begin quote]
Wrong. Repeating decimals are not approximations.

0.33... = 1/3

Equals. Not approximates. Not is almost. Equals.
[end quote]

From a certain point of view, I can see this. Yet at the same time I can think from a different point of view that leads me to disagree. This is where my "math is a game" comment comes into play. The rules we accept are the foundation of our arguments, and our arguments are pointless unless we all agree to play by the same rules, use the same definitions, etc.

I don't know the guy behind this blog, so I don't yet have the level of trust required to take his word for the underlying rules. Nor do I know the person whom I quoted well enuogh to take his word for anything. The original blogger did get me interested in the issue though, enough that I'm digging into it on my own. I give him credit for that. It's been an interesting question to think about too.

I remain for now with an open mind. I've a wikipedia article to read, some people to talk to, etc.

The comments to this entry are closed.