• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

How do you know you are not "A.I."?

Ouroboros

Coincidentia oppositorum
Actually, if they repeat, they're not irrational, ie, they can be written as a fraction.
They're rational.
Yup.

That's the definition even on rational numbers, can be expressed as a ratio (a/b). It can have finite or recurring expansion (which is what 1/3 is).
 

LegionOnomaMoi

Veteran Member
Premium Member
My (made up) short hand for representing the difference between the two would be: 0.000...1. I honestly feel that covers the difference between 1.0 and .999...
The problem, however, is where that last "1" goes in your representation of the distance between 0.999... and the "1". No matter how many 0's you put in front of that 1 to form the number that is the distance between 1 and 0.999..., we find that there is no such distance between the two. So, for example, if you state that the distance between 0.999... and 1 is 0.00001, I can demonstrate that this can't be so because clearly 1 - 0.99999999 is smaller than 0.00001.
Your short hand is actually a great demonstration that 0.999...= 1. You keep making your "distance" number (i.e., 0.000...1) smaller and smaller, and I keep showing that the number 0.999... is closer to 1 than the "distance" number you provide. No matter how small of a number you give me, saying "here's the difference between 0.999.... and 1", I can show that there is no such distance between 0.999... and 1. This is because, in fact, there quite literally isn't any distance between 0.999... and 1, as they are equal.
 

LegionOnomaMoi

Veteran Member
Premium Member
Seems like one assertion is just an assumption without a way to verify.
The method used to verify involves limits or epsilonics or sequences or one or more of several tools that serve as foundations for the whole of the real number line and any and all operations on it (and more than just the real number line- much more). The problem is that this isn't trivial (well, it actually is, but "trivial" is a relative term). It is, for example, vastly easier to show that 0.999... and 1 are equal than it is to show that one can perform simple arithmetic with real numbers (as, among other things, this requires formally constructing the real numbers, something we teach students after a few semesters of calculus and other college level math).
 

viole

Ontological Naturalist
Premium Member
Still not seeing it as verifiable. Still haven't seen that shown in this thread. I'm not able to debunk it, as I'm claiming it is not (inherently) verifiable. Needs an (inherent) limit put on infinite decimals to (strongly) suggest is must be equal to 1, rather than (forever) unequal to 1.

It is very easy to verify. If 0.9999999999....were different from 1, then there would be a rational number in between and different from both. There is no such number.

In other words, if they were different, then a = 1 - 0.999999999... >= 0 Would be non zero. So, let's suppose it is a > 0. But that is absurd, since 0.999999.. Converges to one and so the distance between them can be made arbitrarily small,contradicting the premise of a minimal positive distance. Ergo a = 0. Therefore

1 - .9999999... = 0 which entails 1 = 0.999999....

Ciao

- viole
 

Ouroboros

Coincidentia oppositorum
The method used to verify involves limits or epsilonics or sequences or one or more of several tools that serve as foundations for the whole of the real number line and any and all operations on it (and more than just the real number line- much more). The problem is that this isn't trivial (well, it actually is, but "trivial" is a relative term). It is, for example, vastly easier to show that 0.999... and 1 are equal than it is to show that one can perform simple arithmetic with real numbers (as, among other things, this requires formally constructing the real numbers, something we teach students after a few semesters of calculus and other college level math).
My son that's getting his math major told me a few months back than he finally learned the proof (I think) for what numbers are and how to add. Something with empty sets and stuff. Apparently, it's more complicated than I thought.
 

Acim

Revelation all the time
The problem, however, is where that last "1" goes in your representation of the distance between 0.999... and the "1". No matter how many 0's you put in front of that 1 to form the number that is the distance between 1 and 0.999..., we find that there is no such distance between the two.

Because it is unverifiable?

So, for example, if you state that the distance between 0.999... and 1 is 0.00001, I can demonstrate that this can't be so because clearly 1 - 0.99999999 is smaller than 0.00001.
Your short hand is actually a great demonstration that 0.999...= 1. You keep making your "distance" number (i.e., 0.000...1) smaller and smaller, and I keep showing that the number 0.999... is closer to 1 than the "distance" number you provide. No matter how small of a number you give me, saying "here's the difference between 0.999.... and 1", I can show that there is no such distance between 0.999... and 1. This is because, in fact, there quite literally isn't any distance between 0.999... and 1, as they are equal.

I would say because it is not verifiable, to assume equal or to assume infinitely unequal are assumptions. Thus a matter of faith really. I am yet to see verification of .999... equals 1. I prefer to see it as infinitely unequal, but I recognize that is an assumption cause there is no way to know / verify that it is unequal (or equal).
 

LegionOnomaMoi

Veteran Member
Premium Member
Because it is unverifiable?

It is verifiable. You've been given two informal demonstrations. A formal proof would say much the same, but without all the unambiguity and appeals to intuitive (or even counter-intuitive) notions, concepts, etc.

I would say because it is not verifiable, to assume equal or to assume infinitely unequal are assumptions.
They aren't, actually, but that doesn't matter because assumptions are a great way to start a proof (using a method of proof called "proof by contradiction"). If you make an assumption, and find that this assumption implies a logical contradiction, then it cannot be true. For example, imagine I assume that there are only finitely many integers (i.e., at some point there is an integer greater than all others. I know that, given any integer, I can always add 1 and get another integer. Call the largest integer that I've assumed to exist n. I can form the integer n+1, which is necessarily greater than n, which by assumption is the greatest integer. Thus this assumption leads to a logical contradiction, and I've proved my assumption is false.

Likewise, let us denote the number 0.999... by L. Let us assume, as you do, that this number is not equal to 1. It follows then that there must be some difference (or "distance") between L and the number 1 (equivalently, you are assuming that |L-1| > 0, because both numbers are positive and if the absolute value of the difference between L and 1 were 0, it would mean the two numbers are the same).
Now we see if your assumption leads to a logical contradiction. Again, your assumption implies that there must be some difference between the two numbers. So there must be some positive number/value, which we will denote by ε, that separates/is between L and 1. You can pick any value for ε you like (make it as small as you wish) and you CANNOT EVER find a value that separates L and 1 such that L-1=ε.

Now, such demonstrations (even if you were shown them formally) may not convince you because infinities are often conceptually difficult and lead to unintuitive results. But you rely on this kind of reasoning all of the time, even without knowing it. Numbers like pi are uniquely defined only because we can do this very same thing: show that a unique sequence (3, 3.1, 3.14, 3.141,...pi) converges to pi and only pi the same way 0.999... converges to 1. In fact, the entire real number line is made "complete" and rigorous because every sequence of rational numbers converges to a real number. Exponential functions, square roots, logarithms, even the Pythagorean theorem (when sides a an b are 1, making the length of c the square root of 2) require this method.



My son that's getting his math major told me a few months back than he finally learned the proof (I think) for what numbers are and how to add. Something with empty sets and stuff. Apparently, it's more complicated than I thought.

The construction of the real numbers is anything but trivial. Yes, it is still undergraduate mathematics, but

1) that's only because some of the most because some of the most brilliant mathematicians who ever lived took several centuries developing the mathematical tools and subjects necessary just to be ABLE to rigorously define the real numbers, and we get to stand on their shoulders
&
2) many students (usually those in sciences like engineering or physics) never actually learn what the real numbers "really" are

There are a few ways of doing it, the most common pedagogical method probably being Dedekind cuts, but all of them are typically conceptually challenging for students and involve highly unintuitive results. Perhaps my favorite is the fact that even though the rational numbers are dense in the real number line (i.e., between any two rational numbers in any interval there are infinitely more rational numbers), such that there are no "gaps" (pick any rational number and you can find another rational number "infinitely close" to it), they make up a tiny "amount" of the numbers on the real number line. So in e.g., the interval [0,1] we find
1) there are more irrational numbers in this interval then there are rational numbers in the entire real number line
&
2) despite the fact that the rational numbers are packed so closely together the distances between them can be shown to be arbitrarily small, virtually all of the interval consists of irrational numbers (the rational numbers have measure 0).
 

Acim

Revelation all the time
It is verifiable. You've been given two informal demonstrations. A formal proof would say much the same, but without all the unambiguity and appeals to intuitive (or even counter-intuitive) notions, concepts, etc.

I really do not see the demonstrations as verifying the notion that .999... = 1. I will concede that they do for you, but would stipulate that I still see it as based on assumption.

They aren't, actually, but that doesn't matter because assumptions are a great way to start a proof (using a method of proof called "proof by contradiction"). If you make an assumption, and find that this assumption implies a logical contradiction, then it cannot be true. For example, imagine I assume that there are only finitely many integers (i.e., at some point there is an integer greater than all others. I know that, given any integer, I can always add 1 and get another integer. Call the largest integer that I've assumed to exist n. I can form the integer n+1, which is necessarily greater than n, which by assumption is the greatest integer. Thus this assumption leads to a logical contradiction, and I've proved my assumption is false.

Likewise, let us denote the number 0.999... by L. Let us assume, as you do, that this number is not equal to 1. It follows then that there must be some difference (or "distance") between L and the number 1 (equivalently, you are assuming that |L-1| > 0, because both numbers are positive and if the absolute value of the difference between L and 1 were 0, it would mean the two numbers are the same).
Now we see if your assumption leads to a logical contradiction. Again, your assumption implies that there must be some difference between the two numbers. So there must be some positive number/value, which we will denote by ε, that separates/is between L and 1. You can pick any value for ε you like (make it as small as you wish) and you CANNOT EVER find a value that separates L and 1 such that L-1=ε.

This last assertion (especially with the emphasis) strikes me as restating it is unverifiable.

Again, I assume that .000....1 is the distance between the two. I accept that this is not a rational number, but feel I am able to make sense of it in the way I make sense of .999...
It assumes that at some point (which is irrational) that there is a 1 at the end of infinity (also irrational) that would make for the precise difference between .999... and 1 (logical assumption). I do not find it to be verifiable, but also do not find .999... (or .333... or similar such numerical depictions) to be rational.

Now, such demonstrations (even if you were shown them formally) may not convince you because infinities are often conceptually difficult and lead to unintuitive results. But you rely on this kind of reasoning all of the time, even without knowing it. Numbers like pi are uniquely defined only because we can do this very same thing: show that a unique sequence (3, 3.1, 3.14, 3.141,...pi) converges to pi and only pi the same way 0.999... converges to 1. In fact, the entire real number line is made "complete" and rigorous because every sequence of rational numbers converges to a real number. Exponential functions, square roots, logarithms, even the Pythagorean theorem (when sides a an b are 1, making the length of c the square root of 2) require this method.

So far nothing you are conveying is convincing me that I use this kind of reasoning all the time, without knowing it, but I am willing to consider that possibility (feel very open to it). Partially because philosophically and/or spiritually, I do think I use this kind of reason some of the time, and do so knowingly, but understand it as based on assumptions regarding things that more or less appear irrational, but are seemingly sound/logical. Such as: 'now' is an eternal occurrence. Not sure how I would ever verify that, but also seems (or is) sound in my understanding.
 

LegionOnomaMoi

Veteran Member
Premium Member
Again, I assume that .000....1 is the distance between the two. I accept that this is not a rational number
It is a rational number.

but feel I am able to make sense of it in the way I make sense of .999...
The difference is that .000...1 terminates. It is not an infinite decimal. Both are rational numbers (an irrational number is not merely an infinite decimal, but a non-repeating infinite decimal). With an infinite decimal like 0.999..., the fact that it never terminates means it gets "infinitely" close to 1 such that there is no number ε greater than 0 to satisfy the equation 1- 0.999...=ε. The only number which satisfies this equation is 0, and indeed 1 - 1 = 1 - 0.999... = 0.
It assumes that at some point (which is irrational)
There are no irrational numbers here. 0.999... is a rational number. Recall that 1/3 = .333....
You might find that the notion is not rational, but it is a rational number nonetheless.


that there is a 1 at the end of infinity (also irrational)
Again, nothing here is irrational, and the number 0.000...1 is necessarily finite. That's why the ellipses, the "..." appear before the numeral 1. Again, 0.000...1 is a finite decimal, while 0.999... is not. Neither are irrational numbers, and the reason that the second is equal to 1 is because your number that separates them is finite (a terminating decimal).

I do not find it to be verifiable
Ok. Do you find anything verifiable that concerns numbers like pi or the square root of 2 or even of rational numbers like 1/3 (all of which are infinite decimals)? What do you mean by verifiable? Is 2+2=4 verifiable? Or, slightly more seriously, how would you respond to the tortoise's argument to Achilles here: What the Tortoise Said to Achilles
, but also do not find .999... (or .333... or similar such numerical depictions) to be rational.
You don't find 1/3 to be rational?


So far nothing you are conveying is convincing me that I use this kind of reasoning all the time, without knowing it
It's not so much the reasoning that you use, but rather the reasoning you depend on in various ways without even knowing it (Do you believe in the Pythagorean theorem? Do you believe that 1/3 is a single number? Do you believe that pi is a number? How about e? Do you have any faith in probability theory, physics, the technology you use daily (from GPS and navigation systems to your computer and TV), etc. are derived/formulated using faulty assumptions? Do you think logic is subjective?)
 

Acim

Revelation all the time
It is a rational number.

So, I know you state this many times. And I get that for you and likely other seasoned mathematicians it is 'most certainly rational.' I hope you get that for me, it is not. And that however you are using that word, is not making sense to me. How I'm using that word is (along lines of): logically understood, verifiable by some logical proof. Even that definition isn't great cause all you (and others) have to do is say, 'it is logical. We have provided you with the proof to verify this.' So, I go with what my dictionary says 'rational' means and as it has one designated definition for mathematics, I'll provide that: expressible, or containing quantities that are expressible, as a ratio of whole numbers. When expressed as a decimal, a rational number has a finite or recurring expansion.

So, everything up to the last 2 words of the definition are, to me, not occurring with the number .999... because I fully believe (and find logical) that no one (or computer) could ever express the quantity of nines that are intrinsic to that actual number, while I do believe the .999... is very convenient shorthand. I just do not see it as sufficient. The last 2 words of that definition are perhaps all that would matter for another to explain to me that the definition does cover this number in terms of being rational, but it currently doesn't and I'd really appreciate it being explained rather than just told.

I wish I could tie this back to A.I. (topic of this thread) in terms of how I am not understanding it, but am only being told, "it is this." Alas, I don't think I can, but this is for me like someone saying, "who created the universe?" And another person saying "God." And while not a great analogy as there are so many diverse tangents to go from there, just imagine if someone followed that up with, "that doesn't make sense to me, how is that logical?" And the person then responds, "because God is logical. It makes perfect sense." That's how I feel in this thread.

The difference is that .000...1 terminates. It is not an infinite decimal. Both are rational numbers (an irrational number is not merely an infinite decimal, but a non-repeating infinite decimal). With an infinite decimal like 0.999..., the fact that it never terminates means it gets "infinitely" close to 1 such that there is no number ε greater than 0 to satisfy the equation 1- 0.999...=ε. The only number which satisfies this equation is 0, and indeed 1 - 1 = 1 - 0.999... = 0.

I do get this as an attempt to explain what rational numbers are (and are not), but what I'm getting from this paragraph is:
a) .999... does get infinitely close to 1 (but never reaches it)
b) there is no rational number to fill in the gap (that we know of)
c) we have made up our minds that it must be 0, therefore we are right about it being .999... equaling 1
d) this cannot be disputed. Resistance is futile. You will be assimilated.

(Hey, I did kinda sorta come back to A.I.)

Again, nothing here is irrational, and the number 0.000...1 is necessarily finite. That's why the ellipses, the "..." appear before the numeral 1. Again, 0.000...1 is a finite decimal, while 0.999... is not. Neither are irrational numbers, and the reason that the second is equal to 1 is because your number that separates them is finite (a terminating decimal).

For me, if .999... equals 1, then it is (somehow) necessarily finite; otherwise it is a unverifiable (and therefore irrational) number that is getting infinitely close to 1, but is not equal to 1 (ever). While I already pointed out in previous post how I do concede of .000...1 as not being rational (by my definition) but that it does go on for infinity and terminates with 1. That does strike me as logically inconsistent (thus irrational) but seems to work wonders in filling that alleged gap between .999... and 1. IMO, the gap arguably doesn't exist, which would seem to indicate (to me) that I'm coming as close as I have to saying it does equal 1. But what I feel I'm stating is that it is irrational to say there is some finite gap to fill between .999... and 1, as that gap to me would also contain some aspect of infinity, hence my .000...1, number. So, I'm essentially saying the gap is 1, while I hear you (and many other mathematicians) say, no the .999... number equals 1.

Ok. Do you find anything verifiable that concerns numbers like pi or the square root of 2 or even of rational numbers like 1/3 (all of which are infinite decimals)? What do you mean by verifiable? Is 2+2=4 verifiable? Or, slightly more seriously, how would you respond to the tortoise's argument to Achilles here: What the Tortoise Said to Achilles

By verifiable, I mean can be observed and/or logically expressed as accurate, though in this case, I would go with 'as precise.' So, all the irrational decimal items you bring up (irrational by my understanding) are not verifiable, but limits to the decimal sure as heck help to work further with those numbers so that further math can be done. By limit, I mean terminating the decimal number at some point, based on convenience and/or assumption (that it can rightfully be terminated and still be fairly accurate, but not precise).

I actually do question or have some skepticism around 2+2=4, but mostly find it to be observably accurate and able to be demonstrated to me how 2 items combined with 2 other items would equal to 4 items.

You don't find 1/3 to be rational?

As a fraction, (mostly) yes, as a decimal, no.

It's not so much the reasoning that you use, but rather the reasoning you depend on in various ways without even knowing it (Do you believe in the Pythagorean theorem? Do you believe that 1/3 is a single number? Do you believe that pi is a number? How about e? Do you have any faith in probability theory, physics, the technology you use daily (from GPS and navigation systems to your computer and TV), etc. are derived/formulated using faulty assumptions? Do you think logic is subjective?)

Too many questions for me to answer right now. If you feel after reading the rest of my post that asking any one of these questions would help you to help me (understand), then please ask them and we shall see how I respond.
 

viole

Ontological Naturalist
Premium Member
I really do not see the demonstrations as verifying the notion that .999... = 1. I will concede that they do for you, but would stipulate that I still see it as based on assumption.



This last assertion (especially with the emphasis) strikes me as restating it is unverifiable.

Again, I assume that .000....1 is the distance between the two. I accept that this is not a rational number, but feel I am able to make sense of it in the way I make sense of .999...
It assumes that at some point (which is irrational) that there is a 1 at the end of infinity (also irrational) that would make for the precise difference between .999... and 1 (logical assumption). I do not find it to be verifiable, but also do not find .999... (or .333... or similar such numerical depictions) to be rational.



So far nothing you are conveying is convincing me that I use this kind of reasoning all the time, without knowing it, but I am willing to consider that possibility (feel very open to it). Partially because philosophically and/or spiritually, I do think I use this kind of reason some of the time, and do so knowingly, but understand it as based on assumptions regarding things that more or less appear irrational, but are seemingly sound/logical. Such as: 'now' is an eternal occurrence. Not sure how I would ever verify that, but also seems (or is) sound in my understanding.

I believe you are equivocating between sequences and numbers. They are two completely different beasts.

0.99999.... Is not a sequence. It is a number. Numbers do not approximate other numbers without ever reaching them. Sequences do, so to speak.

In this case, 0.99999.. is the limit of the sequence, 0.9, 0.99, 0.999. ..... The sequence "gets closer to 1" without ever touching it. But its limit is "static". It is a number. And since its limit is 1, and limits of convergent sequences are unique, then 0,99999... = 1.

Ciao

- viole
 

viole

Ontological Naturalist
Premium Member
So, I know you state this many times. And I get that for you and likely other seasoned mathematicians it is 'most certainly rational.' I hope you get that for me, it is not. And that however you are using that word, is not making sense to me. How I'm using that word is (along lines of): logically understood, verifiable by some logical proof. Even that definition isn't great cause all you (and others) have to do is say, 'it is logical. We have provided you with the proof to verify this.' So, I go with what my dictionary says 'rational' means and as it has one designated definition for mathematics, I'll provide that: expressible, or containing quantities that are expressible, as a ratio of whole numbers. When expressed as a decimal, a rational number has a finite or recurring expansion.

So, everything up to the last 2 words of the definition are, to me, not occurring with the number .999... because I fully believe (and find logical) that no one (or computer) could ever express the quantity of nines that are intrinsic to that actual number, while I do believe the .999... is very convenient shorthand. I just do not see it as sufficient. The last 2 words of that definition are perhaps all that would matter for another to explain to me that the definition does cover this number in terms of being rational, but it currently doesn't and I'd really appreciate it being explained rather than just told.

I wish I could tie this back to A.I. (topic of this thread) in terms of how I am not understanding it, but am only being told, "it is this." Alas, I don't think I can, but this is for me like someone saying, "who created the universe?" And another person saying "God." And while not a great analogy as there are so many diverse tangents to go from there, just imagine if someone followed that up with, "that doesn't make sense to me, how is that logical?" And the person then responds, "because God is logical. It makes perfect sense." That's how I feel in this thread.



I do get this as an attempt to explain what rational numbers are (and are not), but what I'm getting from this paragraph is:
a) .999... does get infinitely close to 1 (but never reaches it)
b) there is no rational number to fill in the gap (that we know of)
c) we have made up our minds that it must be 0, therefore we are right about it being .999... equaling 1
d) this cannot be disputed. Resistance is futile. You will be assimilated.

(Hey, I did kinda sorta come back to A.I.)



For me, if .999... equals 1, then it is (somehow) necessarily finite; otherwise it is a unverifiable (and therefore irrational) number that is getting infinitely close to 1, but is not equal to 1 (ever). While I already pointed out in previous post how I do concede of .000...1 as not being rational (by my definition) but that it does go on for infinity and terminates with 1. That does strike me as logically inconsistent (thus irrational) but seems to work wonders in filling that alleged gap between .999... and 1. IMO, the gap arguably doesn't exist, which would seem to indicate (to me) that I'm coming as close as I have to saying it does equal 1. But what I feel I'm stating is that it is irrational to say there is some finite gap to fill between .999... and 1, as that gap to me would also contain some aspect of infinity, hence my .000...1, number. So, I'm essentially saying the gap is 1, while I hear you (and many other mathematicians) say, no the .999... number equals 1.



By verifiable, I mean can be observed and/or logically expressed as accurate, though in this case, I would go with 'as precise.' So, all the irrational decimal items you bring up (irrational by my understanding) are not verifiable, but limits to the decimal sure as heck help to work further with those numbers so that further math can be done. By limit, I mean terminating the decimal number at some point, based on convenience and/or assumption (that it can rightfully be terminated and still be fairly accurate, but not precise).

I actually do question or have some skepticism around 2+2=4, but mostly find it to be observably accurate and able to be demonstrated to me how 2 items combined with 2 other items would equal to 4 items.



As a fraction, (mostly) yes, as a decimal, no.



Too many questions for me to answer right now. If you feel after reading the rest of my post that asking any one of these questions would help you to help me (understand), then please ask them and we shall see how I respond.

And you are equivocating between what rational means in mathematics and what it means in general. For instance, your view of rational numbers is not rational :) Rational in math means being the ratio of two integers. Nothing more, nothing less. Therefore 1/3 is rational, by definition. Pi or the square root of 2 are not, because they are not the ratio of two integers.

I hope you do not rate imaginary numbers as figments of our imagination.

That reminds me the old joke of an argument between i (imaginary unit) and pi (real, not rational number):

I: be rational!
Pi: get real!

:)

By the way. i (imaginary unit) at the power of i, is real. Go figure.

Ciao

- fiole
 
Last edited:

Acim

Revelation all the time
Sorry doesn't help. It's telling, not explaining. May as well just say, "it takes faith to understand our religion. After that, it is perfectly logical."
 

Revoltingest

Pragmatic Libertarian
Premium Member
I think I see the problem.
Some appear to believe that infinity has an end.
That's why they don't accept one repeating decimal subtracting another (with the decimal point moved to the left one place) is non-zero.
The definition of "infinite" (ie, it goes on without limit) makes it work.
Without this, even calculus is just an approximation.
 

LegionOnomaMoi

Veteran Member
Premium Member
So, I know you state this many times. And I get that for you and likely other seasoned mathematicians it is 'most certainly rational.' I hope you get that for me, it is not. And that however you are using that word, is not making sense to me.
Fair enough.

How I'm using that word is (along lines of): logically understood, verifiable by some logical proof.
The problem is that even though it was believed that 0.999... and 1 were the same using informal arguments before the epsilon-delta definition of limits was rigorously formulated, it was BECAUSE of logic and DUE TO logic that the equivalence was proved. It is fairly non-trivial to take the explanations present in the thread and translate them into a formal logical proof, as this is at the heart of analysis. So I have to ask, what do you mean by logical proof?
Even that definition isn't great cause all you (and others) have to do is say, 'it is logical. We have provided you with the proof to verify this.'
Ok, let me give you another restatement of this but now followed by a proof:
full

Now, how does one logically proof that e.g., 0.999.... converges to 1? First, here's a definition of convergence:
full

And, perhaps even more importantly, a logical statement of what a limit (i.e., the thing that "something" like a function or sequence converges to) is. That is, a number L is the limit of a function f (similarly for a sequence) if
full

(I included this part merely to show how strictly the requirement that definitions, proofs, propositions, etc., in mathematics must adhere to logic).
Here is a formal proof given in plain language rather than formal (symbolic) logic:
full

full


Even this proof is an attempt by the authors (Hubbard & Hubbard from Vector Calculus, Linear Algebra, and Differential Forms, 3rd Ed.) to render the logical proof into more intuitive language. In a course in formal logic, one learns early on how to prove/derive conclusions logically that have no meaning, because logical proofs are not concerned with soundness but validity (i.e., granting some set of premises are true, then necessarily some conclusion follows).
 

lewisnotmiller

Grand Hat
Staff member
Premium Member
Fair enough.


The problem is that even though it was believed that 0.999... and 1 were the same using informal arguments before the epsilon-delta definition of limits was rigorously formulated, it was BECAUSE of logic and DUE TO logic that the equivalence was proved. It is fairly non-trivial to take the explanations present in the thread and translate them into a formal logical proof, as this is at the heart of analysis. So I have to ask, what do you mean by logical proof?

Ok, let me give you another restatement of this but now followed by a proof:
full

Now, how does one logically proof that e.g., 0.999.... converges to 1? First, here's a definition of convergence:
full

And, perhaps even more importantly, a logical statement of what a limit (i.e., the thing that "something" like a function or sequence converges to) is. That is, a number L is the limit of a function f (similarly for a sequence) if
full

(I included this part merely to show how strictly the requirement that definitions, proofs, propositions, etc., in mathematics must adhere to logic).
Here is a formal proof given in plain language rather than formal (symbolic) logic:
full

full


Even this proof is an attempt by the authors (Hubbard & Hubbard from Vector Calculus, Linear Algebra, and Differential Forms, 3rd Ed.) to render the logical proof into more intuitive language. In a course in formal logic, one learns early on how to prove/derive conclusions logically that have no meaning, because logical proofs are not concerned with soundness but validity (i.e., granting some set of premises are true, then necessarily some conclusion follows).

Is your brain single?
 

Revoltingest

Pragmatic Libertarian
Premium Member
I stands for 'Intelligence'. I'm clearly disqualified.
And here I thought all along that it was about "Artificial Ignorance".

It's time to admit that there's no such thing as infinity.
More admissions.
- Pi is exactly equal to 3.14.
- The standard number line is wrong because it leaves out "bleen", the number between 4 & 5.
- The famous mathematician, Rudolf Lipschitz, is actually a made up entity....those wacky Germans!
(A lot of people fell for that name.)
 
Last edited:
Top