• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

How do you know you are not "A.I."?

Revoltingest

Pragmatic Libertarian
Premium Member
Still not seeing it as verifiable. Still haven't seen that shown in this thread. I'm not able to debunk it, as I'm claiming it is not (inherently) verifiable. Needs an (inherent) limit put on infinite decimals to (strongly) suggest is must be equal to 1, rather than (forever) unequal to 1.
There is no limit to the number of 9s to the right of the decimal point.
That's just how infinity is defined.
That's why 0.999.... = 1
 

Acim

Revelation all the time
There is no limit to the number of 9s to the right of the decimal point.
That's just how infinity is defined.
That's why 0.999.... = 1

I see the 3rd statement as a non-sequitur to the 1st and 2nd statements.

For me, it is infinitely unequal. If I were in some type of class where regurgitating data was needed to get a passing graded, I'd go with the idea it equals 1, and just move on from the class as if it is really a trivial consideration. There's part of me that doesn't see it as trivial, but it goes well beyond math, for me, at that point.
 

Revoltingest

Pragmatic Libertarian
Premium Member
I see the 3rd statement as a non-sequitur to the 1st and 2nd statements.
It follows from my earlier post.
For me, it is infinitely unequal.
Okey dokey.
If you know any mathematicians, ask'm what they think.
If I were in some type of class where regurgitating data was needed to get a passing graded, I'd go with the idea it equals 1, and just move on from the class as if it is really a trivial consideration. There's part of me that doesn't see it as trivial, but it goes well beyond math, for me, at that point.
Here is more detail....
https://en.wikipedia.org/wiki/0.999...
 

Ouroboros

Coincidentia oppositorum
6fa510b44742046a167b4b8515162825.png
 

LegionOnomaMoi

Veteran Member
Premium Member
Some cool basic algebra.....
...assuming the construction of the reals, which is anything but trivial or basic.

Again, how is that verifiable?
Let's call the number 0.999... "x" and call "1"...well, "1". Let's say I maintain that the two are not the same, and Revoltingest rightly says I don't know what I am talking about because they are. But, being the gracious debater he is, instead of just writing me off he asks me to show that the two numbers aren't equal. We both start by imagining the positions of these numbers on the real number line. For me, they are very close but "obviously" do not occupy the same place, while for Revoltingest, who's mathematical knowledge surpasses my own, the two numbers are really one number located at the same place in the number line. As I think there are two different numbers here, I maintain that there must be some "distance" between 0.999... and 1 on the number line. Rather than call me a fool, Revoltingest asks that I tell him how far apart the number 0.999... is from the number 1. Clearly, I'm about to win this debate. All I have to do is provide a tiny, tiny, tiny, tiny difference- if there is any difference between 0.999... and 1, then they are two different numbers.
That's when I slam into a brick wall. No matter how small an amount I choose, e.g., 0.000000000000000001, 0.0000000000000000000000000000000001, 0.00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001, etc., it turns out that the difference between 0.999.... & 1 is smaller than this amount. In fact, there exists no number small enough that 1-0.999.... is equal to that number. The distance between 0.999... and 1 is smaller than every possible number. That's because the 0.999...=1.
 

Midnight Rain

Well-Known Member
Pretty much... just...... How do you know you are not "A.I."? :shrug:
Because I am a biological organism and every piece of evidence available to me supports this. Naturally occurring biological organisms wouldn't be A.I. AI is by definition a form of computerized computation that mimics some functions of biological mental computations. I believe there will come a time where we may actually create AI so well that it will be indistinguishable from regular intelligence.
 

Midnight Rain

Well-Known Member
...assuming the construction of the reals, which is anything but trivial or basic.


Let's call the number 0.999... "x" and call "1"...well, "1". Let's say I maintain that the two are not the same, and Revoltingest rightly says I don't know what I am talking about because they are. But, being the gracious debater he is, instead of just writing me off he asks me to show that the two numbers aren't equal. We both start by imagining the positions of these numbers on the real number line. For me, they are very close but "obviously" do not occupy the same place, while for Revoltingest, who's mathematical knowledge surpasses my own, the two numbers are really one number located at the same place in the number line. As I think there are two different numbers here, I maintain that there must be some "distance" between 0.999... and 1 on the number line. Rather than call me a fool, Revoltingest asks that I tell him how far apart the number 0.999... is from the number 1. Clearly, I'm about to win this debate. All I have to do is provide a tiny, tiny, tiny, tiny difference- if there is any difference between 0.999... and 1, then they are two different numbers.
That's when I slam into a brick wall. No matter how small an amount I choose, e.g., 0.000000000000000001, 0.0000000000000000000000000000000001, 0.00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001, etc., it turns out that the difference between 0.999.... & 1 is smaller than this amount. In fact, there exists no number small enough that 1-0.999.... is equal to that number. The distance between 0.999... and 1 is smaller than every possible number. That's because the 0.999...=1.
I like your explanation. another one that might be easier for some people to grasp is 1/3.
The fraction 1/3 is .33333...... in decimal. Multiple by 3 and you get .99999999..... in decimal. But in fraction you get 1. They are still equal.
 

fantome profane

Anti-Woke = Anti-Justice
Premium Member
Still not seeing it as verifiable. Still haven't seen that shown in this thread. I'm not able to debunk it, as I'm claiming it is not (inherently) verifiable. Needs an (inherent) limit put on infinite decimals to (strongly) suggest is must be equal to 1, rather than (forever) unequal to 1.
How do you feel about the statement that:

one = 1.

Is that fair. "one" and "1" are just different ways of denoting the same concept. 1 = one. Saying 0.999.... is the same thing. It is a way of denoting he same concept, one, 1, 0.999...

How do you express the concept of one third? You could express that as 1/3. Or you could express it as 0.333.... Same thing.

One third plus one third plus one third equals one.

1/3 + 1/3 + 1/3 = 1

0.333.... + 0.333.... + 0.333... = 0.999.....

one = 1 = 0.999.....
 

ThePainefulTruth

Romantic-Cynic
Because I am a biological organism and every piece of evidence available to me supports this. Naturally occurring biological organisms wouldn't be A.I.

My deistic theology theorizes that God created the universe precisely to evolve creatures like us with full self-awareness and thus free will. Thus we would be naturally occurring. Whether to call ourselves A.I. or not is essentially a moot point and unknowable in this life since there's no evidence for or against the existence of God.

AI is by definition a form of computerized computation that mimics some functions of biological mental computations.

So then, by definition, would a cyborg with full self-awareness be an AI or not? And if "mimics" is in the definition, then the definition needs to be amended. It may mimic, but it doesn't necessarily.

I believe there will come a time where we may actually create AI so well that it will be indistinguishable from regular intelligence.

Me too, and probably sooner than we think given the dawn of quantum computers, in this universe which is essentially a giant quantum computer. And if that AI is indeed fully self-aware and self-replicating, would the source of its creation matter more to them than ours to us? In fact, one could say that whatever created us (God or happenstance), created them.
 
Last edited:

Ouroboros

Coincidentia oppositorum
...assuming the construction of the reals, which is anything but trivial or basic.


Let's call the number 0.999... "x" and call "1"...well, "1". Let's say I maintain that the two are not the same, and Revoltingest rightly says I don't know what I am talking about because they are. But, being the gracious debater he is, instead of just writing me off he asks me to show that the two numbers aren't equal. We both start by imagining the positions of these numbers on the real number line. For me, they are very close but "obviously" do not occupy the same place, while for Revoltingest, who's mathematical knowledge surpasses my own, the two numbers are really one number located at the same place in the number line. As I think there are two different numbers here, I maintain that there must be some "distance" between 0.999... and 1 on the number line. Rather than call me a fool, Revoltingest asks that I tell him how far apart the number 0.999... is from the number 1. Clearly, I'm about to win this debate. All I have to do is provide a tiny, tiny, tiny, tiny difference- if there is any difference between 0.999... and 1, then they are two different numbers.
That's when I slam into a brick wall. No matter how small an amount I choose, e.g., 0.000000000000000001, 0.0000000000000000000000000000000001, 0.00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001, etc., it turns out that the difference between 0.999.... & 1 is smaller than this amount. In fact, there exists no number small enough that 1-0.999.... is equal to that number. The distance between 0.999... and 1 is smaller than every possible number. That's because the 0.999...=1.
It's a little like infinity + 1.
Delta 1 and .999...(inf) is 0.000...(inf)...1. The single one beyond the infinite sequence.
 

Acim

Revelation all the time
...assuming the construction of the reals, which is anything but trivial or basic.


Let's call the number 0.999... "x" and call "1"...well, "1". Let's say I maintain that the two are not the same, and Revoltingest rightly says I don't know what I am talking about because they are. But, being the gracious debater he is, instead of just writing me off he asks me to show that the two numbers aren't equal. We both start by imagining the positions of these numbers on the real number line. For me, they are very close but "obviously" do not occupy the same place, while for Revoltingest, who's mathematical knowledge surpasses my own, the two numbers are really one number located at the same place in the number line. As I think there are two different numbers here, I maintain that there must be some "distance" between 0.999... and 1 on the number line. Rather than call me a fool, Revoltingest asks that I tell him how far apart the number 0.999... is from the number 1. Clearly, I'm about to win this debate. All I have to do is provide a tiny, tiny, tiny, tiny difference- if there is any difference between 0.999... and 1, then they are two different numbers.
That's when I slam into a brick wall. No matter how small an amount I choose, e.g., 0.000000000000000001, 0.0000000000000000000000000000000001, 0.00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001, etc., it turns out that the difference between 0.999.... & 1 is smaller than this amount. In fact, there exists no number small enough that 1-0.999.... is equal to that number. The distance between 0.999... and 1 is smaller than every possible number. That's because the 0.999...=1.

My (made up) short hand for representing the difference between the two would be: 0.000...1. I honestly feel that covers the difference between 1.0 and .999...
 

Acim

Revelation all the time
I like your explanation. another one that might be easier for some people to grasp is 1/3.
The fraction 1/3 is .33333...... in decimal. Multiple by 3 and you get .99999999..... in decimal. But in fraction you get 1. They are still equal.

I see .333... as infinitely unequal to 1/3.

Therefore 3 x .333.... plausibly does equal .999....

.999... is infinitely unequal to 1.
 

Acim

Revelation all the time
It's a little like infinity + 1.
Delta 1 and .999...(inf) is 0.000...(inf)...1. The single one beyond the infinite sequence.

I honestly had not seen this written before my post, nor have I ever seen another person ever write it out, though I thought of it as a logical possibility between the two a few years ago. Not saying I can comprehend 0.000...1, but also am saying .999.... equaling 1 is not verifiable. Saying it is, continues to strike me as laziness, and/or adding a limit for sake of convenience. The 'proofs' I've seen so far are not convincing and appear to me how I imagine 0.000...1 appears to a seasoned mathematician. They also look to me like:

Blue + elephants = 5
Therefore green = 7
 

Ouroboros

Coincidentia oppositorum
Here's an interesting take on it. If we instead of using base 10 used base 3.
0t0.1 would equal exact a 3rd without an infinite sequence.
 

Midnight Rain

Well-Known Member
My deistic theology theorizes that God created the universe precisely to evolve creatures like us with full self-awareness and thus free will. Thus we would be naturally occurring. Whether to call ourselves A.I. or not is essentially a moot point and unknowable in this life since there's no evidence for or against the existence of God.
I don't understand what you are asking if you are asking anything here.


So then, by definition, would a cyborg with full self-awareness be an AI or not? And if "mimics" is in the definition, then the definition needs to be amended. It may mimic, but it doesn't necessarily.
You mean android? A cyborg by definition is still part biological. But a fully mechanical or cybernetic intellegence designed and created by humans would be AI no matter how advanced. In fact there is room to go beyond us in advancement but it would still be AI.
 

Midnight Rain

Well-Known Member
I see .333... as infinitely unequal to 1/3.

Therefore 3 x .333.... plausibly does equal .999....

.999... is infinitely unequal to 1.
the problem with irrational repeating numbers like that is that infinitely small and non-existent are the same thing mathematically. What it really boils down to is the fact that the decimal system, while easiest to calculate more advanced figures and mathematics is inherently flawed 10 point system that lacks even splits in all directions. For example if I gave you 1/3rd of a pizza I would be giving you .333333..... of a pizza. They would be the same. Its just the way one is written strangely since it doesn't appear to be "possible" to split it evenly into 1/3rd. However in the real world it is an simple to do so. So any issue with .99999..... being 1 has to do with the original decimalization of real or imagined quantities. .9999... is infinitely unequal to 1 in the same way 1 is infinitely unequal to 1.
 

Acim

Revelation all the time
.9999... is infinitely unequal to 1 in the same way 1 is infinitely unequal to 1.

I do not see how the latter is plausible. Appears like a non sequitur.

Seems like one assertion is just an assumption without a way to verify. The verifications, thus far that I've seen (here and elsewhere) all appear like assumptions. 1/3 being equal to .333... equals assumption. Multiply .333... by 3, and it presumably equals .999... which then equals 1, ya know, cause we assume that .333... does exactly equal 1/3.

To me, it is possibly as simple as realizing that the decimals are not (ever) going to be exactly equal. Keep dividing 1, by 3, and you'll keep adding another 3 after the string of 3's, without end. May as well just say .3 rather than the .333... is equal to 1/3. If needing more than one placement after the first .3, then there is really no actual limit to the placements of 3's. It goes on forever or in essence is indeterminate. The three dots that substitute for 'infinity' seem like incredibly poor representation of what the assumption is after that which is said to equal 1/3. IMO, it is equal to saying .3 equals 1/3. Might not be precisely accurate, but then again, neither is .333... Whereas, perhaps if there were verifiable representation of .333 (followed by 'infinite amount of 3's), however that may appear, it would be 'precise.' I am yet to see that accurate representation.

If 1 doesn't in fact equal 1, and is instead infinitely unequal, that would seem like monumental news in the world of mathematics. I'm very okay with the idea that it might be the case. But for the life of me, I can't understand how that is seen as (infinitely) unequal.
 

Midnight Rain

Well-Known Member
I do not see how the latter is plausible. Appears like a non sequitur.

Seems like one assertion is just an assumption without a way to verify. The verifications, thus far that I've seen (here and elsewhere) all appear like assumptions. 1/3 being equal to .333... equals assumption. Multiply .333... by 3, and it presumably equals .999... which then equals 1, ya know, cause we assume that .333... does exactly equal 1/3.
Great thing about math is things can be proven.
1/3rd is 1 divided by 3. I divided by three is .333...... .3333.... multiped by 3 is in fact .999999.....
Do you contest either of those and if so why?
To me, it is possibly as simple as realizing that the decimals are not (ever) going to be exactly equal. Keep dividing 1, by 3, and you'll keep adding another 3 after the string of 3's, without end. May as well just say .3 rather than the .333... is equal to 1/3. If needing more than one placement after the first .3, then there is really no actual limit to the placements of 3's. It goes on forever or in essence is indeterminate. The three dots that substitute for 'infinity' seem like incredibly poor representation of what the assumption is after that which is said to equal 1/3. IMO, it is equal to saying .3 equals 1/3. Might not be precisely accurate, but then again, neither is .333... Whereas, perhaps if there were verifiable representation of .333 (followed by 'infinite amount of 3's), however that may appear, it would be 'precise.' I am yet to see that accurate representation.
.3333... is precisely accurate. Eventually we have to round up to use the number But not always. For example if .33... was treated as "X" in an equation we would have it fixed and workable without having to round up the answer.
If 1 doesn't in fact equal 1, and is instead infinitely unequal, that would seem like monumental news in the world of mathematics. I'm very okay with the idea that it might be the case. But for the life of me, I can't understand how that is seen as (infinitely) unequal.
You misunderstand. Its not a matter of having a difference that is infinetly small. I am saying that they are the same. There is no more difference between 1 and 1 than 1 and .9999. Or 1 and I. Or 1 and one. or 1 and uno.

Lets do this in an equation. 1 divided by 3 = X. If we solve here with normal arithmatic we get .333....

But its totally legal in algerbra to multiply both sides by 3. So we get 1= 3X. We just proved through arithmatic that X= .33.... So that means .3333... multiplied by 3 is 1.
 
Top