• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

If you believe in free will, respond to these two objections

idav

Being
Premium Member
"programmer, this is step debugger. step debugger, this is programmer.

I'll leave you two alone now, it seems you have a lot to catch up on..."
When a program is designed to think for itself then it wouldn't need the programmer to input every step since the program is actually learning. You can't send a robot into the city and predict what it might learn.
 

not nom

Well-Known Member
When a program is designed to think for itself then it wouldn't need the programmer to input every step since the program is actually learning.

that's not what a step debugger is. a step debugger lets you run the programm step by step, while inspecting its state. you can watch it grab input, and how *exactly* it reacts to it and why.

my point was that if programmers don't know what their programs are doing -- and unless those programs interect a lot with user input or other programs, or have been written by large groups etc. -- that's just programmers loosing track, not non-deterministic programs.

You can't send a robot into the city and predict what it might learn.

that's because you don't have complete knowledge of the city, and doesn't really mean that's a non-deterministic robot. "not predictable by our means" is not the same as "not predictable/non-deterministic".
 

idav

Being
Premium Member
But it will still be entirely possible to follow the program's train of thought.
Sure. I guess it is more accurate to say that a machine will learn how to do other than it was originally programmed. Experiences would essentially modify the original program.
 

not nom

Well-Known Member
Sure. I guess it is more accurate to say that a machine will learn how to do other than it was originally programmed. Experiences would essentially modify the original program.

if it was programmed to be self-modifying, yes. but that means it does exactly what it was programmed to do, don't you see?
 

Nakosis

Non-Binary Physicalist
Premium Member
so? that doesn't mean identical effects don't produce the same results -- just that that never happens.

No, but there will always be slight variances. Some just not significant enough to alter the output.

However if there was an alternation one can usually trace that to a cause that was significant enough to alter the output.
 

not nom

Well-Known Member
when we talk about hard determinism, as you can at least have in closed, artificial systems, then all the outcome is already present in the initial situation. not just as a potential, but as something that *will* come to pass.

sure, then there's quantum mechanics. not that I understand them, but I get there's something there ^^

but why assume quantum mechanics mean individuals have free will? I don't even follow why that would mean the universe has a free will -- being subject to several, even infinite random outcomes instead of one fixed outcome doesn't equal having a say? -- but I *surely* don't understand how this doesn't utterly smash the notion of individual agents (which you have to first accept before you can ponder their free will): you cannot talk about non-local effects of the brain and ignore non-local effects of the universe, that is just silly :p
 

PolyHedral

Superabacus Mystic
watson wasn't programmed to play jeopardy?

uhuh. you're just making stuff up now, which is why I asked for your point. had a gut feeling ^^
It was programmed to learn to play Jeopardy. They couldn't possibly program it to play Jeopardy and win; they didn't know how to do it themselves. It had to figure it out on its own, pretty much identically to a human.
 

Nakosis

Non-Binary Physicalist
Premium Member
It was programmed to learn to play Jeopardy. They couldn't possibly program it to play Jeopardy and win; they didn't know how to do it themselves. It had to figure it out on its own, pretty much identically to a human.

Could it learn to play chess?
Without some person altering it's programming?
 

not nom

Well-Known Member
It was programmed to learn to play Jeopardy. They couldn't possibly program it to play Jeopardy and win; they didn't know how to do it themselves. It had to figure it out on its own, pretty much identically to a human.

aha. so when I said it did exactly what it [well not watson, any program really] was programmed to do, I was exactly right -- it learned how to play jeopardy, as it was programmed to do, and yes it *was* also programmed to then use that knowledge to actually play jeopardy, your weak sophistry nonwithstanding -- and you still don't have a point. I know that, I pointed it out, now let's wait until that trickles through to you.
 

PolyHedral

Superabacus Mystic
Could it learn to play chess?
Without some person altering it's programming?
Playing Chess is not part of the specification; it is not a generalized problem solving AI, unlike a human. You wouldn't want IBM to make one of those, incidentally, because they'd be a terrifying prospect.
 

idav

Being
Premium Member
that's because you don't have complete knowledge of the city, and doesn't really mean that's a non-deterministic robot. "not predictable by our means" is not the same as "not predictable/non-deterministic".
Humans are less predictable than other animals. Why do you think that is?
 

Nakosis

Non-Binary Physicalist
Premium Member
Humans are less predictable than other animals. Why do you think that is?

Because humans can imagine the world being other thin what it is and act according to the world as they've imagined it to be.

So not only do we have memories of actual experience to deal with in trying to predict human behavior but also memories of imagined experienced which we don't have full access to.

Hard to make accurate predictions when one doesn't have access to all the information which caused the behavior.
 

idav

Being
Premium Member
Because humans can imagine the world being other thin what it is and act according to the world as they've imagined it to be.

So not only do we have memories of actual experience to deal with in trying to predict human behavior but also memories of imagined experienced which we don't have full access to.

Hard to make accurate predictions when one doesn't have access to all the information which caused the behavior.
One of the issues is that we can just make stuff up. What stops anyone from rejecting any of the influences we have with the stuff we can just make up. Thats part of what makes us illogical and therefore unpredictable.
 
Top