• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

Gods responsibility

Saint Frankenstein

Here for the ride
Premium Member
Assuming that the robot is capable of rational thought and empathy, it is its own fault since it's capable of knowing the difference between right and wrong and knowing the effect of harmful actions on others.
 
Last edited:

Mycroft

Ministry of Serendipity
Say some scientist creates a perfect robot with freewill and decides to let it loose on the world. Then the robot decides to murder most of humanity because it became evil. Is that the robots fault or the scientist who let it loose on the world?

There doesn't seem to be justification for god with this, if he intentionally allowed evil then it follows it is gods fault. Even despite some gift of free will, we would put our creations on a leash or face the consequences.

The question is this:

Is it god's responsibility? Or is that merely the expectations of god's responsibility that you project?
 

9-10ths_Penguin

1/10 Subway Stalinist
Premium Member
Say some scientist creates a perfect robot with freewill and decides to let it loose on the world. Then the robot decides to murder most of humanity because it became evil. Is that the robots fault or the scientist who let it loose on the world?
I have problems with perfection as a concept, but if what you mean by "perfect" includes anything like "not evil at all", then your hypothetical scenario is logically contradictory. If the robot was not evil at all, then giving it free will would not result in it becoming evil.

I think of free will as like riding a bike without training wheels: if the wheels were attached, this would prevent you from falling down, but when the wheels are removed, it's not inevitable that you fall down; that depends on how good your sense of balance is.

There doesn't seem to be justification for god with this, if he intentionally allowed evil then it follows it is gods fault. Even despite some gift of free will, we would put our creations on a leash or face the consequences.
This is an issue where the limited nature of humanity comes into play: we're only responsible for the consequences of our actions to the extent that those consequences were foreseeable, and even then, the positive consequences can outweigh the unavoidable negative consequences.

The consequences associated with having kids are good more often than not, so we're justified in having them even if their effects are sometimes negative.

God doesn't get to use a human-style justification, though:

- thanks to omniscience, all consequences of God's creation are foreseeable to God.
- thanks to omnipotence, none of the negative consequences of God's creation are avoidable.

So even if God doesn't intend a particular consequence, he's still negligent, since any negative consequence was foreseeable and avoidable to God.
 

Thruve

Sheppard for the Die Hard
Uhm, didn't god initially decide to destroy us all with the story the Noah and the ark? So, though your stating it's his fault, in theory atleast, are you glad he didn't destroy us or would you of rather he did?
 

9-10ths_Penguin

1/10 Subway Stalinist
Premium Member
Uhm, didn't god initially decide to destroy us all with the story the Noah and the ark? So, though your stating it's his fault, in theory atleast, are you glad he didn't destroy us or would you of rather he did?

Was that at me?

I'm not a theist, and I certainly don't think that the flood happened, but in that story, the fact that humanity got to the point that God felt the need to kill nearly everyone on Earth was God's own fault.
 

Sir Doom

Cooler than most of you
Say some scientist creates a perfect robot with freewill and decides to let it loose on the world. Then the robot decides to murder most of humanity because it became evil. Is that the robots fault or the scientist who let it loose on the world?

So the robot's motivation for killing is simply because 'it became evil' ?

If there is culpability its probably on whatever made it 'become evil' (whatever that means).

There doesn't seem to be justification for god with this, if he intentionally allowed evil then it follows it is gods fault. Even despite some gift of free will, we would put our creations on a leash or face the consequences.

J.R.R. Tolkien created Sauron. Thus, J.R.R. Tolkien must be accountable for Boromir's death. He could have written it any way he wanted, but instead he robbed him of his sanity with an evil ring, and then when redemption was just on the horizon he shot him full of arrows. His lasting memory in his companions minds being one of betrayal and dishonor.

He could have written it any way he wanted. He could have left Sauron out completely, or the ring, or the orcs, or arrows. It could have been a novel about building hammocks and drinking lemonade (it kind of is, actually haha). So, why did Tolkien make a universe where Boromir would have to die? Is it because Tolkien is evil?
 

idav

Being
Premium Member
Assuming that the robot is capable of rational thought and empathy, it is its own fault since it's capable of knowing the difference between right and wrong and knowing the effect of harmful actions on others.

Yes they use the knowledge to inflict good or evil. Maybe they conclude we are a parasite.
 

Mycroft

Ministry of Serendipity
Yes they use the knowledge to inflict good or evil. Maybe they conclude we are a parasite.


They don't care. Machines have no ambition. Hollywood tells you the machines are going to take over, but no machine ever says, "Gee, I'd like to run the lives of people." Machines don't have a gut reaction and they don't get hungry. They don't feel resentment.
 

idav

Being
Premium Member
The question is this:

Is it god's responsibility? Or is that merely the expectations of god's responsibility that you project?
Its a question to gods motives.

I have problems with perfection as a concept, but if what you mean by "perfect" includes anything like "not evil at all", then your hypothetical scenario is logically contradictory. If the robot was not evil at all, then giving it free will would not result in it becoming evil.

I think of free will as like riding a bike without training wheels: if the wheels were attached, this would prevent you from falling down, but when the wheels are removed, it's not inevitable that you fall down; that depends on how good your sense of balance is.
Perfect is to the designers specifications. If he wanted altruistic then the creation couldnt be free to choose.

This is an issue where the limited nature of humanity comes into play: we're only responsible for the consequences of our actions to the extent that those consequences were foreseeable, and even then, the positive consequences can outweigh the unavoidable negative consequences.

The consequences associated with having kids are good more often than not, so we're justified in having them even if their effects are sometimes negative.

God doesn't get to use a human-style justification, though:

- thanks to omniscience, all consequences of God's creation are foreseeable to God.
- thanks to omnipotence, none of the negative consequences of God's creation are avoidable.

So even if God doesn't intend a particular consequence, he's still negligent, since any negative consequence was foreseeable and avoidable to God.
Now if I tried to create a good guy and they turned out evil that wasnt the intention. I can see some forsight would need to be part of it. However omniscience cant be so much that it hinders omnipotence. Knowing would actually be possible worlds and god does, with omnipotence, the world he wants. He could choose to look but would be looking at his choices as well as the alternatives due to omnipotence.
So the robot's motivation for killing is simply because 'it became evil' ?

If there is culpability its probably on whatever made it 'become evil' (whatever that means).



J.R.R. Tolkien created Sauron. Thus, J.R.R. Tolkien must be accountable for Boromir's death. He could have written it any way he wanted, but instead he robbed him of his sanity with an evil ring, and then when redemption was just on the horizon he shot him full of arrows. His lasting memory in his companions minds being one of betrayal and dishonor.

He could have written it any way he wanted. He could have left Sauron out completely, or the ring, or the orcs, or arrows. It could have been a novel about building hammocks and drinking lemonade (it kind of is, actually haha). So, why did Tolkien make a universe where Boromir would have to die? Is it because Tolkien is evil?
They would rationally deduce their reasoning. Maybe just wants power.

If god is sending in all sorts of evils at us it doesn't seem all that fair. Then your just messing with your creation. I can see why people might think god is malevolent.
 

idav

Being
Premium Member
They don't care. Machines have no ambition. Hollywood tells you the machines are going to take over, but no machine ever says, "Gee, I'd like to run the lives of people." Machines don't have a gut reaction and they don't get hungry. They don't feel resentment.

It would do whats in its best interest.
 

9-10ths_Penguin

1/10 Subway Stalinist
Premium Member
Perfect is to the designers specifications.
So then this robot - including its form and function - is exactly as deliberately intended?

If so, then I think the answer to your question in the OP is obvious: since the robot is functioning exactly as designed, then how it functions - i.e. its actions are the fault of its designer.

If he wanted altruistic then the creation couldnt be free to choose.
How do you figure? Free will is the freedom to choose to act on our will; it isn't the freedom to choose our desires.

Now if I tried to create a good guy and they turned out evil that wasnt the intention. I can see some forsight would need to be part of it. However omniscience cant be so much that it hinders omnipotence.
How could omniscience ever "hinder omnipotence"?
 

9-10ths_Penguin

1/10 Subway Stalinist
Premium Member
I dont think perfect means good.

Fair enough. As I mentioned, I was assuming your intended meaning. Personally, I reject the idea of perfection except where it's measured against an artificial standard (e.g. a perfect game of bowling).
 

idav

Being
Premium Member
How do you figure? Free will is the freedom to choose to act on our will; it isn't the freedom to choose our desires.
In a sense will is desire.
How could omniscience ever "hinder omnipotence"?
If it knows what will happen then it isnt free to do otherwise. Omniscience should be open to possibilities. If a then b and if a1 then b1 else c.
So. No evidence whatsoever. I'm glad we cleared that up.
Its a hypothetical fir a sentience we would create. The sentience has to be programmed to some extent.
 
Top