• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

Gods responsibility

idav

Being
Premium Member
Say some scientist creates a perfect robot with freewill and decides to let it loose on the world. Then the robot decides to murder most of humanity because it became evil. Is that the robots fault or the scientist who let it loose on the world?

There doesn't seem to be justification for god with this, if he intentionally allowed evil then it follows it is gods fault. Even despite some gift of free will, we would put our creations on a leash or face the consequences.
 

Kilgore Trout

Misanthropic Humanist
Human beings don't have a very good history of inventing gods and religions which are logically consistent. Then again, the finer points of logic are lost on most members of the species.
 

Amechania

Daimona of the Helpless
it would be the scientist at fault because he programmed the robot to murder all of humanity and it choose to only murder most. God would be exonerated at any rate because he was out of the office that whole week with the "flu."
 

Quintessence

Consults with Trees
Staff member
Premium Member
Setting aside for a moment my disagreements with the language of blame and fault, this seems pretty straightforward to me.

This situation isn't any different from parents and their child. If a man and woman have a child, and that child commits a crime, the parents are not held responsible for the child's crime. If their child is still a minor, they are likely to be involved (or responsible for) disciplinary action against their child, but they are not considered to be responsible for the crime itself. They aren't.
 

idav

Being
Premium Member
The parent aspect is similar but not the same as purposely creating an entity that you unleash on the world. As parents we don't have the ability to put a leash on the children, figuratively speaking of course.
 

Quintessence

Consults with Trees
Staff member
Premium Member
The parent aspect is similar but not the same as purposely creating an entity that you unleash on the world.

It's not? Are you saying all pregnancies happen by accident? That nobody really wants to have children?

As parents we don't have the ability to put a leash on the children, figuratively speaking of course.

We don't? We do until they turn eighteen. And after that, we put leashes on them from non-parental sources ranging from social norms to state laws.
 

idav

Being
Premium Member
It's not? Are you saying all pregnancies happen by accident? That nobody really wants to have children?
The creation aspect of it is a bit out of our hands with children.
We don't? We do until they turn eighteen. And after that, we put leashes on them from non-parental sources ranging from social norms to state laws.
What we don't have control over is a persons freedom of choice. We can lock them up but choices are theirs. We can't control a child as we would be able to control a robot of our own making. The maker of humans has the control not the people producing via procreation.

I think the robot analogy is a lot more accurate. It would be completely our choice to bring a robot into the world that has devastating consequences that are completely within our means of controlling.
 

ametist

Active Member
if the robot has all needed programs to tell harm from good and all the means to avoid harm and conduct good, and also has the program to recognize that murdering most human is evil, though human is not a robot, it is robot's fault.
 

Quintessence

Consults with Trees
Staff member
Premium Member
The creation aspect of it is a bit out of our hands with children.

You said that a scientist creates "a perfect robot with freewill." If an entity has free will, is not not "a bit out of our hands?"

What we don't have control over is a persons freedom of choice. We can lock them up but choices are theirs. We can't control a child as we would be able to control a robot of our own making.

Okay, now I'm getting super confused about your OP. Does this robot have free will or doesn't it? If it has free will, it is analogous to a child that you can't control. What exactly is the scenario you mean to present here?

I think the robot analogy is a lot more accurate. It would be completely our choice to bring a robot into the world that has devastating consequences that are completely within our means of controlling.

Is it? If the robot has free will, how is it controllable? If it really has free will it's not controllable. Wasn't that the entire point of you positing "a perfect robot with freewill?" Again, I'm getting super confused here.

But some of this could be because I reject the idea of free will completely, so I have great difficulty supposing it actually exists. XD
 

idav

Being
Premium Member
if the robot has all needed programs to tell harm from good and all the means to avoid harm and conduct good, and also has the program to recognize that murdering most human is evil, though human is not a robot, it is robot's fault.

You said that a scientist creates "a perfect robot with freewill." If an entity has free will, is not not "a bit out of our hands?"



Okay, now I'm getting super confused about your OP. Does this robot have free will or doesn't it? If it has free will, it is analogous to a child that you can't control. What exactly is the scenario you mean to present here?



Is it? If the robot has free will, how is it controllable? If it really has free will it's not controllable. Wasn't that the entire point of you positing "a perfect robot with freewill?" Again, I'm getting super confused here.

But some of this could be because I reject the idea of free will completely, so I have great difficulty supposing it actually exists. XD
As a programmer of a robot we make that choice. We wouldn't know if said robot would be good or evil since we wouldn't program such a thing. It is however our choice to unleash a powerful entity at the roll of a dice to what it would do. We do still have control though, we could program it to never kill humans which would defeat free will. We could also simply design a kill switch since we are the maker and designer and programmer. That is unlike children, we didn't design humans.
 

ametist

Active Member
but we arent given any info on the purpose of creation of robot, are we?
you are trying to use the negative conclusion as the determiner of the reason. the thing is when free will is involved that cant be the case for sure.
 

idav

Being
Premium Member
but we arent given any info on the purpose of creation of robot, are we?
you are trying to use the negative conclusion as the determiner of the reason. the thing is when free will is involved that cant be the case for sure.

The object could be to create sentience. So with that free will would be a sort of must. The robot could turn good or evil of its own accord and the scientist could get a medal for creating a robot that saves the world or be responsible for its mass genocide of humans. Without creating parameters to keep the robot in check it is the fault of the designer, whatever the creation may do. We shouldn't be surprised when the creation does as it was designed, with freedom to choose and be independent.
 

ametist

Active Member
you assume that a robot can only be created to do good on earth. good is your poin of attention in this assumption. the point in creating a robot with freewill on the other might indicate that robot is the point of attention, not particularly the goodness created on earth, though it sure will be an indication how succesfull the robot's practice was. in the second case goodness chosen on earth by the robot is an indication. not the purpose of creation.
 

idav

Being
Premium Member
I said the creation could do either and there in lies the danger. We could choose to let something with power to do great harm or great good but would we really want to take such a risk?
 

ametist

Active Member
we wouldnt. because we are humans and we think our physical existence is all we got. thus we simply want all robots to serve to our existence and good.
 

idav

Being
Premium Member
we wouldnt. because we are humans and we think our physical existence is all we got. thus we simply want all robots to serve to our existence and good.

Yeah that sounds a lot like what god of the bible wants. Yet we have to be able to choose to believe.
 

ametist

Active Member
can we perceive anything objectively when it is matter of spirituality? you dont need to be an egg to understand what an omlet is but that is only true for earthly matter. never works for spirituality. no matter what source you study from, your god, will be coloured your by perception of self. we have to look real deep and know ourselves to perceive our god correctly. we ought to be the egg to understand the omlet so to speak.
 

fantome profane

Anti-Woke = Anti-Justice
Premium Member
Say some scientist creates a perfect robot with freewill and decides to let it loose on the world. Then the robot decides to murder most of humanity because it became evil. Is that the robots fault or the scientist who let it loose on the world?

There doesn't seem to be justification for god with this, if he intentionally allowed evil then it follows it is gods fault. Even despite some gift of free will, we would put our creations on a leash or face the consequences.
The deciding factor in this for me is did the scientist foresee the outcome, or should we reasonably have expected the scientist to be able to foresee the outcome. This is often used in courts of law to determine both civil and criminal responsibility.

And to relate this to the "God" concept we have to ask could "God" foresee the outcome?


So should humans be bound by Asimov's three laws of robotics? Should you be bound by these laws? Are you bound by them right now?


https://www.youtube.com/watch?v=BKkEI7q5tug
 

bobhikes

Nondetermined
Premium Member
Say some scientist creates a perfect robot with freewill and decides to let it loose on the world. Then the robot decides to murder most of humanity because it became evil. Is that the robots fault or the scientist who let it loose on the world?

There doesn't seem to be justification for god with this, if he intentionally allowed evil then it follows it is gods fault. Even despite some gift of free will, we would put our creations on a leash or face the consequences.

The problem I have is that God created everything not just a robot. God also supposedly has an end plan. Life is more like a movie whereas; in the end, everyone moves on to their new future. All the death, hardship, successes during the movie weren't real even though we saw them and possible emoted with them.
 

idav

Being
Premium Member
fantôme profane;3871548 said:
The deciding factor in this for me is did the scientist foresee the outcome, or should we reasonably have expected the scientist to be able to foresee the outcome. This is often used in courts of law to determine both civil and criminal responsibility.

And to relate this to the "God" concept we have to ask could "God" foresee the outcome?


So should humans be bound by Asimov's three laws of robotics? Should you be bound by these laws? Are you bound by them right now?


https://www.youtube.com/watch?v=BKkEI7q5tug

I don't think the scientist would even know whether or not a weapon is being built but there is knowledge that the potential is there.

I like those laws. We are bound by instinct to them, instinct to survive and even instinct to save someone even at the expense of your own life. Not sure how hard coded that is though, we seem to be able to deviate.
 
Top