That's not the way sovereignty works in relation to beings with free will. I get upset when my dog disobeys as soon as my back is turned. That doesn't mean I would turn her into a robot to save myself from having to put up with her disobedience.
We act on impulse and do what we should not because we choose sin over God. We are responsible for our actions, not God.
If we can not make deliberate decisions, then nothing we do is really us doing it and we are just automatons.
This robot argument doesn't make sense to me. It seems to be a justification for a deity creating a world in which people make moral errors, and it does this by suggesting that such an arrangement is preferable to one in which all people are hardwired to pursue the virtues by calling the latter robotic. I assume that what is mean by that is some kind of subhuman existence similar to zombification or lobotomy, where the human spirit disappears with the inability to make immoral choices.
I think many people would agree with me that if they had omniscience and omnipotence and wanted to create a sinless humanity, they would simply create people with only the will to do good. There is no reason to think that such a life is any less full or meaningful than one where people have to deal with competing urges and tame one of them. In fact, it ought to be a better life, one free of shame or self-loathing, one where there was no need for remorse.
Like many others, I've spent a lifetime attempting to tame these urges with some success. I haven't had the urge to steal in forever, but before then, I had to choose between stealing coins in a friend's bedroom and not. I made the wrong choice at times, and eventually learned not only not to steal, but I lost the desire. Did I become a bit robotic there when I culled the immoral impulse from my repertoire of desires? Isn't the human condition the conflict between baser survival urges inherited from our reptilian and pre-human mammalian ancestors, and the strictly human intellectual and moral imperatives arising from higher cortical centers - you know, the one represented by an angel sitting on one shoulder, a devil on the other, and the two "arguing" with one another by speaking through the ears? Those are our two friends, the base urges and the higher urges. Are we really better off having to endure this struggle, and at times failing.
Is this what is meant by not being a robot, having two voices instead of one? It seems to be if when one suggests that if the base voice were silenced and immoral impulses ceased such that if one were only hearing the angel side of his nature, he's be less than human? Think about it. That somebody who is the best that a human can be.
I've nearly god my devil side tamed. I hardly hear from it any more, that is, I rarely have the desire to do something that I later regret or experience much inner conflict, I rarely fail to heed my conscience and reasoning faculty due to its having almost no competition. Is that not also free will? Is that not exactly the path to right living and right thinking?
You wrote, "
If we can not make deliberate decisions, then nothing we do is really us doing it and we are just automatons" How is what is being described not making deliberate decisions? Most of the time, we are of one mind, as I am writing these words. There is no inner conflict or urge to write immorally. I am of one mind, and I choose to follow its exhortations. Am I not making deliberate decisions when I am of one mind, or am I an automaton?
Yesterday I was in a check-out line in a market that required shoppers be masked. The woman behind me had hers under her chin. I felt the devil well up devil, and soon thereafter, the angel. The devil wanted to say something to her angrily. It wasn't hard to suppress that, but I was not able to conceal the disdain on my face. I think I failed by my own standards. I would have preferred not having to deal with that base urge, or at a minimum, completely
My wife has no such impulse. For her, there was no desire to express an opinion to her at all. She had one will. I had two in conflict. This second state, my wife's, is what is being called being a robot. My internal struggle is being called the gift of free will granted by a loving deity to avoid making man into robots, as if it were a preferred state. It's not.
This is why so many of us say that a loving god with a desire that man behave according to its rules would download those rules into his conscience, which is what we try to do with ourselves when we attempt to place our higher centers in complete control, and only indulge the base urges when the exclusively human part of approves, that is, there is no conflict between different parts of the mind.
It's also what we try to do for our children. We labor tirelessly to help them tame those base urges and direct them only to support the agenda of the higher centers, as when we try to stop siblings from "hating" one another. If we had the power, we would reach into them and purge those impulses from them. Do we think of converting them from a mind where the devil can prevail to one where it isn't even heard making them into robots? No. We think of that as raising them well.
And many of us are applying that same way of thinking to the matter of a deity that lets man fail morally by endowing him with a mind capable of generating two wills, a devil and an angel, rather than one, just an angel. Isn't that the model for God, who we are told only wants to do good, and has no internal moral struggle? Isn't that the robot that apologists describe when others suggest that being of one perfectly moral mind would be the preferred state in man as well?
This is why I reject the idea of mans present condition of living in a head with more than one will, one of which will cause him to make moral errors, as a preferred state. It's not. It's the one that evolution has given us. The beasts don't have these moral dilemmas for lack of a conscience, and the perfected soul also doesn't have moral dilemmas, like a god wouldn't. It's poor humanity caught between these two worlds, the one he came from and the one he is headed toward, who experiences moral dilemmas and shame.
Once again, this is not a gift. Nor is it desirable. But it's what we have and who we are, so the religious apologist attempts to make it seem desirable, since he believes that it is the design of a tri-omni God. And to attempt to defend this arrangement, when the skeptic says that if he were God, he would make man of one mind and perfectly moral, the apologist says, "God doesn't want robots." It sounds good until one looks at the comment closely as I have tried to do here.