As soon as there becomes more than one sapient mind, then goals aren't particularly absolute anymore.
If I build a sapient robot with a goal, and it decides that it's a bad goal and proposes a different one, then my will is no longer absolute over the robot. There's nothing inherent in the creator->created process that dictates that the creator's morality is absolute.
The only theistic context where morals would seem to be absolute are in monist propositions where the only thing that does exist, is god (all is one). Other than that, any anthropomorphic/monotheist depictions of deities are just as prone to relative morality as anything else.