I'm not sure myself. I ask because I read a couple of articles lately written by Christians. They were trying to make the case that secularism believes morality is relative and without the never changing "Word of God", morality is subject to the whims of the people. Eventually, this secular society will morally decay.
Right? Wrong?
Right? Wrong?