Beaudreaux
Well-Known Member
Whenever I talk to Christians about....well, almost anything, they refer to the Bible as the authoritative Word of God. If you are a Christian reading this, I am curious why you selected the Bible to believe in? You were not born believing this, so there must have been a time in your life when you didn't believe the Bible was God's Word. What caused you to start believing in the Bible over other texts that claim to be holy?