Ryan2065
Well-Known Member
I was reading some of the other forums and a topic came up that made me start to wonder... The basic question was "How do you know the bible really is the word of god?" and the answer is you don't. Really anyone who believes that was just told that by another person.
So when it comes to religion in general, aren't you just told what to believe to some extent?
So when it comes to religion in general, aren't you just told what to believe to some extent?