-Peacemaker-
.45 Cal
Why is it seem like the vast majority of "news" that secular media seems to cover in regards to the Church (Christianity) seems to revolve are sex scandals and queers? Is that really the extent of the Church's relevancy in the 21st Century? Do people realize that the Church is on the forefront when it comes to caring for sick and the homeless? Why does the media seem to shy away from covering stories in which people's lives improved because of what the Church provided? Which leads to another question, is the term "secular" in America often a code word for a "religion", one who's tenets are built on rationalism (ie. worship of the self), and sexual "freedom"? In other words, is "secularism" in America another name for a movement which opposes everything that Jesus stands for and is indeed anti-Christian? Is the media actively trying to promote its "values"?
Last edited: