firedragon
Veteran Member
Religions have, throughout history, in some form, denigrated women. I'm coming to the conclusion that women don't gain much from religion. So what's the point? Why should a woman even want to be religious?
Thoughts?
Its not entirely correct. If you want to speak about different standards for the sexes throughout time, society has done it, and you blaming it on religion is a made up theory with no evidence to it. Are you speaking about causes and you put it on scripture or are you speaking about outcomes? Whats your research on this?
If you dont have any research and if this is just preaching or playing to the gallery, it is just made up for pleasure.
Cheers.