OK, obviously my answer is no.
But people who leave Christianity, either drifting off into a secular life or actually converting to a different religion, sometimes site a negative perspective toward humanity or too much emphasis on 'sin' in Christianity as the reason they were turned off from it.
Do you agree that Christianity has a more negative outlook on things? Please say whether you are or were a Christian, and if you left what made you do so. Was it the teachings, the people, a particular experience?
From my own perspective, I think Christianity is a very (the most) hopeful and positive religion. I can understand however that the emphasis on sin, and especially the doctrine of Original Sin, is viewed by many as a negative aspect of Christianity, especially when combined with some Protestant teachings about predestination and hell. Personally I think that while some meditation on sin and hell (as separation from God) can deepen our faith, to only emphasize these aspects is a shallow, hollow approach to Christianity and yes, I consider that a very negative face of the religion. It can also be noted that not all Christian denominations, notably the Eastern Orthodox, have Original Sin as part of their doctrine, and the ideas of theosis and apacatastasis are not/have not always been viewed as heresies.