I was just saying that it doesn't really matter if atheism grows in the West. It doesn't mean that Christianity is going to die out, just that the global center of Christianity is changing.
Truthfully, the West is on the verge of collapse. Western Europe is in turmoil and American culture is falling apart. Culture and society is decaying and becoming more and more disgusting. The economy is in the crapper and the gap between the rich and the poor is only getting wider and wider. So an increase in atheism isn't surprising, since the culture encourages hedonism, selfishness and individualism. A lot of people aren't very educated about religion. Plus, the public face of religion in America is mostly a joke. You have morons like Pat Robertson, Joel Osteen and a number of others that make a mockery out of religion, promote hatred and use it as a tool to get rich.
But true Christianity will persevere on as it has always done. Besides, if we really are getting close to the Eschaton (end of time), all this has been prophesied. There is nothing to fear, as always. God is with us.