I think when it comes to the West and imperialism (whether it's military, cultural, economic, or nationalistic), we probably do have a lot dark pages in our history - and it's really not even entirely over yet. But to be honest, I would have thought there would be greater resistance to Western cultural imperialism in terms of popular culture, such as movies, music, styles, fashion, etc. Religion seems like it would take a back seat to some of the more blatant examples of cultural imperialism which seem evident on a global scale.
However, those who push Western religion - I don't get the sense that they're working on behalf of the government. That is, I don't see any geopolitically-related ulterior motive behind it. Many might be true believers who genuinely believe they're on a mission from God to convert people to their religion.
That's not something I would choose to do, because I don't really believe in God (but I still like to argue about the hypothetical possibilities sometimes). But I guess some people just feel a certain "calling" to go out into the nations and preach the gospel and spread the "good news." I try not to read too much into that, but to say it's "imperialism" seems a bit of a stretch. At least not in this day and age. Admittedly, there were some older colonial empires which were strict monarchies and forced all of their subjects to follow the same religion. The Spanish Empire was particularly notorious in that regard, what with their Spanish Inquisition which nobody ever expected.
But fortunately, it's not like that anymore.