Obviously I have a different take on it, though I'm not immune to using fossilized phrases like "God willing" and "God forbid", myself. But I have always wondered why people blame God and point the finger at Him/Her/It for all the bad things that happen. My grandparents were "right off the boat" from Italy, my parents were born in the US. From earliest childhood all I ever heard from them, and aunts and uncles when something bad happened... like a child dying... was a "well, it's God's will". Aw bulldookie! I always felt that if there is a God, he would be just as sad and grieving over the death of a child. I never believed God took the child. So where does "why does God let it happen" come from?