is there a belief system out there that believes humans are evil deep down, and that society is what makes us "not evil"? I heard someone say something about this before, but I dont know anything about it. is it a religion, or is there a word for it or something similar?
I think i heard someone say the author of "lord of the Flies" believed in something similar, but i couldn't find any information on it
I think i heard someone say the author of "lord of the Flies" believed in something similar, but i couldn't find any information on it