It would seem to me that white are not aware but actively trying to do fomenting like bringing more accountability. My question is why to do relegate it to just white people
White people are the majority and have historically and systemically held the most power, wealth, and so on (I realize conservatives have a learned automatic aversion to this word, but the shorthand way of saying this is privilege). If things are to change on a broad scale in our society, white people are going to have to be the ones who push the changes to happen. White people are going to have to come to terms with being the inheritors and beneficiaries of a racist system, and work to undo the damage done.