I decided to start this thread to understand how feminism has evolved since the 20th century.
Now women are safeguarded by lots of laws, and cannot be considered a "weak" gender, since there is a huge quantity of laws protecting women's rights and punishing gender inequality and gender discrimination.
So what I am wondering is: is feminism in the West still a political priority?
I guess it's not.
In the West feminists fight for non-existent problems, forgetting that feminism is desperately needed in those countries where women have no rights. And they are considered inferior to men.
I am asking feminists to explain me this: thank you in advance.