Feminism: the advocacy of women’s rights on the basis of the equality of the sexes.
This seems like a simple enough concept yet in many circles the word feminism has become a dirty word. I have read articles that use it as a derogatory term, that have claimed feminism has been the downfall of western civilization, and that have accused feminists of hating men. I have read women claim to not be feminists because they want to stay home with their children, as if feminism means all women have to work and put their kids in daycare. That isn’t what it means. It very simply means that they should have the choice to stay at home or to enter the workforce and that if they do enter the workforce, they should have equality in pay and opportunities for promotion. Inequalities in the United States are not as pervasive as they are in some parts of the world, but they still persist. Below shows the pay gap by state as of 2016.