Feminism is the belief that women should enjoy the same political, economic, and social rights as men. Feminists criticize the unjust treatment of women in societies across the world. Feminism also refers to a global social movement seeking to raise awareness of gender inequalities through campaigns for women’s rights, including rights to vote, to bodily autonomy and freedom from violence, to education, to enter contracts, to equal rights within marriage, and more. The issues raised by feminists have become subjects of research and theoretical interpretations that have changed the social sciences and societies along the way.


Feminists have identified and contested the subordination of women through political activism and research. Over time, they have developed sophisticated theories of gender relations and inequality worldwide. Feminist ideas ...

  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles