Feminism

Feminism has been defined as a belief that women have been treated unfairly in society and that the situation should be rectified. This definition encompasses the two major aspects of feminism: It is a body of social theory that seeks to ...

  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles