Throughout much of the history of the United States, medicine has been intimately connected with ideas of masculinity. Physicians, the majority of whom have been men, have often embodied ideas of masculine professionalism. In addition, definitions of health have been intertwined with masculine ideals, and physician advice about disease treatments has reflected and reinforced social, cultural, and political ideas about masculinity. The history of medicine reveals changing ideas about professional identity, masculine character, and the relationship between the two.

Early America

Much analysis of pre-nineteenth-century American medicine has centered on differences between the emerging medical profession in America and the established tradition practiced in Europe. While American medical structures were derived in some ways from their European predecessors, American physicians tended to define themselves in contrast to ...

  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles