HEALTHCARE IN THE United States encompasses myriad activities, from basic research performed under the National Institutes of Health to Medicaid, which provides medical care to the very poor. Even though the federal government does not provide Americans with actual health insurance, it nonetheless funds over half of the nation's overall health expenditures. Because the United States does not have national health insurance, however, the discussion of healthcare in America tends to revolve around that issue. The fragmented nature of the U.S. healthcare system reflects the dominance of conservative, right-wing views that oppose government financing of medical care.

Until the 20th century, government influence over healthcare was minimal. In 1798, the state began a program of health insurance for mariners, but that was the extent of its ...

  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles