World War I cemented the importance of international trade to the nation's economic well-being, and it gave America a prominent role in bringing peace to Europe. The United States established itself as a world military power by playing a key part ...

  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles