Is the United States an Empire?

Lets see what these prominent historians think Despite national myth to the contrary, during the long 19th century the United States engaged in imperial expansion. American imperial ambitions manifest themselves in expanding across the North American continent, obtaining overseas holdings, and influencing other nation-states through economic or military action. From its earliest westward expansion shortly…