English Wikipedia - The Free Encycl...
הורד מילון זה
American imperialism
American imperialism is the economic, military and cultural influence of the United States on other countries. Such influence often goes hand in hand with expansion into foreign territories.The concept of an American Empire was first popularized during the presidency of James K. Polk who led the United States into the Mexican–American War of 1846, and the eventual annexation of California and other western territories via the Treaty of Guadalupe Hidalgo and the Gadsden purchase.

See more at Wikipedia.org...


© This article uses material from Wikipedia® and is licensed under the GNU Free Documentation License and under the Creative Commons Attribution-ShareAlike License