English Wikipedia - The Free Encycl...
הורד מילון זה
Western United States
The Western United States, commonly referred to as the American West or simply the West, traditionally refers to the region comprising the westernmost states of the United States. Because European settlement in the U.S. expanded westward after its founding, the meaning of the West has evolved over time. Prior to about 1800, the crest of the Appalachian Mountains was seen as the western frontier.

See more at Wikipedia.org...


© This article uses material from Wikipedia® and is licensed under the GNU Free Documentation License and under the Creative Commons Attribution-ShareAlike License