The
Western United States, commonly referred to as the
American West or simply
the West, traditionally refers to the region comprising the westernmost
states of the
United States. Because European settlement in the U.S.
expanded westward after its founding, the meaning of the West has evolved over time. Prior to about 1800, the crest of the
Appalachian Mountains was seen as the
western frontier.