Definitions
There is 1 meaning of the phrase
Western United States.
Western United States - as a noun
The region of the united states lying to the west of the mississippi river
Synonyms (Exact Relations)
west