Definitions
There is 1 meaning of the phrase
West Coast.
West Coast - as a noun
The western seaboard of the united states from washington to southern california
Synonyms (Exact Relations)
west coast of the united statesExample Sentences
"I love the west coast weather."
"They went on a west coast tour."
"We enjoyed the west coast scenery."
"She prefers west coast cuisine."
"He has a west coast style of dressing."