Definitions
There is 1 meaning of the phrase
The Indies.
The Indies - as a noun
The string of islands between north america and south america; a popular resort area
Synonyms (Exact Relations)
west indiesHypernyms (Closely Related)
Hyponyms (Broadly Related)