Definitions
There is 1 meaning of the phrase
United States Army.
United States Army - as a noun
The army of the united states of america; the agency that organizes and trains soldiers for land warfare
Synonyms (Exact Relations)
armyu. s. armyus armyusa