Definitions
There is 1 meaning of the phrase
Us Army.
Us Army - as a noun
The army of the united states of america; the agency that organizes and trains soldiers for land warfare
Synonyms (Exact Relations)
armyu. s. armyunited states armyusa