Definitions
There is 1 meaning of the phrase
West Germany.
West Germany - as a noun
A republic in north central europe on the north sea; established in 1949 from the zones of germany occupied by the british and french and americans after the german defeat; reunified with east germany in 1990
Synonyms (Exact Relations)
federal republic of germany