Noun west germany has 1 sense
  1. West Germany, Federal Republic of Germany - a republic in north central Europe on the North Sea; established in 1949 from the zones of Germany occupied by the British and French and Americans after the German defeat; reunified with East Germany in 1990
    --1 is a kind of European country, European nation
,
TOP