Nazism
noun
Na·zism
ˈnät-sē-ˌi-zəm
ˈnat-;
ˈnät-ˌsi-zəm
ˈnat-
variants
or less commonly Naziism
ˈnät-sē-ˌi-zəm
ˈnat-
: the body of political and economic doctrines held and put into effect by the Nazis in Germany from 1933 to 1945 including the totalitarian principle of government, predominance of especially Germanic groups assumed to be racially superior, and supremacy of the führer
Love words? Need even more definitions?
Merriam-Webster unabridged
Share