territorialism
noun
ter·ri·to·ri·al·ism
ˌter-ə-ˈtȯr-ē-ə-ˌli-zəm
1
2
: the principle established in 1555 requiring the inhabitants of a territory of the Holy Roman Empire to conform to the religion of their ruler or to emigrate
Love words? Need even more definitions?
Merriam-Webster unabridged
Share