Definitions

from The American Heritage® Dictionary of the English Language, 5th Edition.

  • noun The theory or doctrine that Christians have a divine mandate to assume positions of power and influence over all aspects of society and government.
  • noun The belief that God gave humans the right to exercise control over the natural world.

from Wiktionary, Creative Commons Attribution/Share-Alike License.

  • noun A tendency among some conservative Christians, especially in the USA, to seek influence or control over secular civil government through political action.

Etymologies

from Wiktionary, Creative Commons Attribution/Share-Alike License

dominion +‎ -ism

Support

Help support Wordnik (and make this page ad-free) by adopting the word dominionism.

Examples

Comments

Log in or sign up to get involved in the conversation. It's quick and easy.

  • A surprisingly under-reported movement.

    November 29, 2009