Western medicine
noun
: the typical methods of healing or treating disease that are taught in Western medical schools
Love words? Need even more definitions?
Merriam-Webster unabridged
Share