tropical medicine
noun
: a branch of medicine dealing with tropical diseases and other medical problems of tropical regions
Love words? Need even more definitions?
Merriam-Webster unabridged
Share