dermatology
noun
der·ma·tol·o·gy
ˌdər-mə-ˈtä-lə-jē
: a branch of medicine dealing with the skin, its structure, functions, and diseases
Love words? Need even more definitions?
Merriam-Webster unabridged
Share