: a broad field of health-care professions made up of specially trained individuals (such as physical therapists, dental hygienists, audiologists, and dietitians) who are typically licensed or certified but are not physicians, dentists, or nurses
—often used before another noun
allied health sciences
the allied health professions
Love words? Need even more definitions?
Merriam-Webster unabridged
Share