doctor
noun
A doctor is a person who has studied medicine and cures people that are ill or injured. If you have any health problems, they visit you and tell you what is wrong with you to then cure you or give you the appropriate recommendations to feel better soon. They also prescribe medicine if needed. Doctors typically wear a white coat.