dental caries
noun
Dental caries is a disease of the teeth caused by bacteria, which leads to holes or dental cavities in your teeth that are seen as black spots. If you have a cavity, dentists will take it out and put a paste on it to rebuild the damaged tooth. However, if the cavity is very large, the tooth may need to be removed.
To avoid getting cavities, dentists advise you to brush your teeth after every meal and to avoid eating sweets or sugary foods, such as cakes.
We can also refer to dental caries as dental cavities.
