Dentists are medical professionals who specialize in the care of the mouth. They are responsible for a range of duties including filling cavities, performing oral surgery, promoting healthy gums, and educating patients on oral health and preventing tooth decay. Dentists typically hold a degree in dentistry from an accredited dental school and must pass state and professional licensing exams. While many dentists have their own practices and support staff, some work in larger facilities like hospitals. Aspiring dentists can get a head start by enrolling in a pre-dentistry program at a university.