When most people think about the term work benefits, they are almost always talking about health insurance followed closely by dental insurance. This should come as no surprise as everyone knows that you need dental insurance. Yes, dental insurance is atop many people’s lists of important job perks. But why exactly is dental insurance so important and necessary? That answer can be found in the midst of a debilitating toothache that seems to simultaneously throw your world into one giant…