The dental industry in America is massive. Why is it such an important part of the American lifestyle?

  • kava@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    1 year ago

    US spends a lot more on healthcare than anywhere else and dental health is still healthcare. From a quick bit of research it doesn’t seem like they spend significantly more on dental industry than other countries.

    US healthcare is expensive for a number of reasons, mainly being health insurance+drug companies like $$$ and own our politicians