I’m trying to get perspective on this particular beauty standard and how I want to approach it. Do people whiten their teeth where you live? Is it seen as expected to do so? Do you live in a city?

I have healthy teeth that have nevertheless seen a lot of tea and coffee. I have generally thought of this as similar to wrinkles, i.e. a natural thing bodies do that I don’t want to pay money to fix since it isn’t broken. I still think this. But I have been feeling lately like there might be more actual social stigma to my teeth being discolored. I am wondering if this is at all real? Has whitening teeth become an expected thing for all adults to do now? I thought I’d ask how other people feel and think about this and what the general norm is in your social circle.

Edit: thanks for the responses everybody.

  • Tedrow@lemmy.world
    link
    fedilink
    arrow-up
    9
    arrow-down
    4
    ·
    4 months ago

    No, you’re objectively wrong on this. It is more akin to cosmetic surgery because it is harmful for your teeth and potentially dangerous. This isn’t a normal hygiene standard.

    • GBU_28@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 months ago

      They never called it hygiene.

      It is indeed potentially harmful, but qualified, legitimate dentists will have the answers needed if it’s ok for any given person.

      • Tedrow@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        edit-2
        4 months ago

        I think comparing whitening to bathing and using deodorant is calling it normal hygiene. Not bathing literally leads to worse health outcomes.

        That being said, you’re correct, I definitely have a strong bias towards this. I have been told by my dentist to not do it because it is damaging to the enamel. Consulting your dentist is definitely a good move.