I’m trying to get perspective on this particular beauty standard and how I want to approach it. Do people whiten their teeth where you live? Is it seen as expected to do so? Do you live in a city?
I have healthy teeth that have nevertheless seen a lot of tea and coffee. I have generally thought of this as similar to wrinkles, i.e. a natural thing bodies do that I don’t want to pay money to fix since it isn’t broken. I still think this. But I have been feeling lately like there might be more actual social stigma to my teeth being discolored. I am wondering if this is at all real? Has whitening teeth become an expected thing for all adults to do now? I thought I’d ask how other people feel and think about this and what the general norm is in your social circle.
Edit: thanks for the responses everybody.
May be natural, but so is body odor and we shower and use deodorants.
Yellow teeth don’t look good, it’s just that simple, and whitening isn’t expensive for most people - just go buy a box of a generic whitening kit from a drugstore. If it works for you, you win the whitening lottery.
Read the directions, the warnings, follow them.
I can’t use most of them, as they hurt my teeth (I’m sensitive to the ingredients).
No, you’re objectively wrong on this. It is more akin to cosmetic surgery because it is harmful for your teeth and potentially dangerous. This isn’t a normal hygiene standard.
They never called it hygiene.
It is indeed potentially harmful, but qualified, legitimate dentists will have the answers needed if it’s ok for any given person.
I think comparing whitening to bathing and using deodorant is calling it normal hygiene. Not bathing literally leads to worse health outcomes.
That being said, you’re correct, I definitely have a strong bias towards this. I have been told by my dentist to not do it because it is damaging to the enamel. Consulting your dentist is definitely a good move.