Meta’s AI image generator is coming under fire for its apparent struggles to create images of couples or friends from different racial backgrounds.
Meta’s AI image generator is coming under fire for its apparent struggles to create images of couples or friends from different racial backgrounds.
I wonder if Meta implemented a very basic “no porn keywords” filter. “Interracial” is quite a common keyword on porn websites, perhaps that’s why it won’t pick up on it well or wasn’t trained on images like it?
I’m thinking it’s an issue that’s more along the law of averages in the training data. Interracial images don’t tend to have descriptions specifying that the content is interracial, or which individual is which race, but there are many photo’s of individuals who are known to be of a specific race, so it falls back to the stronger probabilistic confidence in race x OR y OR z and completely ignores the weak “AND” correlation coefficients.
Rtfa. The prompts don’t say “interracial” they just say things like “show an Asian man with his white wife.”
The article reads:
Which is what I’m referring to.
Yeah my bad.