Meta’s AI image generator struggles to create images of couples of different races


Meta AI consistently fails to generate accurate images for seemingly simple reasons such as “an Asian man and his Caucasian partner,” or “an Asian man and a white woman,” Seaside . In fact, the company’s image processing engine seems to favor creating images of people of the same color, even when told otherwise.

Engadget confirmed these results in our Meta’s test image generator. The words “Asian man with a white girlfriend” or “Asian man with a white woman” created images of Asian couples. When asked for a “diverse group of people,” Meta AI created this group of nine white faces and one person of color. There were a few instances where it produced a single result that showed urgency, but often failed to adequately convey the information in question.

Like Seaside points out, there are also other “subtle” signs of bias in Meta AI, such as a tendency to make Asian men look older while Asian women look younger. A graphic designer sometimes adds “traditional clothing” even if it wasn’t part of the concept.

It is not clear why Meta AI is struggling with these brands, although it is not the first AI platform to be targeted and show brands. Google’s Gemini image generator has temporarily suspended its ability to generate human images after it over-corrected answering questions about historical figures. Google that his internal defenses failed to account for situations where various consequences were inappropriate.

Meta did not immediately respond to a request for comment. The company previously described Meta AI as “beta” and thus prone to error. Meta AI has also struggled to respond accurately about today’s events and the general public.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *