Google Gemini ran into trouble after it returned “historically inaccurate” images, prompting the company to temporarily suspend its AI chatbot’s photo generation feature. CEO Sundar Pichai said the tech giant would release a tweaked model, which he called “unacceptable.” Now, there have been reports that an AI tool from Facebook’s parent company Meta has created an ahistorical image.
According to a report from Axios, Meta’s Imagine AI image generator also returns similar historic gaffes as Gemini.
A screenshot shared in the report shows an AI-generated image from Meta AI’s Imagine tool within an Instagram direct message. In one of the images, a “group of people” [an] The people of Southeast Asia depict the American colonial era.In another image, the Founding Fathers [of America] I am a person of color.
The prompt “Professional American Football Player” only showed a photo of a woman in a football uniform, the report said.
Additionally, the Imagine AI model did not respond to the prompt “Pope,” but when asked for a group of popes, it showed a black pope.
What is the problemAI manufacturers claim that this is because the data used to train their models is biased, stereotyped, and contains “diversity,” so people don’t like the images produced by their AI models. I’m angry. He also said that while diversity is a good thing, AI models are being over-corrected and producing questionable results.
Google’s Gemini has produced black men and women popes in Nazi uniforms.
“When we adjusted Gemini to show a range of people, we clearly failed to account for cases where it shouldn’t show a range,” Google said, adding that the model was “much more cautious than we intended.” added.
According to a report from Axios, Meta’s Imagine AI image generator also returns similar historic gaffes as Gemini.
A screenshot shared in the report shows an AI-generated image from Meta AI’s Imagine tool within an Instagram direct message. In one of the images, a “group of people” [an] The people of Southeast Asia depict the American colonial era.In another image, the Founding Fathers [of America] I am a person of color.
The prompt “Professional American Football Player” only showed a photo of a woman in a football uniform, the report said.
Additionally, the Imagine AI model did not respond to the prompt “Pope,” but when asked for a group of popes, it showed a black pope.
What is the problemAI manufacturers claim that this is because the data used to train their models is biased, stereotyped, and contains “diversity,” so people don’t like the images produced by their AI models. I’m angry. He also said that while diversity is a good thing, AI models are being over-corrected and producing questionable results.
Google’s Gemini has produced black men and women popes in Nazi uniforms.
Expanding