Google apologized for its overly diverse artificial intelligence image generator, which appeared unable to depict white people in any fashion.
Google’s Gemini image generator debuted earlier this month and within weeks drew widespread attention over its aggressively diverse treatment of prompts. When asked to depict white historical figures, including the United States’s Founding Fathers, Vikings, and medieval knights, it would portray them as nonwhite. Even when explicitly asked to portray a white person, the generator would present a nonwhite person.
Google admitted its mistake on Wednesday before apologizing and halting the generator on Friday.
“We’re already working to address recent issues with Gemini’s image generation feature,” Google announced in a post on X. “While we do this, we’re going to pause the image generation of people and will re-release an improved version soon.”
Google Knowledge and Information Vice President Prabhakar Raghavan elaborated in a post on Google’s product blog.
“Three weeks ago, we launched a new image generation feature for the Gemini conversational app (formerly known as Bard), which included the ability to create images of people. It’s clear that this feature missed the mark,” he wrote on Friday. “Some of the images generated are inaccurate or even offensive. We’re grateful for users’ feedback and are sorry the feature didn’t work well.”
Raghavan explained that the image generator was designed to avoid offensive imagery and to express diversity but overcorrected.
CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER
“First, our tuning to ensure that Gemini showed a range of people failed to account for cases that should clearly not show a range,” he wrote. “And second, over time, the model became way more cautious than we intended and refused to answer certain prompts entirely — wrongly interpreting some very anodyne prompts as sensitive.”
“These two things led the model to overcompensate in some cases, and be over-conservative in others, leading to images that were embarrassing and wrong,” Raghavan continued. “This wasn’t what we intended. We did not want Gemini to refuse to create images of any particular group. And we did not want it to create inaccurate historical — or any other — images.”