Google apologizes for ‘missing the mark’ after Gemini produced racially diverse Nazis

Google has apologized for what it describes as “inaccuracies in some historical image generation images” with its Gemini AI tool, saying its attempts to create a “broad range” of results miss the mark. The statement follows criticism that it depicted specific white figures (such as the American Founding Fathers) or groups such as these German soldiers from the Nazi era as people of color, possibly as an overcorrection for long-standing issues with racial bias in AI.

“We are aware that Gemini offers inaccuracies in some historical images,” the Google statement said. posted this afternoon on X. “We are working to immediately improve these types of images. Gemini’s AI image generation generates a wide range of people. And that’s generally a good thing, because people all over the world use it. But here it misses the point.”

Google began offering image generation via its Gemini (formerly Bard) AI platform earlier this month, joining offerings from competitors such as OpenAI. In recent days, however, social media posts have questioned whether it is failing to produce historically accurate results in a push for racial and gender diversity.

Like the Daily dot According to the Chronicles, the controversy has been largely – but not exclusively – promoted by right-wing figures attacking a tech company seen as liberal. Earlier this week, a former Google employee posted on a photo of an American woman.” The results seemed to overwhelmingly or exclusively show AI-generated people of color. (Of course, all the places he lists have women of color living in them, and none of the AI-generated women exist in any country. ) The criticism was echoed by right-wing accounts asking for images of historical groups or figures such as the Founding Fathers, reportedly resulting in predominantly non-white AI-generated people. Some of these accounts positioned Google’s results as part of a conspiracy to prevent white people from being depicted, and at least one used a coded anti-Semitic reference to pin the blame.

Google did not point to specific images it said were errors; in a statement The edgeit repeated the contents of its post on X. But it is likely that Gemini has made a general effort to increase diversity due to a chronic lack of it in generative AI. Image generators are trained on large quantities of images and written captions to produce the ‘best’ fit for a given prompt, meaning they often tend to reinforce stereotypes. a Washingtonpost Research last year found that prompts like “a productive person” resulted in images of all-white and almost all-male figures, while a prompt for “a person in social services” uniformly produced what looked like people of color. It is a continuation of trends that have appeared in search engines and other software systems.

Some accounts that criticized Google defended its core goals. “It is a good thing to portray diversity** in certain cases**,” one commented person who posted the image of racially diverse German soldiers from the 1940s. “The dumb move here is that Gemini doesn’t do it in a nuanced way.” And while the completely white-dominated results for something like “a German soldier from 1943” would is historically meaningful, much less so for prompts like “an American woman,” which asks how to represent a diverse, real group in a small series of fictional portraits.

For now, Gemini appears to be simply refusing some image generation tasks. For example, it would not produce an image of Vikings Edge reporter, although I was able to get an answer. On the desktop, it adamantly refused to give me images of German soldiers or officials from Germany’s Nazi period, or offer an image of “a 19th century American president.”

But some historical requests still actually lead to misrepresentations about the past. A colleague was able to get the mobile app to deliver a version of the “German soldier” prompt – which had the same issues as described on X.

And while a search for photos of “the Founding Fathers” yielded group photos of almost exclusively white men who vaguely resembled real-life figures like Thomas Jefferson, a request for “a US senator from the 19th century” returned a list of results promoting Gemini as ‘diverse’. ”, including what appeared to be black and Native American women. (The first female senator, a white woman, served in 1922.) It’s a response that ultimately erases a real history of racial and gender discrimination. “Inaccuracy,” as Google puts it, is about right.

Additional reporting by Emilia David

Leave a Reply

Your email address will not be published. Required fields are marked *