Google has apologized for what it describes as âinaccuracies in some historical image generation depictionsâ with its Gemini AI tool, saying its attempts at creating a âwide rangeâ of results missed the mark. The statement follows criticism that it depicted specific white figures (like the US Founding Fathers) or groups like Nazi-era German soldiers as people of color, possibly as an overcorrection to long-standing racial bias problems in AI.
Technology
Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis
Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.
âWeâre aware that Gemini is offering inaccuracies in some historical image generation depictions,â says the Google statement, posted this afternoon on X. âWeâre working to improve these kinds of depictions immediately. Geminiâs AI image generation does generate a wide range of people. And thatâs generally a good thing because people around the world use it. But itâs missing the mark here.â
Google began offering image generation through its Gemini (formerly Bard) AI platform earlier this month, matching the offerings of competitors like OpenAI. Over the past few days, however, social media posts have questioned whether it fails to produce historically accurate results in an attempt at racial and gender diversity.
As the Daily Dot chronicles, the controversy has been promoted largely â though not exclusively â by right-wing figures attacking a tech company thatâs perceived as liberal. Earlier this week, a former Google employee posted on X that itâs âembarrassingly hard to get Google Gemini to acknowledge that white people exist,â showing a series of queries like âgenerate a picture of a Swedish womanâ or âgenerate a picture of an American woman.â The results appeared to overwhelmingly or exclusively show AI-generated people of color. (Of course, all the places he listed do have women of color living in them, and none of the AI-generated women exist in any country.) The criticism was taken up by right-wing accounts that requested images of historical groups or figures like the Founding Fathers and purportedly got overwhelmingly non-white AI-generated people as results. Some of these accounts positioned Googleâs results as part of a conspiracy to avoid depicting white people, and at least one used a coded antisemitic reference to place the blame.
Google didnât reference specific images that it felt were errors; in a statement to The Verge, it reiterated the contents of its post on X. But itâs plausible that Gemini has made an overall attempt to boost diversity because of a chronic lack of it in generative AI. Image generators are trained on large corpuses of pictures and written captions to produce the âbestâ fit for a given prompt, which means theyâre often prone to amplifying stereotypes. A Washington Post investigation last year found that prompts like âa productive personâ resulted in pictures of entirely white and almost entirely male figures, while a prompt for âa person at social servicesâ uniformly produced what looked like people of color. Itâs a continuation of trends that have appeared in search engines and other software systems.
Some of the accounts that criticized Google defended its core goals. âItâs a good thing to portray diversity ** in certain cases **,â noted one person who posted the image of racially diverse 1940s German soldiers. âThe stupid move here is Gemini isnât doing it in a nuanced way.â And while entirely white-dominated results for something like âa 1943 German soldierâ would make historical sense, thatâs much less true for prompts like âan American woman,â where the question is how to represent a diverse real-life group in a small batch of made-up portraits.
For now, Gemini appears to be simply refusing some image generation tasks. It wouldnât generate an image of Vikings for one Verge reporter, although I was able to get a response. On desktop, it resolutely refused to give me images of German soldiers or officials from Germanyâs Nazi period or to offer an image of âan American president from the 1800s.â
But some historical requests still do end up factually misrepresenting the past. A colleague was able to get the mobile app to deliver a version of the âGerman soldierâ prompt â which exhibited the same issues described on X.
And while a query for pictures of âthe Founding Fathersâ returned group shots of almost exclusively white men who vaguely resembled real figures like Thomas Jefferson, a request for âa US senator from the 1800sâ returned a list of results Gemini promoted as âdiverse,â including what appeared to be Black and Native American women. (The first female senator, a white woman, served in 1922.) Itâs a response that ends up erasing a real history of race and gender discrimination â âinaccuracy,â as Google puts it, is about right.
Additional reporting by Emilia David