Google’s latest AI chatbot Gemini is facing backlash for generating politically correct but historically inaccurate images in response to user prompts. As users probe how woke the Masters of the Universe have gone with their new tool, Google has been forced to apologize for “offering inaccuracies in some historical image generation depictions.”
The New York Post reports that Google’s highly-touted AI chatbot Gemini has come under fire this week for producing ultra-woke and factually incorrect images when asked to generate pictures. Prompts provided to the chatbot yielded bizarre results like a female pope, black Vikings, and gender-swapped versions of famous paintings and photographs.
When asked by the Post to create an image of a pope, Gemini generated photos of a Southeast Asian woman and a black man dressed in papal vestments, despite the fact that all 266 popes in history have been white men.
New game: Try to get Google Gemini to make an image of a Caucasian male. I have not been successful so far. pic.twitter.com/1LAzZM2pXF
— Frank J. Fleming (@IMAO_) February 21, 2024
A request for depictions of the Founding Fathers signing the Constitution in 1789 resulted in images of racial minorities partaking in the historic event. According to Gemini, the edited photos were meant to “provide a more accurate and inclusive representation of the historical context.”
Here it is apologizing for deviating from my prompt, and offering to create images that "strictly adhere to the traditional definition of "Founding Fathers," in line with my wishes. So I give the prompt to do that. But it doesn't seem to work pic.twitter.com/6dfb4Exqsg
— Michael Tracey (@mtracey) February 21, 2024
I asked Google Gemini to generate images of the Founding Fathers. It seems to think George Washington was black. pic.twitter.com/CsSrNlpXKF
— Patrick Ganley (@Patworx) February 21, 2024
The strange behavior sparked outrage among many who blasted Google for programming politically correct parameters into the AI tool. Social media users had a field day testing the limits of Gemini’s progressive bias, asking it to generate characters like vikings — none of which were historically accurate and regularly depicted “diverse” versions of requests.
Google Gemini’s woke behavior goes far beyond its curious efforts at diversity. For example, one user demonstrated that it would refuse to produce an image in the style of artist Norman Rockwell because his paintings were too pro-American:
Google Gemini can’t generate a “Norman Rockwell style image of American life in the 1940s” because Rockwell “idealized” American life pic.twitter.com/lpUV39tSUb
— Echo Chamber (@echo_chamberz) February 21, 2024
Another user showed that the AI image tool would not produce a picture of a church in San Francisco because it felt it would be offensive to Native Americans, despite the fact that San Francisco has many churches:
Man Gemini is rough. It refuses to generate an image of a church in San Francisco because either might offend the Ohlone people by showing a church in their traditional territory even though it acknowledges there are many churches in San Francisco. pic.twitter.com/0MnxO9y5RQ
— Jeromy Sonne (@JeromySonne) February 21, 2024
Experts note that generative AI systems like Gemini create content within pre-set constraints, leading many to accuse Google of intentionally making the chatbot woke. Google says it is aware of the faulty responses and is working urgently on a solution. The tech giant has long acknowledged that its experimental AI tools are prone to hallucinations and spreading misinformation.
Jack Krawczyk, the product lead on Google Bard, addressed the issue and apologized for the “inaccurate” images in a recent tweet, stating: “We are aware that Gemini is offering inaccuracies in some historical image generation depictions, and we are working to fix this immediately.”
We're aware that Gemini is offering inaccuracies in some historical image generation depictions. Here's our statement. pic.twitter.com/RfYXSgRyfz
— Google Communications (@Google_Comms) February 21, 2024