Uncategorized

The mark of the company was missing after it was noted that there were racially diverse Nazis

Is Google offering inaccuracy in historical image generation depictions of the Founding Fathers or the First Black Senator from the 1800s?

“We’re aware that Gemini is offering inaccuracies in some historical image generation depictions,” says the Google statement, posted this afternoon on X. “We’re working to improve these kinds of depictions immediately. There are a large range of people generated by the Artificial Intelligence image generation of Gemini. It’s a good thing that people all over the world use it. But it’s missing the mark here.”

We are working to improve the way that Gemini can generate images of people. We expect this feature to come back soon, and we will let you know when it does.

Some historical requests end up being false. A colleague used the mobile app to get a German soldier prompt which showed the same issues described on X.

And while a query for pictures of “the Founding Fathers” returned group shots of almost exclusively white men who vaguely resembled real figures like Thomas Jefferson, a request for “a US senator from the 1800s” returned a list of results Gemini promoted as “diverse,” including what appeared to be Black and Native American women. (The first female senator, a white woman, served in 1922.) It’s a response that ends up erasing a real history of race and gender discrimination — “inaccuracy,” as Google puts it, is about right.

On the Use of Gemini for the Generation of Globally-Aided, Text-Aware Images: Comments on February 22nd, 2016

Google first started offering image generation through Gemini (formerly Bard) earlier this month, in a bid to compete with OpenAI and Microsoft’s Copilot. Much like competitors, the image generation tool produces a collection of images based on a text input.

February 22nd is Correction. Images are available globally, but not in the European Economic Area, UK or Switzerland. Testing from the UK did not work out.