Google’s Gemini AI was mocked for its revisionist history, but it still highlights a real problem

Ask Google’s generative AI tool Gemini to create images of American revolutionary war soldiers and it might present you with a Black woman, an Asian man and a Native American woman wearing George Washington’s bluecoats.

That diversity has gotten some people, including Frank J. Fleming, a former computer engineer and writer for the Babylon Bee, really mad. Fleming has tweeted a series of his increasingly frustrated interactions with Google as he tries to get it to portray white people in situations or jobs where they were historically predominant (for example, Medieval knight). The cause has been taken up by others who claim it’s diversity for diversity’s sake, and everything wrong with the woke world.

There’s just one problem: Fleming and his fellow angry protesters are on a futile mission. “This can’t be done with these systems,” says Olivia Guest, assistant professor of computational cognitive science at Radboud University. “You can’t guarantee behavior. That’s the point of stochastic systems.”

The current generation of generative AI tools are stochastic systems—or as one famous academic paper published in 2021 equated it, they randomly produce different outputs, even if given the same input. It’s the thing that has made generative AI capture the public’s attention: That it doesn’t just repeat the same thing over and over again.

Experts also question whether the AI chatbot results presented by the angry mob on social media are the full picture—literally. “It’s difficult to assess the trustworthiness of any content that we see on platforms such as X,” says Rumman Chowdhury, CEO and co-founder of Humane Intelligence. “Are these cherry-picked examples? Absent an at scale image generation analysis that is able to be tracked and mapped across many different prompts, I would not feel that we have a clear grasp of whether or not this model has any sort of bias.”

Google has recognized the uproar and said it’s taking action. “We are aware that Gemini is offering inaccuracies in some historical image generation depictions, and we are working to fix this immediately,” Jack Krawczyk, the product lead for Google Bard, wrote on X.  Krawczyk highlighted that the depiction of historical events fell between two competing interests: to accurately represent history as it happened; and to  “reflect our global user base.”

But tweaking the underlying issues might not be so easy. Fixing stochastic systems is trickier than it looks. Drawing up guardrails for AI models are the same, and can be subverted, unless you revert to brute force blocking (Google has previously ‘fixed’ image recognition software that would identify Black people as gorillas by preventing the software from recognizing any actual gorillas). Then it isn’t a stochastic system, which means that the thing that makes generative AI unique is gone.

The whole brouhaha raises an interesting question, says Chowdhury. “It is really difficult to define whether or not there is a correct answer to what images should be generated,” she says. “Relying on historical accuracy may result in the reinforcement of the exclusionary status quo. However, it could run the risk of being simply factually incorrect.”

For Yacine Jernite, machine learning and society lead at AI company Hugging Face, the issue isn’t just one that’s limited to Gemini. “This isn’t just a Gemini issue, rather a structural issue with how several companies developing commercial products without much transparency are addressing questions of biases,” he says. It’s a subject that Hugging Face has written about previously. “Bias is compounded by choices made at all levels of the development process, with choices earliest having some of the largest impact—for example, choosing what base technology to use, where to get your data, and how much to use,” says Jernite.

Jernite fears that what we’re seeing could be the result of what companies see as implementing a quick, relatively cheap fix: If their training data overrepresents white people, you can modify prompts under the hood to inject diversity. “But it doesn’t really solve the issue in a meaningful way,” he says.

Instead, companies need to address the issue of representation and bias openly, Jernite argues. “Telling the rest of the world what you’re doing specifically to address biased outcomes is hard: It exposes the company to having external stakeholders question their choices, or point out that their efforts are insufficient—and maybe disingenuous,” he says. “But it’s also necessary, because those questions need to be asked by people with a more direct stake in bias issues, people with more expertise on the topic—especially people with social sciences training which are notoriously lacking from the tech development process—and, importantly, people who have a reason not to trust that the technology will work, to avoid conflicts of interest.”

https://www.fastcompany.com/91034044/googles-gemini-ai-was-mocked-for-its-revisionist-history-but-it-still-highlights-a-real-problem?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Creată 11mo | 21 feb. 2024, 21:50:07


Autentifică-te pentru a adăuga comentarii

Alte posturi din acest grup

Find out who’s behind any phone number with this free lookup tool

I don’t know about you, but practically every time my phone rings, my heart rate starts skyrocketing.

Who the hell could be calling me? What in the world do they want? And why, for the l

5 ian. 2025, 07:50:03 | Fast company - tech
‘This app saved our business’: Small businesses are bracing for a TikTok ban

As the clock ticks closer to a U.S. ban on TikTok, small businesses are bracing for the loss of an app that has, in many cases, proven vital for their success.

Millions of small business

5 ian. 2025, 05:30:04 | Fast company - tech
How Big Tech became the world’s most powerful ‘religion’ and why we need to become agnostic

Greg Epstein is the Humanist chaplain at Harvard University and at MIT, where he advises students, faculty, and staff members on ethical and existential concerns from a humanist perspective. He ha

4 ian. 2025, 10:50:02 | Fast company - tech
3 hidden reasons you keep running out of iCloud storage

Apple gives every iCloud user 5GB of free storage space. This storage space can be used for any

4 ian. 2025, 10:50:02 | Fast company - tech
Apple’s Siri settlement feeds the ‘eavesdropping iPhone’ narrative

Apple, which has built its brand on data privacy, settled a class action suit this week in w

3 ian. 2025, 23:20:03 | Fast company - tech
Dating Wrapped: TikTok users are crunching the numbers on their dating life

If Spotify Wrapped left you underwhelmed this year, TikTok’s “Dating Wrapped” trend is here to sp

3 ian. 2025, 20:50:03 | Fast company - tech
Dating Sunday 2025: The busiest day on dating apps is almost here

Dating apps are gearing up for their busiest day of the year: Dating Sunday. 

This landmark day in the dating world always lands on the first Sunday of January. The idea is that sin

3 ian. 2025, 16:20:06 | Fast company - tech