In the ever-evolving world of artificial intelligence, Google's Gemini has hit a speed bump. The tech giant's flagship generative AI suite of models, known for its ability to generate images of people, is taking a brief hiatus to address some accuracy issues.
Google announced on social media platform X that it's putting a "pause" on Gemini's people image generation. The reason? The AI has been generating images that are historically inaccurate, causing quite a stir on social media.
Imagine seeing the U.S. Founding Fathers depicted as American Indian, Black or Asian. That's exactly what Gemini has been doing, leading to a wave of criticism and ridicule across the digital sphere. Even Paris-based venture capitalist Michael Jackson weighed in on LinkedIn, labelling Google's AI as a "nonsensical DEI parody" (DEI standing for 'Diversity, Equity and Inclusion').
Google confirmed on X that it's "aware" of the inaccuracies in Gemini's historical image generation depictions. The company stated, "We're working to improve these kinds of depictions immediately. Gemini’s AI image generation does generate a wide range of people. And that’s generally a good thing because people around the world use it. But it’s missing the mark here."
Generative AI tools, like Gemini, produce outputs based on training data and other parameters. They've often faced criticism for producing stereotypical outputs, such as sexualized imagery of women or white men when prompted for high status job roles.
Google has faced similar backlash before. Back in 2015, an AI image classification tool by Google misclassified Black men as gorillas, causing outrage. Google promised to fix the issue, but as reported by Wired, the 'fix' was a workaround that blocked the tech from recognizing gorillas altogether.
As Google works on improving Gemini, it's a stark reminder that even in the world of AI, accuracy and sensitivity are crucial. Stay tuned for more updates on this intriguing story.
Made with TRUST_AI - see the Charter: https://www.modelprop.co.uk/trust-ai
Comentarios