Google’s failed artificial intelligence image generator is proof that the leaders of the company have allowed their ideologically driven racial brain rot to control every aspect of their jobs.
Google announced that it would pause the generation of images of people with its AI tool Gemini after people discovered the forced diversity being imposed on image prompts. That included things such as making Revolutionary War soldiers Asian women, making the pope a woman, and making 1950s NASCAR drivers into women of various skin colors.
Along with that, the tool declared that it could not make images of white people when prompted because that is “discriminatory and biased content” and that “I promote diversity and inclusion in all that I do.”
Gemini, of course, is not the problem here. As even the tool said when asked by Seth Mandel, its responses “are based on the vast amount of text and code” it has been fed and that the data used to train it “can potentially influence” how it generates images. In other words, the Google employees working on the AI tool are so consumed by their twisted view of racism that they turned the bot into an ahistorical and flagrantly racist tool based on the data they used to create it.
That problem is not going to be fixed either, at least in any serious way. Jack Krawczyk, the head of product for Google’s AI division, said in response to all of this that the tool would keep its racist image generation guidelines for “open ended prompts” but that historical images “have more nuance” to “accommodate.” They are not going to make the tool less racist, they just want to make the racism less obvious.
CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER
This entire ordeal should be an embarrassment to everyone at Google, especially Krawczyk. Their brains are so addled by racism that they created a racist tool that refuses to create images of people of one skin color while inserting racial tokens into historical images. On top of that, the tool showed that Google employees are also beholden to the Chinese Communist Party, as it refused to generate images of the Tiananmen Square massacre.
This should spark a companywide reflection on just how Google could produce a product so embarrassingly subpar before you even get to the racism angle. Instead, Krawczyk and his team will almost certainly try to sweep this under the rug and try to make the racism in the AI tool a little less obvious. That would be another failure for Google on top of this one, and proof that the brains behind the company are riddled with racial decay.