Google acknowledges its biased AI program infuriated users

- Advertisement -

The head of Google has publicly said something about Gemini, the controversial AI program, and the unfair results people get when they use it. 

The head of Google said that some of the results people were seeing were “unfair” and “unacceptable.” He also said that the Gemini findings were not good enough. For example, Gemini, Google’s newest AI program, has shown German fighters in WWII as people of color in search results that some people don’t like. 

- Advertisement -

In a speech, Sundar Pichai said that the results of Google’s Gemini had made a lot of people angry. I know that some of its reactions have hurt feelings and shown bias, and that’s not okay. We made a mistake,” he wrote in an email. 

We have teams working nonstop to fix these problems. “We’re already seeing big improvements on a lot of prompts,” Pichai remarked.

Many posts on social media have shown the results people were getting when they used the AI program. A lot of pictures of historical people like the Pope, the founding fathers, and Vikings were made by the program that was the wrong race or gender. For AI to keep getting better, it will need to be more accurate. 

This is not the first time that AI programs have given wrong impressions of people and things. Some AI systems have also given unfair results. For example, OpenAI’s picture generator would show a judge as white and a gunman as black. 

It was said by Pichai that Gemini would be changed and looked at again to make the program work better and give more accurate results. 

As technology has gotten better, there has been a huge growth in interest in AI, and many businesses are now using AI systems. Several companies have changed their programs to make sure that the results are fair and correct. 

- Advertisement -

RELATED ARTICLES

You may also like…

Recent Stories

Latest Posts on 24 Hour News