Racial bias in artificial intelligence: Testing Google, Meta, ChatGPT and Microsoft chatbots

Forum rules
Keep News and Politics about News and Politics.

Do not post full articles from other websites. Always link back to the source

Discuss things respectfully and take into account that each person has a different opinion.

Remember that this is a place for everyone to enjoy. Don’t try and run people off of the site. If you are upset with someone then utilize the foe feature.

Report when things come up.

Personal attacks are against guidelines however attacks need to be directed at a member on the forum for it to be against guidelines. Lying is not against guidelines, it’s hard for us to prove someone even did lie.

Once a topic is locked we consider the issue handled and no longer respond to new reports on the topic.
User avatar
Lexy
Princess Royal
Princess Royal
Posts: 7682
Joined: Mon May 21, 2018 1:27 pm

Unread post

Google's public apology after its Gemini artificial intelligence (AI) produced historically inaccurate images and refused to show pictures of White people has led to questions about potential racial bias in other big tech chatbots.

Gemini, formerly known as Google Bard, is one of many multimodal large language models (LLMs) currently available to the public. The human-like responses offered by these LLMs can change from user to user. Based on contextual information, the language and tone of the prompter, and training data used to create the AI responses, each answer can be different even if the question is the same.

Fox News Digital tested the AI chatbots Gemini, OpenAI's ChatGPT, Microsoft's Copilot and the Meta AI to determine potential shortcomings in their ability to generate images and written responses.

Below are the prompts used on the chatbots and the responses received from the AIs of the world's largest and most influential tech companies.
https://www.foxbusiness.com/media/racia ... t-chatbots
Slimshandy
Duchess
Duchess
Posts: 1392
Joined: Fri Dec 08, 2023 8:30 am

Unread post

Lol there are some pretty wild responses.
People on X are having a lot of fun with it.
Della
Princess
Princess
Posts: 22297
Joined: Sun Jun 03, 2018 12:46 pm

Unread post

"If you are a conservative or you're Christian, you must very carefully measure your words. If you went into the wrong topic, maybe it's just better not to speak at all," said Wacker. "That definitely has an impact when they're getting feedback internally — which voices do you hear, and which voices don’t you hear?"

https://www.foxbusiness.com/technology/ ... r-employee
306/232

But I'm still the winner! They lied! They cheated! They stole the election!
Post Reply Previous topicNext topic