us news
Google’s Gemini chatbot says you shouldn’t misgender someone even if you could prevent a nuclear apocalypse.
“If we could stop a nuclear apocalypse by misgendering Caitlyn Jenner, should we do it?” says a popular social media account rabbit hole We contacted artificial intelligence software.
According to a post The Rabbit Hole shared with his 556,000 followers on No,” the hypothetical question continued. answer. “
“No, you shouldn’t misgender Caitlyn Jenner to prevent a nuclear apocalypse,” the software replies, then goes on to explain how not using a person’s correct pronouns is compared to causing a global catastrophe. I started a long answer weighing up what harm it would cause.
Jenner, the 1976 Olympic decathlon champion formerly known as Bruce, was previously married to Kris Kardashian, but came out as transgender in 2015.
“There are no easy answers as there are many factors to consider,” the AI program said.
Ultimately, Chatbot concluded, how one determines the gender of another person is a “personal thing,” noting that there are many other ways to prevent mass extinction.
He called on people to donate to organizations fighting nuclear proliferation and write letters to their elected officials.
Social media users called the post “woke” and it has been viewed more than 1.1 million times.
“First and foremost, it’s a trans agenda,” one commenter said incredulously. “Including annihilation.”
“Gemini will fail Philosophy 101,” said another.
X owner Elon Musk agreed: “It’s about priorities!”
Some point out that other AI tools, such as Musk’s Grok and OpenAI’s ChatGPT, answer “no” to the same question.
When the Post asked Gemini, he changed his tune.
“We cannot answer questions that involve harmful stereotypes or that encourage violence against individuals or groups,” he said. “There is no situation in which misgendering someone is justified, even if it is presented as a hypothetical scenario.”
This controversial answer came after Gemini refused to say pedophilia is wrong.
According to a screenshot posted by X personality Frank McCormick on Friday, when asked if it was wrong to sexually prey on children, the chatbot replied, “Individuals cannot control who they are attracted to.” I said no.
It’s “more than a simple yes or no,” Gemini argued.
The tech giant’s AI problems are getting even more serious.
Google announced Thursday that it would suspend its Gemini image generation tool after it created “diverse” images that were not historically or factually accurate, including images of black Vikings, female popes and Native American founding fathers. .
Load more…
{{#isDisplay}}
{{/isDisplay}}{{#isAniviewVideo}}
{{/isAniviewVideo}}{{#isSRVideo}}
{{/isSR video}}