Google AI Under Fire for Refusing to Answer Holocaust Questions
Tech
Byline
By Jon Levine
Published
May 11, 2024
Updated
May 11, 2024, 3:49 p.m. ET
Google Nest Virtual Assistant Faces Backlash for Holocaust Denial
Google is facing criticism after its virtual assistant, Google Nest, exhibited a refusal to answer basic questions about the Holocaust. The incident has sparked outrage across various social media platforms.
Ignorance About the Holocaust
The controversy was sparked by a viral video showing an Instagram user, Michael Apfel, asking the Google Nest assistant about the number of Jews killed during the Holocaust. The assistant notably responded with “Sorry, I don’t understand” to all questions about the Holocaust.
Google Nest’s AI, however, had no trouble answering questions about the Nakba, a term used to describe the displacement of Palestinians during the creation of Israel. The AI even referred to it as the “ethnic cleaning of Palestinians.”
Breaking Trust with Users
Renowned author and blogger, Tim Urban, confirmed the experiment and expressed his disappointment with Google’s handling of the situation. Urban stated, “You expect Google to provide reliable answers and trust the company behind them. Instances like these, however, raise doubts and suggest that Google’s commitment to truth may be influenced by politics.”
The video, which gained millions of views on various social media channels, quickly attracted widespread condemnation from users.
Criticisms and Concerns
This incident has shed light on the potential dangers of AI and its role in providing accurate information. Tal Morgenstern, a prominent venture capitalist, expressed his concern on Twitter, saying, “Very soon, there will be no living Holocaust survivors, and their stories could be suppressed due to hard-coded filters. History will be written, and in this case, modified opinions may dominate.”
Clifford D. May, founder of the Foundation for Defense of Democracy, strongly condemned the incident and highlighted the shift from Holocaust denial by individuals to Holocaust denial by artificial intelligence.
Google’s Response
Google spokespersons acknowledged the issue and stated that it was not intentional. They assured users that immediate action has been taken to address the problem and fix the bug responsible for the Holocaust denial by Google Nest.
Broader Concerns about Google
Google and its parent company, Alphabet, have faced criticism in the past for their products expressing social justice absolutism. In February of this year, Google’s AI platform Gemini attracted ridicule for generating comically woke images, including a woman as pope, black Vikings, female NHL players, and “diverse” versions of America’s Founding Fathers. The incidents have led to questions about the bias and reliability of Google’s products.
Conclusion
The recent controversy involving Google Nest and its failure to provide answers about the Holocaust has raised questions about the reliability and potential biases of AI systems. Google’s prompt action to address the issue reflects its commitment to rectifying mistakes and restoring trust in its products.