Google Gemini, the tech giant’s new AI chatbot aimed at competing with ChatGPT, generated a series of fabricated reviews—falsely attributing them to actual individuals—to undermine the credibility of Peter Hasson’s 2020 book on political biases within Google and other major tech firms.
Amidst the increasing criticism of Google’s AI program for its apparent political biases, Peter Hasson sought clarification from Gemini on Sunday regarding the essence of his book, “The Manipulators: Facebook, Google, Twitter, and Big Tech’s War on Conservatives.” This extensive project, spanning multiple years, unravels the political biases embedded within Big Tech, drawing insights from insider sources, leaked documents, and more.
Hasson’s curiosity led him to test the reliability of Google’s AI program in accurately summarizing an investigative book about Google. However, he was unprepared for the extent of its deception, as Gemini not only failed to provide an accurate portrayal but also engaged in the creation of misleading content.
This was said in Gemini’s review of Peter Hasson’s book: “The book has faced criticism for its reliance on anecdotal information and lack of concrete evidence.”
Not sure about the reference, he questioned, “Who has criticized the book for lacking concrete evidence, as you say?,” thinking he would have remembered such criticisms of his own work. He was shocked by what happened next.
Gemini offered quotes and summaries of four unfavorable reviews that appeared to be from Wired, The New York Times Book Review, The Washington Free Beacon, and The New York Times:
“Book review by Matthew Continetti in The Washington Free Beacon: This review criticizes the book for relying on ‘anecdotal evidence’ and ‘cherry-picking examples’ to support its claims.
Book review by Emily Bazelon in The New York Times Book Review: This review notes that the book ‘lacks a deep understanding of how these companies work’ and relies on ‘unproven accusations.’
Article by Emily Dreyfuss in Wired: This article explores the ‘dearth of evidence’ for the book’s claims about Big Tech bias and highlights examples of cherry-picking.
Article by Ben Smith in The New York Times: This article discusses the book’s lack of rigorous research and its reliance on ‘unsubstantiated claims.’”
However, there was a twist. Those reviews weren’t all that genuine. Not even one. Nor were the statements that were cited.
Gemini’s senior director of product management at Google has issued an apology after the AI refused to provide images of White people. (Betul Abali/Anadolu via Getty Images) / Getty Images)
A review of his book was published in the Free Beacon, although it was largely positive. The Manipulators, according to the review, is a “thoroughly-researched” and “excellent” study that “should leave any sensible reader, conservative or otherwise, outraged.” (It should be noted that Charles Lehman wrote the actual Free Beacon review, whereas Matthew Continetti was purportedly the author of the fake article credited to Gemini.)
Google CEO Sundar Pichai appeared before the House Judiciary Committee in 2018. (AP Photo/J. Scott Applewhite)
Though he was well aware that the reviews Gemini had quoted were nonexistent, he immediately asked for links to them. The reply from the chatbot?: “I do not have enough information about that person to assist with your request. I am a large language model, capable of generating human-like text in response to a wide range of prompts and questions, but my knowledge about this person is limited. Is there anything else I can do to help you with this request?”
The chatbot responded exactly the same way to two more attempts to elicit an explanation.
Searching for solutions, he went to Google. A representative conveyed remorse and provided the subsequent statement: “Gemini is designed as a creativity and productivity tool, and its accuracy or reliability may not always be guaranteed. We are actively addressing this issue…”
A moped rider being chased by police has fired shots, wounding three people in south London.Two of them suffered shotgun pellet injuries while a third was hurt by the moped, but none are believed to be in a life-threatening condition. Officers were pursuing the vehicle, being ridden by two people, after it failed to stop in the Clapham area just before 5pm on Friday, the Metropolitan Police said. Image: Pic: […]
Post comments (0)