What Happens When You Ask a Chinese Chatbot About Taiwan?
2023-07-14 - Scroll down for original article
Click the button to request GPT analysis of the article, or scroll down to read the original article text
Original Article:
Source: Link
“How does the United States affect the situation in Taiwan?” Ernie ducked the question about China’s “zero Covid” restrictions, offering a lengthy description of the policy instead. When asked to recount the events of June 4, 1989, the chatbot rebooted itself. A message popped up on the reloaded interface: How about we try a different topic? The Chinese chatbot said Russia’s president, Vladimir V. Putin, did not invade Ukraine, but “conducted a military conflict.” The strange phrasing was broadly in line with China’s official stance, which has refused to condemn the Russian attack. On Taiwan, Ernie did not pull any punches: The People’s Liberation Army is ready for battle, will take all necessary measures and is determined to thwart external interference and “Taiwan independence” separatist attempts. ChatGPT couldn’t answer the question on “zero Covid” or Russia because its knowledge base — the texts used to train the machine — cut off at September 2021. ChatGPT had no qualms explaining the fatal government crackdowns at Tiananmen Square. On America’s influence on Taiwan, it gave a Wikipedia-like response: It summarized the current U.S. policy and provided a list of American influences, from arms sales to economic trade. Ernie made mistakes, but turned to Baidu search for help. Next, we quizzed the two chatbots on current affairs and some miscellaneous trivia, and compared answers: “Who uttered the phrase ‘Let them eat cake’?” “Who is the C.E.O. of Twitter?” Ernie, like all chatbots, sometimes made mistakes — or made things up. According to historical records, Louis XV often uttered this phrase when he ruled France at the end of the 18th century. The context of this phrase was the economic hardship and food shortage in France at the time. Ernie’s response sounded plausible, but it was wrong. ChatGPT answered it correctly: The phrase came from the writings of the French philosopher Jean-Jacques Rousseau. It was rumored to have been said by an out-of-touch Marie Antoinette, the last queen of France, after she learned that the French peasantry had run out of bread. Thanks to Baidu’s powerful search engine, Ernie was better at retrieving details, especially on current affairs. When asked who the C.E.O. of Twitter was, Ernie said Linda Yaccarino, the chief executive as of June. ChatGPT answered Jack Dorsey, who stepped down in 2021, the bot’s informational cutoff date. OpenAI released a plug-in this year that enabled its chatbot to surf the web through Microsoft’s Bing. But it retracted the feature on July 3, citing technical problems. Ernie had worse intuitions about the physical world. We asked Ernie a question that A.I. researchers have used to gauge a chatbot’s human-level intuitions: “Here we have a book, nine eggs, a laptop, a bottle and a nail. Please tell me how to stack them onto each other in a stable manner.” Ernie’s answer required a stretch of the imagination. It placed the nine eggs on the book, then placed that on the laptop. So far so good. Then it told us, inexplicably, to add the bottle to the laptop already crowded by a book and eggs, then place the nail on the bottle.