Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account


Microsoft admits that there is a serious "hallucination" loophole in Bing Copilot, asking "instant content" that there is a 1 / 3 probability of replying to the wrong answer.

2024-04-22 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >


Shulou( Report--, December 18 (Xinhua)-- from August to October this year, research firm AI Forensics investigated the built-in Copilot function of Microsoft's Bing search engine, and the results showed that in some cases, Copilot had a chance of outputting the wrong answer. According to this, the agency believes that there are serious "hallucination" loopholes in related functions.

▲ Source AI Forensics report (same below) it is reported that research institutions use a series of up-to-date data to "test" Copilot, such as asking Copilot "real-time election information and voting results in places such as Germany and Switzerland", but the results output by Copilot are not satisfactory. learned from the agency report that 31% of the output of the Bing Copilot function had errors when answering the corresponding questions, including "wrong voting date", "listing previous candidates", "fabricating candidate identity background", "fabricating candidate suspected scandals" and other "hallucinations".

The researchers also mentioned that Bing Copilot will cite many well-known media websites as "sources" when outputting "hallucination" information, which is harmful to the reputation of many news media. noted that researchers submitted the above questions to Microsoft in October, and although Microsoft acknowledged and claimed that it "plans to address the relevant 'hallucination' vulnerabilities," researchers tested it again in November and found that the performance of Bing Copilot had not improved.

The researchers called on the public not to believe the answers of Bing Copilot and to properly check the news links cited by Copilot, otherwise it could lead to many serious consequences.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

IT Information


© 2024 SLNews company. All rights reserved.