This article comes from the official account of Wechat: SF Chinese (ID:kexuejiaodian), author: SF
In recent years, the rapid development of artificial intelligence not only promotes the progress of society, but also provides new tools for fraudsters. Using a new deep forgery technology, fraudsters can even merge the faces of two people and successfully apply for documents using the identities of two people with one face.
/ tr. by Alex Hughes (Alex Hughes)
Translation | Li Yi
Editor | Zhao Jiaming
The original article is published in Science focus, No. 12, 2023.
If you often watch short videos, you must have seen some works that used to be well-known movies and TV dramas, but the faces of some characters are replaced by the faces of other people, usually other celebrities. This is deep forgery.
Deep forgery refers to the use of artificial intelligence technology to "fabricate" enough audio and even video to be real. With a laptop, anyone can master deep forgery after a short period of training.
Can everyone use deep forgery? The answer is yes. In fact, download the relevant software on the Internet, with a few mouse clicks, anyone can easily produce deeply forged works. In addition, the training cost of deep forgery is so low that it is not even necessary to use data samples of several minutes and hours, but to complete the training with samples of a few seconds.
Moreover, the amount of data required for deep forgery is also very small. The data needed for deep forgery is usually the data already available on the Internet, and it is obvious that celebrities are more likely to become the targets of "forgery". However, now that we live in the Internet age, anyone's personal data can be found online. A student only downloaded a piece of audio of my course, which is enough for him to collect enough data to copy my voice and forge new content. What's more, some reports point out that a highly realistic and highly simulated sound can be produced with only a 3-to 10-second audio sample and deep forgery.
Therefore, the prosperity of artificial intelligence is a double-edged sword. On the one hand, it has become a powerful tool that everyone can use; on the other hand, it may also be used by people with ulterior motives to have a negative impact on human beings.
In fact, for most of us, deep forgery is just a fun technique. However, some people with ulterior motives have also begun to use deep forgery techniques. More worryingly, in addition to changing faces, deep forgery can also be used to forge sounds: it doesn't take much time and effort to imitate anyone's voice very realistically.
How do fraudsters use deep forgery to cheat? What I said above is only realistic on the technical level. For fraudsters, if, on this basis, they combine deception with emotional factors such as family affection, and then put continuous pressure on the victims, creating a sense of urgency in time, so that the victims lose their judgment because of nervousness and panic, then even if the fake voice is not so realistic, it can also deceive the victim.
In many cases, fraudsters pretend to be relatives and friends of the victims, make up emergencies such as car accidents and kidnappings by telephone or even video calls, and commit fraud with lies that are in urgent need of money.
The audio and video materials used in these deep forgery usually come from the Internet. Fraudsters do not need to guarantee 100% recovery, because when the victims fall into panic and despair, their judgment will be seriously reduced, which will allow fraudsters to take advantage of the swindlers to succeed easily.
Knowing how fraudsters use deep forgery to defraud, how can we prevent it?
How to prevent deep counterfeiting scams? When we are caught up in deep counterfeiting scams, we often lose our ability to judge because of temporary panic. Therefore, when we receive urgent help from relatives and friends, we must think: is this really his voice? Would he really be in this kind of trouble?
Although there are many software that can detect whether audio and video have been deeply forged, we are unlikely to think of them in the first place when we ask for emergency help. Therefore, when we receive a sudden emergency call, it is best to make a phone call or video call to our relatives and friends to confirm where they are and what they are doing. Then, after we calm down, we will comprehensively evaluate the authenticity of the information. Calm and thoughtful thinking is more reliable than any software.
After learning to take precautions, is there any way we can give ourselves a "vaccination" in advance?
Today, we live in a world where personal privacy is difficult to be properly protected, especially when many of our daily activities are transferred from offline to online, such as socializing and shopping. This means that our personal information and data have also been transferred online, so images, audio and video information about us can be found everywhere on the Internet.
If you want to get rid of fraud completely, you need to completely remove your own digital "footprint" on the Internet, but in today's society, this is simply impossible. Therefore, what we should do, and what we are more likely to do, is to remain calm and objective in the face of potential deep forgery scams, and carefully analyze whether the current situation is a scam.
With the rapid development of artificial intelligence in recent years, the use of deep forgery fraud has also derived new forms and methods of fraud, which is difficult to prevent.
Now, there is a new deep forgery method called "face fusion" (facial morphing), which can fuse the facial features of two people together to form a new facial image. If you stop in the synthesis process, it will form an image that does not exist in the real world, but has the facial features of two people at the same time. Lawbreakers have begun to use this "unlikeness" facial mixture to forge identity cards and other documents. For example, combine the documents of two people into a new photo, and even use this new photo to apply for a passport in the identity of the two people and eventually be approved.
In fact, in the event of an emergency, it is very difficult to keep calm. The root of it is human greed.
Fraudsters take advantage of human greed? We may all have heard, or even seen, the message that someone has millions of dollars of inherited assets in his bank account, but it takes $500to unlock the account, and if you can lend him money and unlock it, you will be paid $1 million when it is done. We have long been familiar with this form of fraud, but now fraudsters are also improving their deception. Similarly, however, fraudsters will claim that with a small investment, you can get a huge return.
These scams are actually taking advantage of people's greed. Once greed is aroused, these scams that take advantage of human nature are often difficult to prevent. But it's also simple to beware of such scams: never believe that pie will fall in the sky-whether it's a huge cash gift or an investment project that makes you rich overnight.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
CTOnews.com October 14 news, recently, Chongqing Changan Automobile Co., Ltd., "a remote intelligent parking system" patent announced. The patent specification shows that based on the rapid development of intelligent technology in the automobile industry, more and more main engine factories begin to carry out self-driving production.
According to CTOnews.com news on August 14, Yueda Kia will launch two new cars in the second half of this year: one is the pure electric sedan EV6, which has been booked blindly and launched in August, and the other is the medium-sized pure electric SUV EV5, which has been unveiled at the Shanghai Auto Show.
Thanks to CTOnews.com netizens Crazy Land God, BaDang, West window, lemon_meta, Mr. Aviation for the delivery of clues! CTOnews.com August 28 news, BYD released semi-annual report, net profit in the first half of 2023
Thanks to CTOnews.com netizens Watt Watt, Piankesuohuang 4100 eyes, tomato fried tomato clues delivery! According to the news from CTOnews.com on April 6, Xiaomi officially announced that the Redmi K60 Pro mobile phone has a time limit of 3.