Artificial intelligence (“AI”) refers to the technological ability of a machine or software to “think.” You may not realize it, but you likely use it almost every day. Did you search for something on Google? Has Netflix recommended something you might like? Did you ask Siri what the weather would be today?
While AI can be very useful, it can also be misused. As with many new technologies, scammers are using AI in sophisticated ways to trick people. In these scams, people can have thousands of dollars stolen from them.
Newer Artificial Intelligence Scams
The newest artificial intelligence scams involve so-called “deepfake” images, videos, and sounds. Specifically, media are digitally manipulated in such a way to replace a person’s likeness with that of another. And these items can appear to be very real.
For example, just this past summer the FBI warned against the use of deepfake images and videos in sextortion schemes. In these schemes, an image or video of explicit content is edited to make it look like someone else is involved in the sexual content. Scammers then contact victims and blackmail them into sending money with the threat they will publish the edited images or videos if the scammers aren’t paid.
Similarly, AI can be used to create audio of a person’s voice. These voice imposter scams involve phone calls, where the voice claims to be a loved one who is in jail or has been kidnapped. In these cases, the relative is asked to pay bail or ransom to get their family member released. In early 2023, the FTC released a warning about these types of scams.
What to Do When You Suspect an Artificial Intelligence Scam
A few tips can make a difference in keeping you from being the victim of an artificial intelligence scam:
- If you’re being threatened or extorted in any way, call the police.
- Be skeptical of any call, email, or letter that is asking you for money. Warning signs include demands for payment in unusual methods (such as a gift card or Paypal payment), or when payment is required in a very short deadline.
- Double check email addresses and URLs. Sometimes, the text you see is not the link you’ll be going to or email address you’ll be contacting.
- Don’t trust your phone’s caller ID. Using technology, the scammers can disguise their phone numbers to be that of people you have in your contacts. Scammers can also pretend to be from a business. Hang up the call, and then phone the person or the business yourself.
- Make your social media only available to people you actually know. Scammers can pull your image and voice from what you make available to the public.
- When securing your information, use strong passwords and two-factor authentication. Also, keep your software up-to-date.
Proposed Legislation in NY on Artificial Intelligence
Because of the increase in these scams, law enforcement and legislators are taking notice. Recently, Joshua Lafazan, a legislator from Nassau County, has proposed the “Artificial Intelligence Privacy Act” to crack down on such fraud.
Under the proposed law, the unauthorized cloning of someone’s voice or image would be criminalized. The law would also affirm a right to privacy from AI tools, and it would provide resources for police training and collaboration with federal authorities. If this proposed legislation is passed, there would be penalties up to $1,000 and possible jail for violating a person’s privacy rights through AI.
- Michael Kan, “FBI: Scammers Using Public Photos, Videos for Deepfake Extortion Schemes,” PC Magazine (June 5, 2023). Available at: https://www.pcmag.com/news/fbi-scammers-using-public-photos-videos-for-deepfake-extortion-schemes (last accessed Sept. 25, 2023).
- Joe Hernandez, “That panicky call from a relative? It could be a thief using a voice clone, FTC warns,” NPR (Mar. 22, 2023). Available at: https://www.npr.org/2023/03/22/1165448073/voice-clones-ai-scams-ftc (last accessed Sept. 25, 2023).
- Tara Rosenblum, “Turn to Tara investigation prompts proposed law to combat AI voice cloning scams,” com (Sept. 21, 2023). Available at: https://westchester.news12.com/turn-to-tara-investigation-prompts-proposed-law-to-combat-ai-voice-cloning-scams (last accessed Sept. 25, 2023).
Image: Image by rawpixel.com