- Artificial intelligence (AI) has a host of uses, from coding to creating visuals.
- AI technology is now also being used by people to scam others.
- Here are some ways scamsters are using AI and how you can protect yourself.
As per reports, AI has been misused by people to scam others in various ways. Moreover, AI is now being used to imitate people, from creating deepfake videos to cloning a person’s voice to dupe their friends and family members.
AI deepfake
In July 2023, a 73-year old retired government employee from Kozhikode, Kerala was scammed of ₹40,000 by using an AI deepfake of his friend. He received a call from someone pretending to be his office colleague, asking money for an emergency surgery. He not only used his voice, he also video called him, using AI to impersonate his colleague’s likeness, resulting in the victim trusting him and transferring money.
In a similar incident, a Chinese individual lost over ₹5 crore in an AI deepfake scam after he sent money to a person pretending to be his friend.How to protect yourself
Some of the things you can do to protect yourself from AI deepfake scams are –
- Do not send money to someone who claims to be your friend or family member if they call from a new number. Verify whether they are genuine or not.
- Avoid publicly sharing your images or videos, which can be used to create your deep fakes.
AI voice cloning
Another way AI is being used to scam others is using AI to clone a person’s voice and scamming their friends and family by impersonating them. The use of the person’s voice helps convince others that they are genuine, resulting in the victim transferring money to the scammer.
According to a report by McAfee, 47% of the Indians questioned revealed that either they were a victim of AI voice cloning or they knew someone who was scammed using this technology.How to protect yourself
Some of the things you can do to protect yourself from AI voice cloning scams are –
- Do not blindly believe someone if they call from an unknown number claiming to be your friend or family member.
- Use questions and past memories to verify if the caller is genuine, even if the voice matches the person.
- Limit your social media posts to friends and family to prevent others from getting access to your audio.
Extortion
Scamsters are stooping to extortion to steal money from people. According to an advisory by the Federal Bureau of Investigation (FBI), criminals are now using AI to create sexually explicit images to extort victims. The scammer then uses these doctored images to threaten the victim to pay money. If the person does not pay, they share those images on social media.
How to protect yourself
To protect yourself from such extortion scams, you can follow these tips –
- Avoid posting your or your loved one’s pictures and videos publicly.
- Limit your social media accounts to your friends and family.
- Do not share your images or videos with strangers.
- Parents should monitor where their children post their images and videos.
Have unclaimed bank deposits? Here’s how to find it using UDGAM portal
Indian PC market declines for fourth consecutive quarter, reports 15% decline
Gboard to onboard AI for stickers, proofreading and more
0 Comments