
Recently, there has been an increase in cases where people's voices are recorded under the pretext of social surveys, and deepfake videos are created using artificial intelligence. This was reported by Tatyana Deshkina, Deputy Product Director at VisionLabs, to RIA Novosti.
According to her, in such fraudulent schemes, attackers introduce themselves as employees of sociological research centers or survey companies and try to keep the person talking for as long as possible. The goal is to record the voice in various intonations and without background noise for at least 20 seconds.
"The longer the recording, the more realistic and accurate the cloned voice can be. This allows scammers to create fake voice messages or videos that appear as if a real person is involved," Deshkina explained.
Convincing Fraud Through Fake Videos
The most dangerous aspect is the combination of voice clones with fake video footage. Scammers may create fake profiles on Telegram, WhatsApp, or other social networks and send video or audio messages on behalf of someone you know.
Even tech-savvy users can fall for such tricks, which could lead to the loss of personal or financial information.
How to Protect Yourself from Scammers:
- Be cautious if someone calls you from an unknown number and tries to engage you in a long conversation.
- If you feel pressured to give suspicious or unnecessary information, end the call immediately.
- Do not respond to voice or video messages making unfamiliar requests unless you are sure they are genuine.
Technology continues to evolve, and so do the tactics of scammers. Always remain alert when communicating.
Stay vigilant—don’t let your voice become a scammer’s weapon! Read 'Zamin' on Telegram!
Ctrl
Enter
Did you find a Mistake?
Highlight the phrase and press Ctrl+Enter Related News