Banks warn customers against deepfake AI scams

Hyderabad: Banks have issued alerts warning customers against deepfake AI scams and other forms of cyber fraud, urging them to remain vigilant while making financial transactions.

In advisory messages sent to customers, banks cautioned them not to transfer money based on urgent video or voice calls. They said cyber scammers were increasingly using deepfake AI scams to impersonate known persons and create panic.

“Don’t send money on urgent video or voice calls. Be wary of AI scams. Be alert and bank safely. Also, report cyber frauds at Cybercrime.gov.in or call 1930,” the message stated.

Despite repeated warnings from authorities, incidents of cyber fraud continue to rise, according to media reports. Many victims have been duped after responding to fake distress calls or manipulated audio and video clips.

Rising concern over deepfake AI scams in banking fraud

Banks said customers must verify requests independently before transferring money. They advised people to contact the concerned person directly through trusted numbers.

Officials also stressed the importance of reporting fraud immediately. Prompt complaints through Cybercrime.gov.in or the helpline 1930 can help authorities block fraudulent transactions.

Banks reiterated that awareness remains the strongest defence against AI scams. They urged customers to stay alert and safeguard their savings.