A “grandmother” bot has appeared to talk to phone scammers

0
128
A
A "grandmother" bot has appeared to talk to phone scammers

Despite all the downsides of artificial intelligence (such as encouraging people to eat deadly mushrooms), it can sometimes be used for good. O2, the largest mobile operator in the UK, has deployed an AI-powered voice chatbot to engage phone fraudsters in confusing, fruitless conversations. The chatbot, called Daisy, or “dAIsy,” mimics the voice of an elderly person, which is the most common target for phone scammers.

Daisy’s goal is to automate “scamming” or the practice of deliberately wasting the time of phone scammers to keep them away from potential real victims as long as possible. Scammers use social engineering to take advantage of the naiveté of the elderly, convincing them, for example, that they owe taxes and will be arrested if they do not immediately transfer funds.

However, when the scammer calls Daisy, they are in for a long conversation that will ultimately lead nowhere. If they get to the point where the fraudster asks for personal information, such as bank details, Daisy will make up fake information. O2 claims that it can contact fraudsters in the first place by adding Daisy’s phone number to lists of “easy targets” that fraudsters use to find potential customers.

In the video demonstrating Daisy, excerpts from real-life conversations show how fraudsters become increasingly irritated, staying on the phone for up to 40 minutes and hoping to get a credit card number or bank details. The AI model that O2 has created sounds very convincing – it does all the processing in real time, but fortunately, this makes it easier because older people tend to speak quite slowly.

Of course, the problem with chatbots like Daisy is that the same technology can be used for the opposite purpose – we’ve already seen cases where real people, such as CEOs of large companies, have faked their voices to trick others into sending money to scammers. Older people are already quite vulnerable. If they receive a call from someone claiming to be their grandchild, they will almost certainly believe it is a real voice.

Ultimately, the ideal solution would be to block fraudulent calls and shut down the organizations that run these scams. Operators have gotten better at identifying fraudsters and blocking their numbers, but it remains a cat-and-mouse game. Fraudsters use automated dialing tools that allow them to dial numbers in rapid succession and are only notified when they receive a response. An AI bot that frustrates fraudsters by answering and wasting their time is better than nothing.

LEAVE A REPLY

Please enter your comment!
Please enter your name here