Entities
View all entitiesIncident Stats
CSETv1_Annotator-1 Taxonomy Classifications
Taxonomy DetailsIncident Number
492
Notes (special interest intangible harm)
4.5 - Scammers may target elderly people because they are perceived to be more vulnerable to voice cloning scams.
Special Interest Intangible Harm
no
Date of Incident Year
2023
Date of Incident Month
01
Date of Incident Day
10
Incident Reports
Reports Timeline
- View the original report at its source
- View the report at the Internet Archive
#chatgpt #ai #scam #scammers #voiceai #protect #prevention #awareness #education ai and scammers.
- View the original report at its source
- View the report at the Internet Archive
The man calling Ruth Card sounded just like her grandson Brandon. So when he said he was in jail, with no wallet or cellphone, and needed cash for bail, Card scrambled to do whatever she could to help.
"It was definitely this feeling of ...…
- View the original report at its source
- View the report at the Internet Archive
You may very well get a call in the near future from a relative in dire need of help, asking you to send them money quickly. And you might be convinced it’s them because, well, you know their voice.
Artificial intelligence changes that. Ne…
- View the original report at its source
- View the report at the Internet Archive
A couple in Canada were reportedly scammed out of $21,000 after they received a call from someone claiming to be a lawyer who said their son was in jail for killing a diplomat in a car accident.
Benjamin Perkin told The Washington Post the …
- View the original report at its source
- View the report at the Internet Archive
- AI voice-generating software is allowing scammers to mimic the voice of loved ones.
- These impersonations have led to people being scammed out of $11 million over the phone in 2022.
- The elderly make up a majority of those who are targeted. …
- View the original report at its source
- View the report at the Internet Archive
Parents of Benjamin Perkin (Canada) received a call from their son, which was actually a fake AI, saying he was being held in custody and needed $ 15,000 urgently.
Perkin's family's nightmare, 39, began when a man claiming to be a lawyer ca…
- View the original report at its source
- View the report at the Internet Archive
Bail Out
Ruthless scammers are always looking for the next big con, and they might've found it: using AI to imitate your loved ones over the phone.
When a 73-year-old Ruth Card heard what she thought was the voice of her grandson Brandon on…
Variants
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
Similar Incidents
Did our AI mess up? Flag the unrelated incidents