What to know about A.I. scams as they grow on Long Island, nationwide
Increasingly complex artificial intelligence scams have grown significantly nationwide this year, preying on the anxieties of vulnerable seniors on Long Island and throughout the country, while leaving a path of financial and emotional devastation in their wake, industry experts and law enforcement officials said.
Through the first three quarters of the year, Americans lost nearly $2 billion in so-called “impostor scams,” where a criminal uses A.I. technology to pretend to be a trusted individual — such as a family member or government official, credit card or bank representative — to convince their mark to send them money, according to Federal Trade Commission data.
By year’s end, that figure could top the record $2.7 billion lost in 2022 to impostor scams, which federal officials say now have become among the most common — and lucrative — fraud methods nationwide.
Peter Nicoletti, global chief information security officer for California-based Check Point Software Technologies, said A.I. technology is evolving rapidly, with scammers utilizing genuine voice recordings they find online, often through social media, to “spoof” or duplicate an individual's voice.
More than half of all Americans, he said, have voice samples online, providing scammers with the main ingredient needed to create a digitally generated copy of their voice, known as a “deepfake,” that can be used to victimize loved ones.
“There’s a lot of tools that are on the dark web that are being sold by criminals to criminals for voice cloning,” Nicoletti said. “They don’t have any guardrails. They don't have any watermarks. They’re designed specifically for voice cloning and creating threats using artificial intelligence.”
In addition to robocalls, texts also can be spoofed to mask the originating number and make it appear that they are coming from a number you’re more likely to trust. Some typical examples include phishing scams designed to obtain sensitive information and those suggesting the recipient has won a prize.
In many cases, including two recent incidents in Suffolk County, scammers use cloned voices to convince an unsuspecting family member to send them money.
Last month, an 85-year-old Hauppauge man received a call from an individual claiming to be his granddaughter, who said she was arrested and needed $19,000 for bail, Suffolk police said. The grandfather met an individual two separate times at his home and turned over the requested cash, authorities said.
In another recent con, a 70-year-old Smithtown woman spoke to an individual purporting to be an Amazon representative who said someone was attempting to use her bank account to purchase electronics, police said.
The scammer told the woman that since her bank account was compromised, she should turn over her cash to keep it safe. The victim then made three separate withdrawals, totaling more than $100,000, and the cash was picked up by the same man on three different dates, police said.
And in Nassau last spring, an elderly Great Neck woman was scammed out of $25,000 by a woman claiming to be an employee from Chase's fraud department and requiring the cash to reverse a fraudulent charge, Nassau police said.
All three incidents remain under investigation.
“We are reminding residents to resist the urge to act immediately,” now-former Suffolk Police Commissioner Rodney K. Harrison said in a statement before he left the position last week. “Before sending money, speak to a trusted family member or call police. No legitimate company makes threats or demands cash.”
Richard McGee, 68, of Garden City, was one of the lucky ones who did not fall for an A.I. phone scam.
On two occasions in recent months, McGee said he received calls from an individual purporting to be his grandson and claiming he was in prison in Mexico and needed cash.
Just one problem: McGee's two grandchildren are ages 5 years and 8 months. After toying with the caller, which McGee believes was an A.I.-generated voice, he hung up.
“People can be unsuspecting,” McGee said. “They want to do the right thing. They want to help, especially a grandchild or a child in need. So somebody can be easily scammed.”
Even family members of top law enforcement officials are not immune.
Suffolk District Attorney Raymond Tierney said his father-in-law fell victim to a scam in which a caller pretended to his granddaughter, needing cash to get her wrecked car from an impound lot. Tierney told Newsday in July his father-in-law ended up losing $3,000 before realizing his granddaughter had never been in an accident.
“There’s nothing shameful about this,” Tierney told Newsday at the time. “These guys are professional con men. They’re very good at what they do.”
Seth Boffeli, senior adviser for AARP’s national fraud prevention program, said while 40% of fraud reported to the FTC comes from individuals under the age of 30, older Americans are typically duped for larger sums.
“And criminals know that,” Boffeli said. “So seniors are always going to be a major target for these criminals. They're also more likely to be home, more likely to answer the phone and they're probably also more likely to be [trusting]. So all of these factors add up to seniors in all likelihood getting more phone calls, than, say, younger demographics.”
The hallmark of an impostor scam, experts said, is creating an emergency situation requiring immediate action that is designed to raise the victim's anxiety level.
Individuals who may be concerned about the legitimacy of the caller on the other end, Nicoletti said, should follow some simple steps to protect themselves.
They include asking the caller for a callback number, avoiding extensive conversation, and avoiding words such as “yes.” Then, immediately contact the family member or official who supposedly had reached out for help.
“It's hacked information and they're combining it with social media information and creating these hyper-targeted phishing and hyper-targeted phone calls,” said Nicoletti, who advises families to establish a safe word to be used during emergency situations. “And the problem is, it's at virtually no cost to create these robocalls. It's fractions of a penny. And very rarely are these guys caught.”
In November, Democratic New York Sen. Kirsten Gillibrand and colleagues reintroduced legislation establishing new penalties for A.I.-powered scams and robocallers, including prison time.
The Deter Obnoxious, Nefarious, and Outrageous Telephone Calls (DO NOT Call) Act would make repeated violations of the Telephone Consumer Protection Act, which regulates the use of telemarketing calls, punishable by a prison sentence of up to 3 years and a fine of up to $20,000.
“These calls are more than just an annoying nuisance,” Gillibrand said during a virtual news conference this month. “They are a massive drain on our economy. They hurt real people.”
While public reporting indicates an increasing number of families are being targeted by voice clones, the precise number of Americans duped by generative A.I. remains unknown, said Gillibrand, who wrote to the FTC seeking additional information on their efforts to track the scams.
The FTC, which did not respond to requests for comment, is not the only federal agency looking to crack down on A.I.-generated scam calls.
Last month, the Federal Communications Commission voted unanimously to assess what tools are available to respond to an emerging A.I. threat and how to define artificial intelligence in the context of already regulated robocalls and robotexts. But the agency also plans to examine whether A.I. fraud can be combated by artificial intelligence itself.
“A.I. technologies can bring new challenges and opportunities,” Federal Communications Commission Commissioner Anna Gomez said during the Nov. 15 FCC meeting. “Responsible and ethical implementation of A.I. technologies is crucial to strike a balance, ensuring that the benefits of A.I. are harnessed to protect consumers from harm rather than amplifying the risks in increasing the digital landscape.”
Increasingly complex artificial intelligence scams have grown significantly nationwide this year, preying on the anxieties of vulnerable seniors on Long Island and throughout the country, while leaving a path of financial and emotional devastation in their wake, industry experts and law enforcement officials said.
Through the first three quarters of the year, Americans lost nearly $2 billion in so-called “impostor scams,” where a criminal uses A.I. technology to pretend to be a trusted individual — such as a family member or government official, credit card or bank representative — to convince their mark to send them money, according to Federal Trade Commission data.
By year’s end, that figure could top the record $2.7 billion lost in 2022 to impostor scams, which federal officials say now have become among the most common — and lucrative — fraud methods nationwide.
Seniors targeted on Long Island
Peter Nicoletti, global chief information security officer for California-based Check Point Software Technologies, said A.I. technology is evolving rapidly, with scammers utilizing genuine voice recordings they find online, often through social media, to “spoof” or duplicate an individual's voice.
WHAT TO KNOW
Long Islanders should be wary of callers alleging:
To be part of a familiar, legitimate organization.
There is a problem or a prize.
A family member is injured or in jail.
You need to act immediately or your safety is threatened.
Payment must be made in a specific form, such as cash, Venmo or gift card.
Source: Suffolk County Police Department
How to protect yourself against A.I. scams:
- Have a safe word only you and your loves ones know; not something easily discovered online.
- Don't share too much information with the caller and avoid words such as “yes” that can be used to unlock bank accounts.
- If the individual claims to be from an agency or financial institution, ask for a name and call back number and then hang up.
- If the individual claims to be a loved one, call that person back on your own and verify the story.
- Have a conversation with an elder family member about these scams.
Source: Check Point Software Technologies
More than half of all Americans, he said, have voice samples online, providing scammers with the main ingredient needed to create a digitally generated copy of their voice, known as a “deepfake,” that can be used to victimize loved ones.
“There’s a lot of tools that are on the dark web that are being sold by criminals to criminals for voice cloning,” Nicoletti said. “They don’t have any guardrails. They don't have any watermarks. They’re designed specifically for voice cloning and creating threats using artificial intelligence.”
In addition to robocalls, texts also can be spoofed to mask the originating number and make it appear that they are coming from a number you’re more likely to trust. Some typical examples include phishing scams designed to obtain sensitive information and those suggesting the recipient has won a prize.
In many cases, including two recent incidents in Suffolk County, scammers use cloned voices to convince an unsuspecting family member to send them money.
Last month, an 85-year-old Hauppauge man received a call from an individual claiming to be his granddaughter, who said she was arrested and needed $19,000 for bail, Suffolk police said. The grandfather met an individual two separate times at his home and turned over the requested cash, authorities said.
In another recent con, a 70-year-old Smithtown woman spoke to an individual purporting to be an Amazon representative who said someone was attempting to use her bank account to purchase electronics, police said.
The scammer told the woman that since her bank account was compromised, she should turn over her cash to keep it safe. The victim then made three separate withdrawals, totaling more than $100,000, and the cash was picked up by the same man on three different dates, police said.
And in Nassau last spring, an elderly Great Neck woman was scammed out of $25,000 by a woman claiming to be an employee from Chase's fraud department and requiring the cash to reverse a fraudulent charge, Nassau police said.
All three incidents remain under investigation.
“We are reminding residents to resist the urge to act immediately,” now-former Suffolk Police Commissioner Rodney K. Harrison said in a statement before he left the position last week. “Before sending money, speak to a trusted family member or call police. No legitimate company makes threats or demands cash.”
Richard McGee, 68, of Garden City, was one of the lucky ones who did not fall for an A.I. phone scam.
On two occasions in recent months, McGee said he received calls from an individual purporting to be his grandson and claiming he was in prison in Mexico and needed cash.
Just one problem: McGee's two grandchildren are ages 5 years and 8 months. After toying with the caller, which McGee believes was an A.I.-generated voice, he hung up.
“People can be unsuspecting,” McGee said. “They want to do the right thing. They want to help, especially a grandchild or a child in need. So somebody can be easily scammed.”
'Professional con men'
Even family members of top law enforcement officials are not immune.
Suffolk District Attorney Raymond Tierney said his father-in-law fell victim to a scam in which a caller pretended to his granddaughter, needing cash to get her wrecked car from an impound lot. Tierney told Newsday in July his father-in-law ended up losing $3,000 before realizing his granddaughter had never been in an accident.
“There’s nothing shameful about this,” Tierney told Newsday at the time. “These guys are professional con men. They’re very good at what they do.”
Seth Boffeli, senior adviser for AARP’s national fraud prevention program, said while 40% of fraud reported to the FTC comes from individuals under the age of 30, older Americans are typically duped for larger sums.
“And criminals know that,” Boffeli said. “So seniors are always going to be a major target for these criminals. They're also more likely to be home, more likely to answer the phone and they're probably also more likely to be [trusting]. So all of these factors add up to seniors in all likelihood getting more phone calls, than, say, younger demographics.”
The hallmark of an impostor scam, experts said, is creating an emergency situation requiring immediate action that is designed to raise the victim's anxiety level.
Individuals who may be concerned about the legitimacy of the caller on the other end, Nicoletti said, should follow some simple steps to protect themselves.
They include asking the caller for a callback number, avoiding extensive conversation, and avoiding words such as “yes.” Then, immediately contact the family member or official who supposedly had reached out for help.
“It's hacked information and they're combining it with social media information and creating these hyper-targeted phishing and hyper-targeted phone calls,” said Nicoletti, who advises families to establish a safe word to be used during emergency situations. “And the problem is, it's at virtually no cost to create these robocalls. It's fractions of a penny. And very rarely are these guys caught.”
Lawmakers and regulators respond
In November, Democratic New York Sen. Kirsten Gillibrand and colleagues reintroduced legislation establishing new penalties for A.I.-powered scams and robocallers, including prison time.
The Deter Obnoxious, Nefarious, and Outrageous Telephone Calls (DO NOT Call) Act would make repeated violations of the Telephone Consumer Protection Act, which regulates the use of telemarketing calls, punishable by a prison sentence of up to 3 years and a fine of up to $20,000.
“These calls are more than just an annoying nuisance,” Gillibrand said during a virtual news conference this month. “They are a massive drain on our economy. They hurt real people.”
While public reporting indicates an increasing number of families are being targeted by voice clones, the precise number of Americans duped by generative A.I. remains unknown, said Gillibrand, who wrote to the FTC seeking additional information on their efforts to track the scams.
The FTC, which did not respond to requests for comment, is not the only federal agency looking to crack down on A.I.-generated scam calls.
Last month, the Federal Communications Commission voted unanimously to assess what tools are available to respond to an emerging A.I. threat and how to define artificial intelligence in the context of already regulated robocalls and robotexts. But the agency also plans to examine whether A.I. fraud can be combated by artificial intelligence itself.
“A.I. technologies can bring new challenges and opportunities,” Federal Communications Commission Commissioner Anna Gomez said during the Nov. 15 FCC meeting. “Responsible and ethical implementation of A.I. technologies is crucial to strike a balance, ensuring that the benefits of A.I. are harnessed to protect consumers from harm rather than amplifying the risks in increasing the digital landscape.”
LI under rain, high wind warning ... Update on CEO killing ... Retail pet ban lawsuits ... Personalized gifts
LI under rain, high wind warning ... Update on CEO killing ... Retail pet ban lawsuits ... Personalized gifts