AI is getting adept even more quickly at mimicking voices...

 AI is getting adept even more quickly at mimicking voices and distorting the truth. Credit: Getty Images/fStop/Malte Mueller

Did you see the image?

It coursed through social media in the wake of Hurricane Helene, and showed a frightened little girl in a motorboat fleeing the storm while clutching a puppy. It was a powerful visual. And it was entirely fake, generated by artificial intelligence.

That didn't stop people from posting and reposting it, even after they were made aware it was fake. As one woman from Georgia who had posted the photo later wrote on X: "Y’all, I don’t know where this photo came from and honestly, it doesn’t matter. It is seared into my mind forever. There are people going through much worse than what is shown in this pic. So I’m leaving it because it is emblematic of the trauma and pain people are living through right now."

It didn't matter to her whether the image was real, in other words, only that it felt real. Which brings into focus what a strange and scary — and, sadly, predictable — place we've arrived at, where sometimes all that matters is whether something seems to be true, whether it fits one's expectation of what is true.

It's a new twist on an old phenomenon, confirmation bias, a term that describes the tendency humans always have had to interpret something new as confirmation of something they already believe. Nowadays, it comes most prominently in political packaging as our warring sides, not content with the ammunition at hand, invent new bombs to lob at each other.

An ad released this fall by Indiana Sen. Mike Braun pictured a rally for his Democratic gubernatorial opponent, Jennifer McCormick, where supporters arrayed behind her were seen holding signs with pictures of gas stoves inside a red "no" symbol. In a state with no shortage of libertarians, a message about government coming for your gas stoves hits home. Except that it was fabricated. The campaign signs were digitally altered.

The ad confirmed concerns about artificial intelligence deepfakes. Turns out AI is getting adept even more quickly at mimicking voices.

Numerous audio impersonations are circulating of the major presidential contenders. In one, former President Donald Trump appears to insult the intelligence of some of his most loyal followers. "They believe anything on Fox News," the fake voice says. "I could lie, and they'd still eat it up."

Others make it seem like Vice President Kamala Harris is celebrating President Joe Biden's decision to drop out of the race. Another mimics her saying: "I was selected because I am the ultimate diversity hire: I'm both a woman and a person of color. So if you criticize anything I say, you're both sexist and racist."

Manipulating truth and trying to alter reality are not new, of course. Charlatans and con men are as old as humanity. Ditto lies and conspiracy theories, though spouting them can seem shame-free these days.

What's different and troubling is the change in those being lied to: They seem to want it. Despite all the fact-checking and attempts to educate people about the dangers of online disinformation, some folks want it — when it feels right to them. Like when they see a little girl in a flood with a puppy, or hear a politician they don't like saying something that verifies their aversion.

In this craven new world, the truth is not a course correction. It's an obstacle to be hurdled on the way to accepting what's not real. We saw that in the aftermath of Helene, when people already prone to distrust government wanted to believe posts like "FEMA is actively hindering relief efforts," which had 16.5 million views at last count — people like the armed man arrested in North Carolina after making threats against FEMA staff.

It's always been difficult to counter mistruth, no matter who creates it. Adding the alchemic ability of AI to present fiction as fact only makes the war that much harder to win.

Columnist Michael Dobie's opinions are his own.

SUBSCRIBE

Unlimited Digital AccessOnly 25¢for 6 months

ACT NOWSALE ENDS SOON | CANCEL ANYTIME