Animating with AI to keep up with the lies
I arrived at Stanford University with a smidge of guilt, wondering if I was retreating into the ivory tower at a crucial time of political upheaval.
After years of creating satirical political animation and social impact videos — while getting buffeted by chaos in the journalism business and unpredictable social media algorithms — the time was right for a reset.
Guilt, it turns out, was unnecessary, as universities became one of the main targets in the new Trump administration.
It’s not like they didn’t warn us. (See: JD Vance exhortation to “attack the universities.”)
Once in office, they attacked funding, attacked academics and attacked students.
I sure picked an interesting time to come to Stanford as a 2025 John S. Knight Journalism Fellow — with a goal to figure out how to preserve and amplify the positive impacts satire can have on society.
If anything, though, the Trump administration’s assault on universities, government and society as a whole has reinforced my passion for truth-telling through satire, cartoons and animation.
Many of these attacks come in the form of outright disinformation spread by the administration — be it about elections, climate change, immigration or a host of other issues.
For that reason, I was drawn to study misinformation research, arriving on campus not long after the dissolution of the Stanford Internet Observatory, which looked into the spread of false information during elections.
The Internet Observatory was gutted after Republican congressmen launched kangaroo court “investigations,” with some right-wing figures even falsely accusing the organization’s research director Renee DiResta of being a CIA agent.
When Donald Trump took office again, he upped the ante with an executive order that decreed combating “misinformation” and “disinformation” as nothing more than government censorship.
Flipping reality on its head, Trump called out those terms as “the favorite words of censors.”
Now, apparently, even saying those words was verboten.
Academics and conference organizers knew that using those words in the current environment would draw negative attention to their work. After all, who wants to be hauled before Rep. Jim Jordan because they study disinformation and its impacts?
But just because we aren’t supposed to say the words doesn’t mean misinformation, disinformation and lies don’t exist.
It turns out there is loads of peer-reviewed science on misinformation and the best ways to combat it.
According to psychologists, one of the most effective ways is through inoculation. Think of lies and misinformation like a viral infection — one that we can inoculate against much like we inoculate against an infectious disease.
But it’s important to inoculate quickly, before the virus/lie has had time to infect more of the population.
(Remember the adage, “a lie can travel halfway around the world while the truth is still putting on its shoes.”)
I suddenly realized that the work I have done for years — simplifying complex information in an accessible way with cartoons and animation — was backed by science!
There are studies that show the effectiveness of cartoons, animation and games in inoculating against misinformation.
Not only that, but my initial explorations into how AI image generation could speed up my animation production process could become an integral part of fighting lies and misinformation.
Which is why I have also been working with a brilliant Stanford researcher to develop an ethical AI image generator that is trained on my cartoons and animation.
We’re creating this custom AI technology as a powerful tool to speed up animation production and help create work that wouldn’t otherwise be possible. (A tool designed to help humans, not to replace humans.)
Because the faster we can spread the truth, the faster we can combat the lies.
It wasn’t long after I learned the concept of inoculating against misinformation that I had a chance to put the psychological framework into action, thanks to a particularly egregious case of Trumpian disinformation.
Donald Trump has spread disinformation about California wildfires and water policies before, but in the early days of his second term as president, he tried to back up his lies with action.
Trump ordered the U.S. Army Corps of Engineers to release water from two reservoirs in California as the Pacific Palisades wildfires were burning, then proceeded to claim “victory” and say that had we listened to him, “there would have been no fire.”
Scientists and water managers of all kinds pointed out that this is completely bogus.
What the President really did was release dangerous flows of water and reduce the amount of water available to California farmers for the coming summer months.
The above cartoon represents the first time I incorporated the psychology of inoculation into my work, with this basic framework:
- introduce actual facts
- followed up with a small dose of what the lie is
- state why the lie is demonstrably false
- restate the facts
This animated short also contains my first use of AI image generation trained on my cartoons and animation — specifically, the background image in the drought scene, which was generated using AI technology.
But it’s not enough to counter lies with truth and speed up production with AI — particularly in a time when so many people are not bound by facts.
The third leg of the stool happens to be the superpower of satire and cartoons: Storytelling.
How you tell the story and what the narrative conveys plays a huge role in countering disinformation and lies.
The scholar Walter Fisher theorized that humans are “storytelling animals,” driven more by narratives than by facts alone.
Yes, you counter lies with facts, but you also counter deceptive narratives with excellent storytelling that speaks the truth.
For me, that truth-telling happens with satire, humor and animation.
And thanks to my time at Stanford, what I used to chalk up to just being a cartoonist, I now know is backed up by actual psychological research.