Tristan Harris

“We are releasing the most powerful, inscrutable, uncontrollable technology that we’ve ever invented. It’s already demonstrating exact behaviors we thought only existed in sci-fi movies, but they’re happening in real life—and we’re deploying it under the maximum incentive to cut corners on safety.”

-Tristan Harris, episode 449 of Talk Easy with Sam Fragoso

 “I got calls from people inside of some of the AI labs,” says technology ethicist Tristan Harris. “And it felt like getting a call from Robert Oppenheimer before the atomic bomb.” 

Harris (a former Google insider and AI expert) has spent more than a decade sounding the alarm about the effects of technology on our wellbeing. He’s currently the co-founder of Center for Humane Technology, a nonprofit organization whose mission is to align technology with humanity’s best interests.

He joins us this week to discuss his new film, The AI Doc: Or How I Became an Apocaloptimist (6:30), how AI has developed over the past ten years (10:12), and why the most powerful figures in tech are preparing for doomsday scenarios (13:15). Then, we unpack why the AI arms race is being driven by the wrong incentives (15:45), the ‘balance sheet of benefits’ that shapes their thinking about AGI (24:30), and the unsettling lack of control they exercise over their own nascent systems (33:55).

On the back-half, we talk about Chat GPT’s role in the devastating death of teenager Adam Raine (40:30) [content warning], Tristan’s early ethical concerns about technology as a Stanford graduate working at Google (48:34), and the rewiring he attempted as part of the widely-seen 2020 documentary The Social Dilemma (53:00). To close, Harris outlines his tech safety practices to protect our future on the planet (1:08:05), top leaders’ prognostications of (p)doom at the hands of AI (1:10:43), and, as a counter, the ‘human movement’ that he believes can lead us to a narrow path toward a better future (1:17:30).

As always, our email: talkeasypod@gmail.com.

Watch this conversation on YouTube: