You Tube Theology

Theology videos from YouTube

YouTube video thumbnail
Extra-Long (>70 mins)
Ethics
Anglican

Robot Souls & Junk Code: Dr. Eve Poole on Programming Humanity into AI

Theologian

Eve Poole


Duration

110.51


Uploaded to YouTube

16 November 2025

Added to Database

22 November 2025


YouTube description

Are we building better versions of ourselves in AI – or a master race of very efficient psychopaths?

In this new episode of Singularity.FM, I sit down with Dr. Eve Poole – theologian, leadership scholar, and author of Robot Souls: Programming in Humanity – to ask a simple but brutal question: what makes us human, and what happens if we leave that out of our machines?

Eve argues that what tech culture treats as “junk code” – our emotions, intuition, stories, uncertainty, meaning-making, even our mistakes – is not a bug in human design but the core feature that has kept our species alive. And right now, that entire layer of humanity is mostly missing from how we build AI.

We talk about why that’s not just a philosophical oversight, but a design flaw with existential consequences.

In this conversation, we explore:

What is human? Why law, theology, and philosophy all quietly assume something like a soul even when they won’t define it.

Soul, consciousness & “junk code” – why emotion, intuition, uncertainty, story and meaning-making may be the real hallmarks of humanity.

Theology, capitalism & AI – how Eve went from the Church of England and an MBA to writing about robots and artificial intelligence.

The “junk code” thesis – why free will without omniscience requires emotions, intuition, conscience, and stories to keep a species alive.

Copying only our IQ – how we ended up designing AI that looks more like a master race of high-functioning psychopaths than a full human being.

Consciousness as emergent… from what? Need, embodiment, and why a simple walking robot might tell us more than a massive LLM.

Alignment vs parenting – why trying to “control” AI like a tool may be naive, and what it would mean to raise AI more like a child.

Robot rights & game theory – energy, land use, and what happens when something more powerful starts asking for rights.

Story, Tolkien & u-catastrophe – how stories train our moral imagination and keep hope alive when doomism feels seductive.

Why we must “program in humanity” not for AI’s sake, but for ours.

If you care about AI, ethics, theology, capitalism, leadership, or the future of our species, this one goes straight at the hard questions most people dodge.