Is a truly “sentient” AI even possible? How could code develop the capacity for feelings, experiences, or intentionality? Even if our best algorithms can one day perfectly mirror the behavior of people, would they be conscious? How one answers such questions depends on one’s anthropology. What are people? Are we merely “computers made of flesh?” Or is there something more to us than the sum of our parts, a true ghost in the machine? A true ghost in the shell?