Shh...A Special Offer for Summit 2024 Attendees
Announcing our 2025 Summit—with a special offer for 2024 Summit attendees.
Caring machines may be the only way to scale empathy across our species.
We can think of empathy as another type of perception. There are two types of empathy which interact to give us a sense of the experience of others. Cognitive empathy is a top-down process where we think about what’s happening and make inferences about another person’s internal state. Affective empathy is a bottom-up process where feelings from our own bodies and senses allow us to share in the experiences of others. We literally feel what someone else feels—we vicariously experience someone else’s pain (or joy). Feelings are about something—they are mental projections of the state of the body and they give us an embodied sense of personal vulnerability. Feelings aren’t processed in a value-neutral way. How you feel, what you do, matters. Vicarious feelings motivate us to act prosocially but they can also be unbearable.
Some prominent neuroscientists—including Antonio Damasio and Jonas Kaplan—argue that if AI has no machine version of feeling—no sense of vulnerability—it will be unable to align itself with humans. A machine whose artificial empathy includes a sense of personal vulnerability may allow for more adaptive AI because the machine is imbued with goals related to homeostasis—the process of keeping the body in equilibrium. In other words, a machine that has a visceral sense of how humans feel will make better decisions about humans. An AI would need a real or simulated body that is able to provide homeostatic signals and is vulnerable to the environment. It would need to know what signals come from inside itself that have somehow originated in an other.
How could this be done? First, it may be necessary to build a body, or something like it. An AI would need to feel a sense of embodiment so that it can be trained to want to preserve its integrity. It would need to learn what the ups and downs of embodiment feel like over time. Once an AI has a sense of its own valence and homeostasis, it can be trained to interpret the conditions of others. This forms the beginning of empathic contagion, understanding what others feel because it's felt. Only then the AI would be ready to be trained to maximize its own integrity and the integrity of others. This last stage forms the basis of prosociality. Bottom-up, affective AI avoids the thorny problem of encoding human values which could result in overly drastic outcomes and sociopathic robots.
This could lead to machines that care about us in a visceral, emotionally-driven way which would create the capacity for fast transfer of empathy across a bigger group of both humans and machines. We might see empathy as an emergent phenomenon, where communities made up of machines and humans could cooperate in more flexible and compassionate ways.
Caring machines would bring new biases—people tend to be more empathetic towards people who are like them—and could prioritize the well-being of the in-group over a larger community. But caring AI may also be able to counter one of the great human constraints—our limited cognitive capacity which constrains our ability to solve very complex social problems. Machines could apply their talents and generate more “remote trajectories of continued existence in the world, over time, among greater numbers of individuals in interaction.” Caring machines may be the only way to scale empathy across our species.
Homeostatically Motivated Intelligence for Feeling Machines
Kingson Man and Antonio Damasio
Other things that caught our eye this week.
From Wired: Optimizing Machines Is Perilous. Consider ‘Creatively Adequate’ AI. Angus Fletcher and Erik Larson.
From Quanta: Researchers Build AI That Builds AI
From Stanford Digital Economy Lab: The Turing Trap: The Promise & Peril of Human-Like Artificial Intelligence. Erik Brynjolfsson
The Artificiality Weekend Briefing: About AI, Not Written by AI