Can culture evolve as fast as the technology itself?

How might AI machines affect human culture? And how will we adapt to new forms of cultural inequality?

Culture & AI

Take a moment to imagine this: you stand in the middle of Grand Central Station as throngs of people hurry past. You wonder, who are these people? Where have they been? Where are they going? What are their goals, beliefs, worries, and hopes? What idiosyncrasies do they have? What does their life look like? To them?

Everyone has a rich inner life, one as complex as your own. This sudden realization has a name—sonder. It’s a meta-empathy, a deep sense of connectivity and apartness coexisting in a single emotion. Sonder makes us loyal to humanity.

As our brains evolved, specialized areas developed selection and through practice and learning: intuitive physics, cause and effects, trial and error. With a community of experts, we combine our computational capacity, overcoming limitations on human cognition. Feeling "sonder" energizes us as we realize others have the same capacity to feel. Our need to belong in a group exists in tension with our individualistic desires. Survival depends on being part of the group. The rituals of art, ceremony, and story evolved to serve our community of shared thinking. This is what we experience as culture.

Machines have different skills than humans. Machines can complement us by thinking in high-dimensional spaces and at speeds we cannot. They are powerful tools but their power is not evenly distributed across people, communities and cultures.  

Machines create a distinct cultural power. The new “others” are a specific kind of cyborg. They are humans who have preferential access to machines that use the data of human culture and behavior to build worlds. These others build new cultures because of their relationship with machines. They are the builders and owners of this technology—the ones with the money to fund computational infrastructure and those with the resources to build powerful AI.

AI has already had a profound impact on the professional status of those who design them. In companies that build artificial intelligence, it’s common for the people who build the nervous system, guts, and brains of the intelligence to have higher status and more influence than those who build the hearts and minds of the technology.

Within society, industry, and organizations, technologists gain status and decision power at the expense of diversity. The problems that are not subject to mathematization are seen as less addressable, which changes the way resources are directed towards their solution. Problems that aren’t addressable by machine become externalities for non-cyborgs, such as childcare workers, to take on.

When the AI intelligentsia applies datafication to human culture, they gain unique power. Machines can create a sense of “unbelonging,” shifting power to those fluent in AI. Ethicists have alerted us to biased AI, driving public awareness and new standards for responsible AI. 

AI makes visible what was once invisible, leading to cultural change. Ethicists play a crucial role in moving decision boundaries towards more fair outcomes, as models embed technologists' choices of optimization, often impacting societal values. These are often things society might want to leave open. Consider traditional bank lending: a manager could weigh informal factors like community relationships or unique circumstances when evaluating loans. AI systems must use explicit, quantified rules. While this increases consistency, it eliminates flexibility for valid but non-standard cases—like a small business owner with strong local relationships but unconventional income patterns. This illustrates how AI forces precise definitions for decisions that society might prefer to keep contextual and flexible.

A recent meta-analysis found that human evolution is shifting from gene-dominated to culture-dominated through the interplay of technology, environment, and group dynamics. This can be understood as a debate between whether "culture steers human evolution" or "genes hold culture on a leash."[1] AI is a cultural technology that produces different outcomes depending on its cultural context. Weirdly, it is possible that AI could change the course of human evolution sooner than we might think. 

If you think culture can’t beat genetics, consider how the success of the Cesarean procedure has marginally relaxed genetic selection in humans, slightly increasing the likelihood that a daughter born by Cesarean will herself require one. AI could conceivably enhance culture-driven gene-culture coevolution, creating adaptive advantages of cultural inheritance that drive the evolution of specific genetic traits. Similarly, AI could potentially create new opportunities for cultural evolution to influence genetic evolution, rather than the other way around. In human evolution, it seems that culture may become more important than code.

If AI can provide adaptive advantages that drive cultural inheritance, the use of machines may create a distinct advantage for the "coding elite" of software developers, tech CEOs, investors, and computer science and engineering professors. By controlling algorithms, the coding elite concentrates power and can even affect political gains. As algorithms become more integrated into society, institutions that were once out of reach of algorithmic systems become increasingly subject to them. Consequently, the culture at the highest level is shaped by the people who shape the algorithms.

Culture plays a vital role in connecting individuals and communities, enabling us to leverage our unique talents, share knowledge, and solve problems together. However, the rise of an intelligentsia of machine soothsayers highlights the need to consciously design new ways for humans to connect and belong for the age of machines.

As algorithms reshape our cultural inheritance with unprecedented speed, the challenge is human. We will need to design AI in such a way as to preserve the foundational aspects of human connection that gave rise to culture itself, while steering its evolution in an age where code and culture have become inextricably linked.

How do our intuitions adapt when working alongside a system that doesn’t share them? What forms of empathy emerge—or erode—when AI mediates our interactions? Can we preserve spaces for judgment, creativity, and intuition that resist rigid automation? How will we, as society more broadly, judge "success"—by intent or by consequence? And as new forms of belonging and influence take shape, who gets to design their architecture, and who gets left out of the blueprint? These are the questions we’re asking, not to provide easy answers, but to better understand the shifting ground beneath us.


Timothy Waring and Zachary Wood, Long-term gene–culture coevolution and the human evolutionary transition Proc. R. Soc. B.28820210538, 2021 http://doi.org/10.1098/rspb.2021.0538

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Artificiality.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.