AI Agents, Mathematics, and Making Sense of Chaos
From Artificiality This Week * Our Gathering: Our Artificiality Summit 2025 will be held on October 23-25 in Bend, Oregon. The
Explore the shift from the attention economy to the intimacy economy, where AI personalizes learning experiences based on deeper human connections and trust.
Thanks to our partners GSV and Dash Media for their help and support creating this report.
Listen to an NPR-style discussion of this article, created by Google NotebookLLM.
The attention economy, where companies monetize users' time and focus on digital platforms, has become a familiar concept. AI algorithms curate personalized content feeds and power the recommendation systems, keeping users engaged based on their online behaviors and inferred preferences.
This model of personalization is limited by the data we consciously provide, yet also relies on inferences drawn from information we might not even realize we are sharing. At times, the “people like you” style inferences accurately predict preferences by drawing on patterns of behavior observed in similar users. But often these inferences reveal how little current technology truly understands us, as we expose only a fraction of ourselves online. That is all about to change.
Conversational, generative AI changes our relationship with technology. By enabling genuine interaction, it allows us to form a relationship with technology. We can converse, learn, and co-create with AI. It can help us solve problems—even our most personal problems if we tell it enough about ourselves.
This shift opens the door to what we call the Intimacy Economy. In this new paradigm, humans trade intimate information for enhanced functionality. The more context we provide to the AI about ourselves, the better it can interpret our intent and assist us. This deep understanding—and the trust it requires—will form the foundation of the Intimacy Economy. The trade-off is clear: your personal information in exchange for functionality. The more you tell an AI system about the context of you, the better it will be able to interpret the intent of your interaction and help you in your current workflow. This exchange is the foundation of the Intimacy Economy and hinges on evolving concepts of trust and privacy.
Intimacy will center around our needs, goals, motivations, and desires, both explicit and implicit. As AI evolves to perform tasks and exhibit traits once considered "uniquely human," such as understanding complex emotions, creating art, and making decisions, it will reshape our interaction with technology. For instance, imagine an AI assistant that schedules your day. It starts by considering your explicit inputs—your work meetings, deadlines, and personal appointments. But over time, it begins to recognize implicit patterns, like when you're most productive, how you feel after long meetings, or when you tend to skip tasks. Eventually, the AI suggests not only when to schedule meetings but also recommends taking a break before difficult tasks, or even rearranging your schedule based on your mood and energy levels. As it becomes more attuned to your needs, the AI transitions from simply being a tool to feeling like a deeply personalized assistant, influencing how you structure your day in ways that blur the line between functionality and intimacy.
Crucially, machine intelligence is also transforming our understanding of learning itself. AI-enabled learning technologies not only create educational tools but also provide insights into human cognition and intelligence. This symbiotic relationship between machine and human learning is particularly significant in the context of the emerging Intimacy Economy, where users expect deeply personalized experiences. The success of AI in enhancing and personalizing learning over time could be the proving ground for the Intimacy Economy's broader applications.
For instance, at the simplest level, education-oriented AI will automate tasks like grading quizzes or offering personalized practice exercises based on a student's performance. As it expands, AI-powered platforms may provide adaptive learning experiences, offering real-time feedback and tailoring lessons to individual learning styles. Looking further ahead, AI might evolve into a lifelong learning coach—tracking progress, suggesting new areas of study, providing mentorship, and offering personalized guidance throughout a person’s education and career.
However, as AI becomes more useful, the trade-off between intimacy and functionality becomes more problematic. To be effective, a lifelong learning assistant would need to deeply understand not just a student’s academic performance, but also their personal goals, habits, and emotional states. The more data it collects to enhance its usefulness, the more it challenges our concepts of privacy, raising concerns about how much personal information we're willing to share in exchange for such a tailored experience. For many users, who programs and controls the AI may shape how willing they are to trust it.
Mark Naufel is an education innovator as well as a Professor of Practice and Executive Director of the Luminosity Lab at Arizona State University. He has a vision of AI as a “K to Grey” lifelong learning partner, grounded in the principles of the intimacy economy. "My vision is a personal learning management system for life," he says. He believes that, in the near future, a comprehensive AI tool will revolutionize K-12 education by helping students identify their passions and dreams, teaching them, tracking their knowledge, and assisting with college applications or alternative career paths. This tool will continue to support individuals throughout their lives, helping them achieve fulfilling careers and providing day-to-day support in their roles. Learning is the ‘sharp end of the spear’ for the intimacy economy, he says. “If personal AI is not anchored first and foremost in learning and personal growth, it will fail.” Naufel’s work takes a personal AI approach, where learning is at the center of everything, which he believes is critical.
But what will it mean to learn in the intimacy economy? To answer this, we need to consider three angles: the nature of intimacy with machines, how this might change learning, and the emerging needs and preferences of a new generation.
The vision of an all-knowing AI assistant has been with us for some time, shaped by stories from science fiction, and probably needs no explanation. In reality, AI assistants have thus far been underwhelming. Most people use Siri for simple things like setting a timer and that's about it. Personalization has also fallen short and often feels invasive, with data sharing across platforms resulting in ads for things you might have glanced at in a store or for products you’ve already purchased.
Much of the partial, incomplete personalization of our online lives results from the decontextualization inherent in data collection practices. For decades, the tech industry has distilled rich, real-world experiences into quantifiable data—clicks, views, and time spent—which are then neatly categorized into rows and columns so interactions can be used for profit. This approach has undoubtedly boosted the bottom line of Big Tech but at the cost of stripping the context and meaning from our online behaviors. Machines can store every transaction, yet they fail to capture the 'why' behind our actions. Why do we choose a song? Sometimes it's about mood or company, or even to block out an earworm from a morning commute. These nuances matter because they represent the true sum of our experiences.
Context is everything. Machines currently lack the ability to understand this context, but generative AI, especially modern large language models, hold the promise of escaping this limitation. These models can process vast amounts of unstructured data, potentially capturing some semblance of the human context embedded within. How effectively they can recreate or understand our context remains to be seen, but the potential to do so is here.
We think of this exchange as the Intimacy Surface. We use the word "surface" in this context to mean a dynamic, multidimensional interface between humans and machines. It's not necessarily limited to the physical or digital interface, but a conceptual space that includes all of the contact between humans and AI. It's a malleable, responsive surface that can shift, allowing the user to choose the level of intimacy based on the nature of the interaction. The Intimacy Surface adapts to the user's level of trust and willingness to disclose as well as their needs and desires in context. This layer is highly attuned to emotional connection, context, and trust, enabling more meaningful interactions beyond just basic functionality. It fosters deeper, more personalized, and impactful exchanges.
The Intimacy Surface is composed of five key dimensions:
These dimensions could work together to create an AI learning assistant that's not just a source of information, but a true partner in the learning process. It might adapt to each student's needs, promote deeper understanding and self-awareness, and help connect learning to personal growth and real-world applications.
In the spirit of the saying, "the future is already here, it’s just not evenly distributed," we can glimpse what the future of learning might be in the intimacy economy through current AI innovations. Embedded AI systems in tools such as Apple Intelligence, Slack, Zoom, and Microsoft Office demonstrate a shift towards deeper contextual understanding. Wellness and journaling apps exemplify "walled gardens" for personal digital information, creating secure environments for managing intimate life details.
The demand for such personal apps is strong, particularly among younger generations. Surveys indicate a willingness to invest in services that enhance health and wellness, with many planning to spend significant amounts on new health and wellness subscriptions. These apps, which help users achieve their health and wellness objectives, illustrate how much people are willing to pay for services in the intimacy economy.
Yes, Big Tech may want to extend the negative side of social media and use these new intimate understandings to tell us what we want to hear (and sell us more stuff). But the potential exists for this intimate understanding to push us out of our comfort zones. Truly intimate AI might know what friction you need and won’t just be about efficiency or convenience or instant gratification.
Of course, none of this is possible unless the intimacy economy overcomes its biggest obstacle: establishing trust. As our chart shows, data privacy concerns are widespread among Americans, with a large majority viewing it as an increasing issue. This skepticism extends to AI companies and their handling of personal information. The younger generation, which might be expected to embrace new technologies, shows a notable lack of trust in large businesses. Even business leaders acknowledge that consumers have reason to be cautious. For the intimacy economy to thrive, it must overcome this pervasive mistrust and convince users that their personal data, as the foundation of intimate, personalized experiences, will be handled responsibly and ethically.
AI's potential to know us better than we know ourselves is both intriguing and unsettling, particularly if we value our agency. We often struggle to set meaningful goals for ourselves, are we ready to empower an AI to optimize our lives? This raises profound questions about our humanity and how much we should outsource to machines. Lovejoy recognizes AI's dual role in both fueling capitalist growth and presenting new challenges, balancing its potential for innovation with the ethical and economic dilemmas it creates.His passion is for human-centered AI to transcend simple predictions, building deeper relationships based on genuine long-term improvement.
When asked about this, Lovejoy posed a crucial question for the future of AI design and the intimacy economy: How can we use AI to impart foresight? Humans need to experience things firsthand to learn, a truth every parent understands when a child learns the hard way despite parental advice. This is the essence of the human condition—agency comes with pain, failure, and wasted time. The challenge and opportunity lies in designing AI that enhances our capacity for foresight while preserving the invaluable lessons that come from personal experience.
Social learning is a cornerstone of human development and collective intelligence. When we learn with and from others, we gain new perspectives, deepen understanding, and achieve more than we could alone. However, this process also involves navigating complex social dynamics, including the constant awareness of how others perceive us.
AI tutors present both opportunities and challenges in this context. On the surface, they seem to offer a judgment-free environment where students can feel comfortable asking any question, without fear of being judged for asking something trivial or making mistakes. This kind of tutor won’t react negatively to a silly question or show disapproval for a wrong answer. But is this truly the case? Do AI tutors actually create this judgment-free space? And even if they do, when we consider the widespread adoption of AI tutors, is this the type of learning environment we ultimately want to encourage?
Human social learning is grounded in our "theory of mind"—our ability to understand and share the thoughts, feelings, and intentions of others. We don't just observe and mimic; we intuit mental states and absorb values. We don't just learn facts and spit them out when required to, we absorb values and ways of being.
As AI advances, it's developing a form of intentionality—the ability to infer a learner's intent. While this could enable highly personalized and effective learning guidance, it also risks shaping learners' choices and experiences in ways that may constrain their identity development. It’s crucial to balance the benefits of personalized guidance with the need to support students’ autonomous growth and exploration.
Consider two scenarios:
AI tutors face complex challenges. They must balance reading emotions, anticipating needs, and providing personalized support while also knowing when to push learners out of their comfort zones. The right approach isn't always clear and depends heavily on the individual learner, the subject matter, and the broader context of the learning moment. As AI becomes more integrated into education, these kinds of AI decisions are a source of bias and will play a crucial role in shaping students' learning experiences and their development as independent thinkers.
There is an obvious danger here where the over-use of AI tutors and personalized education journeys could lead to a kind of social deskilling—a generation that struggles with the messy, unpredictable reality of human interaction. A world full of AI tutors and no human teachers would be hideously asocial. If the transfer of knowledge is taken away from the teacher-student relationship, desire, pleasure, and energy are also stripped away. AI that is unable to support human energy and vibrancy isn’t human-centered. It is the core challenge for AI assistants in the intimacy economy: they shouldn't take away from human intimacy.
Innovators like Mark Naufel envision a more holistic approach that goes beyond the attention economy: "AI could use its day-to-day interactions to understand the user—the projects they're working on, the passions they have the interests—to drive learning that's right there and just-in-time relevant, tied to a long-term aspirational goal that the individual has in their life."
As AI systems tackle learning tasks, the teacher's role will evolve to take on new facets, such as:
For instance, in our work with clients, we have co-designed AI learning systems that, in concert with redesigned teacher roles, encourage more human-to-human interaction:
However, Michael Horn, an expert in educational innovation, cautions that AI adoption isn't straightforward. Current tools often require significant trial-and-error, and AI hallucinations could damage students' self-esteem. Teachers need effective, reliable AI tools for tasks like creating lesson plans and providing fast feedback. Teachers at all levels tell us that they are overwhelmed and struggle to find bandwidth to learn new technologies. And, of course, given the focus on efficiency from Big Tech’s AI offerings, teachers are justifiably wary of new technologies that are intended to get them to do more with less—especially since some tech leaders profess that AI can replace schools altogether. Schools at all levels face enormous challenges with new levels of data privacy, access disparities, embedded biases, and technical expertise. The theoretical opportunities may be compelling but the challenges are certainly daunting as well.
Perhaps the greatest opportunity is what we all knew going into this. People want more time with people. “What we need to do with the system is free up teachers' time to do the human piece more,” says Naufel. “I mean applied hands-on learning and taking walks across campus where you have conversations that change your life forever."
The "walk across campus that changes your life" is a luxury that many students simply can't afford — literally or figuratively. For many, education is a grueling marathon of balancing competing demands, not a leisurely stroll through the groves of academe. Any discussion of the future of education, whether AI-enhanced or not, needs to grapple with these realities.
A hopeful vision is that AI will help students of all sorts, from all backgrounds, understand themselves better, promoting deeper self-awareness and personal insight. Instead of just catering to their immediate desires or even practical needs, AI will challenge them in ways they might not always like in order for them to grow and develop both as individuals and as part of a society.
The current generation of learners embodies a series of striking paradoxes that will shape the emerging intimacy economy in education. These students have never been sheltered from the world's complexities, always connected to online knowledge. Combined with recent global events, this has created a cohort of learners unlike any before.
The most significant impact of this generation on institutions is in education. Recent events such as the Covid pandemic have fundamentally altered their expectations of learning. As Michael Horn observes, they "want the best of both worlds. They crave the flexibility of online learning, but also desire the community and in-person experiences of traditional education. They expect learning to be both flexible and socially engaging."
A crucial insight about these learners is their trust dynamics. Despite being digital natives, they show a strong preference for localized, personal connections. They trust small businesses and immediate supervisors far more than large corporations or government institutions. This local trust extends to their learning preferences, favoring in-person 'shadow' learning over fully digital experiences.
This selective trust exists within a broader context of skepticism, with 2/3rds believing most people are untrustworthy and self-interested (Source: EY). Yet, paradoxically, they seek authentic connections. They distinguish themselves as self-aware, persistent, realistic, innovative, and self-reliant, descriptors that contrast with previous generations.
As the attention economy evolves into the intimacy economy, value stems from meaningful, personalized experiences. This shift is reflected in how these learners use social media: primarily for communication, not content consumption. Yet while 35% want to reduce social media usage (Source: EY), they remain the most digitally connected generation.
Learning platforms must transform from content delivery systems into interactive environments that help learners build deeper bonds. This could include AI-facilitated study groups that match learners based on complementary skills and goals, helping bridge digital and physical learning experiences.
These learners expect AI to integrate into the rest of their digital lives. Rather than standalone educational platforms, they anticipate learning opportunities embedded in their everyday digital interactions. For instance, an AI system that connects a news article they're reading to a relevant course concept, or suggests a short learning module based on a question they've just searched online.
These students also place a high value on sustainability and social responsibility. They expect EdTech platforms to not only teach about these concepts but to embody them in their operations. This could involve highlighting the environmental impact of different career paths or integrating lessons on ethical decision-making into various subjects. They will also likely challenge AI companies on the high energy usage of generative AI’s, questionable data consumption practices, and other ethics issues.
Unlike previous generations, these learners rank meaningful work as a top reason for taking, keeping, or leaving a job, placing it above compensation. A remarkable majority say they would switch jobs to be more productive, indicating a strong drive for efficiency and impact. Yet productivity alone is not their ultimate goal. This apparent contradiction suggests a strong individual need to balance pragmatism and purpose, and they look to technology to help them achieve this.
The rigid structures of traditional education are becoming obsolete for these learners. They've been told that their future jobs might not even exist yet, and they don't believe that traditional institutions are preparing them adequately. When asked what they think will improve the education system, they overwhelmingly cite real-life work, professional mentorship, and projects. Unsurprisingly, more lectures is at the bottom of their list.
If they are to succeed, AI tutors in the intimacy economy will have to meet the next generation of learners where they are.
This is the promise and the challenge of learning in the intimacy economy: creating personalized, meaningful experiences that balance efficiency with purpose, skepticism with connection, and individual needs with societal impact.
The shift from the attention economy to the intimacy economy represents a significant change in our approach to learning, technology, and human connection. This transition brings both promising opportunities and significant challenges.
Key Takeaways:
Looking Ahead:
The intimacy economy is already impacting learning, with EdTech startups developing advanced AI tutors and educational institutions reimagining their approaches. However, significant challenges remain:
Managing these challenges will shape the future of learning. The intimacy economy promises more personalized, engaging, and impactful learning experiences. However, it also risks creating filter bubbles, reducing human connection, and commodifying personal data.
Successfully navigating this shift will require embracing AI's potential while safeguarding human connection and agency. It means rethinking not just our tools, but our entire approach to education. The future of learning in the intimacy economy will need to balance intimate interactions with both AI and humans, creating a learning landscape that is both highly personalized and deeply connected.
The Artificiality Weekend Briefing: About AI, Not Written by AI