The Artificiality Imagining Summit 2024 gathered an (oversold!) group of creatives and innovators to imaging a hopeful future with AI. Tickets for our 2025 Summit will be on sale soon!
This week we dive into learning in the intimacy economy as well as the future of personhood with Jamie Boyle. Plus: read about Steve Sloman's upcoming presentation at the Imagining Summit and Helen's Book of the Week.
Explore the shift from the attention economy to the intimacy economy, where AI personalizes learning experiences based on deeper human connections and trust.
Explore the shift from the attention economy to the intimacy economy, where AI personalizes learning experiences based on deeper human connections and trust.
Shift to Intimacy: The transition from the attention economy to the intimacy economy represents a fundamental change in how we approach learning and technology. This shift emphasizes deeper, more meaningful connections and personalized experiences over shallow engagement.
AI-Driven Personalization: AI will enable unprecedented levels of personalization in learning, considering not just a student’s academic performance, but their learning styles, interests, career goals, and emotional states.
Evolving Role of Teachers: As AI handles routine tasks, the role of teacher will evolve to focus more on emotional guidance, data navigation, and designing learning experiences that blend online and offline elements.
Balancing AI and Human Connection: While AI offers powerful tools for personalization and efficiency, maintaining human connection in education remains crucial. The challenge lies in developing AI to enhance rather than replace human relationships in learning.
Trust and Data Concerns: The intimacy economy relies on users sharing personal data, raising significant concerns about privacy, trust, and data security. Addressing these concerns is crucial for the widespread adoption of AI in education and the embrace of schools and families.
Changing Learner Expectations: Modern learners, particularly younger generations, have paradoxical expectations. They demand both the flexibility of online learning and the community of traditional education, along with a strong emphasis on authenticity and meaningful work.
Ethical Considerations: The use of AI in education raises important ethical questions about bias, equity, and the potential for AI to shape learners' choices and experiences in ways that may constrain their development.
Thanks to our partners GSV and Dash Media for their help and support creating this report.
Listen to an NPR-style discussion of this article, created by Google NotebookLLM.
Learning in the intimacy economy
0:00
/646.68
Introduction
The attention economy, where companies monetize users' time and focus on digital platforms, has become a familiar concept. AI algorithms curate personalized content feeds and power the recommendation systems, keeping users engaged based on their online behaviors and inferred preferences.
This model of personalization is limited by the data we consciously provide, yet also relies on inferences drawn from information we might not even realize we are sharing. At times, the “people like you” style inferences accurately predict preferences by drawing on patterns of behavior observed in similar users. But often these inferences reveal how little current technology truly understands us, as we expose only a fraction of ourselves online. That is all about to change.
Conversational, generative AI changes our relationship with technology. By enabling genuine interaction, it allows us to form a relationship with technology. We can converse, learn, and co-create with AI. It can help us solve problems—even our most personal problems if we tell it enough about ourselves.
This shift opens the door to what we call the Intimacy Economy. In this new paradigm, humans trade intimate information for enhanced functionality. The more context we provide to the AI about ourselves, the better it can interpret our intent and assist us. This deep understanding—and the trust it requires—will form the foundation of the Intimacy Economy. The trade-off is clear: your personal information in exchange for functionality. The more you tell an AI system about the context of you, the better it will be able to interpret the intent of your interaction and help you in your current workflow. This exchange is the foundation of the Intimacy Economy and hinges on evolving concepts of trust and privacy.
Intimacy will center around our needs, goals, motivations, and desires, both explicit and implicit. As AI evolves to perform tasks and exhibit traits once considered "uniquely human," such as understanding complex emotions, creating art, and making decisions, it will reshape our interaction with technology. For instance, imagine an AI assistant that schedules your day. It starts by considering your explicit inputs—your work meetings, deadlines, and personal appointments. But over time, it begins to recognize implicit patterns, like when you're most productive, how you feel after long meetings, or when you tend to skip tasks. Eventually, the AI suggests not only when to schedule meetings but also recommends taking a break before difficult tasks, or even rearranging your schedule based on your mood and energy levels. As it becomes more attuned to your needs, the AI transitions from simply being a tool to feeling like a deeply personalized assistant, influencing how you structure your day in ways that blur the line between functionality and intimacy.
Crucially, machine intelligence is also transforming our understanding of learning itself. AI-enabled learning technologies not only create educational tools but also provide insights into human cognition and intelligence. This symbiotic relationship between machine and human learning is particularly significant in the context of the emerging Intimacy Economy, where users expect deeply personalized experiences. The success of AI in enhancing and personalizing learning over time could be the proving ground for the Intimacy Economy's broader applications.
For instance, at the simplest level, education-oriented AI will automate tasks like grading quizzes or offering personalized practice exercises based on a student's performance. As it expands, AI-powered platforms may provide adaptive learning experiences, offering real-time feedback and tailoring lessons to individual learning styles. Looking further ahead, AI might evolve into a lifelong learning coach—tracking progress, suggesting new areas of study, providing mentorship, and offering personalized guidance throughout a person’s education and career.
However, as AI becomes more useful, the trade-off between intimacy and functionality becomes more problematic. To be effective, a lifelong learning assistant would need to deeply understand not just a student’s academic performance, but also their personal goals, habits, and emotional states. The more data it collects to enhance its usefulness, the more it challenges our concepts of privacy, raising concerns about how much personal information we're willing to share in exchange for such a tailored experience. For many users, who programs and controls the AI may shape how willing they are to trust it.
Mark Naufel is an education innovator as well as a Professor of Practice and Executive Director of the Luminosity Lab at Arizona State University. He has a vision of AI as a “K to Grey” lifelong learning partner, grounded in the principles of the intimacy economy. "My vision is a personal learning management system for life," he says. He believes that, in the near future, a comprehensive AI tool will revolutionize K-12 education by helping students identify their passions and dreams, teaching them, tracking their knowledge, and assisting with college applications or alternative career paths. This tool will continue to support individuals throughout their lives, helping them achieve fulfilling careers and providing day-to-day support in their roles. Learning is the ‘sharp end of the spear’ for the intimacy economy, he says. “If personal AI is not anchored first and foremost in learning and personal growth, it will fail.” Naufel’s work takes a personal AI approach, where learning is at the center of everything, which he believes is critical.
But what will it mean to learn in the intimacy economy? To answer this, we need to consider three angles: the nature of intimacy with machines, how this might change learning, and the emerging needs and preferences of a new generation.
Intimate Machines
The vision of an all-knowing AI assistant has been with us for some time, shaped by stories from science fiction, and probably needs no explanation. In reality, AI assistants have thus far been underwhelming. Most people use Siri for simple things like setting a timer and that's about it. Personalization has also fallen short and often feels invasive, with data sharing across platforms resulting in ads for things you might have glanced at in a store or for products you’ve already purchased.
Much of the partial, incomplete personalization of our online lives results from the decontextualization inherent in data collection practices. For decades, the tech industry has distilled rich, real-world experiences into quantifiable data—clicks, views, and time spent—which are then neatly categorized into rows and columns so interactions can be used for profit. This approach has undoubtedly boosted the bottom line of Big Tech but at the cost of stripping the context and meaning from our online behaviors. Machines can store every transaction, yet they fail to capture the 'why' behind our actions. Why do we choose a song? Sometimes it's about mood or company, or even to block out an earworm from a morning commute. These nuances matter because they represent the true sum of our experiences.
Context is everything. Machines currently lack the ability to understand this context, but generative AI, especially modern large language models, hold the promise of escaping this limitation. These models can process vast amounts of unstructured data, potentially capturing some semblance of the human context embedded within. How effectively they can recreate or understand our context remains to be seen, but the potential to do so is here.
We think of this exchange as the Intimacy Surface. We use the word "surface" in this context to mean a dynamic, multidimensional interface between humans and machines. It's not necessarily limited to the physical or digital interface, but a conceptual space that includes all of the contact between humans and AI. It's a malleable, responsive surface that can shift, allowing the user to choose the level of intimacy based on the nature of the interaction. The Intimacy Surface adapts to the user's level of trust and willingness to disclose as well as their needs and desires in context. This layer is highly attuned to emotional connection, context, and trust, enabling more meaningful interactions beyond just basic functionality. It fosters deeper, more personalized, and impactful exchanges.
The Intimacy Surface is composed of five key dimensions:
Connection: Natural engagement with users, understanding emotions and contextual cues. In learning contexts, an AI with a well-developed connection dimension might engage with students in an empathetic manner. It might recognize emotional cues in a student's voice, facial expressions, or typing patterns, adapting its approach as needed. For instance, if it detects frustration, it might offer encouragement, suggest a break, or connect the student with a human teacher or peer. This assistant might also understand the context of learning, recognizing when a student is grappling with a new concept versus reviewing familiar material, and adjust its support accordingly.
Metacognition: Enhancing our ability to think about our own thinking. An AI learning assistant might enhance students' metacognitive skills by helping them reflect on their learning process. It might prompt students to explain their thought process when solving a problem, or help them identify patterns in their learning habits. For example, it could help a student recognize that they tend to struggle with abstract concepts in the morning, encouraging them to adjust their study schedule by focusing on such topics in the afternoon, thereby enhancing their metacognitive awareness and improving their learning strategies. By fostering this self-awareness, the AI system might help students become more effective learners.
Mindfulness: Promoting user awareness of emotions, surroundings, and environment. In the realm of learning, mindfulness is crucial for deep understanding and retention. An AI assistant might promote mindfulness by guiding students through focused study sessions, or offering stress reduction and anxiety management techniques. It might also help students be more aware of their learning environment, suggesting changes to enhance focus and retention. For instance, it could remind a student to find a quieter study space if their current environment is too distracting.
Meaningfulness: Facilitating personal growth, creativity, and self-actualization. This dimension is particularly crucial in addressing the concerns raised about the attention economy's approach to education, which prioritizes task completion over knowledge retention. Instead of simply offering the next piece of information a student might want, an AI system with a well-developed meaningfulness dimension might help students connect their learning to their long-term goals and values fostering a sense of purpose in their studies. For example, it might show a biology student how their current lesson on cellular respiration connects to their interest in developing sustainable energy solutions.
Trust: Building and maintaining trust through reliable performance, appropriate vulnerability, and data integrity. Trust is fundamental to the learning process and comes in two forms: trust in the AI as an effective tutor, and trust that it safeguards user data and avoids conflicts of interest. The system must be transparent about its capabilities and limitations, admitting when it’s unsure or when a human teacher’s input is needed. At the same time, it should uphold strict privacy standards, ensuring that student data is protected and not used for external purposes, while being clear about how information is used to personalize learning experiences.
These dimensions could work together to create an AI learning assistant that's not just a source of information, but a true partner in the learning process. It might adapt to each student's needs, promote deeper understanding and self-awareness, and help connect learning to personal growth and real-world applications.
Privacy and Trustworthiness
The effectiveness of AI in education relies on collecting vast amounts of personal data from students. This includes not just academic performance, but potentially sensitive information about learning styles, emotional states, and personal circumstances. The risk of data breaches, misuse, or surveillance is significant. The reality of AI systems is this: their usefulness increases with the amount of user data they have, making it essential to prioritize privacy and trustworthiness in their design—an area that remains underdeveloped. This ongoing challenge is a major hurdle in realizing the full potential of these technologies.
In the spirit of the saying, "the future is already here, it’s just not evenly distributed," we can glimpse what the future of learning might be in the intimacy economy through current AI innovations. Embedded AI systems in tools such as Apple Intelligence, Slack, Zoom, and Microsoft Office demonstrate a shift towards deeper contextual understanding. Wellness and journaling apps exemplify "walled gardens" for personal digital information, creating secure environments for managing intimate life details.
The demand for such personal apps is strong, particularly among younger generations. Surveys indicate a willingness to invest in services that enhance health and wellness, with many planning to spend significant amounts on new health and wellness subscriptions. These apps, which help users achieve their health and wellness objectives, illustrate how much people are willing to pay for services in the intimacy economy.
Yes, Big Tech may want to extend the negative side of social media and use these new intimate understandings to tell us what we want to hear (and sell us more stuff). But the potential exists for this intimate understanding to push us out of our comfort zones. Truly intimate AI might know what friction you need and won’t just be about efficiency or convenience or instant gratification.
Of course, none of this is possible unless the intimacy economy overcomes its biggest obstacle: establishing trust. As our chart shows, data privacy concerns are widespread among Americans, with a large majority viewing it as an increasing issue. This skepticism extends to AI companies and their handling of personal information. The younger generation, which might be expected to embrace new technologies, shows a notable lack of trust in large businesses. Even business leaders acknowledge that consumers have reason to be cautious. For the intimacy economy to thrive, it must overcome this pervasive mistrust and convince users that their personal data, as the foundation of intimate, personalized experiences, will be handled responsibly and ethically.
Corporate Takeover of Education
Tech giants like Microsoft and Google are poised to dominate the educational AI market. This raises serious concerns about the "hollow coring" of traditional learning institutions. Schools and universities risk having their core teaching and accreditation functions outsourced to AI platforms owned by these corporations. This shift could lead to a homogenization of education, where learning is shaped by corporate interests and an over-emphasis on skill development rather than diverse pedagogical approaches.
Learning in the Intimacy Economy
AI's potential to know us better than we know ourselves is both intriguing and unsettling, particularly if we value our agency. We often struggle to set meaningful goals for ourselves, are we ready to empower an AI to optimize our lives? This raises profound questions about our humanity and how much we should outsource to machines. Lovejoy recognizes AI's dual role in both fueling capitalist growth and presenting new challenges, balancing its potential for innovation with the ethical and economic dilemmas it creates.His passion is for human-centered AI to transcend simple predictions, building deeper relationships based on genuine long-term improvement.
When asked about this, Lovejoy posed a crucial question for the future of AI design and the intimacy economy: How can we use AI to impart foresight? Humans need to experience things firsthand to learn, a truth every parent understands when a child learns the hard way despite parental advice. This is the essence of the human condition—agency comes with pain, failure, and wasted time. The challenge and opportunity lies in designing AI that enhances our capacity for foresight while preserving the invaluable lessons that come from personal experience.
Social learning is a cornerstone of human development and collective intelligence. When we learn with and from others, we gain new perspectives, deepen understanding, and achieve more than we could alone. However, this process also involves navigating complex social dynamics, including the constant awareness of how others perceive us.
AI tutors present both opportunities and challenges in this context. On the surface, they seem to offer a judgment-free environment where students can feel comfortable asking any question, without fear of being judged for asking something trivial or making mistakes. This kind of tutor won’t react negatively to a silly question or show disapproval for a wrong answer. But is this truly the case? Do AI tutors actually create this judgment-free space? And even if they do, when we consider the widespread adoption of AI tutors, is this the type of learning environment we ultimately want to encourage?
Human social learning is grounded in our "theory of mind"—our ability to understand and share the thoughts, feelings, and intentions of others. We don't just observe and mimic; we intuit mental states and absorb values. We don't just learn facts and spit them out when required to, we absorb values and ways of being.
Loss of Social Learning
While AI can provide personalized instruction, it can't replicate the complex social dynamics of a classroom. Group projects, debates, and peer-to-peer learning are crucial for developing soft skills like communication, empathy, and teamwork. Over-reliance on AI could produce a generation of learners who excel at absorbing information but struggle with real-world social interactions and collaborative problem-solving.
As AI advances, it's developing a form of intentionality—the ability to infer a learner's intent. While this could enable highly personalized and effective learning guidance, it also risks shaping learners' choices and experiences in ways that may constrain their identity development. It’s crucial to balance the benefits of personalized guidance with the need to support students’ autonomous growth and exploration.
Consider two scenarios:
A high school student using an AI learning platform that consistently recommends video content about marine biology based on her preferences. While this helps her excel in biology, it might narrow her exposure to other subjects and learning styles. Over time, she might come to believe that she can only learn effectively through videos about topics she already enjoys, potentially limiting her academic growth and curiosity about other fields.
A college student struggling with advanced calculus. His AI tutor must decide whether to soften the difficulty to maintain engagement or encourage perseverance by providing targeted hints and breaking down the problem into smaller steps. The AI's decision here is critical. If it consistently chooses to relax the degree of difficulty, it might create a more enjoyable learning experience, but potentially stunt the student’s growth. If it always goes for the second, it might push them to greater heights but risk burnout or disengagement.
AI tutors face complex challenges. They must balance reading emotions, anticipating needs, and providing personalized support while also knowing when to push learners out of their comfort zones. The right approach isn't always clear and depends heavily on the individual learner, the subject matter, and the broader context of the learning moment. As AI becomes more integrated into education, these kinds of AI decisions are a source of bias and will play a crucial role in shaping students' learning experiences and their development as independent thinkers.
There is an obvious danger here where the over-use of AI tutors and personalized education journeys could lead to a kind of social deskilling—a generation that struggles with the messy, unpredictable reality of human interaction. A world full of AI tutors and no human teachers would be hideously asocial. If the transfer of knowledge is taken away from the teacher-student relationship, desire, pleasure, and energy are also stripped away. AI that is unable to support human energy and vibrancy isn’t human-centered. It is the core challenge for AI assistants in the intimacy economy: they shouldn't take away from human intimacy.
Teacher Displacement
The promise of cost-effective AI tutors creates a strong incentive for educational institutions to reduce their teaching staff. Beyond the immediate economic impact, this also risks losing the invaluable human touch in education. Teachers provide mentorship, emotional support, and nuanced understanding that even the most advanced AI cannot replicate.
Innovators like Mark Naufel envision a more holistic approach that goes beyond the attention economy: "AI could use its day-to-day interactions to understand the user—the projects they're working on, the passions they have the interests—to drive learning that's right there and just-in-time relevant, tied to a long-term aspirational goal that the individual has in their life."
As AI systems tackle learning tasks, the teacher's role will evolve to take on new facets, such as:
Emotional guides: Designing collaborative projects that require empathy and negotiation
Data navigators: Using AI-generated insights to tailor learning experiences
Game designers: Adapting challenges to skill levels, merging online and offline elements
For instance, in our work with clients, we have co-designed AI learning systems that, in concert with redesigned teacher roles, encourage more human-to-human interaction:
Pre-K: With the National Head Start Association, we explored the opportunity to create an AI teacher assistant that can transcribe and document a student accident. Today, teachers must choose whether to hold a child who has, say, bumped their head or to follow procedures to document the accident right away. With an AI assistant, the teacher can focus on caring for the child while also dictating an accident report.
K-8: In the post-pandemic era, current K-8 schools are dealing with a new dynamic of students who have underdeveloped social skills due to isolation. We’re exploring options for AI learning systems to work with students individually and as groups to address this shortfall through conversational interactions.
High School: AI systems have wide ranging possibilities for high school students as they begin to branch into individual learning paths. We’re particularly interested in ideas that help individual students follow—or even discover—their own curiosity whether that be finding a course or teacher in their school they don’t already know or finding resources and communities outside of the school. For instance, could an AI system help a student that expresses interest in engineering or mechanics or space to find a local rocket-building club?
Higher Education: The long-term opportunity for higher education is profound: from teaching assistants to scientific research tools to exploring the frontiers of our understanding of the world, a pursuit which could be greatly enhanced with AI.That said, higher education is on the brink of significant disruption due to AI. Traditional models of teaching, assessment, and credentialing are being challenged by AI-driven personalized learning systems, automated grading, and even AI-generated content. This technology could reshape the roles of educators, potentially reducing the need for certain human-led tasks while raising new ethical and practical questions about the value of human interaction in education. From R1 universities to community colleges, our research shows a wide range of expectations and experiences. Senior administrators are much more optimistic about AI’s potential than faculty—while everyone agrees that no one is well-prepared.
However, Michael Horn, an expert in educational innovation, cautions that AI adoption isn't straightforward. Current tools often require significant trial-and-error, and AI hallucinations could damage students' self-esteem. Teachers need effective, reliable AI tools for tasks like creating lesson plans and providing fast feedback. Teachers at all levels tell us that they are overwhelmed and struggle to find bandwidth to learn new technologies. And, of course, given the focus on efficiency from Big Tech’s AI offerings, teachers are justifiably wary of new technologies that are intended to get them to do more with less—especially since some tech leaders profess that AI can replace schools altogether. Schools at all levels face enormous challenges with new levels of data privacy, access disparities, embedded biases, and technical expertise. The theoretical opportunities may be compelling but the challenges are certainly daunting as well.
Perhaps the greatest opportunity is what we all knew going into this. People want more time with people. “What we need to do with the system is free up teachers' time to do the human piece more,” says Naufel. “I mean applied hands-on learning and taking walks across campus where you have conversations that change your life forever."
The "walk across campus that changes your life" is a luxury that many students simply can't afford — literally or figuratively. For many, education is a grueling marathon of balancing competing demands, not a leisurely stroll through the groves of academe. Any discussion of the future of education, whether AI-enhanced or not, needs to grapple with these realities.
A hopeful vision is that AI will help students of all sorts, from all backgrounds, understand themselves better, promoting deeper self-awareness and personal insight. Instead of just catering to their immediate desires or even practical needs, AI will challenge them in ways they might not always like in order for them to grow and develop both as individuals and as part of a society.
Hallucinations and Misinformation
AI systems, including large language models, are prone to "hallucinations" - confidently stating false information as fact. In an educational context, this is disastrous. Imagine a student trusting an AI tutor's incorrect explanation of a complex concept. Not only does this spread misinformation, but it can severely damage a learner's confidence when they discover the truth. This erosion of trust can have long-lasting impacts on a student's academic journey and self-esteem.
Modern Learners and AI: Shaping the Intimacy Economy in Education
The current generation of learners embodies a series of striking paradoxes that will shape the emerging intimacy economy in education. These students have never been sheltered from the world's complexities, always connected to online knowledge. Combined with recent global events, this has created a cohort of learners unlike any before.
The most significant impact of this generation on institutions is in education. Recent events such as the Covid pandemic have fundamentally altered their expectations of learning. As Michael Horn observes, they "want the best of both worlds. They crave the flexibility of online learning, but also desire the community and in-person experiences of traditional education. They expect learning to be both flexible and socially engaging."
A crucial insight about these learners is their trust dynamics. Despite being digital natives, they show a strong preference for localized, personal connections. They trust small businesses and immediate supervisors far more than large corporations or government institutions. This local trust extends to their learning preferences, favoring in-person 'shadow' learning over fully digital experiences.
This selective trust exists within a broader context of skepticism, with 2/3rds believing most people are untrustworthy and self-interested (Source: EY). Yet, paradoxically, they seek authentic connections. They distinguish themselves as self-aware, persistent, realistic, innovative, and self-reliant, descriptors that contrast with previous generations.
Gen Z Trust
When answering: How much of the time do you think you can trust the following to do what is right, Gen Zers respond: - 71% Supervisor - 71% Small Business - 47% Local Government - 41% State Government - 34% Federal Government - 34% Large Business Source: EY Generational Dynamics Lab
As the attention economy evolves into the intimacy economy, value stems from meaningful, personalized experiences. This shift is reflected in how these learners use social media: primarily for communication, not content consumption. Yet while 35% want to reduce social media usage (Source: EY), they remain the most digitally connected generation.
Learning platforms must transform from content delivery systems into interactive environments that help learners build deeper bonds. This could include AI-facilitated study groups that match learners based on complementary skills and goals, helping bridge digital and physical learning experiences.
These learners expect AI to integrate into the rest of their digital lives. Rather than standalone educational platforms, they anticipate learning opportunities embedded in their everyday digital interactions. For instance, an AI system that connects a news article they're reading to a relevant course concept, or suggests a short learning module based on a question they've just searched online.
These students also place a high value on sustainability and social responsibility. They expect EdTech platforms to not only teach about these concepts but to embody them in their operations. This could involve highlighting the environmental impact of different career paths or integrating lessons on ethical decision-making into various subjects. They will also likely challenge AI companies on the high energy usage of generative AI’s, questionable data consumption practices, and other ethics issues.
Unlike previous generations, these learners rank meaningful work as a top reason for taking, keeping, or leaving a job, placing it above compensation. A remarkable majority say they would switch jobs to be more productive, indicating a strong drive for efficiency and impact. Yet productivity alone is not their ultimate goal. This apparent contradiction suggests a strong individual need to balance pragmatism and purpose, and they look to technology to help them achieve this.
The rigid structures of traditional education are becoming obsolete for these learners. They've been told that their future jobs might not even exist yet, and they don't believe that traditional institutions are preparing them adequately. When asked what they think will improve the education system, they overwhelmingly cite real-life work, professional mentorship, and projects. Unsurprisingly, more lectures is at the bottom of their list.
If they are to succeed, AI tutors in the intimacy economy will have to meet the next generation of learners where they are.
This is the promise and the challenge of learning in the intimacy economy: creating personalized, meaningful experiences that balance efficiency with purpose, skepticism with connection, and individual needs with societal impact.
Bias and Inequality
AI systems often reflect and amplify existing societal biases. In education, this could exacerbate already stark inequalities. AI tutors might provide different quality of instruction based on factors like a student's perceived socioeconomic status, race, or gender. Furthermore, the high cost of advanced AI educational tools could create a two-tiered system, where only wealthy students have access to the best AI-powered learning experiences.
The Intimacy Economy: Reshaping the Learning Landscape
The shift from the attention economy to the intimacy economy represents a significant change in our approach to learning, technology, and human connection. This transition brings both promising opportunities and significant challenges.
Key Takeaways:
Personalized Learning Experiences: AI-driven personalization will adapt not just to learners' academic needs, but to their values, emotional states, and life goals. AI tutors may soon understand and respond to a student's emotional state, adjusting lessons accordingly.
Crucial Human Element: Despite AI advancements, human teachers will remain indispensable. Their role will evolve from primarily delivering information to providing emotional support, facilitating complex problem-solving, and fostering real-world connections.
Bridging Digital and Physical: Successful educational platforms will need to seamlessly integrate online flexibility with authentic human connections, addressing the paradoxical desires of modern learners.
Trust and Transparency: In the intimacy economy, data is a valuable resource, but learners demand transparency. Educational technology companies must prioritize clear data policies and user control to build and maintain trust.
Dynamic Career Pathways: Traditional education-to-career pipelines are becoming obsolete. We can expect to see more AI-driven career guidance systems integrated into learning platforms, using real-time labor market data to inform skill development.
Looking Ahead:
The intimacy economy is already impacting learning, with EdTech startups developing advanced AI tutors and educational institutions reimagining their approaches. However, significant challenges remain:
Ensuring AI doesn't exacerbate existing educational inequalities
Creating AI systems that challenge learners rather than just catering to preferences
Preserving the essential human aspects of learning that lead to true growth
Addressing concerns about data privacy, AI bias, and the potential for corporate dominance in education
Managing these challenges will shape the future of learning. The intimacy economy promises more personalized, engaging, and impactful learning experiences. However, it also risks creating filter bubbles, reducing human connection, and commodifying personal data.
Successfully navigating this shift will require embracing AI's potential while safeguarding human connection and agency. It means rethinking not just our tools, but our entire approach to education. The future of learning in the intimacy economy will need to balance intimate interactions with both AI and humans, creating a learning landscape that is both highly personalized and deeply connected.
Dave Edwards is a Co-Founder of Artificiality. He previously co-founded Intelligentsia.ai (acquired by Atlantic Media) and worked at Apple, CRV, Macromedia, Morgan Stanley, Quartz, and ThinkEquity.
Helen Edwards is a Co-Founder of Artificiality. She previously co-founded Intelligentsia.ai (acquired by Atlantic Media) and worked at Meridian Energy, Pacific Gas & Electric, Quartz, and Transpower.