How Emerging Technologies Like VR and AI could Shape a Kinder, More United Generation

Design Sem Nome 2025 10 03T142845.204
Empathy isn
Image Credits: Julia M Cameron from Pexels

Empathy isn’t merely a desirable soft skill—it’s a fundamental part of how both children and adults manage emotions, form relationships, and learn from each other.

Between ages 6 and 9, children start moving away from self-centered thinking and begin to recognize other people’s feelings and viewpoints. This stage of early childhood is a critical time for nurturing empathy and other social-emotional skills.

Pretend play has long served as a natural tool for developing empathy. Many adults recall playing roles like doctor and patient or using sticks and leaves as make-believe money. These playful experiences weren’t just fun—they were early exercises in understanding and stepping into someone else’s shoes.

As children spend more time with technology and less time engaging in pretend play, opportunities to develop empathy through imaginative interaction are decreasing. Some educators are concerned that this shift may be hindering social-emotional development. However, research in affective computing—technology that can recognize and even simulate emotions—indicates that digital tools might also offer new ways to support emotional learning.

Virtual reality, for example, can create immersive experiences where children interact with lifelike characters that express emotions realistically. As a researcher in human-computer interaction focused on social-emotional learning, I see potential in using VR and artificial intelligence—when applied thoughtfully—to transform how emotional skills are taught. These technologies could become powerful tools for creating “empathy classrooms” or “emotional regulation simulators.”

Emotional Adventure

As part of my doctoral research at the University of Florida, I began developing a framework for a VR Empathy Game in 2017. This work integrates principles from developmental psychology, affective computing, and participatory design involving children. While collaborating with the KidsTeam program at the Human-Computer Interaction Lab at the University of Maryland, I worked with children aged 7 to 11 who acted as co-designers, helping us envision the emotional experience of an empathy-driven VR game.

In 2018, I partnered with 15 master’s students from the Florida Interactive Entertainment Academy at the University of Central Florida to develop the first game prototype, Why Did Baba Yaga Take My Brother? Inspired by a Russian folktale, the game features four characters—each symbolizing a primary emotion: Baba Yaga represents anger, Goose stands for fear, the Older Sister conveys happiness, and the Younger Sister embodies sadness.

Unlike typical games that offer points or badges as rewards, this one takes a different approach. Progress is tied to how well children engage with the characters—by listening to their stories, understanding their perspectives, and practicing empathetic behaviors. For instance, players might see the world through a character’s eyes, explore their memories, or offer comfort with a hug, even to someone like Baba Yaga. This approach aligns with a central principle of social-emotional learning: empathy isn’t about earning rewards, but about taking time to understand and respond to others’ emotions.

Since then, my colleagues and I have been continually refining the game and using it as a tool to explore how children develop empathy.

Various Routes To Empathy

We tested the game individually with elementary school children. First, we asked them general questions and administered an empathy survey, then invited them to play the game. During gameplay, we observed their behavior and later discussed their experiences with them.

Our key finding was that children engaged with the VR characters in ways that mirrored typical empathic behaviors people exhibit when interacting with others. Some showed cognitive empathy—they understood the characters’ emotions, listened carefully, gently tapped the characters to get their attention, and tried to offer help. However, they were not entirely immersed in the characters’ feelings.

Some children showed emotional contagion, directly reflecting the characters’ feelings, at times becoming so overwhelmed by fear or sadness that they paused the game. Meanwhile, a few others didn’t connect with the characters emotionally and focused primarily on exploring the virtual world. These three types of reactions also occur naturally when children interact with their peers in real life.

These observations emphasize both the potential and the challenges of using VR. While it can elicit strong empathic reactions, it also raises important questions about how to create experiences that accommodate different temperaments—some children require more stimulation, while others benefit from a slower, gentler pace.

AI Perspective On Emotions

Our main challenge now is figuring out how to seamlessly integrate this kind of empathy-focused game into daily life. In classrooms, VR won’t replace real conversations or traditional role-playing activities but can serve as a valuable complement. For example, a teacher might use a brief VR scenario to initiate discussions, prompting students to reflect on their emotions and relate them to their own friendships. This way, VR acts as a catalyst for conversation rather than a standalone solution.

We’re also developing adaptive VR systems that respond in real time to a child’s emotional state. Using data like facial expressions, heart rate, or eye movement, the headset could detect if a child feels anxious or scared and adjust the experience accordingly—by toning down characters’ expressiveness or providing supportive prompts. This kind of responsive “empathy classroom” would offer children a safe space to gradually build their emotional regulation skills.

This is where AI plays a crucial role. AI systems can interpret data gathered from VR headsets—such as eye movement, facial expressions, heart rate, and body language—and use this information to adapt the experience in real time. For instance, if a child seems anxious or avoids eye contact with a sad character, the AI might slow down the narrative, offer supportive prompts, or lessen the emotional intensity of the scene. Conversely, if the child appears calm and engaged, the AI could introduce more challenging scenarios to enhance their learning.

In our ongoing research, we are exploring how AI can directly assess empathy by monitoring emotional responses throughout gameplay, giving educators deeper insights into how empathy develops over time.

Research and Partnerships

While I see great potential in this work, it also brings up important questions. Should VR characters display emotions at full intensity, or should we tone them down to accommodate more sensitive children? If kids perceive VR characters as real, how can we ensure that the lessons learned translate to real-life situations like the playground or family dinner? And given the high cost of headsets, how do we prevent empathy technology from increasing the digital divide?

These are not just research challenges but ethical responsibilities. Achieving this vision calls for collaboration among educators, researchers, designers, parents, and children themselves. Computer scientists develop the technology, psychologists ensure the experiences support emotional well-being, teachers tailor the content to fit curricula, and children help co-create the games to keep them engaging and meaningful.

Together, we can create technologies that do more than entertain—they can foster empathy, emotional regulation, and stronger connections in the next generation.


Read the original article on: Phys.Org

Read more: Scientists Have Grown Human Skin in The Lab and may be Close to Stopping Aging

Scroll to Top