Tag: Emotion

  • She Moves Naturally, Expresses Emotion, Maintains Eye Contact, and Feels Warm—Yet She’s a Robot

    She Moves Naturally, Expresses Emotion, Maintains Eye Contact, and Feels Warm—Yet She’s a Robot

    At this point in the robotics race, it’s fair to say some of us are feeling a bit of humanoid fatigue. We’ve all seen the clips of robots fumbling plates from dishwashers, taking ages to open a fridge, and struggling—sometimes hilariously—to cook or play soccer.
    Image Credits: The company is planning to have Moya on the market later this year
    DroidUp/Shanghai Eye

    At this point in the robotics race, it’s fair to say some of us are feeling a bit of humanoid fatigue. We’ve all seen the clips of robots fumbling plates from dishwashers, taking ages to open a fridge, and struggling—sometimes hilariously—to cook or play soccer.

    Image Credits: Moya, customizable humanoid robot, makes debut in Shanghai, powered by DroidUp’s latest tech

    That said, Shanghai-based robotics startup DroidUp (also called Zhuoyide) appears to have raised the bar significantly—by several notches. The move is notable because the company plans to debut its new lifelike humanoid this year. The creators revealed the robot, Moya, at an event in Shanghai’s Zhangjiang Robotics Valley, a hub for China’s rising humanoid developers.

    At the launch, the company introduced what it describes as a “beautifully designed and expressive bionic robot,” claiming it to be “the world’s first highly bionic robot to deeply combine human aesthetics with advanced humanoid motion.”

    A Humanoid that Moves and Emotes Like a Human

    Moya is arguably best experienced on video rather than through text, thanks to footage from Shanghai Eye, part of the Shanghai Media Group.

    A DroidUp representative said Moya’s modular bionic platform lets users flexibly customize her gender and appearance. Moya’s adaptable bionic head conveys a wide range of emotions through expressive eye movements, while Zhuoyide’s cerebellar motor system enables fluid, graceful walking and turning beyond the rigid, metallic style of typical humanoid robots.

    This move away from the “steel robot” look includes human-like features such as temperature regulation, soft skin-like materials, and a rib cage.

    Image Credits: Not what you’d expect to see inside a robot
    DroidUp/Shanghai Eye

    Moya’s creators claim 92% human-like walking, but the remaining 8%—her slightly awkward gait—is noticeable. That said, walking isn’t where she truly shines. Her strength lies in engaging people—through eye contact, smiles, nods, and subtle, human-like facial expressions.

    Real-Time Interaction and Humanlike Warmth

    A hidden eye camera lets Moya use AI to respond in real time with human-like micro-expressions.

    Beyond her very human qualities, Moya is warm in the literal sense. Her skin is kept at 32–36 °C (90–97 °F), making her feel more approachable. Research shows humans instinctively use warmth and touch to bond, often unconsciously.

    “A robot serving humans should be warm… like a living being people can connect with,” said Li Qingdu, DroidUp’s founder.

    With all this considered, it’s hardly surprising that reactions to Moya have been divided. Many responses echo the “uncanny valley,” with some likening her to a Westworld android or a wandering ghost. Though Moya could be sexualized, DroidUp aims to use her in practical, high-need areas like elder care.

    Image Credits: One commentator said Moya looked like a living ghost – and we tend to agree
    DroidUp/Shanghai Eye

    That said, owning a Moya won’t come cheap: you’ll need roughly US$173,000 or more. When she launches later this year, the company expects her to be primarily placed in healthcare and educational settings.


    Read the original article on: New Atlas

    Read more: China is Relying on AI-Powered Robots to Improve Traffic Management

  • Wearable Device for Emotion Detection Works Like a Mood Ring for the Face

    Wearable Device for Emotion Detection Works Like a Mood Ring for the Face

    The device could be voluntarily worn by patients who realize they are at risk of conditions such as depression and anxiety
    Yangbo Yuan / Penn State

    It’s common for patients to conceal their true feelings, either from their caregivers or even themselves. A new experimental facial “sticker” aims to help with this by detecting and transmitting information about the wearer’s emotional state.

    Sensors Measure Key Physiological Data to Track Emotions

    Developed by Assoc. Prof. Huanyu “Larry” Cheng and his team at Pennsylvania State University, this flexible and stretchable device uses sensors to measure mechanical strain in two directions, body temperature, humidity from sweat, and blood oxygen levels. The sensors are arranged in layers, with thin sheets of various materials separating them to prevent interference between their signals and measurements.

    Other components include a printed circuit board, wireless charging coil, 5-volt battery, and Bluetooth chip. These elements are all enclosed in a waterproof silicone cover, with the entire device measuring approximately 6 cm (2.4 inches) long.

    The team tested the device’s temperature- and humidity-monitoring capabilities (shown here with a US quarter-dollar coin for scale) not only on volunteers’ cheeks and foreheads but also on their arms and fingertips.
    Yangbo Yuan / Penn State

    Once the sticker adheres to the patient’s face, its strain sensors track the skin’s movement along two axes and wirelessly send this data to an app on a connected smartphone or tablet.

    AI Algorithms Accurately Identify Facial Expressions and Mood

    The app uses AI-driven algorithms to interpret the wearer’s facial expression, which correlates with their mood. In laboratory trials, the system has demonstrated over 96% accuracy in identifying six common facial expressions: happiness, surprise, fear, sadness, anger, and disgust.

    However, people can sometimes fake facial expressions, even unconsciously.To address this, the app also uses real-time readings from the temperature, humidity, and blood oxygen sensors. By combining these data points, the system has achieved nearly 89% accuracy in identifying true emotions triggered by watching various video clips.

    Advancements Will Improve Accuracy and Remote Monitoring Capabilities

    As the technology advances, we expect this accuracy to improve. Moreover, because the system processes the data in the cloud, doctors could monitor patients’ emotional well-being remotely over the internet.

    The team trained the sticker’s expression-detecting algorithms on eight volunteers and then tested them on another three.
    Yangbo Yuan / Penn State

    This approach offers a more comprehensive understanding of our emotions by analyzing multiple body signals at once,” says Cheng. “People often hide how they truly feel, which is why we’re combining facial expression analysis with other important physiological indicators, ultimately providing better mental health monitoring and support.


    Read the original article on: New Atlas

    Read more: Can the Use of Laptop Cause Infertility?