Apple’s Pixar-Style Lamp-Bot Showcases the Friendly Side of Machines
![](https://assets.newatlas.com/dims4/default/b923c21/2147483647/strip/true/crop/1800x1200+0+0/resize/840x560!/quality/90/?url=http%3A%2F%2Fnewatlas-brightspot.s3.amazonaws.com%2Fa6%2F13%2Fe93ee9384aaea7ecbd8b8afa0395%2Fapples-lamp-bot-can-depict-joy-and-sadness-and-even-do-a-little-dance.jpg)
Apple
Luxo Jr. has been a charming presence in every Pixar film since 1995, when the animated desk lamp first hopped onto the screen, playfully stomping on the studio’s logo in the opening credits. This iconic character has now inspired Apple researchers to explore ways to make robots more expressive and improve human-machine interactions—and the result is undeniably endearing.
A trio of researchers from Apple’s Machine Learning Research division showcased how a robotic desk lamp, capable of movement and gestures, can create a more engaging experience compared to a purely functional design. Their study, accompanied by a detailed video, presents the robot performing six tasks in both “Expressive” and “Functional” modes for direct comparison.
The video is available on Apple’s website and in the embedded X post below. As shown, the robotic lamp features a camera, projector, and speaker alongside its LED light.
Expressive vs. Functional
In ‘Expressive‘ mode, the robot demonstrates lifelike behaviors, such as glancing out the window before giving a weather update, gently nudging a glass toward the researcher as a hydration reminder, and even dancing along when music plays. In contrast, the ‘Functional’ mode lamp performs tasks with only the necessary movements, focusing purely on efficiency.
“Our findings indicate that expression-driven movements significantly enhance user engagement and perceived robot qualities. This effect is especially pronounced in social-oriented tasks,” the researchers stated. That sentiment rings true—when the robot lowered its head in disappointment after being told it couldn’t join a hike, I couldn’t help but ask out loud, “Why not?“
![](https://assets.newatlas.com/dims4/default/573d079/2147483647/strip/true/crop/1400x933+0+0/resize/800x533!/quality/90/?url=http%3A%2F%2Fnewatlas-brightspot.s3.amazonaws.com%2Fbe%2Fe1%2F2f0a32eb450a97748f7ee4a8e7ca%2Fthe-robot-lamp-responds-to-the-researcher-with-lifelike-gestures-of-its-own.jpg)
Apple
Apple’s Robotics Ambitions Align with Bloomberg Report
Beyond highlighting Apple’s machine learning expertise, this project supports an August 2024 Bloomberg report by journalist Mark Gurman, which claimed Apple was developing a robot with an articulating arm and an iPad-like display.
Designed to assist with smart home controls, video calls, and home security monitoring, the device was reportedly expected to launch between 2026 and 2027, with a projected price of around $1,000.
Apple’s interest in robotics is further evident in a research paper it published last month, outlining a framework for generating natural and expressive gestures in humanoid robots—such as giving a thumbs-up to a student solving a math problem on a chalkboard.
Whether these technologies will make their way into consumer products remains to be seen. Apple previously scrapped its self-driving car project, and according to Gurman, reassigned that program’s lead to this screen-equipped robot initiative. But if Apple is truly working on charming, expressive robots like this, it’s a vision worth getting excited about.
Read the original article on: New Atlas
Read more: Robotic Dogs Handle Bomb Detection, Neutralization, and Disposal
Leave a Reply