Tag: Tasks

  • Robotic Hand Demonstrates Lifelike Skill in Intricate Laboratory Tasks

    Robotic Hand Demonstrates Lifelike Skill in Intricate Laboratory Tasks

    AgileX Robotics has showcased a new demonstration that pairs its robotic arm with a dexterous hand to perform intricate manipulation tasks with fluid, human-like precision.
    Image Credits:The duo showcases adaptability and precision, excelling in advanced lab automation and performing intricate manipulation tasks.

    AgileX Robotics has showcased a new demonstration that pairs its robotic arm with a dexterous hand to perform intricate manipulation tasks with fluid, human-like precision.

    A new video shows the PiPER arm with the flexible, high-DoF Linkerbot Chius hand operating complex lab tools with exceptional accuracy, highlighting its potential for advancing lab automation.

    This demo showcases PiPER’s exceptional versatility and precision in advanced lab automation,” the company stated on YouTube.

    Recently, students at USC Viterbi created the MOTIF Hand, a robotic hand equipped with sensors that detect force, temperature, and movement, allowing for more natural, human-like interactions.

    Showcasing Advanced Robotic Synergy in the Lab

    In a recent demo, AgileX Robotics’ PiPER arm paired with Lingxin Qiaoshou’s Chius hand to demonstrate advanced complex lab-handling capabilities.The integrated system displayed a remarkable array of human-like movements with seamless, coordinated precision.

    The demonstration opened with the arm-hand pair adeptly manipulating a spoon, mimicking natural human gestures. It then moved on to pipetting—a task that demands steady control and precision—illustrating the system’s effectiveness for delicate, repetitive lab procedures.

    The duo also performed bottle capping and decapping with impressive finesse, requiring precise force regulation to prevent damage or spills. The system further demonstrated its versatility by shaking stoppered test tubes, showcasing its adaptability to a range of routine laboratory processes.

    Demonstrating Precision and Versatility in Lab Automation

    These tasks showcased the PiPER–Chius system’s versatility and precision in advanced lab automation. AgileX Robotics noted that the performance reflects its potential to enhance efficiency, safety, and consistency across scientific and industrial settings.

    AgileX Robotics’ PiPER is a fast, lightweight six-jointed robotic arm engineered for smooth, precise motion.

    Weighing 9.26 lbs (4.2 kg), PiPER lifts 3.3 lbs (1.5 kg), reaches 24.6 in (626 mm), offers ±0.1 mm accuracy, and operates reliably from –4 °F to 122 °F (–20 °C to 50 °C).

    Flexible Control and Intelligent Motion Capabilities

    Its integrated joint motors enable fluid movement and intelligent path planning. PiPER supports multiple control methods, including hand-guided drag-teaching, offline programming, Python scripting, PC operation, and ROS1/ROS2 integration. Communication runs via CAN, while a tablet-based interface ensures programming is straightforward and accessible.

    High-Precision Robotic Hands for Diverse Applications

    The Linkerbot Chius Hand is a range of advanced, highly dexterous robotic hands for research, industry, and rehabilitation. Engineered to closely replicate the functions of a human hand, they can execute intricate movements with exceptional precision.

    A key feature is force sensing, enabling gentle handling of fragile items and precise control in complex tasks. They also support wireless master-slave communication for smoother, more responsive operation. With adaptive grasping capabilities, the Chius Hands can effortlessly manage objects of different shapes, sizes, hardness levels, and weights.

    Chius Hands feature adaptive grasping for handling varied objects with ease, smooth coordinated motion, and models like the L20 offer 25 degrees of freedom for highly articulated movement. These features make the Linkerbot Chius Hand well-suited for applications spanning scientific research, medical rehabilitation, and industrial automation.


    Read the original article on: Interesting Engineering

    Read more:Launch of ChatGPT-5 Marks the ‘Start of a New Era for Humanity’

  • Robot Masters Surgical Tasks Simply by Watching Videos

    Robot Masters Surgical Tasks Simply by Watching Videos

    Performing surgery takes years of training for humans, but a robot could learn the skill more easily with today’s AI technology. Researchers from Johns Hopkins University (JHU) and Stanford University have taught a robot to perform various surgical tasks just by watching videos of the procedures.
    With the help of an AI model trained on videos of surgery videos, a robot system has successfully carried out difficult surgical tasks as skillfully as a human
    Intuitive

    Performing surgery takes years of training for humans, but a robot could learn the skill more easily with today’s AI technology. Researchers from Johns Hopkins University (JHU) and Stanford University have taught a robot to perform various surgical tasks just by watching videos of the procedures.

    The team used a da Vinci Surgical System, a robot typically controlled by a surgeon, which allows for precise movements like dissection and suturing. The system costs over $2 million, not including accessories or training.

    The da Vinci surgical system at work
    Intuitive

    Using imitation learning, the researchers trained the robot to perform tasks such as manipulating a needle, lifting tissue, and suturing. Remarkably, the robot could perform these tasks as well as humans and even correct its own mistakes, like picking up a dropped needle automatically.

    The AI model combines imitation learning with the architecture used in chatbots like ChatGPT. Instead of processing text, it outputs kinematics, a mathematical language that directs the robot’s arms. The model was trained on hundreds of videos filmed from wrist cameras on da Vinci robots during surgeries.

    Surgical Robot Transformer Demo

    New AI Method and robots Could Accelerate Autonomous Surgery, Reducing Errors and Improving Accuracy

    The researchers believe their method could rapidly teach robots to perform any surgery, making the process far easier than traditional hand-coding each step. According to JHU’s Axel Krieger, this approach could accelerate the path to autonomous surgery, reduce errors, and improve accuracy.

    This innovation could be a major breakthrough in robot-assisted surgery. While some robotic systems, like Corindus’s CorPath, already assist with certain surgical steps, they lack full autonomy. Krieger noted that traditional coding for robotic tasks is slow, often taking years to model even one action.

    ‘Wrist’ cameras attached to the arms of the robot surgical system capture footage to help train the AI model
    Johns Hopkins University / Stanford University

    In 2022, Krieger’s team developed the Smart Tissue Autonomous Robot (STAR) at JHU, which performed suturing without human assistance. Now, the JHU team is working on teaching robots to perform complete surgeries, though it will likely be years before robots fully replace human surgeons. However, advancements like this one could make surgeries safer and more accessible worldwide.


    Read Original Article: New Atlas

    Read More: Scitke