Expanding Human-Robot Collaboration in Manufacturing by Training AI to Detect the Human Intention

Expanding Human-Robot Collaboration in Manufacturing by Training AI to Detect the Human Intention

Machines and robots certainly make life less complicated. They conducted jobs with precision as well as speed, and, unlike humans, they do not need breaks as they are never tired.

Consequently, businesses want to utilize them increasingly more in their production procedures to improve productivity and eliminate filthy, hazardous, and dull tasks.

Nevertheless, there are still many tasks in the workplace that require human agility, adaptability, and flexibility.

Human-robot cooperation is an exciting chance for future manufacturing, given that it combines the best of both.

The partnership needs close interaction between human beings and robots, which could highly benefit from preparing for a collaborative partner’s following action.

Ph.D. student Achim Buerkle and a team of scientists from the Intelligent Automation Centre at Loughborough University have published promising outcomes for ‘training’ robots to find arm motion intention before people verbalize the movements in the Robotics Computer-Integrated Manufacturing journal.

“A robot’s speed and torque need to be well-coordinated because it can present a danger to human health and safety,” said Achim.

” Ideally, for efficient teamwork, the human, as well as robot, would certainly ‘understand’ each other, which is tough because of both being rather different and also ‘speaking’ different languages.”

“We suggest offering the robot the ability to ‘read’ its human companions’ intentions.”

The researchers wanted to achieve this by interfacing the frontal lobe activity of the human brain.

Every single movement executed by the body is examined and also assessed in the brain before its execution. Determining this signal can help in transmitting an ‘intention to move’ to a robot.

Nevertheless, brains are very complex organs, and also detecting the pre-movement signal is demanding.

The Loughborough University scientists tackled this obstacle by teaching an AI system to identify the pre-movement patterns from an electroencephalogram (EEG) – a piece of innovation that allows human mental activity to be documented.

Proposed new system layout. Credit: Achim Buerkle.

Their newest paper reports the discoveries of a test performed with eight individuals.

The participants needed to sit in front of a computer system that randomly showed a letter from A-Z on the screen and also pressed the key on the keyboard that matched the letter. The AI system had to deduce which arm the individuals would use from the EEG data, and motion sensors confirm these intentions.

The experimental data shows that the AI system can notice when a human will move an arm up to 513 nanoseconds (ms) before they do, and on average, around 300ms before actual execution.

In a simulation, the scientists checked the effect of the time advantage for a human-robot joint scenario.

They discovered they could reach higher productivity for the same task using modern technology instead of without it.

The completion time for the activity was 8-11% faster– also when the scientists included ‘false positives,’ which entailed the EEG wrongly transmitting an individual’s intention to move to the robot.

Achim intends to build on this research and wishes to create a system that can anticipate where movement is directed – for instance, grabbing a screwdriver or picking a new work item.

Of the recent discoveries, he claimed that “we wish this research will achieve two things: first, we hope this suggested technology can aid in the direction of a more detailed, cooperative human-robot cooperation, which still needs a large quantity of research as well as engineering job to be completely developed.”

” Secondly, we want to connect that instead of seeing robots as well as artificial intelligence/machine learning as a danger to human labor in production, it can also be seen as a possibility to keep the human as a vital part of the factory of the future.”

In a joint declaration, Achim’s supervisors Dr. Thomas Bamber, Dr. Niels Lohse, and Dr. Pedro Ferreira said that “there is a need to change the nature of human work to produce a truly sustainable world no longer based on difficult physical and cognitive human labor.”

” Human-Robot Collaboration (HRC) is beginning to innovate factory shop-floors. Nonetheless, there is still a requirement for much more significant cooperation between people and robots.”

” Real HRC will have a transformative impact on labor productivity, work quality, as well as wellness and establish a more safe as well as sustainable labor market, while likewise getting over physical downsides caused by sex, gender, age, or disability.”

“Achim’s work making use of the Expert system and EEG brings us one more step towards true HRC.”


Originally published on Techxplore.com. Read the original article.

Reference: Achim Buerkle et al, EEG based arm movement intention recognition towards enhanced safety in symbiotic Human-Robot Collaboration, Robotics and Computer-Integrated Manufacturing (2021). DOI: 10.1016/j.rcim.2021.102137

Share this post