Psychology Theory May Improve the Way Robots Walk

It's the fruit of a rare collaboration between a clinical psychologist, robotics engineers and a robotics entrepreneur.

The self-balancing robot used to demonstrate how 'Perceptual Control Theory' can help robots to walk in a more human-like way.
The self-balancing robot used to demonstrate how 'Perceptual Control Theory' can help robots to walk in a more human-like way.
University of Manchester

A psychological theory could kickstart improvements in the way robots are able to walk, thanks to a University of Manchester study.

The study - a unique collaboration between a clinical psychologist, robotics engineers and a robotics entrepreneur is published in the Journal of Intelligent and Robotic Systems today.

It analysed what happens when standard algorithms driving a self-balancing robot - made from simple Lego - were replaced with those based on 'perceptual control theory'.

The theory was encoded into the little droid, allowing it to control what it sensed so that it moved around more effectively, just as humans and other animals can.

Though the robot moves on two wheels, it is an 'inverted pendulum', which requires nimble balancing in a similar way to how our bodies are kept upright when we walk.

So, the better the robot can balance, the better prepared it will be for walking like a human.

In the study, the more lifelike robot balanced more accurately, more promptly and more efficiently than its rivals by assessing its environment at least 100 times a second.

It also moved to a new location, even when disturbed by sideways nudges, more effectively than its competitors.

When algorithm was programmed into the robot, it appeared much less stable and wobbled excessively.

Though perceptual control theory has been used widely in psychological therapies, education and parenting interventions, this is the first time that using it in robots has been compared in a 'head-to-head' test.

The new study, compared the same inverted pendulum robot programmed and tuned with three different software controllers.

Two of them, proportional control and LQR, are widely used by engineers to build the latest robots.

The third, perceptual control theory, was originally derived from engineering, but it takes the 'insider's perspective', specifying the 'desired inputs' or 'needs' of the robot.

Dr Warren Mansell, Reader in Clinical Psychology at The University of Manchester said: "Although this is early work, it is tantalising to see how a scientific theory used to help people with mental health problems might actually help engineers to improve their designs of artificially intelligent devices.

"Robots are yet to match the abilities of those in science fiction hits like Star Wars and Blade Runner and none have mastered walking on two feet.

"But the use of the theory could really enable the transformation of robots into more lifelike machines."

Dr Simon Watson, Senior Lecturer in Robotic Systems at The University of Manchester said: "Nature has developed the most complex machines we know, so being able to implement algorithms inspired by them is an important step in our own creative development capability.

Thomas Johnson, the PhD student who built and tested the robot said "This work has demonstrated the success of controlling robots with perceptual control theory. This paper is a demonstration of how engineers in robotics can find inspiration from the living world."

Computer technologist Dr Rupert Young said: "This research is a peek at a radical new way of understanding how to build robotics systems, that are dynamic and adaptive despite the chaotic, unpredictable nature of the real world. Based upon an elegant and natural approach, this paradigm holds the promise of developing far more sophisticated, autonomous robots"

More in Product Development