Legged Robots Learn to Hike Harsh Terrain - IEEE Spectrum
( January 20, 2022; IEEE Spectrum )
Description: "The research that the Robotic Systems Lab at ETH Zürich has published showcases a control system for a legged robot to evaluate how reliable the exteroceptive information that it's getting is. When the data are good, the robot plans ahead and moves quickly."
Robots, like humans, generally use two different sensory modalities when interacting with the world. There’s exteroceptive perception (or exteroception), which comes from external sensing systems like lidar, cameras, and eyeballs. And then there’s proprioceptive perception (or proprioception), which is internal sensing, involving things like touch, and force sensing. Generally, we humans use both of these sensing modalities at once to move around, with exteroception helping us plan ahead and proprioception kicking in when things get tricky. You use proprioception in the dark, for example, where movement is still totally possible—you just do it slowly and carefully, relying on balance and feeling your way around.
For legged robots, exteroception is what enables them to do all the cool stuff—with really good external sensing and the time (and compute) to do some awesome motion planning, robots can move dynamically and fast. Legged robots are much less comfortable in the dark, however, or really under any circumstances where the exteroception they need either doesn’t come through (because a sensor is not functional for whatever reason) or just totally sucks because of robot-unfriendly things like reflective surfaces or thick undergrowth or whatever. This is a problem because the real world is frustratingly full of robot-unfriendly things.
- Humans and robots use two types of perception, exteroception and proprioception.
- Exteroception comes from external sensing: things like sight and sound in humans, cameras and lidar in robots.
- Proprioception comes from touch and feel
Humans use both exteroception and proprioception when moving around. Robots have typically used exteroception. Now, though, Robotic Systems Lab at ETH Zürich has published in Science Robotics about their work towards integrating exteroception and proprioception into what they call a "belief state". Their technique uses exteroception first, but falls back to proprioception when exteroception is found to be untrustworthy -- in cases like walking on surfaces that are unstable or covered with things like snow or underbrush, for example. This enables the robot to move quickly when exteroception is working, and still reliably when it's not.
The embedded video shows Anymal in action making use of the integrated belief state. The robot makes use of a neural network for its belief state, where it is trained by having a "student policy"attempt to follow the actions of a teacher.
Read the rest from IEEE Spectrum: Legged Robots Learn to Hike Harsh Terrain
Check the #penny4thoughts tag to find other active conversations.
###### This markdown/html was auto-generated by the java Steem Links Creator prototype application.