Early September will see the very first force-feedback-based teleoperation of a rover-based robotic arm system on Earth from the International Space Station, orbiting 400 km above our heads.Danish ESA astronaut Andreas Mogensen will take control of the Interact Centaur rover, which incorporates a pair of arms to perform precision operations. In the process Andreas will make use of haptic control – providing him with force feedback to let him feel for himself as the robotic arms encounter resistance. In this way, he can perform dexterous mechanical assembly tasks in the sub-millimetre range, remote-controlled from space.
Astronaut Andreas to try sub-millimetre precision task on Earth from orbit
“When we humans have to perform precision operations, for instance simply inserting our key into the lock of our door, we are relying largely on our feeling of tactile and force receptors in the hand and arms – not on eyesight,” states André Schiele, principal investigator of the experiment, head of ESA’s Telerobotics and Haptics Laboratory and Associate of the Delft Robotics Institute.
“Visual information is of minor importance – these kind of tasks can be done with our eyes closed. Now ESA is transferring this skill to remotely-controllable robotic systems.
“Without haptic feedback, the operator of a robot arm or rover must be very careful not to damage something if the robot is in contact with its environment. As a result, a simple task in space often takes a very long time.
“Moreover, the tactile sensation derived from any task contains important information about the geometric relationship of the objects involved and therefore allows to execute tasks more intuitively and thus significantly faster.”
The Lab team, working with students from Delft University of Technology, has developed a dedicated rover called Interact Centaur. The 4×4 wheeled vehicle combines a camera head on a neck system, a pair of highly advanced force sensitive robotic arms designed for remote force-feedback-based operation and a number of proximity and localisation sensors.
As currently scheduled, Monday 7 September should see the Interact rover driven around the grounds of ESA’s ESTEC technical centre in Noordwijk, the Netherlands, from the extremely remote location of Earth orbit, 400 km up.
Signals between the crew and the robot must travel a total distance of approximately ninety thousand kilometres, via a satellite constellation located in geostationary orbit. Despite this distance, Andreas will exactly feel what the robot does on the surface – with only a very slight lag.
Andreas, due to launch to the ISS on 2 September, will first attempt to guide the robot to locate an ‘operations task board’ and then to remove and plug a metal pin into it, which has a very tight mechanical fit and tolerance of only about 150 micrometres, less than a sixth of a millimetre.
]]>
As André explains: “The task is very difficult with visual information alone but should be easy if force-feedback information tells you intuitively when the pin hits the board, or how it is misaligned.
“The rover, its robotic arms and the control methods involved are highly advanced. While robot arms are normally very rigid, Interact’s arms can be programmed to be soft and flexing, in order to comply in a controlled way with any active or passive environment.
“When they hit an object, the arms can flex in a similar manner to human arms and can provide the operator with force feedback to let them know the robot has encountered an obstacle but not damaged anything.
“This force information will be used to plan the following motions by the astronaut, as if he would be there doing the task by himself with his own arms and hands. This helps make the robotic remote operation very intuitive, allowing remote operations to take place across very long distances up to places that are 450 000 km apart.”
The signals from astronaut to rover during the experiment must travel via a system of geostationary satellites, covering a long distance of nearly 90 000 km. The resulting two-waytime delay approaches one second in length.
“Although the ESA developed smart software and control methods can enable astronauts even during longer time-delay operations, research suggests that people can handle time delays during hand-eye coordination tasks of only up to three seconds on a satisfactory basis,” André adds. “This would still allow haptic control of rovers and robotic arms as far away as on the Moon’s surface.”
Remote controlled rovers are very useful in any dangerous or inaccessible environment, not only in space. On Earth, the technology can enable dexterous intervention and operations in Arctic conditions, the deep sea or at nuclear disaster sites.
The Interact experiment represents a first step towards developing robots that provide their operators with much wider sensory input than what is currently available. In this way, ESA is literally ‘extending human reach’ down to Earth from space.
“For space, being able to transmit the sense of touch out where humans cannot go is a mission enabling technology,” says André. “On the far side of the Moon, for instance, such robotic systems could install telescopes and prepare a human base for a long-term scientific outpost.”
“Any assembly task required for connecting modules, electrical power lines or placing systems into the Moon surface regolith would benefit from force-feedback telepresence. Operators could do the work from the comfort of Earth, or else perform similar operations on Mars while still being safely in orbit around that planet.”
The date of the activity is dependent on the overall ISS schedule and that of the iriss mission, and has been allocated to take place between 6-9 September.
_________________________________________________________________________
Pubblicazione gratuita di libera circolazione. Gli Autori non sono soggetti a compensi per le loro opere. Se per errore qualche testo o immagine fosse pubblicato in via inappropriata chiediamo agli Autori di segnalarci il fatto e provvederemo alla sua cancellazione dal sito