It has been suggested that an advance group of robots is needed if humans are ever to settle on other planets. These robots must be robust, adaptable, and recyclable to survive in the inhospitable cosmic climates that await them.
In collaboration with robotics and computer scientists, my team and I worked on such a set of robots. The robots we make are made using a 3D printer and assembled autonomously. They are constantly evolving to adapt quickly to the conditions in which they are found.
Our work represents the latest advances towards autonomous robotic ecosystems that could help build the future homes of humanity far from Earth and far from human oversight.
Robots have come a long way since our first awkward forays into artificial motion many decades ago. Companies like Boston Dynamics today make highly efficient robots that load trucks, build pallets and move boxes around factories, doing tasks that you might think only humans can do.
Despite these advances, the development of robots for unknown or inhospitable environments – such as exoplanets or deep-sea trenches – still represents a significant challenge for scientists and engineers. What shape and size should the ideal robot in the cosmos have? Should it crawl or run? What tools does it need to manipulate its environment – and how will it withstand extreme pressure, temperature and chemical corrosion?[Read: How much does it cost to buy, own, and run an EV? It’s not as much as you think]
As an impossible mental exercise for humans, nature has already solved this problem. Darwinian evolution has resulted in millions of species that are perfectly adapted to their environment. Although biological evolution takes millions of years, artificial evolution – the modeling of evolutionary processes in a computer – can take place in hours or even minutes. Computer scientists have been using their power for decades and lead to gas nozzles for satellite antennas that are ideally suited for their function, for example.
However, the current artificial development of moving physical objects still requires a lot of human control and a tight feedback loop between robot and human. If artificial evolution is to design a useful robot for exoplanetary exploration, we need to put humans out of the loop. In essence, further developed robot designs have to produce, assemble and test themselves independently – independently of human supervision.
All robots that have been further developed must be able to sense their surroundings and have various possibilities of movement – for example with the help of wheels, articulated legs or even a mixture of both. Also, in order to close the inevitable reality void that occurs when translating a design from software to hardware, it is desirable that at least some hardware development take place – within an ecosystem of robots that are evolving Real time and real space.
The Autonomous Robot Evolution (ARE) project is addressing just that, bringing together scientists and engineers from four universities in an ambitious four-year project to develop this radical new technology.
As shown above, robots are “born” with the help of 3D manufacturing. We’re using a new type of hybrid Hardware software evolutionary architecture for design. That means every physical robot has a digital clone. Physical robots are tested in for performance real world Environments as their digital clones enter a software program in which they go through a rapid simulated evolution. This hybrid system introduces a new kind of evolution: New generations can arise from a union of the most successful characteristics of a virtual “mother” and a physical “father”.
“Children’s” robots produced using our hybrid evolution are not only rendered in our simulator, but also printed in 3D and introduced in a real world, Crèche-like environment. The most successful people in this physical training center provide their “genetic code” for reproduction and improvement for future generations, while less “fit” robots can simply be dragged away and recycled into new ones as part of an ongoing evolutionary cycle.
Significant progress has been made two years into the project. From a scientific point of view, we have developed new artificial evolutionary algorithms that have spawned a multitude of robots that can drive or crawl and learn to navigate complex mazes. These algorithms develop both the robot’s body plan and brain.
The brain contains a controller that determines how the robot moves, interprets sensory information from the environment and converts this into motor controls. Once the robot is built, a learning algorithm quickly refines the child’s brain to account for possible mismatches between their new body and their inherited brain.
From a technical point of view, we developed the “RoboFab” to completely automate production. This robotic arm attaches wires, sensors and other “organs” chosen by evolution to the 3D printed chassis of the robot. We designed these components to allow for quick assembly and to give RoboFab access to a large toolbox of robot limbs and organs.
The first major use case we’d like to address is using this technology to develop robots for cleaning up old waste in a nuclear reactor – like in the TV miniseries Chernobyl. Using humans to do this job is both dangerous and expensive, and the necessary robotic solutions have yet to be developed.
Looking ahead, the long-term vision is to develop the technology enough to enable the development of entire autonomous robotic ecosystems that live and work in challenging and dynamic environments for long periods of time without the need for direct human supervision.
In this radically new paradigm, robots are designed and born rather than designed and manufactured. Such robots will fundamentally change the concept of machines and present a new breed that can change shape and behavior over time – just like us.
This article by Emma Hart, Chair of Natural Computation at Edinburgh Napier University, has been republished by The Conversation under a Creative Commons license. Read the original article.