In 2020, scientists made global headlines by creating “xenobots” – tiny “programmable” living things made of several thousand frog stem cells.
These pioneer xenobots could move around in fluids, and scientists claimed they could be useful for monitoring radioactivity, pollutants, drugs or diseases. Early xenobots survived for up to ten days.
Read more: Not bot, not beast: scientists create first ever living, programmable organism
A second wave of xenobots, created in early 2021, showed unexpected new properties. These included self-healing and longer life. They also showed a capacity to cooperate in swarms, for example by massing into groups.
Last week, the same team of biology, robotics and computer scientists unveiled a new kind of xenobot. Like previous xenobots, they were created using artificial intelligence to virtually test billions of prototypes, sidestepping the lengthy trial-and-error process in the lab. But the latest xenobots have a crucial difference: this time, they can self-replicate.
Hang on, what? They can self-replicate?!
The new xenobots are a bit like Pac-Man – as they swim around they can gobble up other frog stem cells and assemble new xenobots just like themselves. They can sustain this process for several generations.
But they don’t reproduce in a traditional biological sense. Instead, they fashion the groups of frog cells into the right shape, using their “mouths”. Ironically, the recently extinct Australian gastric-brooding frog uniquely gave birth to babies through its mouth.
The latest advance brings scientists a step closer to creating organisms that can self-replicate indefinitely. Is this as much of a Pandora’s Box as it sounds?
Conceptually, human-designed self-replication is not new. In 1966, the influential mathematician John Von Neumann discussed “self-reproducing automata”.
Famously, Eric Drexler, the US engineer credited with founding the field of “nanotechnology”, referred to the potential of “grey goo” in his 1986 book Engines of Creation. He envisaged nanobots that replicated incessantly and devoured their surroundings, transforming everything into a sludge made of themselves.
Although Drexler subsequently regretted coining the term, his thought experiment has frequently been used to warn about the risks of developing new biological matter.
In 2002, without the help of AI, an artificial polio virus created from tailor-made DNA sequences became capable of self-replication. Although the synthetic virus was confined to a lab, it was able to infect and kill mice.
Possibilities and benefits
The researchers who created the new xenobots say their main value is in demonstrating advances in biology, AI and robotics.
Future robots made from organic materials might be more eco-friendly, because they could be designed to decompose rather than persist. They might help address health problems in humans, animals and the environment. They might contribute to regenerative medicine or cancer therapy.
Xenobots could also inspire art and new perspectives on life. Strangely, xenobot “offspring” are made in their parents’ image, but are not made of or from them. As such, they replicate without truly reproducing in the biological sense.
Perhaps there are alien life forms that assemble their “children” from objects in the world around them, rather than from their own bodies?
What are the risks?
It might be natural to have instinctive reservations about xenobot research. One xenobot researcher said there is a “moral imperative” to study these self-replicating systems, yet the research team also recognises legal and ethical concerns with their work.
Centuries ago, English philosopher Francis Bacon raised the idea that some research is too dangerous to do. While we don’t believe that’s the case for current xenobots, it may be so for future developments.
Any hostile use of xenobots, or the use of AI to design DNA sequences that would give rise to deliberately dangerous synthetic organisms, is banned by the United Nations’ Biological Weapons Convention and the 1925 Geneva Protocol and Chemical Weapons Convention.
However, the use of these creations outside of warfare is less clearly regulated.
The interdisciplinary nature of these advances, including AI, robotics and biology, makes them hard to regulate. But it is still important to consider potentially dangerous uses.
There is a useful precedent here. In 2017, the US national academies of science and medicine published a joint report on the burgeoning science of human genome editing.
It outlined conditions under which scientists should be allowed to edit human genes in ways that allow the changes to be passed on to subsequent generations. It advised this work should be limited to “compelling purposes of treating or preventing serious disease or disability”, and even then only with stringent oversight.
Both the United States and United Kingdom now allow human gene editing under specific circumstances. But creating new organisms that could perpetuate themselves was far beyond the scope of these reports.
Looking into the future
Although xenobots are not currently made from human embryos or stem cells, it is conceivable they could be. Their creation raises similar questions about creating and modifying ongoing life forms that require regulation.
At present, xenobots do not live long and only replicate for a few generations. Still, as the researchers say, living matter can behave in unforeseen ways, and these will not necessarily be benign.
Read more: A fresh opportunity to get regulation and engagement right – the case of synthetic biology
We should also consider potential impacts on the non-human world. Human, animal and environmental health are intimately linked, and organisms introduced by humans can wreak inadvertent havoc on ecosystems.
What limits should we place on science to avoid a real-life “grey goo” scenario? It’s too early to be completely prescriptive. But regulators, scientists and society should carefully weigh up the risks and rewards.
- is a Senior Research Fellow in Digital Ethics, Centre for AI and Digital Ethics, School of Computing and Information Systems, The University of Melbourne
- is an Honorary Senior Fellow, Department of War Studies, King’s College London
- This article first appeared on The Conversation