
Yuichiro Chino/Getty Images
In 1818, Mary Shelley invented a technology that has been used for good and bad for centuries. It’s called science fiction.
You may not think of a literary genre as a technology, but science fiction stories have long been tools for announcing and criticizing science. Shelley’s FrankensteinConsidered by many to be the first science fiction novel, it was powerful enough to be banned in South Africa in 1955. He established the formula, with a story that still serves as a warning of unintended consequences today.
The exact science used by the eponymous Victor Frankenstein in his creation is not, as far as we know, possible. But today’s researchers are able to bring dead human brains back to something resembling life. Experiments are being carried out to test the effects of treatments on cell activity (but notably not consciousness) after death such as Alzheimer’s disease (see “Radical treatments that bring people back from the brink of death”).
It’s hard not to think of the many sci-fi stories about similar scenarios and imagine what might happen next. The same goes for jobs reported at work “AI simulations of 1000 people accurately replicate their behavior”in which researchers are using the technology behind ChatGPT repeating the thoughts and behaviors of specific individualswith surprising success.
The groups behind the work are blurring reality, fiction and what it means to be human
In both cases, the teams behind this research, blurring the lines between reality, fiction and what it means to be human, are acutely aware of the ethical concerns of their work, with strong ethical care about what they are doing and its details. made public in the initial phase. But now that the technology is proven, there’s nothing wrong with more nefarious groups trying to do the same, unsupervised and with the potential to do massive damage.
Does this mean that research should be banned, as Shelley’s book was, for fear of falling into the wrong hands? Far from it. Concerns about the technology are best addressed through appropriate evidence-based regulation and swift penalties for transgressions. When regulators pass, we lose not only the technology, but the opportunity to critique and debate it.
Topics: