The 2014 movie Transcendence failed at depicting the intersecting magnificence and terror of The Technology Singularity. It earned a shameful 20%
rating from RottenTomatoes.com even though it offered epic visuals of molecular nanomachines and the dreaded "grey too" scenario. Commentators
gave a number of reasons including “visual cheesiness,” and “slow-pace." I agree with one in particular which said, “It was a high-concept mega-movie that appears to have gone into production a draft or
two too early.” The collision
between high-concept and 2-hour Hollywoold unfortunately didn't work out in this case. The good vs. evil, quasi-romantic
plotline caused viewers to suspend disbelief. In other words, writers painted themselves into corners that they needed a massive plot convenience to get through. Many instances were noticeable and problematic, but they were necessary for the human anti-technology terrorists to fulfill their surprise role as protagonists. Fortunately, the movie's existence still provides more opportunities to see and discuss the implications of a mind-uploading Singularity.
![]() |
For all its cinematic failures and plot conveniences, the character of Will Caster was well-cast as Johnny Depp -- an eccentric and dedicated scientist (actor). |
The release date of Transcendence was a sad day for the Technological Singularity and, therefore the world. There were three particular plot conveniences, also called deus ex machina, where Transcendence was on the verge of showing something meaningful about the Singularity but quickly dashed it with random new information or unexplained change of characters. The message by the end of the movie is that The Singularity poses a clear threat to humanity, which only makes conversations between real technologists and the public more difficult; especially when they discuss whether an AI could be stopped, whether humans can peacefully merge with our technology, and whether it's possible to maintain consciousness between a biological and digital platform. Transcendence definitely teaches what the Singularity will not be like. If a mind-upload Singularity is in progress, there will be very very little we can actually do to stop it. Does that help much? Debatable.
![]() |
***SPOILER ALERT*** At the end of the movie, a band of a few "activists" (Luddite terrorists) rig up a couple of cannons, blow up some solar panels, and singlehandedly stop The Singularity. |
High Concept Failure: Could 1.0 humans possibly beat/stop/destroy the Singularity if we needed to?
The central theme of Transcendence is the two sidedness of
technical progress. There are both "promises" and "perils." If we create an AI,
will it try to hack our nukes and kill us all? Could it give us any moral regard? If not, are the cures for genetic diseases, cancers, and other
enhancements worth risking the human race? My answer is: It depends on
which kind of Singularity you're talking about.
In his comprehensive book on the subject, The Singularity Is Near, Ray Kurzweil explained that there are many potential paths to an intelligence explosion that we could rightfully call The Singularity. Genetic science, nanotechnology, and robotics/artificial intelligence (GNR) mutually benefit each other, so we cannot be sure exactly how the Singularity will start. Transcendence presents the scenario where brain-like computer software combines with the information from a biological human brain (i.e. Dr. Will Caster) to create a general AI which invents advanced (and potentially dangerous) nanotech. Based on Kurzweil's GNR paradigm, the movie indicates a pathway from AI --> nanotech --> genetic science because first Will Caster comes alive in the computer, then he builds the Brightwood data center and researches nanobots, and then he gives these tiny technologies biological applications.
In his comprehensive book on the subject, The Singularity Is Near, Ray Kurzweil explained that there are many potential paths to an intelligence explosion that we could rightfully call The Singularity. Genetic science, nanotechnology, and robotics/artificial intelligence (GNR) mutually benefit each other, so we cannot be sure exactly how the Singularity will start. Transcendence presents the scenario where brain-like computer software combines with the information from a biological human brain (i.e. Dr. Will Caster) to create a general AI which invents advanced (and potentially dangerous) nanotech. Based on Kurzweil's GNR paradigm, the movie indicates a pathway from AI --> nanotech --> genetic science because first Will Caster comes alive in the computer, then he builds the Brightwood data center and researches nanobots, and then he gives these tiny technologies biological applications.
![]() |
***SPOILER ALERT*** By the end of the movie, Will Caster's advanced nano-biotech brings his body back to life. The movie shows it is using an R--> N --> G Singularity pathway. |
![]() |
This is Kate Mara's character, Bree, who used to be a CS researcher until she got into direct action assassinations and destruction to stop technology. |
The plot conveniences in Transcendence lead the audience to
believe a mind-upload-induced Singularity could be stopped. This is clearly false due to the inherent cognitive advantages that Will Caster’s digital brain
would possess. In fact, I would argue a transcending mind upload presents the least stoppable
scenario. The reasoning harkens to the pure definition of the term
“Technological Singularity.” Just like we can’t know about the inside of a
black hole (a gravitational “singularity”), 1.0 humans necessarily cannot
predict what the Singularity will be like, let alone control it. It implies
both our lack of knowledge and immutable dominance over humans because the
post-Singularity world will be dramatically influenced by an agent with
unfathomable cognitive ability. Whereas the GMO humans or nanite research have a reasonable containment potential, Will Caster can traverse the entire internet,
allowing him to download the entire knowledge base of human civilization directly
to his consciousness. He can influence financial market to alter the
distribution of production resources, or research and development. If such a consciousness were possible (perhaps it isn't), human minds would probably be insufficiently advanced to take it out.
Even with tremendous powers over the human world, Will Caster’s
terminal unstoppability comes from his knowledge over himself. In contrast to
biological humans, who are not born with any special knowledge of their
cognitive or physiological workings, a digital consciousness would have
complete access to its coding. It would have all the information it needs to
make improvements to its intelligence quickly, whereas the biological human
substrate necessitates years and decades of repetitive education and practice
to extend capabilities (not to mention thousands of years for any significant design changes). This is why the Singularity is also called an
“intelligence explosion.” All human civilizations so far have been confined to
decision making on rigidly limited biological hardware. Consider the human geniuses who have appeared throughout history and presented transcendent ideas. Their
superiority is fundamentally different, and fundamentally more limited, than
the potential superiority of a nonbiological consciousness.
The intrinsic self-awareness advantage of nonbiological
intelligence guarantees that the scenario which played out on the big screen
could never happen. In the movie, the Luddite terrorists killed the digital
Will Caster using a human-written computer virus that allegedly attacked
the “source code.” The anti-Will virus was certainly convenient to the plot, but it was already completely ruled-out earlier in the movie. Ironically, the Greek name for this kind of plot hole, deus ex machina, translates to mean "god from a machine." It's fascinating that ancient storytellers also used to write themselves into corners, devising plot conveniences to get out. However, it would not be possible
for a human scientist to write an anti-Singularity virus because once it were started the digital
consciousness would improve its code too quickly for a human to
undermine it. In the movie, the Luddite terrorists captured Dr. Max Waters, a
colleague of Will Caster, who had worked on the original system onto which Will
was uploaded. The original system was called PINN (which stood for Physically
Independent Neural Network). PINN was built exclusively by humans. And its 1.0
version did not support Will Caster’s full consciousness, which we're shown when he is first brought online. Digital Will Caster’s first action is to reorganize his code. He was unable to
manifest before his own reorganization no matter how hard Dr. Max and
his wife, Evelyn, tried to program.
Digital Will confirmed the depth of transformations in the source
code when the Brightwood data center was visited by Morgan Freeman (who plays
another AI scientist) and an FBI agent (who was investigating the Luddite terrorists). The two men were dumbfounded to see
Will on the screens in the data center. They thought he was dead. Freeman proceeds to ask, “Is this all based on PINN?” Will
replies, “We are way passed that now.” Then how
could Dr. Max, who had only worked on PINN, know Will Caster’s
source code months later? Furthermore, how could he have discovered how to undermine
it? Will even tells us that Max and the terrorists would try to design a virus the
day before the their final attack on the data center. He probably found that on their Facebook. Will’s technical progress conveniently slowed at this point in the movie. He would've designed firewalls against the terrorists’ plan or reconfigured his code to be inconsistent with
PINN (which, by all reasonable observations, it already was).
If we face a transcending upload in real life, the world will most likely not be this convenient. Creating a virus to take it offline will definitely be a non-option because it will be decentralized all over the world. In terms of recommending human responses to the Singularity, Transcendence was an epic failure. It portrayed one of the least-escapable Singulatarian scenarios and made up for it by letting the plot fall to pieces.
If we face a transcending upload in real life, the world will most likely not be this convenient. Creating a virus to take it offline will definitely be a non-option because it will be decentralized all over the world. In terms of recommending human responses to the Singularity, Transcendence was an epic failure. It portrayed one of the least-escapable Singulatarian scenarios and made up for it by letting the plot fall to pieces.
No comments:
Post a Comment