See Part 1 of our Speculative Fiction Crash Course series HERE
On August 21st, 1835, a strange headline appeared on the second page of The New York Sun. “CELESTIAL DISCOVERIES,” it announced in bold letters. “We have just learned from an eminent publisher in [Edinburgh, Scotland] that Sir John Herschel, at the Cape of Good Hope, has made some astronomical discoveries of the most wonderful description, by means of an immense telescope of an entirely new principle.” The newspaper promised to cover the story as it developed, and sure enough, on August 25th, the front page of the Sun proudly announced what it had hinted at four days earlier: famed English astronomer John Herschel had discovered life on the Moon.
And not just any life, either–complex, sentient life. The Sun described enormous red poppies blooming from crags of black rock, herds of miniature bison roaming through forests, goatlike creatures with midnight blue hides, bipedal beavers that lived in huts and built campfires, and most notably, winged orangutan-like humanoids that Herschel termed “man-bats.” The newspaper would reveal these discoveries over the course of the next five days, describing each one in lavish detail. They cited excerpts from the Edinburgh Journal of Science, explaining that these miraculous discoveries had been made possible with the creation of a new kind of telescope, some six times larger in diameter than the largest one built to date. Tragically, though, just after capturing these discoveries, the telescope was destroyed in a fire, making it impossible for any other scientists to observe what Herschel’s team had seen.
The story caused a sensation in New York City. The Sun began printing illustrated pamphlets dedicated solely to the Moon discoveries, and sold 60,000 copies within the month. Other publications, desperate to keep up, began printing the story as well, until most newspapers in New York (and many in other major cities across the globe) were carrying it. The Bowery Theatre put on a play based on Herschel’s discoveries. The Hannington Museum built dioramas depicting blue goats and lunar bison. A debate began among some religious sects about whether the church should plan for eventual trips to the Moon to convert the beaver- and bat-people to Christianity. It was an event with implications for science and humanity unlike any that had been seen before.
There was just one small problem: the Edinburgh Journal of Science had gone defunct two years prior. And the author of the report, Dr. Andrew Grant, didn’t exist. And there are no beaver people or man-bats on the Moon.
The real author of the story, editor Richard A. Locke, admitted the piece was a hoax a little over ten years later. But in creating his fabrication, Locke had ironically revealed an interesting truth: the literary device of fictional technology can be a believable gateway to imaginary worlds. And though it was intended to seem truthful, the “Great Moon Hoax” in many ways ended up being an early example of what we would one day call “science fiction.”
WHAT IS SCIENCE FICTION?
The definition of science fiction is a question so hotly debated that there exists an entire Wikipedia page dedicated solely to the topic. Rod Serling, host of TV’s The Twilight Zone, once famously described it as “the improbable made possible,” as opposed to fantasy, which was “the impossible made probable.” Robert Heinlein, the originator of the term “speculative fiction,” described sci-fi, in part, as a fantastical story in which “no established fact [is] violated,” adding that “when the story requires that a theory contrary to present accepted theory be used, the new theory should be rendered reasonably plausible and it must include and explain established facts as satisfactorily as the one the author saw fit to junk.” Author Isaac Asimov defined the genre simply as “that branch of literature which deals with the reaction of human beings to changes in science and technology.” And according to the Merriam-Webster English Dictionary, science fiction is “fiction dealing principally with the impact of actual or imagined science on society or individuals.”
Probably the simplest way to describe science fiction is as a genre of speculative fiction where fantastical things occur as a result of scientific phenomena, rather than supernatural or magical means. In other words, the laws of sci-fi universes may not be exactly the same as our own, but the fantastical things depicted in them are explained in terms at least ostensibly connected to our own understandings of physics, chemistry, biology, and so on. The degree to which a work of sci-fi adheres to this standard is usually measured on a spectrum of “hard” to “soft” sci-fi, with “hard” sci-fi being based more on real-world scientific principles (like artificial intelligence, robotics, or space travel), and “soft” sci-fi having more flexible and/or imaginary rules (like mega-proton-gamma-blaster rays, time travel portals, or aliens from the planet Zorgon). These subcategories are often difficult to apply retroactively–what seemed plausible based on scientific knowledge at one time could be proven impossible just a few short years later–but they’re still frequently used to categorize sci-fi texts, and to differentiate pure science fiction from science fantasy (more on that later).
The proximity of sci-fi to reality also makes it a popular vehicle for commenting on real-life issues. Mary Shelley’s Frankenstein, for instance, addressed the ramifications of science manipulating human life. Ray Bradbury’s Fahrenheit 451 explored mass media’s potential to brainwash societies. And Stanley Kubrick’s 2001: A Space Odyssey expressed anxieties over the implications of artificial intelligence. While a work doesn’t need to be political to be a piece of science fiction, elements of social critique do play a large part in the shaping and defining of the genre, as we will soon see.
PRECURSORS TO MODERN SCIENCE FICTION
The debate about the origin of science fiction is contingent on one crucial question: when did humans start thinking of the world “scientifically?” It’s clear that early human civilizations were capable of conceiving of tools, structures, and technologies that had not been invented yet (that is, after all, how things usually get invented), but the origin of the kind of “imaginary” or “hypothetical” technology that defines science fiction is a bit harder to pin down. Generally, in early texts where characters can do things like travel through time or visit different worlds, they do so with the use of some form of magic, rather than with man-made technology. Take, for instance, the Vimana flying machines in ancient Indian literature, or the interplanetary travel in Lucian of Samosata’s 2nd Century Greek text A True Story (which happens via whirlwind rather than spacecraft).
But the idea of true imaginary technology–purely fictional inventions created to explore a hypothetical problem or phenomenon–did arise quite early on in recorded history. The Liezi, a Chinese Daoist text published circa 400 BCE, includes a story about an inventor who creates an automaton indistinguishable from a real human being. When he presents it before the royal court, the king demands it be taken apart piece by piece, and finds that every component operates mechanically. “Is it then possible,” the king exclaims, “for human skill to achieve as much as the Creator?” In many ways, this question, even more so than the story’s early depiction of robotics and artificial intelligence, shows the first glimmerings of the ideas and concepts that would one day evolve into science fiction.
Unsurprisingly, early precursors to sci-fi flourished during the scientific and cultural renaissance of the Islamic Golden Age. The One Thousand and One Nights, a text we’ve already cited as influential to modern science fiction, is one famous example, featuring stories like The Ebony Horse where a mechanical horse is able to transport its rider forward in time. Another is the novel Theologus Autodidactus by Syrian philosopher Ibn-al-Nafis. In the book, al-Nafis dedicates two full chapters to proposing scientific explanations for supernatural and divine phenomena like resurrection and the human soul, based largely on his real-life studies of the circulatory system. And although his theories are not themselves meant to be fictional, they nonetheless show the ways in which contemporary philosophy was beginning to think of technology and science as a means to explain the seemingly supernatural.
This sort of shift wouldn’t arrive in Europe until several centuries later, with the advent of the Renaissance in 1440 (warning: warp-speed recap of Early Modern European history incoming). The invention of the printing press that year would drastically increase literacy rates across the continent, making both scientific and literary texts more accessible to the citizens of Europe than ever before. With this came broader dissemination of new ideas and discoveries, including Galileo Galilei’s heliocentric model of the solar system, an event often said to mark the beginning of the “Scientific Revolution” of the late Renaissance. The Scientific Revolution also introduced Francis Bacon’s scientific method, a system for effectively conducting laboratory experiments, and Isaac Newton’s Principia Mathematica, which introduced concepts like the three laws of motion and the law of universal gravitation. This latter work is often credited with starting yet another stage in Europe’s transition into the modern age, that is, the Age of Enlightenment. As we explored in our last installment, the Enlightenment not only led directly to the industrialization of Europe and North America, but also to a widespread trend in art and literature of questioning dogma and superstition, and championing the power of human ingenuity and reason over imagination.
But with this new fixation on humans’ ability to master the world came questions about our limits in doing so. As the King in the Liezi inquired 2,000 years earlier, was it possible for human ability to rival that of God? And if it was, what did that mean for the world as we knew it?
It’s here that we meet up again with our old friend Mary Godwin, better known as Mary Shelley. Though hardly the first writer to express anxieties about the advancement of science and technology, her 1818 novel Frankenstein was by far the most well-known early example of a fictional work exploring them. In recalling her inspiration for writing the novel, Shelley would later remark that she was inspired by the “frightful” ability of “any human endeavor to mock the stupendous mechanism of the Creator of the world.” So, drawing from both a fantastical phenomenon (the resurrection of the dead) and a real-world scientific advancement (the discovery of electrical currents animating dead tissue) Shelley crafted a hypothetical story in which the logical progression of existing technology could achieve what was once limited to myth and fantasy. Or, in other words, a work of science fiction.
THE BEGINNINGS OF MODERN SCI FI
Though there is some debate as to whether Frankenstein truly is the first science fiction novel, it is generally agreed that it is at least a very early example of the genre, and certainly the most popular of its early works. It also demonstrates some of the cultural changes that allowed sci-fi to take root in 19th century literature–namely, the rapid advancements of science and technology that put impossible things, like reanimation of the dead, plausibly within human reach. This radical expansion of the average person’s scientific imagination ultimately became a clear dividing point between science-themed literature before Frankenstein, and after it.
But while technological advancements throughout 19th century Europe brought their own anxieties, they also gave cause for excitement. French novelist Jules Verne made his literary debut in 1863 with the novel Cinq Semaines en Ballon, or Five Weeks in a Balloon, kicking off what would become known as the “Amazing Journeys” series. The Amazing Journeys were largely intended to be educational works–they contained real information about astronomy, geology, oceanography, and other sciences–but they also extrapolated this knowledge to propose new and fantastical possibilities for human achievement. Verne wrote about tunneling to the earth’s core, traveling miles below the surface of the ocean, journeying to the North Pole (a feat that wouldn’t be achieved for nearly half a century after the book’s publication), and perhaps most notably, flying to the surface of the Moon. Verne’s work was significant for two reasons: first, because his stories about space travel would end up influencing actual aeronautics, and secondly, because he was widely thought to be the first novelist to write entirely within the genre of science fiction.
British author H.G. Wells was also an early pioneer of sci-fi genre writing, but he took a markedly different approach to his subject matter. Where Verne focused on the potential of technology in its own right, Wells used fictionalized technology to illustrate broader points about society and culture. His 1895 novel The Time Machine made little effort to propose a realistic working time travel device (the titular machine runs on a fictionalized radioactive substance that generates forcefields), choosing instead to focus on the imaginary worlds the time machine could access, and using them as frameworks to explore real-world social issues like class struggle. His follow-up novel, The War of the Worlds, continued this trend, using intergalactic invasion to comment on British imperialism and colonialism.
The massive popularity of Verne’s and Wells’s work is frequently credited with helping to define science fiction as its own distinct genre, but the differences in their writing styles were equally significant. Verne’s focus on the potential of technological achievement, contrasted with Wells’s focus on social critique, would not only help demarcate the divide between hard- and soft sci-fi, it would also foreshadow the tension between hope and anxiety that the scientific advancements of the coming century would bring.
THE EARLY 20TH CENTURY
By the beginning of the 20th century, science fiction had reached widespread popularity not just in England and France, but around the world. The rapid modernization of Japan in the late 19th century had brought an influx of new technology to the country, which, paired with recently translated versions of foreign sci-fi literature like Verne’s Around the World in Eighty Days, laid the groundwork for a blossoming of Japanese sci-fi in the early 1900s. Authors like Shunrō Oshikawa and Jūza Unno pioneered the field, which would come to be known as the kagaku shōsetsu, or “scientific novel.” Sci-fi was also popular on the Indian subcontinent, particularly the writings of Bengali author Jagadish Chandra Bose. And, of course, the genre took the United States by storm, particularly in the form of the early-20th-century “pulp” magazines.
Pulp magazines, so named for the cheap wood pulp paper they were printed on, were the number-one medium for science fiction in the first few decades of the 20th century (It was even in one of these magazines, Amazing Stories, that the term “science fiction” was first coined). Their pages featured stories by authors like Philip Francis Nowlan and Edgar Rice Burroughs, usually depicting the epic adventures of dashing heroes like Buck Rogers and John Carter of Mars. This subgenre, the “space opera,” became perhaps the most defining style of science fiction in the 1930s. Though originally intended as a pejorative (a play on the term “soap opera”), space operas became a beloved form of sci-fi storytelling, translating from pulp magazine pages to radio plays and ultimately even TV. So it’s no surprise that the rise of the space opera is often associated with the beginning of what is now referred to as the Golden Age of science fiction.
It’s the Golden Age of sci-fi that brought us some of the most recognizable tropes of the genre–think rayguns and rocketships, epic interplanetary adventures, and talking robots. Publisher John W. Campbell’s magazine Astounding Science Fiction was perhaps the most important platform for this style of sci-fi, featuring writers like Robert Heinlein, Arthur C. Clarke, and Isaac Asimov (the latter of whom was largely responsible for the rise of robots as a fixture of the sci-fi genre). And while Campbell’s space operas were often far from great literature, they did bring a more human element to science fiction storytelling, focusing on character development and relationships as much as technology. As a result, the Golden Age usually took a generally optimistic tone to science and technology, carrying with it a belief in the potential of human achievement and the inevitable triumph of good over evil.
Needless to say, it would not last forever.
WORLD WAR II AND THE ATOMIC AGE
The deployment of the first atomic bomb in 1945 put a sudden end to the uncritically utopian vision of sci-fi’s Golden Age. Though the potential of nuclear power continued to inspire “space-age” fantasies about the future throughout the 1950s (oftentimes as a marketing tactic to sell “futuristic” products like tools and appliances), the horrors that scientific innovation had wrought upon the people of Japan cast a dark shadow over a largely hopeful genre. But rather than making science fiction obsolete, the dawn of the Atomic Age made the genre seem more serious than ever. “Once the horror at Hiroshima took place,” commented Isaac Asimov in 1969, “anyone could see that science fiction writers were not merely dreamers and crackpots after all, and that many of the motifs of that class of literature were now permanently part of the newspaper headlines.”
Writers around the world responded to the potential of the atom in a number of ways. A commonly cited example is the divide between American and Japanese sci-fi in the 1950s, often expressed with the quip, “American radioactivity creates superheroes, and Japanese radioactivity creates monsters.” Though this is of course a simplification–the heroic Astro Boy (aka “The Mighty Atom”) was one of the hallmarks of postwar Japanese sci-fi, and mutant monsters were the bread-and-butter of 1950s American B horror movies–it does highlight the dual nature of the genre throughout the 1950s. Real-life science could send a man to the Moon or level a city, just as fictional science could turn a teenage boy into Spiderman or summon Godzilla from the depths of the Pacific. The tension between optimism and anxiety was greater even than the divide between Verne and Wells nearly a century before.
It’s no surprise that the 1960s–a decade defined by its blend of optimism and anxiety–would continue this trend, though increasing social and cultural turmoil added a markedly political note to even the fluffiest sci-fi works. The television series Star Trek, which debuted on NBC in 1966, is one such example. Though Trek took place in a relatively utopian future, where advanced technology allowed for intergalactic exploration and diverse groups of humans lived together in harmony, creator Gene Roddenberry also used the series to comment on troubling issues of the 1960s. Racism, fascism, the Vietnam War–all were addressed allegorically (and occasionally hamfistedly) in episodes of Star Trek’s original three-season run. Though the series never reached mainstream success during its initial time on TV, it developed a cult following that would not only form the template for modern fandom culture (and fanfiction), but would lay the groundwork for the wave of avant-garde political sci-fi that was to come.
THE 1960s-1970s AND THE NEW WAVE
The rise of 1960s counterculture–and the “mind expanding” activities that came with it–left an indelible mark on art and literature, with science fiction being no exception. Among the earliest influences of this movement was William S. Burroughs, a member of the “Beat Generation” literary movement along with Jack Kerouac, Neal Cassady, and Allen Ginsberg. Though not strictly a sci-fi writer, Burroughs’ psychedelic Nova Trilogy was noted for incorporating postmodern, experimental modes of writing with science fiction tropes like genetic modification and mind control. This new style of sci-fi writing, with its avant-garde conventions, darker tone, and rejection of Golden Age clichés, would become known as “New Wave” science fiction.
The New Wave de-emphasized the hard sci-fi elements of earlier science fiction, preferring instead to use softer sci-fi elements to shed light on human relationships to technology. Author J.G. Ballard, for instance, used grim dystopian settings to reflect upon the toll modernity takes on the human psyche, Philip K. Dick explored how technology can warp our ability to determine what is real and what is artificial, and Frank Herbert, author of the Dune series, focused largely on ecology and human survival in inhospitable future environments. The New Wave was also a time of massive growth for sci-fi that focused on social sciences, including feminist sci-fi. Authors like Margaret Atwood, Ursula K. LeGuin, Joanna Russ, and Octavia Butler (the latter of whom is also considered the godmother of the Afrofuturist movement, a subgenre incorporating Black perspectives into science fiction) came onto the scene around this time, with work addressing gender, race, reproductive rights, and more. It’s not a stretch to say that the New Wave movement was what truly legitimized sci-fi as a genre, elevating it from pulp fodder to literature.
Another element in bringing sci-fi into the mainstream was the dawn of the sci-fi blockbuster in the 1970s. Though science fiction films had existed since the dawn of cinema, they were usually pigeonholed in the “B-movie” category, along with cheaply produced monster movies and exploitation films. Stanley Kubrick’s 1968 film 2001: A Space Odyssey is one of the earlier examples of prestige sci-fi cinema, incorporating new technology and visual effects to create a strikingly realistic science fiction universe. There were also more highbrow international sci-fi films released during this time, like Jean-Luc Godard’s Alphaville and Andrei Tarkovsky’s Solaris, though they never had mainstream success or wide international distribution.
Then came Star Wars.
George Lucas’s Star Wars was released at a pivotal time not only for science fiction, but for cinema in general. For years, Hollywood’s #1 moneymaker was the “roadshow picture,” a big-budget film (usually a musical or historical epic) given a limited release to promote hype and justify charging a premium on tickets. But with the massive success of films like Jaws in 1975, attention turned to the action-packed “blockbuster” as the new cash cow for film studios. Star Wars, released in May of 1977, just two years after Jaws, would become one of the earliest of these blockbusters, bringing in $307 million domestic in its original run (equivalent to $1.3 billion in 2020). And, what’s more, it did so as a classic Golden Age sci-fi pastiche–spaceships, rayguns, warrior princesses, and all. It was a success that would usher in a new wave of sci-fi blockbusters in the coming years, starting with Close Encounters of the Third Kind in November of that year, followed by Alien two years later, and E.T. the Extraterrestrial three years after that (not to mention both Star Wars sequels, in 1980 and 1983).
But not all sci-fi films of the ‘70s and ‘80s were crowd-pleasers like Star Wars. The 1982 film Blade Runner, an adaptation of Philip K. Dick’s Do Androids Dream of Electric Sheep? presented a dystopian vision of the future, with an anti-heroic protagonist, a cynical tone, and a focus on the underworld of a high-tech futuristic society. It was far from a box office smash, but it would become known as one of the earliest iterations of the hallmark sci fi genre of the 1980s: cyberpunk.
CYBERPUNK AND THE DIGITAL AGE
Cyberpunk, a term coined by author Bruce Bethke in 1980, was a subgenre that grew largely from the New Wave movement of the 1960s and 70s, especially the works of Philip K. Dick and J.G. Ballard. Like much of New Wave literature, cyberpunk works took place in a dark and dystopian future, but unlike New Wave literature, they focused specifically on the outcasts and rebels that rapidly advancing society had left behind. Bethke himself described the genre as a response to the explosive growth of computer technology at the end of the 20th century, and the implications it would have on “the first generation of teenagers who grew up truly ‘speaking computer’.”
Though cyberpunk had roots reaching back into the late 70s and early 80s, it was William Gibson’s novel Neuromancer that would bring cyberpunk into sci-fi’s mainstream. Gibson is now widely recognized as the preeminent literary voice of the genre, along with Orson Scott Card (of Ender’s Game fame), Pat Cadigan, and Richard K. Morgan, among others. But it was in Japan where cyberpunk really took root, particularly in the mediums of animation and graphic novels. Sci-fi had been a popular genre for anime and manga since the 1960s, but it was the cyberpunk movement in the 1980s and 1990s that not only popularized sci-fi within Japan, but also popularized Japanese animation throughout the rest of the world. Katsuhiro Otomo’s 1988 film Akira, a cyberpunk epic set in “Neo-Tokyo,” was an international hit, and is still consistently rated as one of the greatest animated films of all time. Along with this came Masamune Shirow’s manga Ghost in the Shell, Hajime Yatate’s anime series Cowboy Bebop, and Yukito Kirshiro’s manga series Battle Angel Alita, among many others. Japanese cyberpunk made such an impact on 1980s and 1990s global sci-fi that Japanese fashion, architecture, art, and writing have even become a signature aesthetic trope of the genre.
The anxiety about computer technology that gave birth to cyberpunk would be forever changed on August 6th, 1991, when the World Wide Web went live for the first time. The Web offered unfathomable potential for communication and transfer of information, but it also afforded an ominous degree of anonymity, loss of privacy, and, ironically, isolation. It’s no surprise that late- and post-cyberpunk works played upon this dynamic, as in Lilly and Lana Wachowski’s 1999 film The Matrix, which imagines a future where humanity is plugged into a vast computerized false reality from which the computer hacker protagonist is able to break free. The creation of the internet was also an incentive for more people to buy personal computers, a trend which evolved into purchasing laptops, smartphones, tablets, and computerized “smart” objects like refrigerators and TVs. The omnipresence of computer technology–essentially the future that cyberpunk predicted several decades prior–has changed the way humans live, work, and interact so radically that it’s almost impossible for science fiction to keep up.
Now, in the turbulent 2010s and early 2020s, a time when computers and the internet are more deeply interwoven to our daily life than ever before, sci-fi has reached something of a crossroads. Understandably, much 21st century sci-fi leans to the dystopian side, with shows like Black Mirror depicting technology alienating and controlling humankind. But there’s also a hopeful streak emerging in much contemporary sci-fi. Films like Spike Jonez’s Her show a world where technology is incorporated into human life and can even provide counsel and companionship, rather than estrangement. There’s also been a notable diversifying of authorial perspectives in sci-fi, as with the Afrofuturist renaissance sparked by Ryan Coogler’s film Black Panther. And the environmentalist movement of the late 20th and early 21st century has inspired the emergence of “solarpunk,” a sci-fi subgenre focusing on the potential of futuristic technology to build a sustainable, green society.
As it has been since the time of Shelley, Verne, or Wells, science fiction remains a vehicle for our anxieties about what the future may bring. After all, the further technology advances, the fewer things seem truly out of reach, for better or for worse. But even in times of turmoil, science fiction is also a vehicle for our hopes. As Rod Serling said, sci-fi is the improbable made possible–and whether those possibilities are fearful or thrilling remains up to us.
On that note, what happens when, in Serling’s words, the impossible is made probable? You’ll have to wait for our next installment to find out.