THERE’S ONE THING that becomes clear reading John Wyndham’s apocalyptic novels: the apocalypse is never singular. Catastrophes might devastate a city, maybe a small country. An earthquake, an epidemic might be a catastrophe. But the apocalypse bundles together multiple forces and changes everything irrevocably. In Day of the Triffids (1951), it’s not enough to have an alien invasion. The meteors that carry alien plant life to Earth also blind everyone who watches them streak through the atmosphere. The aliens land on Earth and find the blinded humans easy prey and wreak havoc on the human population. Only those protected from the alien light retain their vision, and it is up to them to rebuild human society on isolated islands cleared of the alien invaders. In The Midwich Cuckoos (2008), it’s not enough that the English village of Midwich is trapped under an invisible dome, an impermeable barrier that traps everyone and everything inside of it. The newly born infants in Midwich all resemble each other, preternaturally blond and blue-eyed—and, it seems, capable of telepathy. What could have possibly made such a thing happen? In answering that question, the small village of Midwich—and the world—will never be the same again. Writing in the period after the Second World War, Wyndham captures the ways that a set of unpredictable events overlay already existing conditions to create a situation that changes how society is organized and how people think about the future. Alien invasions don’t occur in a social vacuum; they overlay social unpreparedness—captured in the triffid-induced blindness and the Midwich provinciality. It’s not just that no one sees the apocalypse coming but that the series of catastrophes that make up an apocalypse is made more potent through this lack of foresight and compounded by provinciality inspired by modern comfort.
Decades before Wyndham, H. G. Wells imagined alien invasions, too: The War of the Worlds (2005) features an invasion that has since become paradigm defining. An unsuspecting Earth is invaded by wily Martians, who wield technologies that far outstrip human achievements. All the machines of war that humans have invented barely scratch the Martian war machines, and all seems lost—until the Martians are brought low by Earthly viruses. The Martians, it seems, catch a cold and, before they know it, are all wiped out. Such is the alien invasion story we like to tell, and it comes with variation, like in Independence Day (Emmerich 1996) when aliens are brought low by a computer virus, thanks to a little human ingenuity. But War of the Worlds and Independence Day are also struck through with provincialism. Despite being global invasions, their stories focus almost exclusively on the plights in a very specific location, England and the United States, respectively. We take for granted that events are unfolding similarly elsewhere and that the tiny changes made in one environment will cascade through the whole of the alien fleet. If only we could always be so lucky to be saved through pure accident or a little human ingenuity, and to emerge unscathed or better for the experience. Years later, Ian Edginton and D’Israeli’s Scarlet Traces (2003) imagines what the aftermath of Wells’s invasion was, with the alien technologies used by the imperial powers of the nineteenth century to entrench their hold on the globe in the years after the war. Things change, but they stay the same, reinforcing already-existing social relations. But if Wyndham’s rule holds, there’s no saving to be had in such simple fashion. The apocalypses that Wyndham imagines are more unsettling. They fundamentally change how society is constituted after they occur: there’s no going back to how things were before. Human society might continue to exist alongside the triffids, but only on its isolated islands; life will continue after the cuckoos leave, but always with the possibility of their return. There is the terror of the event itself—the catastrophe that catches us off guard—but then there is the terror of the ongoing apocalypse: who knows what will happen next and what the ramifications will be? What can possibly be done to prepare for the unexpected? Speculative fiction and social theory both ask us to consider these questions, and in finding answers we make new futures possible.
The future is so hard to imagine these days. We know so many things that will happen: the weirding of weather patterns as a result of global warming; the rising of sea levels, which will flood cities and island nations; the acidification of the ocean, leading to mass extinctions of life in the sea and on land; drought, starvation, and famine as a function of changing weather patterns; the growing tensions between the haves and the have-nots, especially as the have-nots are increasingly subject to the worst of changing climate conditions and geographical displacement; the increasing numbers of the have-nots as a result of automation in the workplace and advances in artificial intelligence; growing animosities between nations based on access to needed resources, and complicated by increasing ethnonationalisms. That all seems to be a given, and how exactly it all plays out is impossible to guess. But then imagine some of the other possibilities—holding aside for a moment the possibility of an alien invasion, benign or otherwise: epidemic diseases gaining footholds due to changing climate conditions and spreading globally thanks to faster and more efficient travel technologies; global and local economic recessions—if not total collapses—as a result of corporate and governmental failures to anticipate changing local and global conditions; earthquakes, wildfires, tsunamis, hurricanes, heat waves, all claiming lives and destroying homes, if not whole cities; food chain disruptions due to the extinctions or displacements of species. Then there are things like errant asteroids striking Earth and rogue states unleashing nuclear or biological warfare against unsuspecting populations. And did I mention alien invasions?
Any one of these events would be a catastrophe in its own right; it would disrupt society and its easy operations and assumptions about social order. But taken together—and it seems like more than one of them will happen, and most of them might, all in quick succession—there’s an apocalypse looming. It’s no wonder, then, that we have a hard time imagining what happens next—any one of these situations might be able to be modeled, but how do we model them all together? How can we possibly imagine how they’ll intersect as emerging conditions laid on top of already-existing and slowly mutating social conditions, like racism, international relations, and the profit motives and class antagonisms of global capitalism? Wyndham’s rule suggests that we just can’t: there’s something about the apocalypse and its multiplicity that defies the imagination and any attempt to model it. And even Wyndham falls prey to this: he can imagine the apocalypse and its local iteration, but the world escapes him.
So, here’s Wyndham’s rule: The apocalypse is never singular; it is always multiple. In its multiplicity, the apocalypse is unimaginable.
What is to be done when the future eludes our capacities for imaginative play and scientific modeling? One possibility is to imagine them one by one, to play with catastrophes and apocalypses in the confines of a novel or film or study. Speculative fiction does this with aplomb, a tradition Wyndham contributed to, if not wholly shaped. Imagine a catastrophe—a natural disaster, a widespread epidemic, a stand-in alien invasion—and model what happens next across society, from individual lives and families, to cities and nations, to the globe. At their most dramatic, everything changes—but usually humans pull through, if only by the skin of their teeth, and maybe in modified form, like Lilith and her brood in Octavia Butler’s Dawn (1997); at their most modest, catastrophes are recovered from and the social order is restored, albeit in modified form. These attempts to imagine the apocalypse in its singular forms—and to imagine its aftermaths—produce models to think about society and how it might recover from devastating—if not ontology shattering—events. The theory for the world to come lies in these experiments, individual attempts to imagine, to model, to conceive of a future. It also lies in very real experiments with life after catastrophe, from rebuilding communities in the wake of natural disasters, to individual and family attempts to recover from disease, to societies reconstituting themselves in the wake of settler colonialism. These projects don’t bring the future into being so much as make plain possible ways forward—and how they build upon the past.
Social theory and speculative fiction are two sides of the same coin. It is not the case that social theory is the sole provenance of academics nor that speculative fiction is that of science fiction writers. Both traditions ask us to imagine worlds that can be described and depicted, and ask us as audiences to imagine the rules that undergird a society and its human and more-than-human relationships. What are the systems of belief of these people (whoever they are), and how do they impact the everyday experience of identity categories like race, ethnicity, disability, gender, sexuality, and class? What are the bases of identity categories—is there even such a thing as race, ethnicity, disability, gender, sexuality, and class in a given society? What is the role of “human nature,” and how does it impact people’s relationships with each other and their relationships with nonhumans? What are the institutions that shape society and people’s everyday lives? Is it capitalism and representative democracy, or some other economic and political organization? Framing these questions should make it clear that the kinds of questions that speculative fiction writers ask also motivate social scientists and other academics interested in social theory. The very questions that anthropologists, sociologists, and psychologists have been pursuing since the nineteenth century have also been motivating speculative fiction writers, from Mary Shelley, Jules Verne, and H. G. Wells, to our contemporaries (Collins 2008). But such speculation and theorization also underlies all description and depiction, even if it is muted and obscured from notice through social convention. Whenever we ask that someone imagine, we ask them to speculate—to theorize with us—the world that we weave through our descriptions. Speculation is a part of life, and theorization is as well (see Shaviro 2015).
Social theory, like speculative fiction, is always situated. It is situated in its time and place, in its historical moment. It is situated in the lives of the people who develop and implement the theory. And it is situated by its critiques, which themselves arise from particular life histories and social situations. Claims to objectivity in social theory—to universal generalization—are just as fraught as they are in objective claims more generally, whether in the hard or social sciences (Behar 1996; Clifford and Marcus 1986; Daston and Galison 2007; Haraway 1997; Hymes 1974; Latour 1987; Shapin and Schaffer 1989). As often as those points have been made since the 1970s—at least—claims to objectivity creep back in. In some cases, they never left, particularly in laboratory sciences. What is often elided in critiques of objectivity, however, are the ways that social situations and personal histories ensure that progress in social theory—in the sense that social theories are developed that capture experiences of the world rather than determine experiences of the world—is short-circuited (Haraway 1997; Lutz 1995). That “legitimate” knowledge is produced largely by scholars trained at universities in the North Atlantic or influenced by knowledge that has developed out of colonial relationships and their influences has profoundly shaped what forms social theory has taken, as well as the categories relied upon to create theoretical knowledge. This has resulted in social theory seeming to be the property of academics, with speculative fiction being the realm of nonacademics. But such an easy division of labor obscures how speculative knowledge in the academy is, how contingent its production has been, and what the influences of individual lives have on what gets thought and how it is articulated. This is evident in how disciplines have formed, how they have spread, and how they have shaped knowledge production.
Consider two very different examples: Anthropology developed in the nineteenth century as the study of Man, the assumption being that humans (and especially men) were distinct among animals, and that they could be studied apart from other animals. How different anthropology would have been if it had developed as a mode of thought in Papua New Guinea, where humans, yams, and pigs all are thought of as persons. Ancestral spirits and inanimate art count too, both of which are roundly dismissed in the North Atlantic through secular claims to rational objectivity (Gell 1998). Anthropology developed when and how it did because of the colonial situations that state powers in the North Atlantic had become invested in, paired with pseudo-Darwinian theories about the development of Man, evolving from a state of savagery through barbarism to civilization, which conveniently mapped onto the globe to explain the apparent development of societies and people toward a state of modernity determined by university cities in the North Atlantic (Trouillot 2003; Wolf 1982). How different anthropology would look if it had been developed in China, where modernity developed quite differently, where theories of rationality had a much longer history, and the conception of the self-determined individual a much shallower one (Rofel 1999).
Or consider how psychiatry would have developed in Japan rather than in Western Europe. For many Japanese people, the site of personhood is the heart; for Western Europeans, it has long been the brain (Lock 2002). In psychiatry in the United States and elsewhere, this has led, over time, to the imputation of aberrant behaviors to problems in the brain. If the site of personhood had been elsewhere, as in the heart, psychiatry may have focused there instead, and its conception of the self would have articulated very differently. Similarly, with the idea of the individual firmly rooted in North Atlantic philosophical, religious, and political traditions, behavior was seen in early psychiatry as rooted in the individual. “Hysterical” women were seen as individual problems, not reactions to patriarchal power structures and social organization that systematically disempowered women from making decisions that they were simultaneously told were the marks of prized, liberal personhood: a double bind if there ever was one (Fanon 2008; Metzl 2003; Rose 1996; Wilson 2004). If psychiatry had developed elsewhere—South Asia or Melanesia, for example—where personhood is seen as a relational process rather than an inherent quality of the individual (Strathern 1992), how psychiatric disorders were defined would be different. They would rely more on an understanding of social dynamics and relations between people rather than organic causes isolated in the brain. What we think of as psychiatry and psychiatric problems are entirely indebted to the social origins of the discipline of psychology; if the discipline had arisen elsewhere and spread from that geographical locus to the rest of the world, it would have made the world in a different way.
Such fantasies of disciplinary alternative histories are important, not just as “what if?” stories but also as a means to denaturalize the latent conception that knowledge produced in any discipline is objective. Knowledge, theory, and models are all subject to the same situational histories that have led to the founding of disciplines, the same personal histories, the same contingencies, and same historical processes. There’s nothing inevitable about the creation of anthropology, nor its progress from theoretical paradigm to theoretical paradigm; harder to imagine is that physics or biology might not have arisen as a discipline or that they would have developed in ways other than they have. But an Isaac Newton killed in his crib by scarlet fever or Charles Darwin thrown overboard by a rogue wave, and both physics and biology may have developed quite differently. Gravity and natural selection may have been invented as concepts—eventually—or they may have been obscured by other, more dominant theoretical concerns. That Darwin’s theory of natural selection was informed by his social position is well documented (Desmond and Moore 1994); what if he hadn’t lived a life of relative privilege, watching those around him, better looking and richer, getting married while he remained a bachelor? What if, instead, he was just a little handsomer and wealthier? He may never had boarded the Beagle, and, even if he had, he might not have conceptualized selection as a struggle or that it was best expressed in reproduction over time. Contingency is critical to the dominance of models and theories that develop, just as much as historical situations ensure that some ways of thinking take precedence over others. A successful theory works on the aggregate—it relies on making sense to a wide swath of people who employ the theory and use it to shape their interactions with the world. We should be constantly skeptical of the theories that are developed, not only for the motives of their makers but for the ways that they reinforce already-existing conceptions of the world, of persons and institutions, power relations, and particular forms of speculating about the present and the future. That so much of how we think about the world comes from theories developed in the imperial, global North Atlantic should be pause for consideration; there are other ways to think of the world, to build social theory, to speculate about our futures, and to imagine, critically, the failures that North Atlantic traditions have precipitated.
Because of these determinants of social theory, I have a suspicion that the theories that have developed and gained traction over the last half century are deeply suburban. Theory that has been developed since the 1950s has emerged from a North Atlantic sense of comfort, a lack of hardship, an acceptance of global, national, and local power relations, an acceptance of a certain kind of inevitability inspired by a general level of prosperity. In this respect, consider the appeals of J. K. Gibson-Graham’s tripartite understanding of capitalist, alternative capitalist, and noncapitalist modes of exchange (Gibson-Graham, Cameron, and Healy 2013). For Gibson-Graham, the model of capitalism they forward captures how there are those exchanges that participate in the capitalist market—buying an industrially produced loaf of bread at the corporate-chain grocery store. And there are alternative forms of exchange, namely purchasing a loaf of bread from a locally owned baker who sources ingredients from family-owned organic farms. Alternative capitalist exchanges help to distribute wealth outside of corporate contexts, supporting laborers more directly without the need to appease corporate shareholders. Finally, there are noncapitalist modes of exchange: I make a loaf of bread from ingredients that I have grown and milled, and I give my loaf of bread to a friend in exchange for his or her labor. Such a conception of alternative capital exchange can exist at a moment in the history of capitalism when institutions that facilitate such forms of exchange are supported, institutions like farmers markets, which are available in some but not nearly all markets. But it is also true that capitalist institutions rely on alternative capitalist formations as the basis for expansions of the market: Whole Foods, a national grocery chain, trades on its appearance as an alternative to corporate chains, when it is just as corporate and dependent on capital as any of the other national chains. Similarly, an understanding that there is an outside of capitalism is a very peculiar fantasy that depends on imagining the existence of property that hasn’t been captured by national or private claims to ownership; the seeds to grow wheat have increasingly become proprietary, as are most fuel sources that could conceivably bake a loaf of bread. And yet, Gibson-Graham’s conception of capitalism is appealing; it’s compelling to imagine that one is participating in some alternative to the hegemony of capital through frequenting a farmers market, even if it is in the microscopic monetary exchanges that support local growers and noncorporate farms. But the success of the theory depends in no small amount on individuals extrapolating from these small exchanges to the entire basis of the economy and its formation.
Or consider the recent turn to multispecies ethnography and animal studies in the social sciences and humanities (Haraway 2003, 2008; Hartigan 2014; Kirksey 2015). That animals had been left out of scholarly consideration for the last century of the university might easily be claimed to be a function of there being more pressing concerns—Civil Rights and the history of racism; gender, sexuality, and feminism; postcolonial politics; globalization and U.S. hegemony; the postmodernist critique of objectivity and knowledge production—and now we are rethinking the category of the human, human agency, and the role of animals in human society. The analytic worm turns, and focus becomes ever narrower in its objects. Anticipating what comes next is beside the point; the challenge is that when scholarly attention continues to work within North Atlantic traditions, drawing on the sources and disciplines that have shaped attention over the last century, the possibility of a rupture, of finding something truly new, is impossible. Instead, new objects are fit into already-existing theoretical matrices, taming the objects as sources of knowledge while reinforcing dominant ways of knowing the world. Take, for example, recent attention to the microbiome, that teeming mass of microbes that covers human bodies and fills our digestive system (Yong 2016): yes, attention to the determinative powers of the microbiome has the potential to reshape how we think about human desires, particularly related to food, but it also tames the microbiome into a knowable object, first in the laboratory and then in the humanities and social sciences. It becomes subject to already-existing ways of knowing, of acting upon it, of conceptualizing it. Its alienness, its unexpected potentials and actions will be translated into models, theories, and language that adhere in disciplines as they already exist. Newness becomes the fetish of disciplinary knowledge production precisely because the new is always impossible to bring into being—because the new is always trapped in disciplinary modes of knowing.
The disciplines are ruled by their comfort and their suburban complacency. And I am no different. Raised in an upper middle class home, in a periurban part of metropolitan Detroit nearly thirty miles from the city center, I attended a private elementary school and an elite public high school. Unlike many of my peers, I opted for a liberal arts education while they attended larger state schools; where most of my peers got by on allowances provided by their parents, I chose to work throughout high school and college, in part to support my game-playing and comic-book-reading hobbies. When college was over, I spent time teaching elementary school, and then migrated into a series of graduate programs. Although there were precarious times, I could always rely on my parents and student loans through the federal government for support. Over time, through work and good fortune, I’ve found myself gainfully employed, a homeowner, the parent of two children, blissfully free of debt. Maybe you see yourself in that description—or a version of yourself that you aspire to be or have left behind. Maybe you’ve been more fortunate than me or made different decisions. But if you’re reading this, chances are that we have more in common that we hold in difference. And maybe you too are struck with a kind of suburban comfort, a complacency that seeks, discretely, to reinforce already-existing modes of knowledge production and theoretical models. And maybe, like me, you’ve become a little dissatisfied with the futures that are being made for us and that are foreclosing the development of other possibilities. Maybe you too want to choose a different future.
Maybe it was that suburban comfort that propelled me into reading all of the speculative fiction that I did throughout my teenage years and into my adulthood. In an era before the internet, I would scour bookstores for Philip K. Dick novels that had long been out of print, and stumbled upon authors like Samuel Delany and Thomas Disch, Joe Haldeman and H. Beam Piper, Ursula Le Guin and Octavia Butler, Norman Spinrad and Cordwainer Smith, by diligently reading the backs of books in search of something else compelling. I had long tired of Isaac Asimov and Robert Heinlein, both of whom seemed too out of touch with reality—a strange criticism for science fiction writers. I wanted fiction that helped me make sense of the world around me but that also unsettled me in productive ways. In essence, I wanted social theory before I knew that there was such a thing to be had, and speculative fiction supplied it. Is it any wonder then that I gravitated toward the discipline of anthropology, a social science that is fundamentally built upon the postapocalypse of global settler colonialism?
Arguing that fiction writers are products of their biographies, their historical situations, and their biases is less controversial than suggesting the claims to objectivity that social scientists implicitly (and sometimes explicitly) make need to be interrogated. Inasmuch as fiction authors are taken to be exemplary thinkers of their time, they are also treated as symptoms—as expressions of cultural trends, some of which might be subtle in their shaping of individuals (Barthes 1977; Foucault 1998). Authors are important not merely for who they are but as one of many, as part of an aggregate that is considered a generation. Reading across the speculative fiction that emerged in the 1960s as a response to the hard science fiction of the immediate afterwar period, a growing concern about the suburban fantasies of the United States becomes clear: from Le Guin’s critiques of capitalism in The Dispossessed and gender norms in The Left Hand of Darkness (1969), to Spinrad’s The Iron Dream (1972) and its critique of racial fascisms, to Disch’s meditations on urbanism and kinship in 334 (1999), to Dick’s unease with all the convenience and comfort of the afterwar period, from Ubik’s (2012b) talking smart devices to The Man in the High Castle’s (2012a) paranoid awareness of something having gone deeply wrong with society to lead to the Nazis and Imperial Japan carving up a defeated United States. In the speculative fiction of the 1960s, there was a palpable dis-ease with suburban comfort—a comfort that was largely predicated on consumerism, patriarchy, and implicit white supremacy. Any social theory that arose from this matrix might rightly be treated with suspicion; any speculative fiction that arose from the same period might serve to put into relief the implicit attempts to normalize suburban hegemony, benefiting from not existing within the university disciplinary system, which implicitly—and sometimes explicitly—sought to diagnose the contemporary moment as a unilineal result of the civilization process. In so doing, social theory operated teleologically, diagnosing the present as an effect of an inevitably unfolding past series of events. But speculation works differently, and drawing from authors whose biographies lent a critical stance to the theories they operated with led them to question the seemingly inevitable. They questioned what would happen if the seemingly inevitable continued in an extrapolative fashion. What would happen if the inevitable intensified, or what if something truly strange and unpredictable happened?
This is all to suggest that social theory is a question of biography, social location, and institutional situation. Social theory and its production is a question of scale—of moving from the lives of individuals, to communities, to society at large, and ever outward spatially. But it is also to suggest—like Frederick Jackson Turner’s theory of the frontier (1998)—that social theory also traffics in time. It is also to suggest—somewhat paradoxically—that, like Roland Barthes’s discussion of the “death of the author” (1977), biography is less important than social location. This is how a generation can occur, with a shared sensibility across different life experiences. It is also to explain why this book looks the way that it does. My assumption is that you, like me, are a product of a particular moment, and that my dissatisfaction with available theories, with the teleological diagnosis of white supremacy, with the resignation in the face of planetary collapse, is shared. My biography, represented in part in the diary entries that make up the spine of this book, may be my own, but it is not unique; it is singular, but the experiences I have had throughout my life echo in the lives of others. My sensibilities, shaped as they have been by upbringing, education, and my media environment, are the product of the world that I have inherited. My seeking out of alternatives to dominant social theories handed down to me by my disciplines is presumably also shared by others—evident in the growing interest in alternatives to Eurocentric, diagnostic theories. One source for these alternative theories is speculative fiction in its proper sense, but the speculative impulse arises elsewhere, too—in music, in film, in countergenealogies of thought that resurface traditions and thinkers that have been dismissed as not fitting into the dominant body of social theory.
We are at a moment where we need to choose our future. Diagnostic theories will only get us so far. Our imaginations can only get us so far, as well. Both are impoverished by their social locatedness. Accepting both of those claims might inspire resignation. But the purpose of this little foray into the speculative is to inspire the opposite: what sources might there be for rethinking the future? for dislodging the futures that we have been given and to think something anew? for rethinking the past that has gotten us to this point? Articulating futures—imagining them and bringing them into being—is an active process, and rather than a posture of resignation, theory for the world to come needs to instill radical curiosity. That curiosity should be about sources—about texts and authors—as much as it is about practice. The disciplinary configurations that make up the terrain of social theory, embodied in the contemporary neoliberal university, are insufficient at best and harmful at worst. Choose your future: throw in with what has been or try to find something that disrupts the futures we have been given.
There’s something appealing about desolation, about the radical reduction of society to its barest elements. Maybe it’s the sheer simplicity of it all: agrarian villages in the wake of nuclear holocaust, communitarian solidarity in the aftermath of global economic collapse, transformations in gender roles and sexual mores as a result of alien invasion. Wiping the slate clean makes imagining the future so much more possible. But the future we face won’t be based on a clean slate—instead, it will be Wyndham’s future, built upon the already-existing blindnesses that adhere in our societies, made possible by our provincialities, our comforts, our prejudices. The future is unfathomable. But in this openness, it becomes a space to play with theories of what might be, of what the future holds, how it will reshape human lives and society, and how the future will change too. Putting theory into the world makes new worlds possible—it creates new lines of flight that human action follows, unfolding in turn new possibilities. Speculative fiction—and social theory—that considers desolation and its aftermaths helps to point to ways forward, ways to live through the apocalypse, even if living through doesn’t manage to keep things the same as they were.
How the end is imagined changes over time, and so are the beginnings that the end spawns. As a child, raised during the tail end of the Cold War, the end always seemed to be nuclear. On Saturdays, after a morning’s worth of cartoons, one local affiliate would turn to showing horror and science fiction movies—sometimes Japanese kaiju movies, starring Godzilla and his nuclear-inspired ilk, sometimes films like Night of the Comet (Eberhardt 1984), that fictionalized nuclear holocaust through unexplained phenomena. Looking back, I wonder what sadist programmed these movies to air following Saturday morning cartoons; who in their right mind would think that it was appropriate to show a film featuring flesh-eating zombies after Elmer Fudd chasing Bugs Bunny? But then it strikes me that that question is disingenuous. Future making is about communicability (Briggs and Nichter 2009), about giving from one generation to the next a sense of the future they are to inherit. That’s how futures are made—not necessarily through deliberation but through infection.
That I would lie in bed as a child, thinking about nuclear holocaust, imagining that any passing airplane could be a Soviet bomber carrying a nuclear payload to devastate suburban Detroit was entirely the point of showing those films. I’m not sure that I could imagine what would happen next—I couldn’t imagine that we would return to some form of agrarianism after nuclear conflagration, since those weren’t the stories that made it into film. I’m not sure that I could imagine anything other than the event itself. There was something particularly debilitating about the nuclear future: there wasn’t anything to be done about it to avoid it, nor was there any recovering from it. The movies I watched on TV—sometimes through a screen of fingers to block out the gore—stopped short of imaging what happened next. How, exactly, did human society rebuild itself? What, precisely, did people do to ensure that another nuclear holocaust didn’t happen? In effect, not imagining these possibilities ensured that viewers like me never learned a language to elaborate postapocalyptic possibilities.
Utopias are usually boring, even when they’re stories of crawling from the wreckage of the apocalypse. And that’s what so much of postapoaclyptic literature attempts to skirt around: what’s so compelling about societies rebuilding themselves, or, worse yet, societies that have already rebuilt? There must be some threat to the easy life of utopia to make it worth putting into narrative. In a nutshell, that’s the motivation for so many of the apocalyptic stories I grew up with, albeit their utopia was the casual, comfortable utopia of 1980s American consumer capitalism. That American utopia was a partial one—it overwhelmingly favored decently educated, suburban white people, who were either in professions protected from recessions or from the globalization of industrial labor; rural and urban communities, each in their own ways, were more exposed to the vagaries of internationalizing and financializing capitalism. But for those in their suburban idylls, utopia was at hand. Not in some ideal sense—it wasn’t a suburban heaven—but all needs were met, and wealth accrued. Life wasn’t perfect, but it was secure to the extent that it was unlikely that nuclear bombs would actually be dropped.
That utopia was easy to disrupt, and films like George Romero’s Dawn of the Dead (1978) sought to do just that. Set in a suburban shopping mall, the protagonists take refuge from a zombie infestation that has seemingly infected the rest of the world—or at least the immediate surroundings. Part too-obvious allegory for the blindnesses of consumer capitalism, part postapocalyptic anxiety fantasy, Dawn of the Dead doesn’t have anything to say about how to rebuild society, how to make good use of the clean slate that the zombies would provide once they were eradicated. That world would wait until Survival of the Dead (2009) in which an island-bound community has found refuge from the ever-present zombies—until the zombies begin to make their way to the island under water. But the world of Survival is a sadly capitalist one, where the wealthy live in glass towers and think themselves immune from the recurrent waves of the zombie apocalypse. They find themselves unprepared for the teeming masses of zombies that find their way ashore. If Romero has an abiding rule, it is that wealth won’t protect you from catastrophe. If anything, wealth tends to make you especially blind to the world through its suburban provincialisms. Rebuilding the world and its consumerist vagaries won’t stop the next catastrophe—and the blindness that would seem to justify such rebuilding points to the problems that make our present catastrophes so devastating. There’s no way forward, and the route society is on is sure to make the next catastrophe more devastating for lack of planning.
Those apocalyptic Saturday films seemed to suggest that my forebears were resigned to the future that was coming to meet us. Resignation is a powerful force, and one that is infectious: living resignedly tends to infect those who are exposed to it. Resigned imaginations and catastrophic speculations tend to materialize their worlds through inaction—or, rather, actions that are too modest to affect a different future. Resignation makes sense of modest action, of being comfortable, of being maybe a little outraged, but not outraged enough to make a difference. That suburban complacency, that late capitalist utopia that favors some people and their lifestyles over others, is easily fostered through modest action, through a lack of imagination about what comes next. How do we see past our resignations? How do we begin to think past the looming apocalypse, one that will comprise the known and unknown? How do we begin to build a set of theories for the world to come that works past our blindnesses, our complacencies and resignations? How do we design a society that at once is prepared for the array of catastrophes that are on the horizon and will rebound into something new and unprecedented on the other side of the apocalyptic events?
In this book, I begin to think about the theories for the world to come—what theories there are that will help to build a sustainable, equitable world after collapse. Or, maybe, sets of theories that when put into play in the present will change the possible futures that have come to grip our imaginations. Can social theories, mined from specific local spaces and historical moments, inspire new lines of flight, new paths into the horizon that is the future? To answer these questions, I turn to a set of speculative texts—films, novels, television shows, comic books—to consider their embedded social theories: what are the worlds they build, how are they composed, and what kinds of lives do they enable? I think through these texts based on my own history, my own locations and fixations. I consider, in turn, my time in Michigan, in California, and in New York, and the texts that infected me in each of those personal moments. Those texts are largely local themselves, texts produced by or situated in Michigan, California, and New York. The chapters are autobiographic reflections and textual analyses, all in the service of thinking through speculations in their moment and their lingering effects. Throughout, I am interested in three forms of future historiography: intensification, extrapolation, and mutation, each of which helps to expose the problems we currently face in thinking about the future and its possibilities.
Not solely the provenance of speculative fiction, intensification, extrapolation, and mutation are also at the heart of many social theories. Extrapolation takes a force or institution or person and puts it into the future, relatively unchanged. Extrapolating in this way is a means to imagine what will happen to something as it is carried into the future: What will the future of capitalism be? Or the future of kinship relationships? Extrapolation allows for imagining how something like capitalism will respond to other changes, social and environmental. In doing so, extrapolation serves to make evident how fragile or resilient institutions, forces, and people are. Intensification imagines what will happen with the increase in quality of a force or the pervasiveness of an institution. This is not to imagine an institution or force unchanged, but to purposefully toy with making a force unrelenting or an institution more total. What if the world just keeps heating up? What if capitalism becomes so totalizing that all social interactions are subject to the market? Answering these questions is a foray into speculating about intensification. Mutation modulates forces and institutions in unpredictable ways. What if marriage was replaced by the corporation, allowing for more than two people to be married in a joint venture? Or what if the Ice Age suddenly began, despite all our preparations for global warming? Mutation is about surprise, about the unexpected, and how individuals and societies respond.
Each of these modes of thought, which might otherwise be sufficient for predicting the future, run up against Wyndham’s Rule: they cannot capture the multifarious forces that conspire to make our collective future. In the following, I turn from the question of life at the end of capitalism to the nihilism of deep time, to the social need for revolution and its relationship to conceptions of time. If intensification, extrapolation, and mutation are insufficient, might the disavowal of power, a graceful handing over of power to the generations in our wake, help to usher in a new set of theories for the world to come?