Observations of the natural world and the fossil record have revealed that the most basic form of life is a cell, yet how life comes to be is unknown. Perhaps the holiest grail of all scientific pursuit is to produce life from nonlife. Such is the goal of origin-of-life researchers and those in the field of artificial life who are pursuing the creation of the first “protocell,” which would be the first living cell created from the “bottom up.” Researchers pursuing this goal combine knowledge of the chemistry of single-celled organisms and the composition of the earth’s primordial sea, as ascertained from the geological and fossil record and from studying single-celled organisms today, to try to create a cell from scratch. Their attempts have led to the creation of vesicles, “constructs,” and “pre-protocells” (Plate 2), which possess some of the properties of living cells, but thus far no cell with all the properties of life has been created in such a manner (Figure 6.1).
Other scientists are pursuing the same problem with an opposite, “top-down” approach. They are taking single-celled organisms and removing their DNA bit by bit (or synthesizing new “minimal” DNA from scratch) to try to understand what a “minimal genome” is that allows the cell to still function with the basic properties of life. In March 2016, Craig Venter and his team at the J. Craig Venter Institute, who also were the first to decode the human genome, succeeded in creating a “minimal cell” with only 473 genes made from synthetic DNA. They believe that this version of the bacteria Mycoplasma mycoides, which they named JCVI-syn3.0, offers “a working approximation of a minimal cellular genome, a compromise between small genome size and a workable growth rate for an experimental organism.” Note that their approach is gene-centric. They are not removing parts of the cell—for example, features of the membrane, ribosomes, plasmids, or cytoplasm—but instead define a “minimal cell” solely by its having a “minimal genome” but with all the other parts of a prokaryotic cell. Yet, because they are synthesizing the DNA and implanting it into a cell that has had its DNA removed, their techniques and research aiming to create minimal cells in general are classified as part of the relatively new field of synthetic biology, defined as “the attempt to engineer new biological systems.” It is hoped by researchers at both ends of this spectrum of synthetic biology that their work will “meet in the middle” and thereby unlock the key to creating life from nonlife.
Although pre-protocells are not biological entities since they are not living cells, they exhibit a number of interesting “programmable” physical and chemical properties that appear lifelike, as architect Rachel Armstrong states. Armstrong promotes the use of pre-protocells from synthetic biology as a means to grow “genuinely sustainable” buildings and cities from the bottom up. She describes pre-protocells as “chemically programmable” because the results are predictable depending on which combination of chemicals one adds to the solution, so long as it is confined to a beaker or other controlled environment. Some of the chemicals that in combination can form pre-protocells, which must always be in water, include olive oil, different metal ions, calcium chloride, sodium hydroxide, ferrofluid solution, and copper sulphate, among others. Pre-protocell properties include the ability to move through liquid, to respond to light and vibration, to cluster together, and to selectively let different materials into and out of their bordering membrane while transforming some of that material into energy. While all of these properties are explainable with insights only from chemistry and physics, Armstrong chooses to interpret these properties using language that imbues them with biological-sounding qualities. First and foremost in this regard, she consistently refers to her creations as “protocells” when they are pre-protocells; this choice leads her followers to think that protocells have already been created. She says that pre-protocells form “populations” and that they have a “metabolism” and secrete “waste,” some of which are solid precipitates like calcium carbonate of which limestone is basically constituted.
Armstrong therefore has promoted their use for shoring up the foundations of Venice, asserting that the calcium carbonate precipitate will make rotting piers solid again: “The protocell system would be released into the canals, where it would prefer shady areas to sunlight. Protocells would be guided towards the darkened areas under the foundations of the city rather than depositing their material in the light-filled canals, where they would interact with traditional building materials and turn the foundations of Venice into stone.” She adds that “at the same time, a limestone-like reef would grow under Venice through the accretion and deposition of minerals” (Figure 6.2). These proclamations, backed up by the fact that she collaborates with scientists in Venice at the European Center for Living Technology who are trying to create protocells, have garnered her notable publicity. Subsequently, she was invited to exhibit her concept at the Canadian Pavilion at the Venice Biennale in 2011 in collaboration with architect Philip Beesley, professor at the University of Waterloo, Ontario. Together, they installed Beesley’s piece Hylozoic Ground (2011) (Plates 2 and 14), replete with pre-protocell flasks of two types and tubes connecting the flasks to the water in the Venetian Lagoon. As people visited the gallery and exhaled carbon dioxide, the gas was absorbed into the canal water in the tubes; the pre-protocells responded by changing color and “demonstrat[ing] a carbon fixation process where the waste gas was recycled into millimeter-scale building blocks. In this way metabolic materials turned products of human activity into bodily components for the construction of Beesley’s giant synthetic ‘life form,’” she summarized. To clarify, the “metabolic materials” she refers to are the chemicals in the flasks, and the “bodily components” they produced in the form of millimeter-sized calcium carbonate particles are “bodily” only metaphorically, in the sense that she views Beesley’s sculpture as a form of artificial (synthetic) life. These “bodily” components do not “construct” any architectural or structural support in any way for the sculpture; they simply are part of it, in the water inside the flasks.
Although Armstrong and Beesley’s work offers the best example of “protocell” architecture/design/sculpture, it demonstrates art–science interdisciplinarity involving only the “bottom-up” protocell-research branch of synthetic biology. As Venter’s work demonstrates, the other branch of synthetic biology works with genetically manipulating cells, not only to ascertain a “minimal genome” but also to alter the functional properties of cells in order to have them produce designed outcomes such as the secretion of chemicals useful in other industrial processes. This second “top-down” branch, referred to as engineering synthetic biology (shortened here to engineering synbio), is also attractive to architects and designers who hope to use its techniques to create new bio-based materials or biofabrication methods for design. For example, the work of architect David Benjamin in collaboration with synthetic biologist Fernan Federici, for the Synthetic Aesthetics project and book, aimed to uncover growth algorithms using engineering synbio processes that could then be used for architectural design purposes.
Benjamin does not just envision using synthetic biology for novel conceptual and algorithmic approaches to design, however. According to an interview with William Myers, he is building a compendium Registry of Synthetic Biology Applications to accompany Massachusetts Institute of Technology’s Registry of Standard Biological Parts, a repository of “BioBrick” gene sequences created and built through annual iGEM (International Genetically Engineered Machine) competitions. Both of these—MIT’s registry and iGEM—were started by synthetic biologist Tom Knight in 2003, and they form core apparatuses for the “top-down” engineering branch of the field. Benjamin’s registry would contain puzzles and problems that could potentially be solved using synthetic biology, functioning as a database of future research ideas. He also claims he is working with a major software company to “explore the intersection of architecture, synthetic biology, and computation. We are looking to advance the use of software tools in synthetic biology and we think this might help both experienced synthetic biologists and non-expert designers—architects, artists, material scientists, computer scientists, and all types of students—to improve their capacity to deal with biology.”
Benjamin is a principal of the architectural firm The Living, which created the project Hy-Fi that won the 2014 Young Architects Program competition at the Museum of Modern Art PS-1 (Figure I.3). The structure was created from bricks made by the biodesign company Ecovative from mushroom mycelia and corn stalks. In this case, as in some of the other design visions described in this chapter as well as in some of Phil Ross’s design work using mushroom mycelia, synthetic biology is not actually integrated into the process (Figure I.4). This is because for Hy-Fi and Ross’s Walnut Legged Yamanaka McQueen (2012), it was not a necessary technology to implement the design. But like Benjamin, Ross is researching ways that synthetic biology might be useful in the process of creating designs using mycelia. He has been working in synthetic biologist Drew Endy’s lab at Stanford University exploring the possibilities. In other biodesign examples included in Myers’s book, despite being referenced, synthetic biology is not actually integrated into the design projects because as a technology, engineering synbio is not yet successful enough to be useful in realizing the designer’s visions. In these cases—such as Michael Burton’s Nanotopia for the Future Farm project (2006–7) (Figure 6.3) or Alexandra Daisy Ginsberg’s Designing for the Sixth Extinction (2013–14)—the works function more as speculative design or “design for debate.” Yet, despite the fictional nature of the projects, a number of architects and designers express their opinion that synthetic biology should be integrated into architectural and design education to prepare designers for this imminent reality.
This chapter therefore explores the two branches of synthetic biology in relation to the approaches of each, and how some architects and designers envision using these technologies. After introducing the scientific approaches in a bit more detail, it elaborates on and critically analyzes proposals of architects and designers working in this area. It concludes by exploring four themes that run throughout both this chapter as well as the book, therefore serving as a conclusion to both. The first theme is that of complexism and complexity theory, which is engaged by “bottom-up” protocell researchers but largely ignored by “top-down” engineering synthetic biologists. In many ways this makes sense, since the first group is trying to put together materials that can “self-organize” to produce the “emergent property” of life, but the second group is trying to reduce biological complexity to a great extent in order to control it for standardized industrial outputs, or at least to bolster support for the idea that they can. The second theme addresses the lack of clarity in both the language used by generative architects working with pre-protocells and in their visual depictions or material realizations of their ideas. Ambiguity in terminology, like Armstrong’s choice to use the word “protocell” to describe what actually are pre-protocells, and ambiguity in visual works that may or may not be created as “design for debate,” can produce confusion about the actual reality versus the possibility of the concept or its technology. Third, in engineering synbiodesign (adapting Myers’s use of “biodesign”) projects especially, ideas of eugenic improvement surface with regularity, so these are reconsidered in relation to the other examples of eugenic thought offered throughout this book. And finally, since “protocell” architects and designers promoting engineering synbio both claim that they pursue these technologies in order to create more “sustainable” solutions, this topic then frames the book’s end.
“Top-Down” Engineering SynBio and SynBioDesign
As stated above, synthetic biology as a scientific discipline generally claims two complementary territories of research that are at least partially viewed as different means to a shared goal of trying to ascertain the basic chemical ingredients of life. The first approach, protocell research, combines “nonliving matter” to try to create a living cell from the bottom up. Whether this cell will resemble the simplest single-celled organisms that we know of today, or whether it will be simpler or different, is not clearly specified or even known. But the phrase “minimal cell” is used to describe scientists’ aim to achieve life at its most basic level. The second approach, top-down engineering synbio, begins with “living matter.” This is a strategic term that not only works to link “living matter” into a continuum with nonliving matter, as the diagram shows. It also works to desensitize readers to the standard process of this branch of the field: genetic manipulation of living organisms. By substituting the word “matter” for “organism,” life seemingly becomes less special and less autonomous.
Engineering synbio draws heavily from electrical and computer engineering in its conceptual logic and terminology. For example, designed gene combinations are described as “circuits” that have “toggle switches” and “logic gates”; multiple circuits can be combined into “devices.” The design of circuits and devices is thought through using processes of mathematical modeling. Once designed, the circuits or devices are inserted as plasmids into host prokaryotic and eukaryotic cells, often using the processes of transformation through heat shock, polymerase chain reaction, and then gel electrophoresis to ascertain if the transformation was successful. The host cell is referred to as a “chassis” for its sole role of providing the necessary cellular support infrastructure for the inserted gene. In other words, it functions as a “machine” or a “workhorse” to transcribe, translate, and replicate the inserted DNA in order to produce its intended chemical output. The most common chassis by far is the bacteria Escherichia coli owing to its quick reproducibility, relative success in being transformed, and known properties from decades of research. In general, bacterial host cells are fed with a sugar-rich nutrient media that offer the energy and supporting molecules necessary for biofabrication. Sometimes, the goal of engineering synbio is not make a chemical product but rather to design a new form of living organism that can perhaps then serve as a chassis—such is the claim of Venter regarding JVCIsyn3.0.
Two other defining features of engineering synbio play a major role in the approach and growth of the field. The first is the use of “standardized parts,” introduced by Knight, to design circuits and devices. He named the basic part a “BioBrick,” which is a genetic sequence that can serve a standardized function when inserted into another organism. This means that regardless of what organism it is inserted into, it is only qualified to be a BioBrick if in fact it does produce the outcome it is claimed to produce. Owing to the importance and complexity of genome architecture and the interactions that naturally occur within it, say gene-splicing and recombination, however, to always produce the intended output is a tall order. Therefore, a BioBrick can be removed from the registry just as it can be added to it as greater knowledge about the gene sequence is uncovered. Because it is difficult to define what a “gene” is, much less what it is as a “standardized part,” definitions of what constitutes a BioBrick have also changed rather consistently. BioBricks were originally housed at MIT and could be accessed and ordered through the Registry for Standard Biological Parts, although owing to growth in the field, other sources now exist. The second major feature, the iGEM competitions, has become the best-known contributing mechanism for adding BioBricks into the registry and recruiting students from around the world to study engineering innovations using synthetic biology.
Interestingly, engineering synthetic biologists describe the history of their field as beginning in the early 1970s, with the discovery of restriction enzymes (what they call “molecular scissors”) as “a form of genetic ‘cut and paste’ in which genes could be removed from one organism and introduced into another.” Since at this point much less knowledge about genomes and molecular biology existed in general, “the power of restriction enzymes could not be harnessed much beyond the serendipitous.” Without any infrastructure for “an engineering industry, . . . this biotechnology version 1.0” persisted into the 1990s. With increased knowledge of genomics particularly with regard to specific crop species, “genetic engineering” from the 1990s onward introduced genetic modifications that in a “hit or miss” fashion gradually took hold. Engineering synthetic biologists also tend to characterize “biotechnology” or “genetic engineering” as “a disappointment” that “grossly undershot its promise” precisely because it lacks the engineering theoretical basis and methodology of synthetic biology that supposedly would allow standardized outputs. When not characterized as a disappointment, genetic engineering is considered to offer only “bespoke solutions” for individual challenges and not an across-the-board platform for “cut and paste,” “drag and drop,” or “plug and play” genetic circuit design. The difference between “genetic engineering” and “synthetic biology” is thus characterized as being comparable to the difference between artisan production by hand and mass production at the onset of the Industrial Revolution. Yet, continuing this analogy, both approaches aim for a similar goal even if they end up being realizable on different production and economic scales.
These fine-grained historical distinctions between what the general public may perceive as being part of the same discipline (usually called “genetic engineering” or “biotechnology”) reveal strategies of internal posturing among those in synthetic biology. These characterizations of others in their general field serve to distinguish what are significant conceptual distinctions in their approaches in order to hopefully garner financial support from those who can help grow the field of synthetic biology, whether granting agencies or biotech corporations. Within the discourses of generative architecture, earlier publications like Estévez’s Genetic Architectures series and Antonelli’s Design and the Elastic Mind refer mostly to genetic engineering without distinguishing it from synthetic biology. Since the publication of Myers’s BioDesign and Synthetics Aesthetics, more designers and architects have begun using the latter term.
Just to be clear, in this book the terms are distinguished by a few factors. Genetic engineering refers to altering DNA sequences in a particular organism’s genome in order to affect some aspect of its form or function while generally desiring basically the same organism to result, hopefully with the added features. With regard to architecture, these are envisioned to be large-scale organisms, not single cells, and usually it is the form of the organism that is imagined to be changed. Synthetic biology, on the other hand, seems to be less interested in altering the form of an organism and more interested in altering its function. It works primarily with fast-reproducing bacteria and views them as machines that can produce chemicals. The genetic alterations can use synthetic DNA or BioBricks, whether synthesized or extracted from natural DNA. Engineering synthetic biologists speak of their field as the means to the Second Industrial Revolution, one that is bio-based but otherwise envisioned as similar in terms of resulting in new, highly profitable innovations based on consistent, large-scale mass production.
Few of the ideas being put forward as synbiodesign have actually been realized since the scientific field itself is in its “infancy,” as its supporters claim, implying a long and successful disciplinary life in the future. Top-down engineering synbio is still trying to garner enough successes to substantiate its methods and frequently exaggerated claims. Synbiodesign pieces therefore usually exist as images or concept prototypes that have been published in books or included in exhibitions. Ginsberg, one of the leaders of the Synthetic Aesthetics project, notes, “It is easy to forget that many of the outputs of the residencies are fictional. Will [Carey] and Wendell [Lim]’s packaging that builds its own contents is a computer-manipulated image, as are many of the images on these pages.” A number of works included in these recent exhibitions and publications are from designers in the United Kingdom created in the vein of Anthony Dunne and Fiona Raby’s “speculative design,” “critical design,” or “design for debate.” Now, the Design Interactions Department at the Royal College of Art includes synthetic biology as one of its research areas, as well as professors Oron Catts and Ionat Zurr, who are leading the study of “contestable design.”
For example, the Design and the Elastic Mind exhibition included a section on “Design for Debate” featuring the work of faculty and students from the Design Interactions Department. Two of these were Burton’s Nanotopia from the Future Farm project (2006–7) and The Race (2006–7). In the first, he visualizes human bodies transformed through genetic engineering, synthetic biology, nanotechnology, and pharmaceuticals in order to grow novel body parts for harvesting for use by other people, within socioeconomic contexts of inequality that exploit the poor. The second depicts pets engineered to be extra hairy and human fingernails engineered to have cascading ridges so that bacteria can be harbored in the hair and crevices in order to expose overly hygienic “first world” humans living under the legacy of hygienic modern design to more bacteria. If our immunity does not improve, the piece implies, bacteria and other humans and species who live comfortably in dirty, bacteria-rich environments may survive in “the race” (evolutionary competition). Burton’s work is thoughtful and more questioning than promoting of the implementation of this new technology for the ways it engages socioeconomic disparity and the injustice of an economic system that encourages those who need money to farm out their bodies, such as for surrogacy. They were also prescient in being created before the recent revelations and new popular scientific knowledge concerning the extent of the human microbiome.
Similarly critical are Ginsberg’s essays and her recent creative work Designing for the Sixth Extinction. The latter depicts newly created organisms made by synthetic biology—ones that have never existed before—that are designed to “support endangered natural species and ecosystems” in the wild in line with current conservationist efforts, given that “the sixth great extinction in the history of biology is underway.” She asks, “If nature is totally industrialised for the benefit of society—which for some is a logical endpoint of synthetic biology—will it still exist for us to save?” Like Catts and Zurr, she is a master of irony. Her publications make clear that rather than reach this point, perhaps more countries should follow Ecuador’s lead in granting constitutional rights to nature. Ginsberg quotes Article 71 from Ecuador’s constitution, passed by public referendum: “‘Nature or Pachamama, where life is reproduced and exists, has the right to exist, persist, maintain, and regenerate its vital cycles, structure, functions, and its processes in evolution.’ Ecuador’s constitution charges its people not only with protecting nature, but also with the responsibility to ‘promote respect towards all the elements that form an ecosystem.’”
Other works of synbiodesign seemingly harbor less criticality toward the aims and means of synthetic biology, serving more as visionary applications of the foreseen technology even if their creators say they intend to provoke debate. For example, Natsai-Audrey Chieza’s Design Fictions, like Burton’s piece, also imagines future bodies as farms; she materializes this vision through the creation of a “very precious, very valuable Genetic First Aid Cabinet.” Whereas Burton’s text accompanying his piece highlights socioeconomic injustices that may accompany body-farming, Chieza’s piece simply validates bodily alterations as precious, valuable, and even common, like do-it-yourself (DIY) sculptural tattoos made from stem-cell alterations. Her website claims that the “project makes us reconsider the role of the designer whose manufacturing process is likely to take place in a laboratory in 2075.” Such a statement reads more strongly as an affirmation of the likelihood of this happening rather than as a “reconsideration” or a “debate” about making a choice in the first place not to pursue these technologies for the human body at all.
Both Chieza and Amy Congdon were students in the Textile Futures Program at Central St. Martin’s College of Art and Design, London, where designer Suzanne Lee used to teach. Congdon’s Biological Atelier, like Chieza’s work, imagines that “biotechnologically engineered high fashion . . . might be realized one day soon.” She imagines “growing objects in the lab from our own cells or those of animals” that could be used for “personalized and renewable fashion.” Her images depict bracelets, a brooch, and a collar that are “grown, not made,” using “developments in the fields of biotechnology to create materials” such as “cross-species fur” or “ethically-grown” “victimless ivory.”
A few designers working at the architectural scale are also promoting the use of synthetic biology. Marin Sawa is another student who passed through the Textiles Future Program who has now earned a Ph.D. from the Energy Futures Lab, Imperial College, London. Her project Microalgerium aims to create textiles interwoven with “hybrid algal species” engineered to secrete oil and release ethanol for use architecturally in “everyday spaces” to “prevent and eliminate pollution and waste.” Sawa was inspired by Rachel Armstrong’s “protocell” research, which Sawa summarizes as work “where an artificial cell was created and programmed with a basic behaviour.” The chemical reactions of pre-protocells differ from the top-down approach of genetically engineering hybrid algal species, but Sawa sees both as applications of synbiodesign. She astutely recognizes the current limitations she faces with realizing her project in architectural spaces apart from a laboratory. “We realize that the creation of an engineered biological entity must be contained within the lab and not outside the lab because of its unverified synthetic biohazards to our ecosystem,” she states. “The idea of genetically encoding a biological logic of death in the case of unwanted leakage is great,” she says, referring to the idea of putting “kill switches” into genetically engineered species released into the wild. “But I think if we were to get new designs of synthetic biology out of the lab, it would be equally interesting and imperative to design secure containment and disposal systems in our physical world as a natural by-product. In this sense, this design tool actually contradicts my interest in creating an open metabolic relationship between ‘living’ textiles and the rest of the biosphere.”
Others, such as Spaniards Eduardo Mayoral Gonzalez and Alberto Estévez, who are less troubled by such difficulties than Sawa is, imagine using trees engineered to bioluminesce to provide nighttime light in urban areas. Bioluminescent light is very dim so this proposal is actually impractical for its proposed purpose, yet the designers do not mention this fact. Lastly, Benjamin has been teaching a studio on synthetic biology and architecture at Columbia University’s Graduate School of Architecture, Planning, and Preservation, which has produced a number of interesting student projects included in Myers’s BioDesign (as were the works of Chieza, Congdon, Sawa, Mayoral Gonzalez, and Estévez). For example, Mike Robitz’s project Googol Puddles imagines using urban bodies of water to store data information encrypted in the DNA of bacterial organisms living there. Another project, Algal Filter Machine by Benjamin, Nathan Williams Smith, Geoff Managh, Mark Smout, and Laura Allen, proposes a system to remove carbon dioxide from urban airways to feed algae designed by synthetic biologists to create biofuels, thereby using algae “acting as engines and filters for the environment simultaneously.”
Ginsberg is one of few designers who mention horizontal or lateral gene transfer as something to rightly consider if synthetic biology becomes practiced on a larger scale. Microorganisms, including bacteria, easily swap genes when they are in an environment where they need a function that other nearby bacteria have. Bacteria are the chassis of choice for biofabrication of chemicals that could eventually be used for design purposes, although most designers currently imagine plant or animal cells being manipulated for their designs. However, with the use of bacteria, how will the intended genetic modifications be stabilized, especially if they do their “work” outside the laboratory in the external environment, as imagined? Critics do note the possibility of engineered bacteria evolving—which to most people implies random mutation rather than lateral gene transfer—and so they propose mechanisms such as “kill switches” or “programmed cell death” that can be activated to terminate the engineered species, assuming one knows a problem exists. If engineered bacteria are in the wild, however, such knowledge would be unlikely, and theoretically the engineered genes could swap into a non-engineered bacteria that does not contain a kill switch. Lateral gene transfer thus poses a fundamental problem for synbiodesign of this sort.
“Bottom-Up” “Protocell” Architecture
The protocell branch of synthetic biology focuses on very different questions than does engineering synbio. These concern the definition of life as constituted by criteria on which scientists more or less agree (Figure 6.4). This textbook diagram uses a cell-like circle to frame the “operational functionalities of living systems,” which are separated into three separate rectangles with different systemic qualities. This makes the system appear relatively simple, yet each of the three rectangles contains a few different properties, allowing perception or interpretation of the “three” necessities with greater or lesser reductionism. Different scientists do in fact use different criteria to get at the essence of life. Some define life using biologists Humberto Maturana and Francisco Varela’s concept of “autopoeisis,” which focuses on cellular life. Cells are dissipative systems that possess a membrane, a semipermeable boundary that both separates them from and allows a selective connection to their environment. Within this boundary, a cell regenerates all it needs to live by drawing on materials from its environment. Chemist and synthetic biologist Pier Luigi Luisi describes life in fairly technocratic terms as “a factory that makes itself from within.” He holds that “a system can be said to be living when it is defined by a semipermeable chemical boundary which encompasses a reaction network that is capable of self-maintenance by a process of self-generation of the system’s components from within.”
Instead of autopoiesis, others use the “chemoton” concept first proposed by Tibor Gánti: “a minimal living system is a chemical supersystem comprising three systems: a metabolic network, template replication, and a boundary system,” where all three systems are “autocatalytic.” These are demonstrated in the inner triangle of the diagram as the primary protocell components. Alternately, Frank Harold, a cell biologist, proposes that “architecture is what ultimately distinguishes a living cell from a soup of chemicals of which it is composed”; “how cells generate, maintain, and reproduce their spatial organization is central to any understanding of the living state.” Others add to these criteria the need for genetic material (DNA or RNA) and homeostasis. Finally, protocells have been “defined in various ways, ranging from a plausible representation of a hypothetical precursor to the first biological cell, through to the description of a synthetic cell-like entity that contains non–biologically relevant components.” The latter is a goal of some synthetic biologists who aim to synthesize novel minimal cells from scratch in order to either not use current living organisms as technological machines or to design a minimal organism most efficiently toward a certain product outcome, without any “extra” stuff that evolution might have provided along the way. A similar move, in this case to create a novel genetic information system (XNA) that could be used an alternative to the A, C, T, and Gs of DNA so as not to interfere with the evolution of organisms with DNA, has been created recently at Cambridge University by Philipp Hollinger, Jason Chin, and others.
The most visible aspect of these characteristics evident in pre-protocells created in laboratories is the membrane bounding the vesicle (Plates 2 and 14). Because this membrane is what isolates the fluid and chemicals inside from the liquid outside, it is fundamental to the creation of a pre-protocell. In theories of evolution, however, the primordial sea is thought to have contained supramolecular chemical aggregates prior to their joining together to form a cell. For example, cell membranes comprise a bilayer of lipids that are made up of fatty acids “with a sufficiently long linear hydrophobic chain,” phosphate and glycerol; so, in order for a membrane to form, these constituents must already be present. It is not easy, by the way, to theorize the processes by which all the different chemical constituents came into being in the first place. Although the primordial sea may have contained supramolecular chemical aggregates such as DNA or RNA prior to the formation of the first cell, in the creation of pre-protocells today this material—the source of “template replication,” heredity, or “genes”—is not always included. This is because cells have so many important parts that also must “self-assemble” that research into the creation of protocells focuses on understanding each of these different facets. For example, the pre-protocells that Armstrong and Beesley create do not contain any genetic material—at least, this is not stated in the publications describing them. Rather, they are chemical vesicles or constructs that exhibit chemical reactions without having even the three basic features required for life, as defined by Gánti’s chemoton concept or by the textbook diagram (Figure 6.4).
Armstrong collaborates with some well-known protocell researchers at the European Center for Living Technology and at the Center for Fundamental Living Technology (FLinT) in Odense at the University of Southern Denmark. In 2011, when she and Neil Spiller guest-edited an issue of AD, titled “Protocell Architecture,” she was a visiting research assistant at FLinT, where Martin Hanczyc, Steen Rasmussen, and Mark Bedau are based. Rasmussen is also affiliated with the center in Venice, along with Norman Packard, and both also are connected to the Santa Fe Institute. Together, these scientists have edited the primary textbook on protocells, Protocells: Bridging Nonliving and Living Matter (2009), and authored numerous articles, including “Living Technology: Exploiting Life’s Principles in Technology” (2010). Hanzyc also published an article in Armstrong and Spiller’s “Protocell Architecture” (2011).
Armstrong and these researchers often use the words “living” or “living technology” to describe pre-protocells, which they describe as having the lifelike qualities of being “robust, adaptive, self-repairing, self-optimizing, autonomous, intelligent, and evolvable.” “We deem technology to be living if it is powerful and useful precisely because it has the core properties of living systems,” they write, “including such properties as the ability to maintain and repair itself, to autonomously act in its own interests, to reproduce, and to evolve adaptively on its own.” They predict that “as our technologies increasingly embody such core properties of living systems, they will become increasingly powerful, natural, and sustainable,” although why they think the latter two are true is not supported. They state that in the past, humans harnessed oxen and horses as sources of power to do work, although with the invention of the internal combustion engine and the onset of the Carbon Era, animals were replaced by machines. “In the coming technological revolution, the technological systems themselves will become alive or very much more lifelike,” they state, “bestowing the advantages of life on the wider sphere of material and technical innovation.”
In contrast to Armstrong, in his own publications Beesley more cautiously uses the term “near-living” to refer to the artificial-life qualities of Hylozoic Ground, created in collaboration with Armstrong, as well as those qualities present in many of his other responsive and interactive installations. Although his architectural firm is named Living Architecture, his writings make it very clear that his research bridges the domains of architecture loosely defined and “hard” artificial life, which explores physical and material “implementations of lifelike systems” such as those used in robotics. Beesley uses techniques of generative design to create exquisite environments made from tens of thousands of laser-cut acrylic pieces that are connected into meshes, membranes, and webs and suspended from ceilings so that viewers can walk under and through them. The works have what might be called appendages, branches, fronds, and feathers that are embedded with tiny microprocessors, sensors, and lightweight actuators connected into a distributed communication and control system. Together, they function kinetically, moving slowly in response to the presence and motion of people in the room or other factors they are designed to sense. His works often evoke emotional, affective responses in viewers even without the addition of “protocell” flasks; when these are present, they add one more layer—a “wet” one—to the artificial lifelikeness of his work.
In presenting her concept of Future Venice, Armstrong communicated her concept not only through the “protocell” flasks in Hylozoic Ground but also through talks accompanied by visualizations, still and video, created by Christian Kerrigan, an artist in residence at the Victoria and Albert Museum around 2010 when Armstrong and Hanczyc were discussing protocell research. Armstrong and others describe the mixture that creates pre-protocells, as was demonstrated by the flasks in Hylozoic Ground, as “reminiscent of salad dressing.” Should large amounts of this actually be dumped into the lagoon, many people might worry about effects equivalent to an oil spill damaging the local ecology. Armstrong asserts that “protocells” form at the interface between the oil and mineral-rich water, but she does not discuss the fact that the lagoon is open water and not a scientifically controlled glass vessel. Currents or variations of the proper chemical composition certainly would hinder this predicted formation, even if the “protocells” are “chemically programmed” to move away from light (Figure 6.2). In fact, a number of Kerrigan’s renderings actually depict the opposite of what Armstrong claims will happen (Figures 6.5 and 6.6). They show the canals and lagoon filled with rock formations. Rather than not use his images or explicitly address their critique of her vision, Armstrong publishes them without a hint of recognition. Is her strategy similar to Matthias Hollwich and Marc Kushner’s, whose Econic Design can also be read two ways, straight and as farce, or is her work meant to be “design for debate”? Is her oil spill in a lagoon equivalent to their use of kudzu, since both are forms of humanly caused environmental devastation that now are being imagined to form the basic infrastructure of a new “sustainable” urbanism?
Armstrong and others following her lead imagine that “protocells” can form a revolutionary sustainable architecture. Paul Preissner, architectural professor at the University of Illinois at Chicago, asserts, “It only takes a few moments to be taken in by the utterly fantastic possibilities protocells offer the world; for example, these real and shapeable life forms promise to grow us limestone faster than limestone. Starting from oil and water and a few more things,” he explains, “the resulting calcification suggests a material residue that is not only agreeable, but also useful, essentially giving us the ability (not unlike our novelty plant-imal the Chia Pet) to grow our surrounds—although, instead of sheep or heads of hair, we can think about growing our buildings. Buy some land, mix up some salad dressing, sit back a couple of decades, and then move right in. Wild.” Armstrong believes that “ultimately metabolic materials will give rise to a whole new range of architectural forms that could potentially represent the architectural equivalent of the Cambrian Explosion, when according to fossil evidence, most major groups of complex animals suddenly appeared on the earth around 530 million years ago.” Wired magazine writer Bruce Sterling writes, “I really enjoy that rhetorical way in which Dr. Armstrong ‘talks architecture’ while saying some of the weirdest stuff imaginable.” Enjoyment is one thing; taking this vision seriously is an entirely different matter. Preissner calls protocell architecture “utterly fantastic” and uses the phrase “to be taken in,” implying gullibility. Are we supposed to find these visions credible?
For example, Armstrong seemingly forgets, but then remembers, that human beings do not thrive on living in aqueous environments, or perhaps she just forgets that architecture as a discipline has been and is intended for human occupation. Throughout her various publications, she predicts “protocell” cities of the future as the new sustainable architecture, equipped with “the principles of emergence, bottom-up construction techniques, and self-assembly,” albeit necessarily in a wet environment. In contrast to the “traditional architectural approach to meeting the challenges of hostile environments” by creating “the most effective possible barrier between nature and human activity, using durable and inert materials,” she prefers the ways that “algae, shellfish, and bacteria have claimed a construction process” within the harsh terrain at the edge of waterways by “accreting, secreting, remoulding, and sculpting the materials of their surroundings to create tailored micro-environments.” Human architecture “has worked sufficiently effectively for human development” in the past, she states, “but on an evolutionary timescale it’s not how the most resilient structures persist.”
Armstrong rarely acknowledges the need for persistent wet conditions for protocell action as a “design limitation” that requires a “troubleshooting” solution. To remove the necessary protocell “medium” of water from the space presumably occupiable by humans, she proposes at times the “creative design of water removal systems” that would still permit the “feeding” of “computational materials,” by which she means the “protocells,” referencing complexity science’s terminology of “natural computing.” One such system could use porous rock to offer structural rigidity to the building while functioning as a water supply for “chemical computers.” This might cause one to ask why the protocells are necessary at all, if a stone structure already exits—after all, stone lasts a very long time. To this, Armstrong might reply that “protocells” offer “self-repairing architecture.” Since they secrete calcium carbonate—what Preissner refers to as limestone—then presumably if the porous stone started to erode, perhaps owing to constant water motion inside it, the protocells could reinforce it. Yet without water being added, the stone is highly unlikely to erode.
The ambiguity of whether “protocell” architecture is actually imagined to be structural as opposed to just a “self-repairing” surface persists in other examples. Consider, for example, Lisa Iwamoto’s contribution to the AD issue on protocell architecture, whose title—“Dynamic Array: Protocells as Dynamic Structure”—implies that “protocells” create a structure, albeit a dynamic one. (This raises another interesting question: Is “protocell” architecture motile like water, or stable like limestone?) Yet the caption to the first image in Iwamoto’s article states, “Detail view showing aggregation of protocells along lines of structure,” calling into question whether the “protocells” are the structure or whether they are on it. Iwamoto writes, “A driving concern for Line Array (2010) was how to envision a protocell modality suitable for architecture that could be applied to a range of structural surface formations. Protocells are used here as a self-organising structural matrix,” but this is surely on the surface of another structure, since she asserts that “protocells” in their aqueous environments have the ability to “circumvent gravitational conditions as well as aggregate without concern for larger-scale, hierarchical structure.” Similarly ambiguous about the structural use of “protocells” given the rest of her explanations of “protocell” architecture, Armstrong envisions protocell paint for buildings that she is developing in collaboration with chemist Leroy Cronin at the University of Glasgow. “If buildings were covered in a layer of [protocells], they would act as a sort of smart paint, absorbing carbon dioxide from the atmosphere,” she states. “When the building got wet the mineral salt would dissolve, react with the carbon dioxide in the rain, and produce a deposit of mineral carbonate, which would strengthen the bricks.” Note the bricks. As “carbon dioxide would be removed from the atmosphere,” she asserts, “over time . . . the building would become more robust.” Yet elsewhere, she and architectural professor Neil Spiller refer to “protocells” as a “synthetic surface ecology.”
The contradictions over “protocell” use as structure versus surface deepen when Armstrong attempts to explain how “protocell” architecture is superior to “green building.” She claims that her approach is unlike green-bling, or “gling” architecture, which she criticizes for covering buildings with greenery when the building itself, made with the same normal structural materials, keeps “the fundamental unsustainability of modern architectural practices” unchanged. How is this different from using stone as the porous structure of a protocell building or applying protocell paint to a brick building? “Green walls and roofs require constant energy, water, artificial fertilizers, maintenance, and a high upfront cost to create the illusion of a mature and self-sustaining ecosystem,” Armstrong argues, adding, “Once installed, these systems are resource-intensive and require daily upkeep from external sources, which effectively outweighs any environmental benefit they offer.” One of these benefits is the removal of carbon-dioxide from the atmosphere, the same benefit Armstrong extols for “protocell” architecture, which also requires energy in the form of light, chemical additives, and maintenance, even if the upfront cost of olive oil might be considered to be relatively cheap.
Armstrong states these points explicitly when describing how “protocell” “salad dressing” can replace traditional ready-mix concrete to hold up fence posts. “You take your spade of [new] ready-mix concrete and stir it into a bucket containing a greasy solution, reminiscent of salad dressing. The solution congeals as the chemistry of the concrete is taken up into the protocell droplets, and you pour the mixture into the hole.” She describes how “the mixture swells and almost instantly supports the pole with its turgor”; “it now resembles a large lump of jelly. Bubbles start to appear and are quickly turned into a precipitate as the released carbon dioxide from the reaction is absorbed into a solid form.” Time passes. “The sun comes out. . . . The world turns, the rain falls, the snow comes. . . . By the end of the year, it is time to add a new protocell material to the base of the post. This is a species of strengthening agent.” Yes, “each year, you come back to the post and make an assessment regarding what processes are required for the post to be kept in place, and each year a new protocell species is added.” Instead of “gling,” she desires a “new kind of biology [insert chemistry] for the built environment that is native to its context and . . . genuinely sustainable. In order for this to happen, the basic materials that underpin this system need to be developed using a bottom-up approach.”
What does it mean to use a “bottom-up approach” when humans are the ones developing a chemical system, picking its basic ingredients, and adding the new “protocell species” required each year? A fundamental contradiction in agency is at play here between humans designing something and putting all the parts together and something just making itself through self-assembly. The latter requires all the right materials being proximate to one another in the correct environment, which they would not be now without human design and action. Even if cells formed in the primordial ocean, at this point pre-protocells are being created in laboratories and in artworks by humans thinking very carefully about what molecules are necessary to produce the proper precipitates. This process is no different from what humans have always done—combining materials to produced desired effects—except now owing to the popularity of complexity theory and self-organization, some are strategically calling this “self-assembly” since chemistry happens.
The tiny scale—half of a millimeter—of a pre-protocell just exacerbates the bottom-up architectural problem, although truly building from the bottom up, molecule by molecule, is by definition always going to be very small. Its precipitates are even tinier. Just how much olive oil and mineral additives, floating in (a contained body of) seawater, does it take to “self-assemble” a human-scale building, presumably with rooms inside? How are “protocells” “chemically programmed” to create rooms? “Protocell Architecture” is full of blown-up images of micron-sized “protocells” allowing us to see and imagine this otherwise nearly invisible-to-the-naked-eye future. Cronin describes in his article in the issue, “Defining New Architectural Design Principles with ‘Living’ Inorganic Materials,” how his lab aims “to reduce the fundamental building block of building materials from the centimetre (real bricks, nails, concrete blocks) to the same dimensions as the building blocks of biology and to produce inorganic cells. Imagine the outcomes of establishing such a paradigm,” he writes. “Buildings would have a cellular structure with living inorganic components that would allow the entire structure to self-repair, to sense environmental changes, establish a central nervous system, and even use the environment to sequester water, develop solar energy systems, and regulate the atmosphere, internal temperature, and humidity using this decentralized approach.” The stratospheric air of his vision is brought back to earth by the caption for an image of a crystal tube that states, “The diameter of the tube is around 0.0001 millimetres.” Yet, his own description jumps from the micro to the global scale: “To be useful, to create systems with this degree of sophistication requires a robust chemical library of structures with embedded chemistries that are adaptive, resilient, environmentally compatible, and realisable on a global scale.” The word “global” can, of course, be used relatively to speak of the broad limits of a system, as it is in complexity theory and in generative architectural practice. With a slight bit of caution, he adds, “The global deployment of such a fundamentally new building platform, though, should probably not be permitted until we are able to get to grips with the concepts of artificial inorganic ‘living technology.’” This could take a while.
Full Circle. Stop.
Synthetic biology is the last scientific field discussed in this book, which has ranged through the founding of complexity theory in relation to cybernetics and its current ideological manifestation in complexism (self-organization, emergence, natural computation), to changing theories of biological evolution, morphogenesis, genetics, and epigenetics, to the applied practices of tissue and genetic engineering and synthetic biology. Fittingly, the two branches of the latter bring us full circle to a few themes that have surfaced multiple times across the different chapters.
This first recurring theme is the pervasive application of complexism, informed by materialism, as a primary interpretive tool for characterizing the processes of the three main systems explored here together: architectural, computational, and biological systems. Complexism provides the language of “bottom-up” versus “top-down.” The former is heavily favored now for its convenient and useful connotations of being untainted by human intervention and its connotations of matter having agency. Because the universe consists of energy and matter, which Einstein showed to be equivalent, and matter takes shape in the form of atoms, and because atoms combine into molecules and molecules constitute the physiochemical basis of cells, many scientists and philosophers of science consider life to be an emergent property of the self-organization of matter. Although the sciences of physics and chemistry have thrived under reductionism—the idea that wholes can be broken down into and sufficiently understood and described by their constituent parts—biology is posited by some to be a “special science,” one that cannot be reduced to the laws of physics and chemistry. This debate informs the earliest definition of self-organization by philosopher Immanuel Kant in 1790, who was attempting to identify what separates living organisms from other objects, as well as the degree to which science can help us understand what is unique about living organisms. Thus, the founding of the discipline of biology is intertwined with the philosophy of emergence and its related concept, self-organization. What are now facets of complexity theory are key framing devices by which even disciplines cordon themselves off from one another.
Interestingly, although the bottom-up protocell research branch of synthetic biology embraces complexity theory for its descriptions of self-organization and acquired hierarchies of order and complexity—which seem necessary for the move from self-assembled vesicles to protocells to actual cells, with their spatial architecture, organelles, et cetera—the top-down engineering branch of synthetic biology keeps complexity theory at a distance. This is because the former need the molecules to do the hard work. Scientists can provide the right chemicals in the right environment, but the “life” is going to have to “emerge” through “self-organization” and “self-assembly.” On the other hand, top-down engineering synthetic biologists are striving for control over an already living system; they take living cells and alter their genetic makeup and hope for a precise outcome. The fact that as recently as 2014, students in a synthetic biology course that I took at UC Davis were still being taught the central dogma of molecular biology as the ruling theory, reveals the extent to which theory is put to work to bolster the self-image of an engineer who can control, even if this theory is now known to partially describe only one part of heredity owing to new knowledge of the rest of the genome and epigenetics. Students were also not introduced to epigenetics, the rationale being that synthetic biology is already challenging enough without the addition of another system that complicates it. Critiques of the central dogma as oversimplistic given recent postgenomic discoveries are well recognized among many groups of scientists. Yet, computer scientists and most architects designing genetic algorithms and synthetic biologists engineering organisms still teach the dogma as the primary theory. Its simplicity, without the complications of postgenomics or complex systems interactions, is necessary for a relatively easy theorization of how genetic circuits work. Without this, it becomes much more difficult to uphold the idea that synthetic biology can in fact produce standardized products. If other branches of biology are aware of systemic processes that are environmentally responsive and clearly affect gene regulation and heredity, then the fact that engineering synthetic biologists can willfully ignore this knowledge in order to shore up their nascent discipline simply shows how determined they are.
In contrast to bottom-up approaches that are all the rage, top-down engineering synbio carries associations of a bygone modernist paradigm of control, except for the fact that this paradigm is anything but gone. Scientists working in this field are unabashed about stating their desire to design, engineer, and control. Yet, even beyond the field of synthetic biology, natural or social or economic processes that are cast as “self-organizing” are still targeted for co-optation by scientists, social scientists, economists, and designers. These practitioners either want to predict and profit from the outcomes of these systems or intend to use “self-organization” to generate designed outcomes. This use or control of so-called self-organizing systems produces a problematic agency and fundamental definitional tension, owing to the “self” part. That aspects of the modernist paradigm are alive and well is made clear simply by the engineering mentality of synthetic biology (Figure 6.7). As this comparison of images first put forward by Ginsberg in one of her essays in Synthetic Aesthetic shows, the hand of the designer and the pipette referencing a scientist’s hand reveal the agency that creates the design, be it Le Corbusier’s ideal city Ville Radieuse (1930) or JCVIsyn1.0 (2010), the first self-replicating synthetic bacteria cells ever created. Similarly, Patrik Schumacher is not shy about his dislike of postmodern and deconstructivist architecture and favors a return to the control and homogeneity that modernism offered (Figure 1.14). Whereas biologists try to understand living organisms, synthetic biologists want to manipulate and use them. Bottom-up protocell research intends the creation of protocells for use in synthetic biology as artificial cells that can be engineered to produce desired products, as well as for knowledge gained about the creation of life—which after all, if it is achieved, will put humans in a position once occupied by the deities.
The few articles that address facets of complexity theory or systems biology in relation to top-down engineering synbio primarily pertain to the issue of “stochasticity” or “noise” in gene circuits. Gene circuits are mathematically modeled in computers, but when DNA plasmids are actually inserted into an organism, “very often inherent stochasticity of gene expression process strongly influences dynamic behavior of the network.” In fact, “stochastic fluctuations (noise) in gene expression can cause members of otherwise genetically identical populations to display drastically different phenotypes.” Therefore, “an understanding of the sources of noise and the strategies cells employ to function reliably despite noise is proving to be increasingly important in describing the behavior of natural organisms and will be essential for the engineering of synthetic biological systems.” Scientists therefore are exploring different methods to regulate the effects of “noise,” for example by designing transcriptional cascades of different lengths that can either “attenuate or amplify phenotypical variations depending on the system’s input conditions.” It may be that “noise” is due to epigenetic processes that most assuredly can produce unpredictable phenotypic outcomes in engineered cells.
To phrase this limited use of complexity theory in engineering synbio slightly differently, top-down synthetic biologists draw a narrower boundary around what they see as their “system” than do biologists working in the fields of cell biology or systems biology. Top-down synthetic biologists limit their view primarily to the engineered genetic circuit. They apply complexity theory within the “system” of the genome and the gene circuits they create and, after installation, to its phenotypic expressive variability. They do this instead of, from the outset, integrating elements outside the gene circuit—epigenetic markers such as methylation patterns or other important interactive molecules already present in the “chassis” cell—into their theoretical and mathematical models. As scientists Pengcheng Fu and Cliff Hooker summarize, “Systems biology is inherently a universe in which every ‘ome’—genome, transcriptome, proteome, metabolome, interactome, phenome, and so on, is another dimension. We have to reduce this dimensionality through integration in order to comprehend, evaluate, and make use of the information.” Or, as biologist Michael Bolker states, “The inherent complexity of biological systems renders any strict calculations impossible and thus poses an enormous challenge to synthetic biology.” Because of this, “two alternative strategies have been adopted by synthetic biologists to deal with this problem: (1) Reduction of complexity by applying engineering principles to biology like standardization and modularization and (2) orthogonalization through chemical or biological modification of synthetic cells to prevent genetic interactions with other organisms.” Selectively drawing one’s boundaries around a system so as to frame it as you need to see it is common practice for many people in both science and daily life. Yet, doing so does not remove the actual interactive relations that extend beyond the edge of the imagined boundary. Narrowly drawing system boundaries just provides a short-term conceptual coping tool. This is as true for how a synthetic biologist chooses to apply complexity theory—either for its concepts of self-organization and emergence, as protocell researchers do, or for its mathematical tools for analyzing stochasticity and contingency, as top-down synthetic biologists do—as it is for how synthetic biologists construct the history of their discipline in order to differentiate it from genetic engineering.
The second recurring theme in both branches of synthetic biology that echoes other instances throughout this book is the potential confusion caused by ambiguities of language and presentation. That Armstrong uses the phrase “protocell architecture” instead of “pre-protocell architecture” offers a prime example since a protocell has not yet been created and is the holy grail of protocell researchers. Granted, Hanczyc, the synthetic biologist with whom she collaborates, also talks misleadingly about pre-protocells that he makes as if they are protocells, or at least “simple protocells.” He also uses the present tense in talking about protocells: “Protocells are . . . made in a laboratory. . . . The protocell is motile.” (Sometimes, he more modestly uses “protocell-type structure”; this is closer to what other contributors to the Protocells textbook use, the term “pre-protocell.”) If scientists discuss pre-protocells as if they are protocells, then architects who collaborate with them may possibly be excused, except for the fact that their use of the word “protocells”—not just Armstrong, but others who follow, work with, and promote her—gives readers who do not know better the impression that protocells already exist. This makes her pronouncements seem much more realizable and possible, and gives her vision more respectability, on top of the fact that she collaborates with scientists and is frequently cited as a medical doctor. Outside of synthetic biology, a similar problem occurs with use of the word “victimless” by scientists, journalists, and designers using processes of tissue engineering who either fail to realize or acknowledge that the primary nutrient media used contains FCS from slaughtered calves and cows. A different type of ambiguity in generative architecture occurs from the use of words that sound the same but refer to very different technologies or modes of production. Usually the words or phrases refer to digital modes of production but they sound biological—namely, “gene,” “genetic,” “genotype,” “DNA,” “morphogenesis,” “phenotype,” “organism,” “species,” and “phylogenesis.” Another group of similar sounding terms, some of which confuse subject/object position as to what is computing and what is computed, include “natural computation,” “material computation,” “biomolecular computation,” “biocomputation,” “biosynthesis,” and “synthetic biology.”
The third theme that connects this chapter with the others is the presence of eugenics, although the word itself is never used. Its stand-ins are words such as “optimization,” “enhancement,” and “improvement” when used in tandem with any technique of genetic engineering. For example, in one “design for debate” project called Child Force by Marei Wollersberger that Marcos Cruz and Steve Pike included in “Neoplasmatic Design” (2008), the designer “explored the impact of gene technology and its ensuing ideology in relation to our current move towards heightened surveillance.” Anthony Dunne describes her work as a “cautionary tale.” Yet in the footnotes of his description of her research process, he writes that Wollersberger consulted “Horst Voithenleitner, psychologist and director of the International Social Service (ISS) in Vienna, about our current understanding of the social role of children and how optimisation of our genetic make-up could impact on this.” His use of the phrase “optimisation of our genetic make-up” shows transference of the conceptual coding process of optimization from the digital realm of design into and onto that of our hypothetical human future. Michael Weinstock and computer scientists Sathish Periyasamy, Alex Gray, and Peter Kile make a similar move when they state that optimization is part of actual evolution. The latter three write, “Evolution is an optimization process, where the aim is to improve the ability of a biological system to survive in a dynamically changing and competitive environment.”
Top-down synthetic biologists are also proclaiming the imminent creation of “enhanced” human beings through the use of CRISPR/Cas9. These are perhaps the most vociferous and respected proponents of eugenics today. Consider for example the words of Juan Enriquez, who chairs the Genetics Advisory Council at Harvard Medical School, and Steve Gullans, who was a professor there, in their 2015 coauthored book Evolving Ourselves: How Unnatural Selection and Nonrandom Mutation Are Changing Life on Earth: “CRISPR can be repurposed to cut, paste, and edit any DNA sequence into or out of any genome quickly and easily, not just bacteria. . . . CRISPR can effectively edit out any harmful DNA sequence (for example, a disease causing gene mutation) and replace it with a beneficial DNA code (a normal non-mutated gene).” Unlike other approaches in genetic engineering “such as gene therapy, which introduce one new gene into a genome with many complex and tedious steps, CRISPR is rapid, large-scale gene-editing technology.” They describe this as “transitioning from a mechanical typewriter, having to use Wite-Out, and retyping a word or phrase . . . to having a primitive word-processing program that allows one to swap whole paragraphs or pages in and out,” without retyping.
Enriquez and Gullans see CRISPR as an everyday technology, one that DIY biologists and high school kids alike will have access to; the possible uses for it are seemingly limitless. “But by far the most important impact of CRISPR,” they write, “will be on the modification and evolution of humans.” Though they acknowledge lightheartedly that this might be controversial, they are certain that our “broad ethical debate and education” on this technology will lead us to decide to use it. This pronouncement is similar to how “critical design” urges debate with the same expected outcome. “Reasonably soon we will find a safe way to engineer long-term changes into our descendants. When we choose to do so, we will begin to shape the species according to our own set of instructions and desires,” they state. “This is not just unnatural selection altering and shaping what already lives, this is nonrandom mutation rapidly creating and passing on something new. So let’s now look at the completely uncontroversial topic of altering future babies.” They begin their next chapter, “Unnatural Acts, Designer Babies, and Sex 2.0,” with the fact that the United Kingdom in 2015 “may become the first country to allow trans-generational genetic engineering” known as “germ-line engineering,” in order to “deal with mitochondrial diseases.” In fact, a law was passed in 2015 that “carve[d] out an exception to the prohibition on human inheritable genetic modification in the UK”; it allows “‘3-person IVF’ techniques without human clinical trials, and with no required follow up of any resulting children.” Enriquez and Gullans predict “a tidal wave of genetic upgrades in humans,” ranging from ones targeting the brain to those that can make us “Forever Young, Beautiful, and Fearless.” These qualities are almost identical to those desired and pursued by American eugenicists in the 1920s and 1930s.
This pursuit of being disease-free, ageless, even evading death, is proposed by Armstrong for “protocell” architecture. She states that “protocell” architecture is “self-healing” and “self-repairing,” but fails to explicitly mention though that it can get hurt, wounded, or sick, the necessary precondition demanding these qualities. Throughout the special issue of AD, whenever the word “death” is mentioned with regard to “protocells,” the sentence negates its presence. “Protocells inherently engage with the principles of design,” Armstrong and Spiller write. “They manipulate and can be manipulated to alter matter in their environment, reworking and repositioning this material in time and space—a strategy shared by life to avoid entropy and the decay towards equilibrium, in other words, death. . . . [Protocell architecture] resists the equilibrium since this constitutes death.” Realize that “protocells” can only do this if their environment provides the necessary molecules for their ongoing “metabolic” reaction. In essence, then, “protocell” architecture seemingly does not age, repairs and heals itself, and never dies. This is the same future that nano-bio-info-cognitive convergence (NBIC) supporters imagine for the human body, including Armstrong’s friend, British sociologist Steve Fuller.
Fuller dedicated his 2011 book Humanity 2.0: What It Means to Be Human Past, Present, and Future to Armstrong. Late in 2012, Armstrong and Fuller presented together lectures on “Architecture and Ecology” at the Victoria and Albert Museum in London. In his review of this event for Architectural Review, Robert Bottazi states that Armstrong agrees with Fuller’s predictions in Humanity 2.0 that “the role of human beings in a world increasingly governed by convergent nano- and biotechnologies will unavoidably fade for at least three reasons”: climate change, technologies that “level out differences between living beings and inorganic matter,” and other actors (inorganic and organic) playing a role in designing the environment. “The architect will recede into the background to become more of a designer of systems of interaction rather than fixed objects.” As Bottazi mentions, Fuller promotes NBIC convergence, which together will become the tools for designing the future world when humans assume a role akin to a demiurge. Fuller also supports transhumanism, which promotes humans’ “evolving into something different, something better” and is serving as a foundation of the new eugenics. In fact, transhumanists cite their forefathers as eugenics supporters Julian Huxley and J. B. S. Haldane. Its Wikipedia page describes transhumanism as “an international and intellectual movement that aims to transform the human condition by developing and creating widely available sophisticated technologies to greatly enhance human intellectual, physical, and psychological capacities.” Protocell researchers, who work on the nano-bio portion of this predicted convergence, in the introduction to the textbook Protocells state that both the National Science Foundation in the United States and the European Commission “believe that convergent technologies will have a very large socioeconomic impact in the next 25 years.”
The similarities between NBIC proponents and less critical synbiodesigners must be highlighted, lest we overlook their common acceptance and prediction of eugenics in the rush to meet the future. For example, Susana Soares’s New Organs of Perception (2007) and Genetic Trace (2007), which were included in Design and the Elastic Mind, depict new perceptual organs on the body, such as sensitive whiskers on the eyebrows or comblike tips on fingernails that allow individuals to collect other people’s biological data through interpersonal encounters, perhaps for “selective mating.” Her website asserts that “genetic screening technologies are enabling parents to design their babies and weed out undesirable diseases.” She supports this with quotes by evolutionary psychologist Geoffrey Miller—“Within a few generations market-based genetic technology will eclipse social-sexual selection”—and biologist Kate Douglas—“1,000 years from now, people will be much more beautiful, intelligent, symmetrical, healthy and emotionally stable, thanks to 40 generations of genetic screening.” In fact, the United Kingdom passed the Human Fertilisation and Embryology Act (2008) that prevents the implantation of embryos diagnosed with a “serious illness,” including “deafness,” into a mother’s womb. In this case, the parents’ desire to “design their babies,” as Soares puts it, should they desire a deaf child, is overruled by national law, a situation that repeats and renews the aims of laws passed in many states and nations during the 1920s and 1930s in the name of eugenics.
Similarly, technocratic thinking that was so prevalent in eugenics of the interwar period surfaces today in synbiodesign. Chieza’s Design Fictions adopts the idea of a body as a farm, which echoes but counters the dystopia of Burton’s Nanotopia for the Future Farm project. Her pieces reflect a broad acceptance of technocratic thinking based on industrial processes, showing humans using DIY genetics to grow new aesthetic or health-giving biological prosthetics that function in part as fashion. Whether human bodies are genetically engineered to be factories to grow body parts viewed as “products” for other humans, or whether humans use other living organisms as factories to grow materials or “products” for ourselves, the underlying approach and action are based on means–ends thinking and a valuation of living bodies as nothing special, therefore just seeing bodies as materiality, matter, “stuff.” This “engineering mindset” carries with it corollary industrial assumptions that products/bodies should be “well-designed,” engineered, managed, and profited from—or at least something that companies or governments should not be made poorer by, as when politicians argue that women on welfare should be paid to be sterilized. This mode of thought runs deep in synthetic biology today and is fundamental to the principles of NBIC convergence. It is beneath the idea of social control through rational selection, assuming of course that one believes there are social factors that result primarily from genetic bases, an idea that was and is prominent throughout facets of twentieth- and twenty-first-century psychology, sociology, and criminology. Technocratic logic was, according to historian of German eugenics Sheila Weiss and myself, the most fundamental, pernicious, and ethically perverse mode of thought behind eugenics and its policies during the interwar period and beyond, as demonstrated most prominently through its effects in the Holocaust.
The language of controlling evolution is virtually identical between the eugenicists of the interwar period and synthetic biologists today, although the means by which they intend to reach their goals are based on different knowledge bases and scientific methodologies. Both speak of replacing natural selection with “rational selection” (the interwar term) or with “unnatural selection and nonrandom mutation” (the current terms). Between the world wars, “positive eugenics” aimed to increase the population of the “fit” using “race betterment,” whereas today, “eugenic algorithms” “optimize” “fitness” according to parameters set by designers. Without obvious historical consciousness spelled out in the choice of name, software named “Eugene” offers synthetic biologists a language for creating “composite devices from collections of parts.” In the interwar period, “negative eugenics” minimized the presence of those deemed “unfit” or “dysgenic” through reproductive sterilization. The first state sterilization law was passed in Indiana in 1907, with others added through the 1930s; some state sterilization laws in the United States remained on the books and were actively in practice into the 1980s, with over seventy thousand individuals across the United States being involuntarily sterilized. A few governors have apologized to their state’s citizens, but so far only North Carolina has voted to make financial reparations to living victims. Just two years ago, California passed a bill to ban forced sterilizations from occurring in California prisons, precisely in response to it being brought to light that nearly 150 women had been sterilized illegally between 2006 and 2010. Even worse, in the interwar and wartime period, doctors, scientists, politicians, and citizens “removed” the “unfit” using “euthanasia” or outright murder—referring primarily to Germany, as well as to eugenically motivated race-based murders in the United States, mostly across the South. Today, computer scientists design kill strategies in eugenic algorithms while synthetic biologists design kill switches for rogue species.
Along these lines, in Synthetic Aesthetics Ginsberg describes a “lease of life” concept for engineered products, which would legally and physically make them finite even if they could still be useful, according to the economically useful idea of planned obsolescence. She recounts how Oxitec Ltd. has designed and engineered a variety of male mosquito (RIDL mosquitoes) “whose progeny are designed to never live. . . . Already on trial in the Cayman Islands and Brazil, factory-grown RIDL mosquitoes are sorted by sex, and then released into the wild by the million.” After the mosquitoes mate, their “offspring will never hatch. . . . Mosquitoes are not killed as such; they are just never born. . . . Mosquitoes that never live seem to be good design.” Methods such as these, she proposes, “could offer an economic and social safety mechanism beyond the kill switch, as we continue to seek reliable ways to design a good death for our things”—such is the source of the word “euthanasia.” “While biotechnological obsolescence may become a fact of life and a functional design reality, it also marks the ultimate instrumentalization of life.” The lines blur between the past and the present, except for the fact that in the publications on synbiodesign and synthetic biology, the words “eugenics” and “euthanasia” are never used.
This historical amnesia is inexcusable. Does it stem from ignorance, hubris, or both? Regardless of the underlying reason for why this amnesia is so widespread, these continuities of and resonances with eugenics of the past force us to face questions of justice and injustice, including asking who now is wielding power over “the other,” how so, and what and who constitutes “the other.” Philip Beesley wisely intended to raise these questions through the works he created for the Hylozoic series, including Hylozoic Ground, on display in Venice in 2011. The responsive, kinetic environments that hover in space, enveloping humans beneath and between its parts, are intended to make humans perceive themselves as the potential “other” in relation to “near-living” architecture. Beesley’s worthy goal is motivated by and intending toward the creation of humility, which is a fabulous position from which to begin facing questions of “othering” and injustice. Yet, efforts by some of his collaborators unfortunately, and perhaps unwittingly, work against Beesley’s intent. Dana Kulic, Rob Gorbet, and Ali-Akbar Samadani observed human responses to the sculpture in Venice, literally noting expressions and hand gestures, and then algorithmically scripted these very shapes into the movements of the fronds (“fingers”) of later sculptures in the Hylozoic series (Figure 6.8). The group wanted to heighten future visitors’ experiences by designing the sculpture’s actions to produce affect, making humans think that the “near-living” architecture is empathetic and responsive to them when, in fact, the sculpture was simply being created in their own image. What could have been “other”—and would be legitimately “other” were it actually a living organism—has merely become a scripted representation of human beings themselves. Alternatively, Burton’s Nanotopia (Figure 6.3) addresses these important questions by directing us toward the broader contexts of socioeconomic inequalities within which future practices of human genetic engineering, or NBIC convergent technologies, might occur. What he observes is not much different from how surrogacy is currently being outsourced today. In other words, new technologies often integrate themselves into existing patterns of injustice implemented through political economy, economic production, and socioeconomic stratification. In a very different sense from how Jenny Sabin and Peter Lloyd Jones address it, we return to the idea that “context matters.”
The final theme apparent in synthetic biology and throughout the book is the way the technologies discussed here, especially when implemented by generative architects or synbiodesigners, can presumably contribute to “sustainability.” In her critical thinking about the potentials and pitfalls of synthetic biology as it is becoming constituted as a field, Ginsberg notes that industrialization and design under the mass production and mass consumption mentality of the twentieth century have both proven themselves unsustainable. They have also demonstrated that the profit models they are allied with tend toward mono-cropping and homogeneity rather than diversity. Synthetic biology touts its foundational role in what promoters are predicting will be the Second Industrial Revolution that permits unabated consumption and economic growth for the twenty-first century. This is one of the primary rationales for modeling top-down synthetic biology on the principles of the original Industrial Revolution.
Yet, Ginsberg critiques this mentality that pertained to twentieth-century industrialization and to design: “Faced with the individuality of living things, does the uniform engineering vision, eliminating diversity in favor of a controllable uniformity, seem as desirable?” She describes how the “unique properties of biology are often overlooked in discussions of the industrialization of synthetic biology. Instead,” she writes, “we are presented with a vision of ‘drop-in’ replacements, which use bacteria as a mechanism to produce more of what we already have, rather than doing something more interesting that draws on the particular characteristics of biological systems.” Rather than copying techniques of mass production of design through standardization and overproduction, as these “design practices are themselves unsustainable,” or becoming obsessed with gene-centrism, she argues that we need to think anew, learning from biology itself rather than from past modes of production. “Life is more than DNA. Fetishizing DNA is limiting, and analogies with the digital world prevent us from seeing biology in full.”
A number of issues are at stake in these visions of both design and design education integrating synthetic biology that are not being discussed by many of its proponents. Ginsberg is one of the few synbiodesigners who is thinking critically from within while asking excellent questions about which futures are being envisioned and why. She critiques the limitations of current work in synthetic biology. These include limitations of vision, where the Second Industrial Revolution is imagined to be almost exactly like the first despite the problems created by that original incarnation. They include limitations of practice as well. She critiques the design of “bacteria that pump out non-biodegradable acrylic acid for plastic or isoprene for tires. Once they leave the factory, these plastics and tires may be no less polluting than conventionally made ones.” Similarly, some companies are producing synthetic biofuels that in their use emit greenhouse gases at similar or higher levels than those produced through the burning of fossil fuels. Such uses do not work for “sustainability” but rather shift economic profit from the fossil fuel–based economic sector toward the “glucose economy.” “The bioeconomy is sold as a sustainable glucose-powered future, a sweet medicine to remedy our dirty carbon and dangerous nuclear habits,” she writes. The “rush to build sugar-powered biofuel infrastructure is often underscored by geopolitical or economic pressures around energy, rather than a desire to maintain biodiversity or seek out good design.” Is there even “enough land to feed planes and cars and products as well as people and animals, and can such large-scale monoculture be sustainable?”
This returns us to considerations of spatial and material limits, given that the size of our planet and the material formations it offers are finite. The materials used in the production of microprocessors and computers—not to mention many other products—are being “plundered,” to use Steve Pike’s word choice, at a rapid rate. Yet, architectural visions of the future—such as Weinstock’s “self-aware” cities, artificially living environments such as Beesley’s sculptures were they adapted to an architectural scale, and even an abundance of plain old “smart” “green” buildings—rely on microprocessors, sensors, actuators/motors, and distributed communication and control systems. These require both materials and energy, not only in their operation or useful life but throughout their full life cycle. These production processes and the materials and energy used to power them produce by-product pollution and waste. Until architects, designers, manufacturers, consumers, and politicians integrate life cycle analysis into their everyday decision-making, claims of “sustainability” remain unsubstantiated. If molecules can self-assemble simply using energy in chemical bonding, so that then presumably a designer or architect growing something from the bottom up might say no energy is added and the process is “sustainable,” then what energy and materials go into getting the molecules isolated in the first place, in order to then combine them, assuming it is a technologically controlled process? Skylar Tibbits’s deployable structures to be dropped from helicopters use only gravitational energy for their opening, but this says nothing at all about the energy used to design and manufacture them and the materials from which they are made, transported, and so forth.
By drawing a narrow frame around a product—looking only, say, at its immediate creation and operational use—and then not looking beyond the frame to see the rest of the embedded energy, material extraction, and pollution needed by and produced by the whole life cycle, one can more easily say a product is “sustainable” when it well may not be. Strategies of framing—whether for life cycle analysis or for delineating the boundaries of a complex system—allow us to see or not see the unsustainability or the social and economic inequities that our actions or designs or consumer choices may promote. Framing strategically allows us to imagine and teach that we can engineer something based on a simple but incomplete theory, in the face of contrary knowledge. We can frame something so that it appears to be “bottom-up” even though it may also function “top-down.” These strategies of framing work as a process of expulsion, as Saskia Sassen notes in her study of the environmental and social brutalities being enacted under complexity in our current neoliberal global economy. For all these reasons, to ask how something is framed and what that framing reveals are perhaps the most important questions of all. The heavy reliance in discourses of generative architecture on the rhetoric of complexism and current biological theory and practice both naturalizes and distracts from its many modes of instrumentalization: of life and of materiality, through ever bigger data and ever bigger digital infrastructure. These are the strongest but not the only reasons to question the value of the contribution of generative architecture and design, as currently practiced. While some biodesign approaches hold great potential in terms of rethinking materials as grown and biodegradable, lower-tech rather than higher-tech approaches hold more promise for future environmental and generational health and well-being.