Conclusion
“Now you have seen the temporal and the eternal fire, and you have reached the place where on my own I can discern no further.”
—Virgil’s last words to Dante in the Purgatorio
THIS JOURNEY THROUGH PANDAEMONIUM NOW ENDS. Through the book, I have explored the history of internet daemons, their present work, and the conflicts between them and against them. Though contested, they are a constant presence online. Daemons run on the switches, routers, gateways, and other middleboxes that the internet’s infrastructure comprises. Networking happens through their constant work. Daemons inspect, queue, route, and coordinate the sending of packets across the internet. Their distributive agency enacts what I have called “flow control.” Through this control, daemons realize the conditions of transmission that govern networks online, enacting a certain optimization.
Internet daemons have come to antagonize pirates, hackers, and gamers, as well as regulators and publics. This book also has focused on the limits of daemonic control. Pirates elude daemons through cunning tactics that outrun and hide from their pattern recognition and control. Publics, conversely, try to reveal how daemons malfunction, bringing their unruly behavior to account. Yet, daemons continue to confound both pirates and publics.
Within these conflicts are two very different visions for communication. The internet has many kinds of optimization, but this book has focused on two: nonsynchronous and polychronous. The latter is currently more prominent. The goal of polychronous optimization is to consolidate authority in the infrastructure in hopes of achieving an optimal metastability for the network of networks. As a result, the unruly daemons of the early internet now fall increasingly under the tempo of a common conductor creating a stable, predictable optimization. Thus far, these optimizations have favored networks that are predictable, knowable, and desirable, as seen in the many advertisements that celebrate speed and reliable performance.
These optimizations reflect the early divides between Claude Shannon and Norbert Wiener over entropy. To recall the discussion from chapter 1, Shannon had a more positive view of entropy than Wiener. Shannon believed that entropy could contribute to information, whereas Wiener sought to avoid it altogether. Seventy years later, this debate plays out in these two optimalities. Wiener’s concerns resemble those of a daemonic desire for order and managed networks. Allot Communications, a prominent supplier of internet equipment, contrasts managed and unmanaged infrastructure in a sales video for its “Smart Pipe” service. “When the pipe is unmanaged,” the video explains, “bandwidth allocation is chaotic.” Managed and unmanaged easily stand in for polychronous and nonsynchronous optimization. By providing a managed service, “Allot Solutions allows service providers to gain full visibility and control of their broadband networks.” Greater flow control allows an ISP to structure traffic into tiers. The video continues: “For example, basic service plans give low-volume broadband consumers a package tuned to their needs and their budget. Advanced subscribers make up the majority of consumers who use internet services regularly but not excessively, while premium subscribers get the throughput quota priority and optimization services they need.” The managed service, which is the polychronous optimization, translates messy uncertainty and disorder into a known, organized internet. That might not be an optimal outcome in the end. Reflecting on the history of cybernetics, Katherine Hayles ends with a provocation: “As chaos theory has taught us, disorder is not necessarily bad, and the void is not always empty.”[1] Shannon’s optimism about entropy might be the better choice when the alternative is to “optimize, monetize, personalize with pipes managed by Allot.”[2]
These optimizations matter because the internet hosts a collision of political visions, alters the circulation of cultures, and sparks ruptures of production such as free software and user-generated content. The internet facilitates new forms of social coordination and cooperation as networks. The stakes of flow control are more than sending and receiving, more than faster or slower. Flow control keeps people and software in communication, allowing the possibility of being networked together in cultures, economies, and political movements, but also frustrating the success of networks in achieving efficiency.
Too often, the definitions of the optimal remain concerned with efficiency, cost, and reliability without considering the diversity of the internet (as Allot exemplifies). My history hopefully has raised questions about the governance of the governance of bandwidth. As much as daemons have become more intelligent and more capable, they have not become more governable. Instead, definitions of the optimal remain calculations best left to economics or engineering. Though daemons are vital to the proper functioning of the internet, ignoring the risks of optimization leaves flow control unchecked. The future of flow control threatens network diversity as internet service providers (ISPs) optimize their networks to remove disruptions and inefficiencies, even at the expense of creative and democratic expression.
Matters of optimization and its intents exceed this book, but I hope my book offers a beginning for studies of other daemonic media. My intent has been both to analyze the daemonic internet and to guide studies of other daemonic media. In what follows, I would like to summarize some questions and approaches that arose on my journey through Pandaemonium. Finally, I offer some speculation on what I discern on the horizon and how daemonic media studies might be put in service of understanding even larger problems in media theory.
Lessons for Other Daemonic Media Studies
Throughout the book, I have shared my fascination with daemons. My many technical descriptions delight in their workings. My study offers some key questions for other daemonic media studies. What daemons inhabit other infrastructures? How do they acquire information? What are their gazes? How do they work? What are their grasps? How do these daemons coordinate? There was no one clear path to find answer to these questions: I have relied on historical documents, technical manuals, hacks, experimental observation, and policy proceedings. Each method offers a unique insight into the daemon, from being able to situate its development in a larger technical history to understanding how daemons function (and malfunction). Future daemonic media studies might find these pathways useful, but it is likely that new methods will also have to be found to study daemons hidden in proprietary code, purposefully obfuscated or simply obtuse. Intersections between computer science and media studies, discussed in the appendix, provide a good foundation for developing these new methods.
In tandem with studies of daemons themselves, I use the concept of the diagram to explore their arrangements. In chapter 3, I described a few of the diagrams associated with various states of the internet. These diagrams prefigure the daemons that come to be in their hubs and spokes. Diagrams also illustrate the actual work of daemons and their flow control. Comcast’s submission to the Federal Communications Commission (FCC), discussed in chapter 4, included a diagram that elaborated how it managed Peer-to-Peer (P2P) networks. With the rise of software-defined networking and AI, new diagrams will better elaborate the changing ways daemons enact flow control. Where else can diagrams be found? How might a search for diagrams inform other studies of media infrastructure?
Finally, daemonic media studies question optimization. The internet is one great Pandaemonium, filled with daemons from its core to the end user, where one daemon provides the input for another to form patterns of continuous variation and control. Daemons cooperate to realize programmed optimalities. I have shown a particular distributive way to enact the optimal, but I believe this to be only one kind of optimization at work today. Future studies must first understand the techniques of optimization, how things other than daemons and diagrams manage social and communicative practices. Future research will also have to balance the intents of programmers with the unruliness of daemons. What is the optimal? Who (or what) defines it? When is an optimization working according to plan? When have daemons done something unexpected? Discovering optimizations as practices and goals will be difficult, but a necessary project of daemonic media studies.
Daemons and Operating Systems
With these contributions of daemons in mind, I wish to return to one lingering question. What of the UNIX purist who complains that the daemon is a term best reserved for operating systems alone? Should the term daemon be reserved for programs managing printer queues? Do my internet daemons muddle clear technical language? To this hypothetical objection, I say that my expanded vision of daemons might be the first step to understanding the operating systems at work today. “Operating system” typically refers to the basic program that puts a computer in use, such as Windows or Apple OS.
My expansion of the term “daemon” is perhaps a symptom of a declining interest in formal research into operating systems.[3] In many ways, operating systems have been overshadowed by the term “platform,” a concept used early on by Scott Lash and Adrian Mackenzie to think about operating systems as modes of participation and coordination in digital society.[4] Platforms, Lash writes, allow people to “participate in various forms of technological life.”[5] While the platform highlights the user agency found on today’s internet, I wonder the value of returning to the operating system and the daemon as a way to capture the scale and function of the distributive agency that coordinates social media, apps, or ubiquitous internet access.
Indeed, daemons inspired now forgotten, but much more ambitious, operating systems.[6] After Bell Labs developed UNIX, researchers there continued to work on other operating systems. Started in the late 1980s, Plan 9, Bell Labs’ next operating system, attempted to move beyond the desktop to create a hybrid of a centralized time-sharing system and a decentralized personal computer. The operating system, in short, was bigger than one computer. The goal of Plan 9 was “to build a system that was centrally administered and cost-effective using cheap modern microcomputers as its computing elements”:
The idea was to build a time-sharing system out of workstations, but in a novel way. Different computers would handle different tasks: small, cheap machines in people’s offices would serve as terminals providing access to large, central, shared resources such as computing servers and file servers.[7]
Plan 9 was not an operating system in today’s conventional sense, since it was not located on a device, but rather across devices. In contemporary media studies, Plan 9 seems to resemble a platform, a kind of technological phenomenon operating on a massive scale, distributed across many devices.
Comparing Plan 9 to a platform is not far off from its own development. Java was the primary competitor to Plan 9. Java’s designers attempted to turn the internet into one universal platform by writing code that could run on any computer no matter its local operating system, as Mackenzie presciently analyzed.[8] This possibility of a universal platform inspired Bell Labs to shift its work on Plan 9 to focus on a new competitor to Java, Inferno. The metaphor of the daemon once reserved for UNIX became the speculative foundation for a new way to imagine operating systems.
Dante’s famous work inspired not only Inferno’s name but also the names of its programming language, LIMBO, and its communication protocol, STYX. I cannot tell if Dante name-inspired the actual design as well, but Inferno did use a number of layers to isolate its different processes and daemons, not unlike the rings of Dante’s hell. The engineers at then Lucent Labs imagined Inferno as creating a vast interconnected system of many devices, platforms, and locations not unlike the vast, interconnected cosmos toured in Dante’s Inferno, Purgatorio, and Paradiso. It extended the operating system well beyond the desktop, much like the daemons encountered in this book. The operating system was
designed to be used in a variety of network environments—for example, those supporting advanced telephones, handheld devices, TV set-top boxes attached to cable or satellite systems, and inexpensive Internet computers—but also in conjunction with traditional computing systems.[9]
Inferno could be run across these different systems, uniting different technologies, devices, and users into one system. Where Java aimed to be a universal platform for development, Inferno sought to create order in a world in which “entertainment, telecommunications, and computing industries converge and interconnect.”[10] With Inferno, the idea of the operating system was abstracted from the lone desktop computer or the central time-sharing system. Inferno was a “network operating system for this new world.”[11]
Inferno was used in some smart telephone terminals and some of Lucent firewalls, but it never replaced Java. In turn, Java never succeeded in creating a universal platform. These failures, however, should not have us forget the scope of these pervasive systems. What if Inferno signals an overlooked change in operating systems from being local to being networked? What if the operating system now operates at a much larger scale, across many computers, phones locations, and tablets? Where I have used Pandaemonium as a concept to analyze the internet, perhaps it is Inferno that might inspire studies of global networked operating systems.
In the next part of this conclusion, I would imagine internet daemons along with other daemons and mechanisms of control at work in these networked operating systems. The internet might be seen as a place of competing operating systems, host to a few global overlapping heterogeneous systems comprised of wires, devices, and protocols, as well as daemons. Google, Apple, Facebook, Amazon, and Microsoft likely control these operating systems. While the nature and the logics of these operating systems are beyond what I can speculate here, a daemonic media studies helps explore the operation of these systems. Daemonic flow control is one of five controls in my estimation. Other daemons and controls include connecting, standardizing, mediating, securing, and transmitting. Connecting gives an infrastructure its space, creating a shared physical connection. Standards and protocols constitute a language for the computational components of networks and, at times, their human coders. Platforms and websites are mediators that enable access to digital life and the circulation of shared desires, feelings, relations, and labor. Securing, perhaps the most secret control, assesses the risks and threats in network behavior. Amid these other potential daemons and other controls, internet daemons control the flows of information within these operating systems. The following section introduces these different controls and gives examples of their operation and their limits.
Connecting: A Network of Networks
Operating systems vary in their connectivity, depending on where they connect, whom they connect with, and how they filter their connections. As evidenced by metaphors like “cloud computing,” the internet usually appears in the popular imagination as ephemeral, but these out-of-sight, buried physical connections can have profound effects on our digital lives. Homes have “tails” (slang for a fiber connection) or connect over repurposed coaxial cable or twisted-wire copper lines. These media alter how networks can send signals and the amount of available bandwidth. Comcast daemons, as discussed in chapter 4, had to make do with shared cable lines. Mediums of communication have a spatial influence on networks. Fiber backbone often follows long-established rights-of-way along railway lines. These fiber lines frequently converge at Internet Exchange Points and Carrier Hotels that establish the physical connections between autonomous infrastructures. Often, these data centers are strategically located near to cheap water to reduce cooling costs.[12] Undersea cables wind around the globe, connecting nations and corporations. For example, internet access degraded in all of eastern Africa after a boat anchor dropped off the coast of Kenya accidentally cut the East African Marine Systems cable.[13] Global connectivity depends largely on these kinds of international, often undersea cables.[14] These cables determine the routes packets take as they travel internationally.
Points of infrastructure function as important points of control. When President Hosni Mubarak faced popular unrest, his regime tried to disconnect Egypt by turning off these sites of interconnection.[15] These interconnection points may also be monitored or filtered by daemons to prevent particular connections without completely disconnecting from the internet, such as the frequent bans of domains related to the Falun Gong in China, the blocks of The Pirate Bay (TPB) by the United Kingdom, and the CleanFeed program in Canada, which aims to block child pornography.[16]
Interconnection involves a control that influences who or what might be in communication. Robert Latham describes this control as the network relations or logics “whereby computer networks would form and then connect or not connect (and the consequences of such formation and connection).”[17] Network relations attend to the consequences of connection and disconnection and to the forces driving connections in and between networks. Who can talk to whom? Which systems connect?[18] Which nodes connect first or are avoided altogether?
The political economy of internet peering perhaps best demonstrates the control exerted by mechanisms of connecting.[19] “Peering” refers to how different infrastructures on the internet connect to one another. Few networks now exchange data without economic compensation. These few, known as “Tier 1,” have agreed to settlement-free interconnections across which data passes unconditionally. This form of interconnection is closest in spirit to the internet engineers and “netheads” who valued free interconnection as a way to create a global or even intergalactic computer network.[20] However, networks now increasingly agree to settlement-based innterconnections that attach cost recovery for data exchanged. Settlement-based interconnections, known as “Tier 2” and “Tier 3,” create asymmetrical data flows, with one network paying to send its traffic to the other. Recently in the United States, ISPs Verizon and Comcast has been in disputes with content-distribution networks Level 3 and Netflix over who should pay to upgrade the links between their networks.[21] Who should be responsible for maintaining peering links? Should the distribution networks bear the cost, since they send the bulk of the traffic, or should ISPs be the ones to ante up because their business model is founded on connecting to the internet?
Canada provides a good example of the political consequences of peering. Most major ISPs in Canada peer outside of the country. As a result, traffic between Canadians often travels through the United States in a process of “boomerang routing,” in which data exits Canada, travels on American networks, and then returns to its final, Canadian destination. Boomerang routing has been accused of undermining Canada’s information sovereignty, since many network decisions happen outside its borders, which means that Canadian data might be lawfully intercepted by American surveillance agencies.[22]
Connectivity as a control has its own limits and exploits. As much as states or other network administrators hope to control connectivity, the intense complexity of the system creates opportunities to elude this control. If their network blocks a location online, users might connect to a proxy, a server that acts as an intermediary between blocked content and affected user. Often, networks cannot physically disconnect a problematic site and instead block its domain. The U.S. Department of Homeland Security has taken to seizing the domain names of piracy websites whose servers reside outside the country. In other words, they prevent the domain name from locating the server when they cannot disconnect the server itself.[23] Seizing the domain name, however, does not disconnect the site from the internet, and affected users can circumvent this maneuver by using an alternative domain name server or connecting to a proxy server that connects to the internet from a different location. When the United Kingdom blocked TPB’s domain, the pirates enlisted hundreds of servers to act as proxies to forward traffic to their servers without having to rely on the blocked domain names.[24] These examples demonstrate the limits of connectivity as a control, though with less optimism than when John Gilmore claimed, “the Net interprets censorship as damage and routes around it.”[25]
Gilmore’s optimism seems ever more dated to me amid the growing reality that the internet infrastructure increasingly relies on centralized players administering a few international content-distribution networks. The online circulation of videos, websites, and apps largely depends on a physical infrastructure of servers mirroring data. Akamai, Amazon, Google, Level 3, and Netflix are all major players in the business of content distribution. These companies own servers near consumer ISPs, or even, as in the case of Netflix’s OpenConnect program or Google’s Cache, within an ISP’s infrastructure. Proximity and local peering lower connection and transit costs while boosting performance, a clear win for those with the economic power to participate in this infrastructural competition. Josh Braun calls for a distribution studies to explore the infrastructures that enable online video streaming.[26] Others have called for new internet policy to address the growing influence of these privatized infrastructures.[27]
With the rise of content-distribution networks, daemons’ flow control may at first seem to have a waning influence, but these networks actually increase the importance of local daemonic optimizations. Capitalizing on local installations of content-distribution networks requires daemons able to route and prioritize them. Indeed, “zero-rating” programs proposed by ISPs in Canada and the United States leverage daemons to increase the discoverability of certain networks by advertising that its packets do not count toward their customers’ data caps. If these trends continue, ISPs will increasingly use a combination of zero-rating and content-distribution networking to privilege networks with low transit costs or those with a working relationship with the ISP. Daemons, in short, are vital to managing the interconnections between parts of a networked operating system.
Standardizing: Distribution of Conduct
Moving from connecting to the next set of controls that may be part of networked operating systems, standardizing, it is helpful to remember the Domain Name System (DNS) used to disconnect TPB from the British internet. The DNS acts as a bridge between the mechanisms related to connecting, on the one side, and those related to standardizing, on the other. As the de facto standard for addressing, the DNS has the power to connect and disconnect different parts of the internet.[28] The DNS wields this tremendous influence because the daemons of the internet have mutually agreed to use it to locate resources.
DNS is just one standard used to interconnect technical systems. Protocols, standards, and formats make up the second mechanism of control that allows parts of the internet, particularly daemons, to understand each other. Generally speaking, “standards” refers to agreements about measurement, quality, or safety, but digital media depend on a variety of standards to ensure technical compatibility and interoperability.[29] Different hardwares can interoperate so long as they abide by common standards (not a small feat, as anyone who grew up trying to share floppy disks between Macintosh and IBM computers can attest). International organizations like the Internet Corporation for Assigned Names and Numbers (ICANN), the Internet Engineering Task Force (IETF), and the International Telecommunications Union (ITU) develop, debate, and administer protocols.[30] Information also circulates through shared, standardized file formats, such as a Microsoft Word document or a HyperText-Markup-Language (HTML) file, that specify how to store information.
Standards are mechanisms of control precisely because they regulate the conduct of communication: free communication through strict standards. Alexander Galloway describes this control as “protocological”[31] and argues that it functions by distributing and embedding the same rules across the network, regulating the conduct of communication at the node. To join a network, nodes must obey the rules of a protocol. By defining the rules of networking at the nodes, protocols maintain control in decentralized networks and keep conduct consistent among diverse and dispersed nodes.
Standards have political and economic ramifications.[32] The unforeseen consequences of protocols have also had significant impacts in the domains of intellectual property and ownership. For example, competing hypertext formats widely differed in their approach to attribution: the world wide web provided far less than the fabled Xanadu Hypertext System.[33] The former allowed for rapid growth, but since no formal rules were established to attribute original sources, situations such as spammers repurposing Wikipedia content as their own easily arose.[34] Formats matter too. The small sizes of the MP3 audio format and the DIVX video format facilitated online file sharing and an explosion in piracy.[35]
Formats, standards, and protocols all exemplify the ways daemons communicate. Most often these standards are human-readable as well as machine-readable, but that might not be the case in the future. Google recently conducted a machine-learning, or “deep learning,” experiment in which neural networks developed their own encrypted language.[36] While only an experiment, it points to a time when daemonic language might be entirely unintelligible to humans. This scenario presents a different kind of problem than the question of open standards.[37] Some outputs of deep learning are difficult for even their programmers to understand; the output is effectively a black box. Where proprietary, closed standards might at least be human-readable to those with access, daemons could someday create network patterns and optimizations that will be impossible for human network engineers and administrators to fully comprehend.
This matter of legibility might play out in the protocols of the Internet of Things (IoT), one term used to describe the expansion of the internet. The future of the digital communication has at least two discernible courses ahead of it today: operating systems using Internet Protocol version 6 (IPv6) or ones filled with daemons speaking in tongues foreign to any human mind. The former would perpetuate some public oversight through standards, whereas the latter would turn the protocols of the internet into proprietary code. The laws of cyberspace might be compiled into a private language, finally breaking with the social imaginary that once dreamed of an open internet. Moves like Google developing its embedded operating system Brillo suggest that the internet is probably headed toward a fragmentation such that what I see as overlapping operating systems today disaggregate into even more distinct systems.[38]
Mediating: Participation in Technological Forms of Life
Operating systems may also control the points of entry for users, what is often called “platforms” (in a much narrower sense than discussed above with, e.g., Bell Labs’ Plan 9 platform). Where protocols emphasize the conduct of the network, platforms emphasize the integration of protocols and standards.[39] Platforms are a “convergence of different technical systems, protocols and networks that enable specific user practices and connect users in different and particular ways.”[40] The concept of the platform helps explore the operations of control in heterogeneous systems with horizontal and vertical factors. The next section discusses the ways platforms mediate as a third mechanism of control and their daemons mediate inputs in global operating systems.[41]
The platform is a particular technological stage on or from which the user can operate, a stage shared by a common user base across the web. We tweet, check in, or pin depending on the affordances and functions of platforms.[42] Facebook’s website and apps allow participation in its virtual community; its technical work simplifies converting a text entry into a “status” update distributed to friends. Web browsers (often overlooked as platforms) dictate website functionality depending on how they implement web standards. The linguistic dimensions of platforms also influence user behavior and its technical functions. The term “platform” itself “fits neatly with the egalitarian and populist appeal to ordinary users and grassroots creativity, offering all of us a ‘raised, level surface.’”[43] Platforms encourage users to participate in the internet only after agreeing to terms of use, privacy policies, and codes of conduct. These legal documents attempt to control user behavior with threats of legal action while also granting platforms broad access to use uploaded data. The breadth of the work that platforms do complicates their influence. YouTube’s economic interest in becoming a professional streaming video platform conflicts with the demands of its users, so it has to “play host to amateur video culture and provide content owners the tools to criminalize it.”[44]
Platforms mediate user input.[45] Mediation is a kind of control in its purest form, since it processes user input toward particular outputs. This influence varies from Twitter limiting all messages to 140 characters to Facebook’s much more subtle forms of mood manipulation. All across the internet, web developers run constant experiments on platform users to discover the optimal layout and ensure that surfers click the right button (or donate the right amount in political campaigns).[46] Platforms also influence user behavior by reflecting their activities back in particular ways and providing personalized vantage points into the digital world, what Eli Pariser calls “filter bubbles.”[47] How Netflix recommends movies influences film culture,[48] just as news apps reconfigure the work of newspaper editing through their dynamic promotion of content.[49] Facebook conducted trials that promoted content using sentiment analysis to vary the mood of a user’s “news feed”: a positive news feed led to a somewhat increased probability of happier posts.[50] Google News has also admitted to favoring positive stories, so when disappointed Brazilians visited the site after it lost in the 2014 World Cup, they found fewer stories that mocked their team’s poor showing.[51]
Mediation often influences user behavior to maximize an operating system’s profit. Most social media companies depend on user-generated content to create commodities such as data profiles, usage behavior, or simply viewers.[52] Twitter has developed a secondary business of selling access to its data firehose, the flow of users’ activity.[53] Since social media depends on advertising, its platforms encourage users to post more content to create better profiles and cybernetic commodities to sell to advertisers. Facebook’s news feed algorithms score users’ posts to encourage them to share more and more often. Inactivity diminishes the score (at least in its 2012 iteration): the lower a score, the less likely a post will be seen by others.[54] More activity drives traffic, which leads to increased advertisement views and more advertisement sales, and in turn, those ads can be better-targeted using the profiles Facebook has built from user activity.
Users find their own use of platforms’ affordances. Cultural adoption recontextualizes platforms’ features in a fashion similar to Andrew Feenberg’s model of the two-stage process of technological influence from its design to its actual use.[55] MySpace, one of the early social networks, gave users the ability to pin their top eight friends to their profile page. The feature created great social conflict, as users suddenly had to justify which friends appeared in the list. Platforms’ mediating constraints can be maneuvered around, or “gamed,” in many ways, especially when platforms can provide revenue for users.[56] Some female content producers on YouTube earn income by gaming its recommendation algorithms. “Reply girls,” as they are known, create videos in response to other highly popular videos. They edit their response so that YouTube selects a sexually suggestive thumbnail when recommending what to watch next for users. Reply girls appeal to a viewer’s erotic curiosity. A link is clicked and the reply girl receives another view.[57] As a result, YouTube has adjusted its recommendation system to cut down on the success of reply girls.[58] Mechanisms of control on a platform then have to be seen as constantly being reconfigured through systems of feedback to ensure the effectiveness of its controls against these unruly mis-users.[59]
My interest in daemons complements studies of social media platforms that have questioned the role of algorithms and bots in creating, ranking, editing, and moderating content online.[60] Algorithms and bots resemble daemons in their infrastructural role. Like a daemon, algorithms run as the background processes of platforms. Likewise, on mobile phones, background algorithms communicate the device’s location to advertisers in new programs like Facebook’s Local Awareness[61] or Rogers Alerts[62] that enable geographically targeted ads. These are just a few examples, but they link the intelligence in the internet’s infrastructure with broader trends in technology and society.
AI, bots, and daemons also operate in the production of content on platforms, more so than daemons that influence the transmission and circulation of this content. Wikipedia depends on bots to protect against vandalism.[63] Political bots on Twitter engage in what communications scholars Sam Woolley and Phillip N. Howard call computational “propaganda.”[64] The political sphere as mediated by social media platforms has failed the Turing test with bots tweeting to amplify political messages, demobilize supporters, and artificially inflate the numbers of followers online.[65] These political bots are a few machines of everyday life. Most mobile operating systems have on-demand artificial intelligence companions like Apple’s Siri or Microsoft’s Cortana. While distinct from the daemons encountered in this book, a daemonic media studies will help analyze these new programs.
Flow control further can help to understand the influence of platforms by emphasizing the subtle nudge of responsiveness. Programs like Google’s AMP (Accelerated Mobile Page) and Facebook’s Instant Articles exemplify the emerging power of platforms. Both programs aim to improve how third-party stories load on their respective platforms. Facebook hosts Instant Articles so that the content of their program partners loads faster than other stories on its platform. Google’s AMP sets standards for web code. The project includes specifications for a webpage’s HTML code, known as AMP HTML, a new AMP JavaScript library, and a Google-owned content delivery network that “is a cache of validated AMP documents published to the web.”[66] For coding to AMP standards, content creators are promised their pages will load more quickly, presumably when accessed from Google Search. These slight boosts in page-loading time promise to have a familiar affective influence—a nudge, so to speak—that might be another way platforms function as gatekeepers. Nothing will be as simple as “normal” versus “lag” mode, but rather a diverse ecosystem of content designed to nudge users into keeping within the boundaries of an operating system. These ensuing and subtle differences in performance will need the same attention as questions about trending and popularity.
Bandwidths of the Acceptable: Securitization of the Internet
Operating systems have also become involved in a fourth control: securitization. Just as desktop operating systems have been secured, networked operating systems likely will require greater investment in security. Mechanisms of security may be understood through Michel Foucault’s more historical writings on security. He discusses security as a means to influence “a reality in such a way that this response cancels out the reality to which it responds—nullifies it, or limits, checks, or regulates it.”[67] Securitization has three parts: a gaze, a calculation of risk, and mechanisms of intervention. This calculation evaluates reality between “an average considered as optimal on the one hand, and, on the other, a bandwidth of the acceptable that must not be exceeded.”[68] “Bandwidth of the acceptable” is a particularly apt term for examining network securitization. Certain activities on the network might exceed a limit of acceptable risk, requiring mechanisms of security to mitigate the probability of some outcomes and ensure the perpetuation of other realities.
Intellectual property requires much of this investment in security. Unauthorized copying is one symptom of a technological problem that Tarleton Gillespie explains as “how to control the way that someone uses something that is freely handed to them.”[69] How do you stop a user from actually using that record button? Copying music sheets, home-taping, and bootlegging each represent the vulnerability of open technologies when it comes to the security of intellectual property.[70] Where copyright has long attempted to regulate behavior, Digital Rights Management (DRM), filtering software, and trusted computing embed restrictions into computers and other platforms to prevent unauthorized uses.[71] As platforms have moved from the desktop to the web, they have developed new mechanisms to secure their content and user activity. YouTube, for example, has to check seventy-two hours of uploaded video per minute for infringement. Not unlike Charon, YouTube has developed its own guard to manage this data flow. The company’s ContentID system automatically monitors uploaded videos and compares them to a database of known copyrighted works (or references files). A match causes ContentID either to block the video or to keep it up but redirect its revenue to the copyright holder.[72] Platforms’ attempts to protect intellectual property point both to the influence of platforms when mediating user input and to the next mechanisms related to security that will attempt to neutralize threats, risks, and unacceptable behavior.
Mechanisms of security imbricate with the three prior mechanisms discussed above. America’s National Security Agency (NSA) programs like FAIRVIEW and STORMBREW partner with major internet backbone providers to tap submarine cables.[73] Commercial ISPs worry about the health of the network and filter out threats using devices similar to those used to monitor interconnection. Network security appliances scan for bugs, worms, and viruses and stop these threats from circulating on their networks—a kind of inoculation. New security protocols like DNS-SEC secure the DNS by creating an international system of distributed trust to authenticate domain names and prevent man-in-the-middle attacks that allow a rogue server to hijack a legitimate domain name.[74] Platforms, as discussed above, also work to prevent risks. Google, for example, removed 345 million links at the request of copyright holders in 2014.[75]
Mechanisms of security have often relied on surveillance to elicit self-censorship in a population,[76] but more advanced programs have tended to avoid public attention altogether. Singapore, early in its adoption of the internet, opted for a “light-touch regulatory approach” using symbolic bans of websites to cultivate self-discipline in internet users.[77] Unfortunately, these national security measures pale in comparison to the covert militarization of cyberspace by the Five Eyes intelligence agencies.[78] Leaks from Edward Snowden and Mark Klein revealed that these agencies inspect most internet traffic through interception points installed in key locations of the internet backbone. Mark Klein described NSA “splitter rooms” installed in AT&T’s network to siphon data.[79] In these rooms, a Deep-Packet-Inspection (DPI) device built by a Narus (the STA 6400 traffic analyzer) inspected AT&T customers’ packet streams.[80]
Snowden, among his many leaks, disclosed how Five Eyes collects most internet traffic through upstream programs such as FAIRVIEW[81] and TEMPORA[82] that tap major undersea data cables through targeted programs such as PRISM, which collects data from major firms like Apple, Facebook, and Google,[83] or through LEVITATION, which mines data on 102 cyberlockers like SendSpace and MegaUpload.[84] Analysts then use programs such as XKEYSCORE to query the vast quantity of data collected.[85] Their efforts construct a data flow to inform their vision of reality, calculate the bandwidth of the acceptable, and regulate or nullify possible futures. These robust efforts align more with Foucault’s concept of security than with his panoptic model of surveillance.[86] Five Eyes seems to prefer to clandestinely observe and intervene, rather than discipline internet users with the threat of being watched. In fact, the Five Eyes agencies could be said to encourage, rather than suppress, so that they can better identify possible threats. Able to respond when necessary without a need for spectacle, the Five Eyes have an array of cyberweapons to disable and disrupt online threats (best seen in the leaks related to the Cyber Activity Spectrum of the Communications Security Establishment of Canada, or CSEC),[87] as well as offline weapons to discredit and destroy threats.[88]
While calculations of the bandwidth of the acceptable might be difficult to observe deep in the headquarters of the Five Eyes, the industry of human commercial content moderation demonstrates how all threats require a certain degree of deliberation. Dick pics, beheading videos, and images of animal cruelty clog the inputs of user-generated content. All social media platforms require users to abide by terms of service that regulate conduct, but automating acceptable use has proven more difficult. Instead, up to 100,000 human content moderators patrol social media platforms. These low-paid laborers, usually in the global South, watch all flagged content and decide whether it violates the acceptable use policies.[89] Facebook has a “user operations” team that monitors behavior, a “safety” team that watches for self-harm, and an “authenticity” team looking for fake accounts. No matter the content, a response requires calculation. Moderators have a few moments to decide if the content is inappropriate. Guidelines, habit, and acceptable use policies inform how a moderator judges the content and how, in turn, to respond. Many firms are located in the Philippines because its colonial history provides moderators with better sensitivities to American values.[90] Bots and algorithms (mentioned above) have also begun to automate this work of securing the platform. Google and Facebook, for example, have begun to automate the human work of content moderation with new algorithms designed to flag and remove extremist videos.[91]
Throughout this book, internet daemons have had a close relation to processes of securitization. The traffic-management industry discussed in chapter 4 does double duty as a way to both manage bandwidth and secure infrastructures, and it is sometimes called the cybersecurity industry. Arbor Network’s ATLAS program (Active Threat Level Analysis System), a global system of observation and detection of threats, exemplifies how daemons are tasked with solving the problem of cybersecurity. Yet, the reach of daemons also polices bandwidth. Andrea Kuehn and Stephanie Michelle Santoso join internet-governance expert Milton Mueller in arguing that DPI is increasingly used for copyright enforcement.[92] Indeed, copyright protection drives the production of new daemons to manage copyright better within infrastructures. As online piracy has moved to streaming sites and cyberlockers, Cisco Systems intends to sell a new device that inspects packets for copyrighted content and blocks them.[93] This control does not occur within the platform or in the protocol, but during transmission through the work of internet daemons. Thus, networked operating systems may be able to use daemons to nullify certain possibilities in addition to controlling the flows of information.
Together these daemons and other mechanisms of control create an ecology of control that coordinates new Infernos—networked operating systems. This broad overview contextualizes the internet daemons of this book. Flow control coordinates with mechanisms of connecting, standardizing, mediating, and securing. For networked operating systems to function, they require internet daemons to coordinate their data flows and systems of feedback across their vast distributed operations.
Daemonic media studies should prove helpful to understanding these new Infernos and their daemons as they become the internets and as they become embedded more into everyday life, expanding to connect more infrastructures, more devices, and more people. The term “Internet of Things” has been popularized to capture this expansive vision of the internet’s reach. Phil Howard, in his book on the concept, argues for the term to be broadened from marketing lingo to an understanding that “nation-states, polities, and governments need to be thought of as socio-technical systems, not representative systems.”[94] A renewed interest in operating systems might be a means to analyzing this wider theorization of the IoT that draws on my daemonic media studies.
Living with Our Daemons
As daemons multiply, it helps to remember their mythic origins and the optimism therein. Dante, in his journey through heaven and hell in the Divine Comedy, encountered the demon Charon, who ferried souls across the river Styx into hell: “Charon the demon, with eyes of glowing coal, beckoning them, collects them all; smites with his oar whoever lingers.”[95] Charon hints at the power of the daemon as a mythical spirit whose presence explains how something works, who repeats a task endlessly toward a goal. James Clerk Maxwell thought of his demon as being capable of what was impossible for mortals. His demon was conjured out of hope for a technical solution to human fallibility. Such optimism endures in the internet today as Maxwell’s descendants champion artificial intelligence, which promises to solve what humans cannot and conceive of new optimizations to resolve issues mired in politics and policy.
This daemonic optimism raises a final question. What do daemons desire? What futures do they hope for? Rendering these daemonic desires public might be the primary task of a daemonic media studies. Only through coming to terms with our daemons will future studies be possible. Daemonic media studies embrace the volatile mixture of lively daemons, bots, and algorithms, of wires, cables, and processors, and of a multitude of humans. Daemonic media studies are fascinated by daemons because they are presently important to media policy and power. Daemons allow the internet to be a multimedia medium by managing its various networks. Where I have argued that daemonic flow control guides the operation of the internet and underlies key policy debates like net neutrality, future studies must look to other instances of daemonic influence. In an era of algorithms, AI, and bots, I expect their manifestations to be legion.