A Crescendo of Online Interactive Debugging?
Gamers, Publics, and Daemons
RENÉ DESCARTES AND COMPUTER GAMERS faced a similar problem. A mischievous creature haunted both of them. Symptoms varied. Descartes worried that some malicious demon was manipulating him, generating illusions that meant he could not trust his hands, eyes, flesh, and blood. Everything his senses detected might be one of these illusions caused by an evil demon (or evil genius in other interpretations). Canadian gamers had less existential woes. Beginning in 2010, gamers who subscribed to Rogers Internet experienced trouble enjoying online games like World of Warcraft and Call of Duty. The ouroboros symbol of lag haunted their games until, suddenly, something broke their connection to these virtual worlds. Neither Descartes nor these gamers had proof of their suspicions. Where Descartes adopted a philosophical approach, gamers, specifically customers of Rogers Internet in Canada, had to develop their own methods to explain their connection issues. How could gamers identify the true cause from among the possibilities? Could it be their modems or their home routers? Or maybe a problem with a recent update to the game? These questions led the gamers deep into the infrastructure of their internet service provider (ISP) and into a Pandaemonium, that home of its daemons, named for the capital city of hell in John Milton’s Paradise Lost.
The World of Warcraft gamers discussed in this chapter faced a situation common to many caught up in the distributive agency of infrastructure. What is their relation to a technical infrastructure? What is the proper response when it fails? Reflecting on the subject’s position in these assemblages, Jane Bennett writes: “Perhaps the ethical responsibility of an individual now resides in one’s response to the assemblages in which one finds oneself participating: Do I attempt to extricate myself from assemblages whose trajectory is likely to do harm? Do I enter into the proximity of assemblages whose conglomerate effectively tends toward the enactment of nobler ends?” Gamers could easily switch to a better ISP (though media concentration limits their choices). They could have extracted themselves from the assemblage, to borrow Bennett’s phrasing. But they had another option: to fix Rogers. This choice did not so much address harm (beyond the frustration of losing in the game), but rather involved making an infrastructure accountable, specifically to Canada’s regulations for net neutrality if a violation of the neutrality principle proved to be the cause of their connection issues.
By choosing to address the issue, Canadian gamers exemplified how publics address the influence of daemons in the internet. As we have seen, “daemons” refers to those programs running in the internet’s infrastructures that control communication. “Publics” refers to strangers bound together by a common issue, and they are a vital force that keeps the internet running and accountable. Chris Kelty stresses the importance of internet-focused recursive publics in creating the internet’s critical infrastructure. The free software movement, as Gabriella Coleman compellingly argues, is vital to preserving internet freedom. Publics have also had an important role in internet regulation, as was seen in the Comcast case discussed in the introduction. Canadian gamers have had a role too. Before net-neutrality legislation had even passed in the United States and the European Union, Canadian gamers had prompted an early case of net-neutrality enforcement.
The two-year struggle between Rogers Internet and Canadian gamers presents a critical case through which the role of publics in revealing the operations of daemons and holding them accountable can be understood. The obstacles these gamers faced exemplify the challenges of rendering daemons public. Gamers lacked proof of a problem; only a sense that something unusual was happening prompted them to investigate. This feeling of being affected is a critical step in the formation and resolution of problems by publics. Daemons, however, thwart the development of publics. Their intangibility and the invisibility of their operations make wider reflection difficult. Publics affected by daemons have to bootstrap their own convention through mediators and reflexive apparatuses. In other words, the public comes to know itself and its unifying issue only through a cycle of positive feedback in which research leads to a better, wider definition of the public. This chapter and the appendix give special attention to internet measurement. Since ARPANET, researchers have had to find tools to study the internet’s running code. These tools enable a sort of public research into distributed systems like the internet (or what might be called an early algorithmic audit). The appendix elaborates the history and technologies of internet measurement that help to reveal the work of daemons, whereas this chapter reflects more on the role of finding mediators for daemons. The story of these mediators, together with the work of publics, provides a fitting end to the book, revealing how advocacy and public engagement offer perhaps the only enduring response to the influence of daemons.
“How Does This Affect You?” Rogers Internet and World of Warcraft
Canadians were having a hard time playing World of Warcraft (WoW) in 2010. The Massive-Multiplayer-Online (MMO) game was at the height of its popularity. The fantasy game had just released its third expansion, “Cataclysm,” with a fire-breathing dragon named Deathwing the Destroyer gracing its box cover. But daemons, not dragons, were bothering gamers. Lag, disconnections, and difficulty joining a game plagued gamers from sea to sea. Explanations were scattered across blogs and internet news websites, as well as in the support forums for the game and for local ISPs. In the WoW forums, a gamer using the alias “Shifthead” started a thread on December 18, 2010, entitled “Rogers ISP, WoW, and you!” Shifthead frequently suffered from high latency, the time a packet takes to reach its destination (usually measured in milliseconds). The post asked “How does this affect you?” Shifthead’s post generated twenty-three pages of replies and discussion lasted until late 2011.
No one believed that the disconnection issues were random. Shifthead blamed Rogers Internet and linked to stories from consumer-oriented internet news sites Torrentfreak and DSLReports. These websites reported that Rogers Internet had changed its internet traffic management practices sometime after September 2010. Customers had reported numerous connection issues that adversely affected WoW and other applications. Replies to Shifthead’s post echoed these stories.
Gamers had a reason to suspect Rogers Internet. Investigations by eMule and Vuze users into Comcast mentioned in the introduction had also discovered that Rogers, like most Canadian ISPs, actively managed Peer-to-Peer (P2P) traffic. Gamers knew that Rogers continued to manage upstream traffic, stopping BitTorrent networks from consuming too much of its limited capacity, and so they suspected that WoW traffic was mistakenly being throttled.
By the time of Shifhead’s post, these practices were common knowledge, thanks in part to a public inquiry. In 2009, citizen and industry concern prompted the Canadian regulator, the Canadian Radio-Television and Telecommunications Commission (CRTC), to enact one of the first net-neutrality rules. The rulings resulted from a complaint by internet resellers who bought wholesale internet access from Canada’s established players, a practice that was a national policy meant to promote competition in the market. The Canadian Association of Internet Providers (CAIP), an association of fifty-five small ISPs in Canada, submitted a complaint that Bell had begun throttling their wholesale internet connections. Even though they denied CAIP’s initial request to stop Bell, the CRTC launched formal regulatory hearings on throttling and other internet traffic management practices in April 2008. The commission heard from ISP representatives, telecommunications experts, and public-interest groups over the summer of 2009. During the hearings, Canadian ISPs disclosed that they did manage P2P applications, configuring their daemons to find and limit these networks.
The CRTC released its policy on internet-traffic management practices on October 21, 2009. The Telecom Regulatory Policy CRTC 2009-657 permitted traffic management so long as it met four conditions. First, all practices had to be publicly disclosed (ISPs could comply by simply stating their practices on their websites). Second, the traffic management had to meet a defined need. Traffic management was a fundamental tool for ISPs, but it had to be used properly. The last two conditions concerned fairness and competitive neutrality. The CRTC prevented ISPs from using traffic management practices uncompetitively or preferentially. This included prohibiting ISPs from using Deep Packet Inspection for anything other than traffic management. Prominent advocates of net neutrality, such as Michael Geist, Canada Research Chair of Internet and E-commerce Law at the University of Ottawa, and Milton Mueller, scholar of internet governance, cautiously embraced the framework.
WoW gamers knew that, if Rogers Internet had deliberately throttled WoW, then it had broken these new rules. No one suspected that Rogers was throttling WoW traffic for anti-competitive reasons. Posters in Shifthead’s thread speculated about numerous explanations, including the throttling being the result of error. This would be an acceptable excuse, but posters debated whether Rogers Internet knew about the issue. If the company did know that WoW had connection issues, then why had it not released a statement? Why had it not updated its traffic management disclosure? Gamers debated how best to bring the issue to the attention of Rogers Internet. Could they find some evidence to convince Rogers or to justify a complaint to the CRTC?
These puzzled gamers revealed a common criticism of the CRTC’s traffic-management policy, and indeed of net-neutrality legislation in general. While regulations limited the work of daemons, they lacked oversight. The onus rested on the complainant to provide evidence of violations of these principles (a common problem for ex post facto rules). As WoW gamers were learning, finding answers was difficult. In a blog post about the same issues affecting WoW gamers, Christopher Parsons, an expert in telecommunications and privacy now at the University of Toronto’s Citizen Lab, called for third-party oversight to watch for misapplications of traffic management and to alert the public. Without such a third party in place, WoW gamers were left discussing the problem with each other, guessing at explanations.
These affected gamers offer a first step toward understanding a public response to flow control and to daemonic media. As the WoW gamers replied to each other, they formed part of a public dedicated to proving daemonic effects. “Publics” is a term used by the pragmatists Walter Lippmann and John Dewey. Publics, to Dewey, are “all those who are affected by the indirect consequences of transactions to such an extent that it is deemed necessary to have those consequences systematically cared for.” Though Dewey predates affect theory, it is telling that he defined publics as the affected, a suggestion that publics initially did not know exactly what was bothering them. No sense of being a public or knowledge of their problems prefigures the transaction, but once formed, publics function both as a means to acquire knowledge and as a political resolution (not unlike the advertisements discussed in chapter 5). Indirect consequences demand a response. Affected strangers become drawn into participating in a collective understanding. As people become more aware of the consequences, they become more aware of their problem and their role in a solution. Democracy, to Dewey, succeeds in resolving the complexities of life through this process of affected persons systematically caring for the indirect consequences of transactions that got them involved in the first place.
Publics are a conceptual and normative way to address technical controversies and open black boxes, such as through the work of Noortje Marres and Bruno Latour, as well as that of Jane Bennett, on whom I draw more frequently. According to Bennett, a public “is a contingent and temporary formation” that forms after being “provoked to do so by a problem, that is, by the ‘indirect, serious and enduring’ consequences of ‘conjoint action.’” There are many problems that could cause people to form a public: scientific controversies, events, debates, and problems like the disconnection issues in WoW. To those studying infrastructure, problems might be seen as an outcome of infrastructural inversion: when the system breaks, its workings become more apparent. Digital technology, however, has often been seen to complicate the formation of publics. In his studies of networked-information algorithms, Mike Ananny argues that algorithmic sorting convenes publics, though these computational associations are rarely apparent to those affected. Ananny gives the example of computational analysis of Facebook data sorting people by sexual orientation. Targeted advertising also aggregates people into fluid and self-correcting probable categories. As John Cheney-Lippold argues, these algorithmic calculations of relevance become a feedback loop in which people come to identify with their calculated demographics. Daemons also convene publics, as WoW gamers were learning.
World of Warcraft Gamers and Their Problems
Posters in Shifthead’s thread tried to solve the problem. Respondents tried to find some proof connecting their issues to Rogers’s traffic management practices. In doing so, these publics formed affective bonds and relationships similar to Zizi Papacharissi’s descriptions of “networked structures of feeling.” Gamers had difficulty finding answers linking their connection issues to Rogers Internet’s infrastructure. The only evidence was a post by a Rogers employee in a DSLReports forum on October 28, 2009. The employee, using the handle “RogersKeith,” admitted that changes in its traffic management had disrupted some non-P2P networks, but he did not directly mention WoW. RogersKeith promised to respond to the issue as quickly as possible. No updates had been made to Rogers’s traffic management disclosure on their website.
Conversations in the thread turned to discussions of how best to raise awareness of the problem. While niche-oriented, internet-focused outlets had covered the news of Rogers traffic management in 2010, coverage had been largely absent since. Posters in the thread believed they made up only a small portion of those affected by Rogers Internet’s throttling. Other games and other gamers might also be affected. Furthermore, their experience of being throttled was only one aspect of the wider problem of violations of net neutrality that affected all of Rogers’s customers. Could those other, regular internet users be enlisted in the cause? The WoW posters, for their part, cited issues like net neutrality and privacy that implicitly connected their own concerns with a larger, more inclusive public.
The gamers debated how to address this lack of publicity. A poster by the name of “Demonomania” suggested creating a petition. Other posts linked to different forums where, it turned out, others were discussing the issues as well. To troubleshoot similar issues, “Goldmonger” then opened a thread on the game’s forums on January 26, 2011. He asked people to run a test (technically a traceroute) and post the results, including their location, operating system, and ISP, on the thread. He hoped that a running list of this technical information would help the game’s owners and the Canadian ISPs fix the issue. On the Rogers support forums, a customer using the alias “Ressy” asked Rogers to explain the issue. Her post generated fifty-eight pages of replies. She posted a link to Shifthead’s thread. These threads pointed both to the scope of the issue and to the fragmentation of these affected strangers.
These threads illustrate the convening of a public. If publics begin as “a relation among strangers,” according to Michael Warner, then some aspects of the transaction must cause people to think they might be affected by a common issue. He calls this cause the reflexive apparatus, and it plays a crucial part in the forming of a public by allowing people to think of themselves as part of something collective. Looking at climate-change activism, for example, Marres contends that domestic appliances have the potential for “dramatizing connections between practices in here and changing climates out there.” A new dishwasher arriving in the home, Marres suggests, brings with it a political opportunity for its users to reflect on water use and its consequences for the environment. Drawing these kinds of connections suggests an approach to climate-change activism that seeks to raise awareness of the links from the private space of the home out to the environment, instead of attempting to inject an environmental awareness into a detached domestic sphere.
The reflexive apparatus is usually obvious. Traditional media have prominent reflexive apparatuses. Benedict Anderson suggests that newspapers created reading publics integral to early nationalism. As he writes, “the date at the top of the newspaper, the single most important emblem on it, provides the essential connection—the steady onward clocking of homogeneous empty time.” The date allowed the public to imagine they existed in a common time, and thus could relate to issues as nations rather than as mere individuals or families. Newspapers and television programs, Warner argues, similarly to Anderson, have a punctual temporality of circulation that produces a routine capable of fostering a subjectivity, or publicness, from its audience of strangers. The reflexive apparatus allows the possibility of systematically caring about an issue.
What WoW gamers were learning is that daemons frustrate the convening of publics. As forum poster “Haakonii” noted, traffic management “is transparent, undetectable and beyond the technical comprehension of the man-on-the-street.” Daemons leave little trace of their influence. Screens depict only the outputs of calculations. Ganaele Langlois argues that the web includes both what users see represented on screen and a-semiotic encodings “that work through the transformation of human input (meaningful content and behavior) into information that can then be further channeled through other informational processes and transformed, for instance, into a value-added service.” Much of the internet (algorithms and software processes) functions a-semiotically, or without signification. Daemons, like algorithms and software, also function in the microtemporalities of computing, a scale imperceptible to humans. Calculations occur too quickly for a user to notice, and daemons, by default, do not leave a record. Moments of reflection evaporate even though their implications endure.
However, it is more than just that daemons are unrepresented and untraceable; often, difference is the only commonality of daemonic publics. Daemons function dynamically, which often privatizes affects by network or by user. A user experiences a particular array of affects composed through their own way of being online. Gilles Deleuze describes this type of subjectivity as “dividuality.” He writes: “We no longer find ourselves dealing with the mass/individual pair. Individuals are ‘dividuals,’ and masses, samples, data, markets, or ‘banks.’” Individuals dissolve into a variety of profiles and network classes. One user might have some of their traffic throttled while others experience acceleration. These experiences appear unique or individual, a product of targeting and redlining. Dividuality increases differences and fragmentation as it dissects users into dividuals. The body public, in other words, is ever thus dissected and reassembled constantly. Marco Deseriis expresses this condition well:
By breaking down the continuity of the social bios into dividual sessions and transactions, the engineer of control produces what Franco Berardi calls a “cellularized” info-time, an abstract time that is no longer attached to the body of any specific individual but generated by the automated recombination of dividual fragments of time in the network.
People remain dividualized. Daemons seemingly destabilize the traditional subjectivity of media and publics. This suggests that publics simply cannot form because the myriad of dividual sessions thwarts the necessary reflexive apparatus. Gamers, then, faced an uphill battle in trying to convene a public out of these desperate experiences of disconnection and lag.
Will This Problem Fix Itself?
Posters did find solutions. Some tried contacting Rogers. One poster described the results of such efforts: “Just called Rogers, complained, and they said ‘we do not throttle people.’ Called back, got a different service rep, and he said ‘we have not had other complaints about this issue.’” Through calls and posts, gamers began to compose themselves as they assembled more and more evidence about this public problem. By the second page of the thread, comments had begun to discuss Deep Packet Inspection (DPI), concerns about this technology raised by the Canadian privacy commissioner, and the role of the CRTC in regulating internet traffic management practices. Although posts did not solve the issue, they did help these gamers understand themselves as a public.
But this nascent public had difficulty finding a resolution. Some posters suggested avoiding the problem rather than solving it. One user experienced better connections by tunneling to WoW servers using a paid service called WoWtunnels. Not unlike IPREDator, discussed in the previous chapter, subscribers paid $3.95 (USD) per month to connect directly to WoWtunnels’ server, which was located closer to the WoW servers. WoWtunnels extricated gamers from the problem (to recall Bennett’s question of what public should do with broken infrastructures, mentioned at the start of the chapter). But this solution carried a risk: Blizzard, the game’s manufacturer, was forever looking to stop cheaters and banned players with suspicious account activity. One replier, seemingly frustrated, summed up the issue: “So wait, are we Rogers users all SOL from now on or will this fix itself?” Did gamers actually have a way to resolve their affliction?
Customers weren’t out of luck, Shifthead replied. People needed to raise the issue. Shifthead posted, “if more people call in about it, the better the chance Rogers will revert the changes.” Calling in, to Shifthead, meant talking with Rogers and contacting the CRTC. Indeed, Shifthead wrote that the CRTC had already received one complaint. Could others write more? How could they provide evidence of the issue? How could they enlist others to help them prove it? Shifthead was calling for those in the forum to systematically care for the issue. The challenge, however, was to find ways to investigate the hidden world of daemons. His call can be heard as part of a larger refrain on the internet: how can publics embrace their daemons?
The Demos and the Daemon
According to Dewey, publics are an “immense intelligence.” This intelligence is active; it comes from members of the public being participants rather than spectators: “If we see that knowing is not the act of an outside spectator but of a participator inside the natural and social scene, then the true object of knowledge resides in the consequences of directed action.” One cannot see without being an active part of the world. Yaron Ezrahi quotes the above passage from a commentary on Dewey and adds that “seeing is always an aspect of acting and interacting, of coping with problems and trying to adapt and improve, rather than just contemplate, mirror, or record.” Ezrahi’s interpretation inverts the relationship between publics and information, positing that publics do not just receive information, but produce it. Knowledge results from experience and process, not just witness and spectacle. Through the stories and conversation found in venues like the WoW forums, people come to know themselves as forming a public. These formational moments offer an opportunity to create new knowledge about the world, creating the possibility of publics becoming aware, becoming engaged, and developing into a tangible political force capable of addressing the provoking issue.
Public research works as a kind of recursion. Douglas Engelbart, ARPA researcher and early architect of personal computing, adapting the term “bootstrapping” (generally meaning developing something without assistance), describes it here as a form of positive feedback and uses it to describe a loop or recursion in which scientists, through their research, improve their ability to do research. Building a computer was both research and a better way to do research, as the computer could aid future research. Geoffrey Bowker later uses bootstrapping to describe how infrastructure needs to exist in order to exist. Chris Kelty describes a process like bootstrapping in his work on recursive publics, which are “publics concerned with the ability to build, control, modify and maintain the infrastructure that allows them to come into being in the first place.” Bootstrapping emphasizes this recursive move when publics come to know themselves, the issues, and the solution simultaneously.
Research helps a public understand its relation to daemons. Though Dewey emphasized the human side of publics, Bennett argues that the conjoint action of a public includes more than humans: “For is it not the case that some of the initiatives that conjoin and cause harm started from (or later became conjoined with) the vibrant bodies of animals, plants, metals, or machines?” Could daemons not be added? The formation of a public then involves seeing itself as part of a larger system. Bennett gestures to Latour’s idea of a parliament of things as a way to imagine a heterogeneous public putting daemons and demos (demographics) on equal footing, although she rejects his tendency for horizontalism. The forming of a public, then, involves more than just humans becoming aware; it requires a new sense of the world.
Such a task is well-stated by Michel Callon, Pierre Lascoumes, and Yannick Barthe in their writings on sociotechnical democracy. They argue that publics, through their formation, compose a common world. Publics begin as what they call “uncertainties of groupings” that lack a collective understanding. “Composition” designates a collective that seemingly creates a new grouping, but it also casts the identity of its members in flux. Embracing daemons is not an orientation to reality, but a composition of a common world that involves new understandings of the issues and senses of self. It is a recursive process similar to bootstrapping, or as they write, compositions “simultaneously define (or redefine) the significant entities.” Controversies like the nascent one facing WoW gamers are “powerful apparatuses for exploring and learning about possible worlds.” The challenge is to develop methods to compose the world that are inclusive of both gamers and daemons.
In what ways could the gamer publics better understand their relation to daemons and their larger infrastructures? The challenge involved a kind of public research, finding methods that could reveal the work of daemons. Shifthead began: “Are there any definitive tests that can prove this is happening to you?” In other words, what tools might be able to translate the effects of flow control or the feelings of frustration into something more tangible. Shifthead did not have an answer and dispelled hopes that simple measures could detect the issue: “I’m afraid not. Because the throttling is only dropping certain packets, pings and [traceroutes] are completely unaffected.” Pings and traceroutes are popular—and very old—tools for internet measurement, as discussed in the appendix. These tools might be some of the first tools for public research of flow control. They were not the only ones.
On the third page of the thread, “Haakonii” mentions a tool named “Glasnost” that was created by one of the leading internet-measurement projects, the Measurement Lab (M-Lab). The tool simulates different packet flows (Flash, BitTorrent, and HTTP) and compares their performance. Hypothetically, all flows should perform equally. If not, then the Glasnost results provide some evidence of traffic management. Glasnost, however, lacked the ability to simulate WoW packets, and Shifthead replied, “this test is fairly useless for the type of throttling Rogers does.” A fair point, but one that did not solve the public’s inability to understand their problem.
Shifthead and others were searching for what I call “mediators,” a term borrowed from Deleuze and by which I mean tools and methods to include daemons in publics. Deleuze writes:
[Mediators are] not so much a matter of winning arguments as of being open about things. Being open is setting out the “facts,” not only of a situation but of a problem. Making visible things that would otherwise remain hidden.
Note his use of “making [the French rendre, to render, to make, or to return] visible,” as opposed to “finding” or “revealing.” Broadly speaking, internet-measurement tools can function as a mediator: they return the hidden work of daemons to the public. Mediators help publics know the transaction that caused them to be affected. Mediators in this sense are both dynamic, in that they become active projects to observe flow control, and static, since they endure as databases and logs of evidence. In doing so, mediators help publics better understand themselves and their problems. Mediators function to bootstrap the reflexive apparatus and convening of publics. By revealing the modulations of flow control, internet measurements publicize the dividual effects and allow daemons to be part of the conjoint action of publics. I discuss more mediators in the appendix.
Not all mediators are technical or simply concern daemons. As mentioned earlier, blog posts and forum threads also help publics understand themselves. What the posters sought were ways to convene those members of the public not already reflexively aware of their association. Mediators, from blog posts to forums to internet measurements, become a way to convene these publics. This process of bootstrapping leads to a bigger public, more capable of composing a common world.
Cataclysm: Teresa Murphy and Rogers Internet
Publics did eventually convene to resolve the WoW controversy. The desperate threads and theories began to connect through the work of Teresa Murphy. She had been a WoW gamer since 2006 and noticed a strange issue when visiting her sister: Murphy could not connect to the game from her sister’s Rogers Internet connection. She told the blog The Torontoist: “Mostly, I thought it was weird . . . you just couldn’t see a cause for the problem.” Her attempts to diagnose the problem using internet-measurement tools failed. She could connect to the internet, but something between her and the WoW servers disrupted that connection. Using her alias “Ressy,” she started a thread on the Rogers Internet technical support forums on January 17, 2011. Her initial post explained the issue and asked “Who can I talk to [in order] to get this fixed?” That turned out to be complicated.
Rogers employees replied two days after Murphy’s first post, assuring her that an investigation was under way. Her first replies to the support agent were hopeful, as she expected the matter to be fixed soon. Her tone deteriorated as a week passed without any resolution. Ten days after her first post, she claimed to have heard from a WoW employee who claimed they had not been contacted by Rogers Internet. Rogers support staff continued to post in the forum to state that the matter was still being resolved. The official replies ignored new evidence posted in the thread.
As Murphy waited for a clear answer from Rogers, she began to connect the threads between the WoW forums and the Rogers forums. She actively posted in Shifthead’s threads in the WoW forum, as well as others dedicated to the Rogers issue. She responded to questions from other gamers, helping them understand the issue and correcting rumors. In her posts, she explained her conversations with both Blizzard and Rogers Internet. She also learned what others discovered in their own complaints and investigations. One poster in the Rogers forum shared a link to another thread on DSLReports. Another poster shared the news that Starcraft 2 players were also experiencing problems and posted a copy of a chat log with Rogers Internet technical support. Another poster using the alias “irix” posted that Rogers uses Cisco SCE traffic management devices. Irix explained that these devices use “a combination of packet content inspection, number of connections, connection establishment / teardown rate, packet size and other traffic characteristics to classify traffic.” Irix continued, writing that changes in WoW’s code meant that “the SCE can sometimes, especially under higher traffic conditions in WoW, mis-categorize WoW traffic and cause it to be rate limited / throttled.” Irix’s post proved to be a decisive comment offering a clear technical and alternative explanation to the problem. Technical support did not reply to these comments.
Murphy’s activities culminated on February 14, 2011. She posted on the Rogers Internet forum that she had sent a complaint to the CRTC outlining the problem affecting WoW gamers and her difficulties getting a response from Rogers. The CRTC responded to Murphy’s letter on February 23, 2011. She wrote to her fellow WoW gamers: “I think I love the CRTC. They accepted my complaint against Rogers throttling gaming, stating it’s P2P traffic.” The CRTC addressed their letter to both Murphy and Rogers Internet and asked Rogers to explain the issue. Thus, the burden of evidence shifted to Rogers, but only briefly. The complaint remained unresolved for the rest of 2011.
Rogers responded on March 22, 2011. Their letter admitted that an update to its traffic management practices interfered with WoW traffic, but they claimed that the issue occurred only when a customer connected to a P2P network while playing the game. This meant that if a P2P network was running on any connected device (say, another computer in a household sharing a connection among a few computers), WoW gamers might experience connection issues. Rogers also claimed that they had known about the issue and tried to fix it. Their initial solution did not work, but they promised a new one by June. Until then, Rogers suggested gamers turn off any P2P applications while running the game, including disabling the official P2P network that Blizzard used to share updates for the game.
Murphy rejected Rogers’s explanation, drawing on what she learned in the forums. P2P had nothing to do with WoW disconnection issues. The theory had already been rejected in the forum threads. Two months before Rogers’s reply, on February 9 to be precise, when Murphy responded to one poster to explain that she had used P2P only during patching processes, an activity infrequent enough that it alone could not completely explain the issue. Brianl, a WoW employee, replied the same day to corroborate Murphy’s explanation:
Your game connection is not on p2p, so . . . you’re welcome? :)
If your ISP thinks that is what is causing the issues, I humbly request that you ask them to contact us directly. We will be more than happy to speak with them and discuss why their customers may be seeing these issues.
Furthermore, irix’s post citing issues with the Cisco SCE appliance cast doubt on Rogers Internet’s explanation. These competing explanations marked a point of collision between two worlds: Rogers Internet’s public face and the gamer publics.
Murphy responded to Rogers Internet via a letter to the CRTC on March 29, 2011. No better record of the controversy likely exists than her letter. It included data collected by gamers, records of interactions with Rogers, and complaints posted in the threads mentioned above. She asked, drawing on her knowledge of Rogers Internet, why WoW traffic suffered even when no P2P networks were running. The letter documented her concerns that Rogers had misinformed its customers of the issue. Murphy also explained the technical side of WoW patching and provided a timeline that raised questions about why Rogers had not admitted the error sooner. Murphy offered a competing explanation of the issue: she argued that Rogers Internet used Cisco devices to shape traffic and that these devices misapplied rate limits to WoW traffic (similar to irix’s claim).
Murphy also mentioned that similar issues with WoW connection had been reported and resolved quickly in the United States. Indeed, “Brianl,” the same technical support officer helping Shifthead and others, had addressed similar disconnection issues in a thread from November 2010, explaining that Blizzard “changed [its] traffic pattern, and this is what is triggering traffic management systems to throttle individual connections.” Later, on November 18, Brianl explained that Cisco had to change its policy maps (discussed in chapter 5) to correctly classify WoW traffic and expected a patch in late November. His comments corroborate the belief among WoW gamers that their connection issues had nothing to do with P2P networks. The questions for Murphy (and, by extension, the CRTC) were why Rogers Internet had not applied these patches and why it cited P2P traffic as the cause of the misclassification.
For all my previous talk of publics, this public response to flow control was initially something of a one-person operation. Teresa Murphy had been the sole representative of the WoW gamers to the CRTC for most of this complaint process. Her submissions alone had kept the CRTC connected to the discussions happening on WoW forums. Her letters passed on the complaints and findings of her gaming peers. She also provided an important counterargument to Rogers Internet’s descriptions of their infrastructure, which conflicted with the explanations refined in the discussion threads. Like her peers in the forums, she still had difficulty publicizing the issue. Her activity had attracted no mainstream press attention and only scant coverage in sympathetic blogs up until late July. OpenMedia, Canada’s digital policy advocacy group, had blogged about her case back in March. Geist, Canada’s foremost internet law expert, had blogged multiple times, mostly to update his readers about the status of the complaint. Yet, she had also been making connections with her peers that led to a change in tactics.
Through Twitter, Murphy met another concerned party, Jason Koblovsky. He had participated in the CRTC internet-traffic-management hearings. Koblovsky mentioned to Murphy that he wanted to create an organization to represent Canadian gamers. Together they cofounded the Canadian Gamers Organization (CGO) on July 26, 2011. In many ways, the group became a mediator both for gamers to understand their common issues and for the wider public concerned with the state of the internet in general. Indeed, OpenMedia allowed the group to guest blog on its website, connecting these gamers with one of the largest internet-oriented advocacy lists in Canada. This attention proved an important push to increase pressure on Rogers Internet and the CRTC to resolve the issue.
It also helped that CGO filed a new complaint with the CRTC on July 31, 2011, claiming that Rogers Internet interfered with another popular online game, Call of Duty: Black Ops. Their press release included a description of another test—a mediator—that showed the game suffered due to traffic management by Rogers Internet. The advocacy group also expanded their policy intervention by filing complaints with the Ontario Ministry of Consumer and Business Affairs. Their advocacy soon attracted more press attention, with the Huffington Post running a story on the issue on August 22, 2011.
More than just expanding concern beyond a single game, CGO sustained the issue long enough for it to be resolved. On September 2, Rogers admitted that the issue affected non-WoW gamers but downplayed the problem, again claiming that it rarely happened and could be avoided if gamers turned off P2P networking. This time, Rogers Internet’s response attracted more attention. The Canadian Broadcasting Corporation (CBC) reported on the issue. Internet throttling again became of interest to the press.
A different mediator also made a timely intervention in Canadian media coverage. Milton Mueller, a leading expert on internet governance, used the Glasnost internet measurement project, the same tool Shifthead dismissed in the forums, to gather data about the global use of traffic management. Canada and Rogers appeared at the top of the list. Though Glasnost did not help gamers solve their problem, it helped journalists understand the matter. The CBC subsequently reported that “Rogers throttles file-sharing traffic from BitTorrent more than any other Internet provider in North America.” Mueller’s findings reinforced a narrative in Canadian press coverage that Canadian ISPs had a problematic relationship with internet traffic management.
As much as the work of publics led to the enforcement of the CRTC’s net neutrality framework, the complaint was resolved outside of public purview. The CRTC passed the complaint on to its Compliance and Enforcement Sector on October 27, 2011, which meant that the company finally accepted that there was an issue that warranted penalties. Gamers did not hear from the CRTC until the next year. On January 20, 2012, Andrea Rosen, Chief Compliance and Enforcement Officer, wrote to Rogers Internet to explain that its investigation had found that its Cisco equipment “applied a technical ITMP to unidentified traffic using default peer-to-peer (‘P2P’) ports.” The CRTC, in other words, found that Rogers Internet was wrong, though not entirely. P2P networks might interfere with WoW traffic, but only because Rogers daemons misclassified game-related packets. More importantly, the CRTC found Rogers Internet had implemented a controversial policy to throttle any unknown traffic on P2P ports. Just as had been warned by net neutrality advocates, Rogers’s polychronous optimization had foreclosed the unknown or the unpredictable. The perceived need for a manageable network outweighed the risk of unknown applications causing upstream congestion.
The CRTC ended its letter by asking Rogers either to rebut its evidence or to explain how it planned to comply with its regulation. The conversation ended in February 2012 when Rogers agreed to phase out any use of internet traffic management for P2P applications. This effectively ended the company’s attempt to optimize its infrastructure. Today, Rogers Internet’s traffic management policy simply states: “Rogers uses a variety of network management techniques. These techniques have evolved as the Internet has changed. We continue to manage the network to limit spam, viruses and other security threats.” It remains for the next public to hold that statement accountable.
Daemonic Media Policy
The controversy above raises important questions for policy and regulatory responses to daemons. This response will require reconsideration of each party’s role in the affair and the matter of accountability for distributive agency. Bennett, in her own study of infrastructural failure in the North American blackout of 2003, wonders how to assign blame after breakdowns caused by distributive agency. Formulating a policy response to the blackout was frustrating, as there was nobody to blame. Her theory of agency “does not posit a subject as the root cause of an effect” because “there are instead always a swarm of vitalities at play.” With this in mind, what should be done? How should Rogers Internet respond to their unruly daemons? What could the regulator do to acknowledge this distributive agency? Were gamer publics essential to the complaints success or merely spectacle? Answers to these questions require regulatory and policy principles concerning the accountability and management of optimizations. These concerns have their closest affinities to the emerging debates around regulating algorithms and bots.
Certainly a daemonic media policy would mean a reorientation of accountability in internet infrastructures. Ethical debates about algorithms frame the problem as an accountability gap between “the designer’s control and algorithm’s behaviour.” Rogers discovered this gap when its daemons misclassified WoW traffic. This gap is even wider in an optimization enforced by many daemons working in tandem, and in these ethical debates, “insufficient attention has been given to distributed responsibility, or responsibility as shared across a network of human and algorithmic actors simultaneously.” In the case above, accusations of fault could be made against Rogers Internet, Cisco Systems, and perhaps Blizzard. All three parties faced the dilemma of discovering the problem and assigning blame. This is called a problem of “traceability.” Forum posts, technical reports, and independent audits all had to identify the culprit. These lines of investigation collectively pulled the Cisco router into public view—for a time.
Distributed agency should not lead to an abdication of accountability. Bennett, for her part, plays with that idea, finding two political possibilities: “It is ultimately a matter of political judgment what is more needed today: should we acknowledge the power of human-nonhuman agencies and resist a politics of blame? Or should we persist with a strategic understatement of material agency in the hopes of enhancing the accountability of specific humans?” Rogers Internet, to her point, could be seen either as at fault or as a smaller player in a bigger system. They were accountable, but not necessarily to blame. They were slow to respond, but without clear intent. The ISP’s unexpected or ill-advised network management policy caused as much trouble for the company as gamers. Their daemons simply did not behave the way they planned. Is internet regulation now hopeless?
Perhaps it is more a matter of responsibility than of blame. No matter how distributed the agency, Rogers Internet remains the responsible party. The company’s daemons malfunctioned, and they had the authority to fix them. The resolution of the issue ultimately depended on them. Though their motivations seem ambiguous, their responsibility is clear. My point reflects one of Bennett’s comments about the electrical blackout she uses to discuss distributive agency:
Though it would give me great pleasure to assert that deregulation and corporate greed are the real culprits in the blackout, the most I can honestly affirm is that corporations are one of the sites at which human efforts at reform can be honestly applied, that corporate regulation is one place where intentions might initiate a cascade of effects.
Given the humanism of media policy, a response must apply to Rogers first, before daemons, if only to ensure a cascade of effect to resolve gamers’ issues. Beyond this case, more thought should be given to the traceability problem in distributed systems as called for in algorithm studies. Perhaps causality should be abandoned in favor of probability (a nod to the roots of the daemon). What if there was a threshold of traceability, an indication that a certain party has a majority or large stake in the matter. Such a turn might require a reconsideration of accountability, and perhaps a sense of forgiveness for mischievous daemons.
ISPs could also be more proactive in minimizing complexity in their infrastructures. Rogers Internet installed a daemon that they did not fully understand or control. That might hold a warning to network administrators when they install new daemons in the future. Do they know all that a daemon may do? ISPs need to better acknowledge which daemons they allow to share their infrastructure. Perhaps ISPs should aim to limit the unforeseen consequences of complexity by removing daemons. Lessened complexity might also be an added benefit of network neutrality rules that restrict the types of daemons that can be installed on a network.
In any case, ISPs need to admit they are not the only ones to speak on behalf of their daemons. Instead, they should listen more to others who understand and interpret their networks. That could be difficult for a company accustomed to representing its infrastructure, but Murphy and her fellow gamers knew a different side of that infrastructure. They too represented it publicly in the end. It is unclear how an ISP could admit not fully understanding its infrastructure in a regulatory context. Perhaps a trade-off could be made between responsibility and culpability in which ISPs are allowed to listen and admit mistakes with lower penalties. That might require more of a change in the ISPs’ public relations than in the regulatory context. Even now, the consequences of violating Canadian telecommunication law are low enough to allow domestic ISPs to be more publicly engaged.
The unruly daemons of Rogers Internet pose some new challenges for traditional media regulators. They can no longer avoid studying the deep materiality of infrastructure. The CRTC, for its part, prefers to be technologically neutral, avoiding infrastructural detail in its decisions. But daemons demand more attention, and not just their active configuration but their possible uses as well. Regulations might require an ISP to disable a daemon’s more advanced packet inspection features now, but once installed, these capabilities are ready to be used in the future. Reckoning with daemons might require drawing on new areas such as robot law. Sandra Braman, writing about the development of the internet through a review of the Requests for Comments (RFCs) archive, reflects:
To those responsible for ensuring a network that offers all the capacities the Internet offers when unconstrained, the first pass at building a network (while simultaneously conceptualizing and reconceptualizing just what that network should be) very quickly yielded a keen sense of the importance of both daemon and human citizens; further research, conceptualization and theorization in this area would be helpful for those currently building out the domain of robot law and in other areas of the law in which machinic liability is a matter of concern.
Braman’s reference to robot law highlights one important pathway to daemonic media policy. Machinic liability might offer another way to frame the net neutrality debate as a question of daemonic autonomy and the risk of out-of-control daemons.
Regulators, journalists, and academics all have an important role in the formation of publics that could speak for and with daemons. The WoW gamers succeeded in large part because the CRTC responded to and validated their concerns. Institutions can be mediators for publics. Hearings by the U.S. Federal Communications Commission (FCC) and the Canadian CRTC convene publics around a shared issue and give them a chance to speak for themselves. However, the formality of the complaint process at the CRTC clearly created barriers for participation. Murphy succeeded in spite of the formal complaint letters she was asked to produce. How could regulators clarify their expectations for public participation? Policy scholar Jonathan Obar has argued that form petitions and other tools created by advocacy groups enable better public participation. Though often accused of being a kind of astroturfing or faked grassroots support, these tools help translate public opinion into a language more accessible to the regulator.
Regulators might look to validating mediators (like those discussed in the appendix) that help the public diagnose and document issues. Such a task differs from calls for greater transparency or more data. Calls for transparency often assume that seeing how things work inevitably leads to understanding, trust, and regulatory outcomes. But data can be obtuse, and critiques have been made against open data that are similar to those made against a lack of data. The idea of openness often stands in for the type of participation and accountability it hopes to inspire. Open data, further, requires intermediaries and experts to actually use it. Indeed, the CRTC already required ISPs to disclose their traffic-management practices on their websites and submit information for their annual monitoring reports, but neither effort had any bearing on the WoW complaint. By engaging more in the field of internet measurement, regulators could at least legitimate new mediators for publics to diagnose their issues. Or regulators could be even more involved by creating their own tools or by creating legislative conditions for effective disclosure. Could policy be created to facilitate Freedom of Information Act requests about daemons (similar to calls for an FDA for algorithms)?
Internet measurement tools do not simply provide only data about the internet: they provide means for improving and encouraging public participation. Mediators, in this case, provide what Jennifer Gabrys, Helen Pritchard, and Benjamin Barratt call “good enough data”:
[This data] might fall outside of the usual practices of legitimation and validation that characterise scientific data (which also has its own processes for determining if data is good enough). However, it could be just good enough to initiate conversations with environmental regulators, to make claims about polluting processes, or to argue for more resources to be invested in regulatory-standard monitoring infrastructure.
Internet measurement tools and other mediators might not conclusively diagnose a problem (a formidable challenge in a distributed system), but they might indicate enough symptoms to warrant further investigation.
By improving channels of public feedback, a regulator acts as an important check on the public’s legitimacy. Some publics have recently taken a darker turn. Conspiracies, racism, and a deep suspicion of public institutions have affectively charged online publics. The story of the 2013 subreddit about the bombing of the Boston Marathon may mark a key turning point. An attempt to crowdsource the investigation ended with two people being falsely accused and Reddit apologizing for “online witch hunts and dangerous speculation.” What’s more troubling is that Reddit had already banned disclosing personal information on the site in part to prevent false and racialized allegations in the aftermath of the bombing. Community standards were ignored. Moderation could not keep up. The threat of these witch hunts and conspiracies will continue to haunt the legitimacy of publics. Reddit witch hunts are a reminder that publics do not necessarily act in the public interest. The effects of indirect consequences, to recall Dewey’s phrase, vary by person and by class, race, and gender (terms noticeably absent from Dewey’s writing). The freedom to feel upset or frustrated can be a privilege not afforded to all. Regulators and corporations might act as a check on publics, holding their public-mindedness accountable.
The success of Murphy, CGO, and OpenMedia is an important reminder that media advocacy can translate popular concerns into regulatory change. Groups like OpenMedia and CGO might be seen as an example of what Danny Kimball calls “wonkish populism.” They employ a discursive strategy that “entails public participation in arcane administrative procedures, with rhetoric antagonistic to establishment structures, but steeped in policy minutia.” Part of this activism involves finding new means of public participation.
Murphy deserves the most credit for collecting and making sense of all the forum posts. Her work is like a “data story” that composed data and experience into a narrative that could be read by the CRTC. Gabrys, Prichard, and Barratt suggest these data stories might be another source of “good enough data,” in the Rogers case, helping publics engage with environmental problems. Murphy’s intervention might be seen as one data story generated by the WoW gamers.
As they tell their stories, publics have to question how much to adopt regulatory and corporate discourses about an issue like the internet. As much as checks and balances might address accountability and equity questions with publics, they create problems about formulating the scope of the public and its means of systematically caring for something. Publics risk forming themselves around institutions unwilling or unable to address the issue. Some matters might not translate into demands easily heard by regulators. Publics then have to be aware of that which cannot translate. The WoW gamers succeeded, in part, by turning the international conversations happening on forums into a national issue. This national formation perhaps limited a broader global public concerned with internet optimizations.
Publics finally need to be more aware of themselves. Becoming upset about game performance is a privilege, one not possible for anyone living in rural and remote communities with less reliable internet access. Relying on customer complaints for regulation creates blindspots based on who has the privilege to complain. Big data sets in public life create similar biases. Who has the luxury to generate data? Publics face a test of whether caring for their indirect consequences leads to greater public benefit. Dewey, in some ways, signals this need. An indirect consequence must be systematically cared for. Systematic care, in the case of the WoW gamers, meant drawing out the issue beyond their game performance, out to being about the rights of all Canadians to internet service without accidental or undisclosed discrimination. Indeed, calls for communication rights or a right to internet service might be one way for publics, especially ones with the privilege of being prioritized, to demand that their experience be universal.