“6. Interrupting Shareveillance: New Cuts” in “Shareveillance”
6. Interrupting Shareveillance: New Cuts
There is no need to fear or hope, but only to look for new weapons.
—Gilles Deleuze, “Postscript on the Societies of Control”
THE SHAREVEILLANT SUBJECT, then, is rendered politically impotent from (at least) two, not necessarily distinct, directions. In the face of state and commercial dataveillance, the subject’s choices (whether with whom to communicate, what to circulate, or what to buy) are compulsorily shared to contribute to an evolving algorithm to make advertising, say, or governmentality more efficient, targeted, precise. The public is configured as rich big data rather than as a network of singularities with resistant potential. Open government portals share the state’s data with subjects and, in doing so, responsibilize and isolate individuals and thus disavow the legitimacy of collective power. In addition, this form of accountability produces a limited relation with the information provided. In monitoring the granular transactions of government—in the form of U.K. MPs’ expenses, for example, now available online after the scandal of 2009,[1] or the White House’s visitor’s log, financial disclosures, and annual reports to Congress, also offered online during the Obama administration[2]—the armchair auditor is only permitted to spot anomalies or aberrations in a system he or she has to otherwise acknowledge as fair. This form of sharing, of openness, anticipates a limited form of engagement and response. And, as I have outlined, even this armchair auditor able to engage with “raw” data is largely a fiction produced by the rhetoric of open government; the crucial role that datapreneurs and app developers play in mediating data means that the state’s sharing and the citizen’s share of the state are subject to market forces.
I want to reiterate that I am not imagining a once fully agential political subject who has now been supplanted by this shareveillant version, compromised by marketized, securitized, and neoliberal apparatus such as algorithmic governmentality and open data portals. Political agency has always been limited by structural and relational conditions as well as the fluidity, fragmentation, or fracture of psyches and subjectivities. Nevertheless, the particular discursive–material conditions that curtail agency alongside those other inescapable metaphysical limitations matter. For it is from here that we can more fully understand the singularity of the particular distribution we face.
It is one thing, of course, to diagnose a condition and quite another to prescribe a remedy. If one accepts that shareveillance supports a political settlement not conducive to radical equality, and that a more equitable distribution is something to strive for, how might shareveillance be interrupted? I would like to offer one possible strategy, while recognizing that there will be, and need to be, others. The conceptual framework for my interruption is inspired by the etymology of share. From the Old English scearu—“a cutting, shearing, tonsure; a part or division”—the root of the meaning of share apropos “portion,” to the term scear, with respect to plowshare, meaning, simply, “that which cuts,” cutting clearly resonates within the concept and practice of sharing.
This focus is certainly supported by Rancière’s (2004a, 225) framing of the distribution of the sensible, at least in certain translations:
I understand by this phrase the cutting up [decoupage] of the perceptual world that anticipates, through its sensible evidence, the distribution of shares and social parties. . . . And this redistribution itself presupposes a cutting up of what is visible and what is not, of what can be heard and what cannot, of what is noise and what is speech.
What share we have of resources, as well as the mode of sharing, falls along the lines of a particular distribution or cut. The way we share, the conditions and decisions underlying how and what we share, what I am calling the “cut,” create a certain distribution. My focusing in on the term cut here is, as I’ve confessed, inspired by the etymological roots of share, but I am also mindful of Sarah Kember and Joanna Zylinska’s (2012) productive use of it in Life after New Media. Thinking about mediation as a “complex and hybrid process” that is “all-encompassing and indivisible” (Kember and Zylinksa 2012, xv), the authors draw on a range of thinkers, including Henri Bergson, Karen Barad, Jacques Derrida, and Emmanuel Levinas, to imagine cuts (into the temporality of mediation) as creative, ethical incisions and decisions. Thus photography, to take their most potent example, is
understood here as a process of cutting through the flow of mediation on a number of levels: perceptive, material, technical, and conceptual. The recurrent moment of the cut—one we are familiar with not just via photography but also via film making, sculpture, writing, or, indeed, any other technical practice that involves transforming matter—is posited here as both a technique (an ontological entity encapsulating something that is, or something that is taking place) and an ethical imperative (the command: “Cut!”). (xvii–xix)
This leads Kember and Zylinska to ask what it means to “cut well” (xix). It is a question that every artist must ask himself or herself and practice, they argue. This imperative to cut well extends to all acts of mediation (any other technical practice that involves transforming matter), including the kinds of practices that mediate data that I engage with in this book. Obviously, not everyone who works with data is an “artist” in the way we would traditionally understand that term. But if we draw on aesthetics in the Rancièrean sense—as a distributive regime that determines political possibilities—then we can begin to see different decisions being made as to how and when to cut into data and what to reveal or conceal about that decision-making process itself, as ethical or unethical.
When we are cut off from our data (as is the case with closed data), we are not given the opportunity to make our own cuts into it. Equally, if the cut of data is such that we can only engage with the data in ways that support a political settlement we might not agree with—if what might appear as an ethical provision of data (through transparency measures, for example) in fact supports or makes more efficient an unethical system—then our cuts are determined with strict parameters. To cut (and therefore share) differently, to cut against the grain, we have to interrupt the strictures of shareveillance.
Many interruptive cuts deserve to be mentioned. First, by not abiding by the rules of privatized and securitized access and copyright laws, some acts of hacking can highlight the unidirectional sharing of closed data and systems. Bracketing off those hackers who are recruited by intelligence agencies and cybersecurity firms, independent hackers could be characterized as performing guerilla sharing (of source code, access, software, recovered data), taking a share where none was offered. Second, decentralized data storage facilities like Storj[3] and SAFE[4] offer distributed object storage across the network, utilizing excess hard drive space, as an alternative to traditional cloud storage solutions (what Geert Lovink [2016, 11] calls “centralized data silos”). By employing peer-to-peer technologies to distribute data across nodal points, this new kind of cloud computing shares out encrypted data in a way that interrupts the flow of data upstream to surveillant third parties. Third, and this leads on from the previous example, encryption technologies delimit sharing communities by requiring keys for decryption; sharing is, therefore, more controlled. Far from being an unconscious default, sharing becomes a purposeful and targeted action. Take, for example, the “off-the-record” encryption used by the free and open source instant messaging application Adium, favored by journalists and nongovernmental organization workers. According to the Off-the-Record Development Team, this technology offers not only encryption of instant messaging and authentication of your correspondent’s identity but also deniability, because the messages one sends do not have digital signatures that can be checked by a third party, and what they call “forward secrecy,” which means that if the user loses control of his or her private keys, no previous conversation is compromised.[5]
Some of the most creative cuts into shareveillance can be encapsulated by the term data obfuscation. In their book Obfuscation: A User’s Guide for Privacy and Protest, Finn Brunton and Helen Nissenbaum (2015, 1) identify a number of different obfuscation strategies that demonstrate a “deliberate addition of ambiguous, confusing, or misleading information to interfere with surveillance and data collection.” Brunton and Nissenbaum consider, among other technologies, the Onion Router (TOR), which allows for online anonymity through the combined tactics of encrypting communication and relaying it via several nodes on the Internet to obscure the source and destination; TrackMeNot, a browser extension that floods search engines with random search terms to render algorithms ineffective; and the privacy plug-in FaceCloak, which encrypts genuine information offered to Facebook so that it can only be viewed by other friends who also use FaceCloak.
We could also add to their examples tracking blockers like Ghostery, which intervene in consumer dataveillance by alerting users to, and in some cases disabling, cookies, tags, and beacons, and search engines like DuckDuckGo and StartPage, which allow for online searching without being tracked or profiled as with facilities like Google Search, the business model of which relies on the accumulation of consumer and user profiles and browsing habits. While driven by privacy concerns and corporate confidentiality, the Blackphone developed by Silent Circle offers mobile users a mode of communication built on a concept other than the form of sharing figured by shareveillance. Aptly, one of the professed unique selling propositions of the phone is that it is “built on a fundamentally different protocol.”[6] The online promotional video[7] consists of a series of interviews with mobile users (or actors posing as mobile users) who are asked to read out the terms and conditions of use of the apps on their mobile phones. One woman stumbles on the fact that she has agreed to an app being able to change her call log. A man realizes he has given an app permission to record audio at any time without his confirmation. A woman is incredulous that an app can “modify calendar events and send e-mails without [her] knowledge.” Yet another mobile user looks concerned that an app can read his text messages and modify his contacts. To back away from what Google’s Eric Schmidt called “the creepy line” and prevent leaky data, Blackphone uses its own operating system, offers compartmentalization facilities between work and social life in a way that goes against the grain of Facebook’s integrated philosophy and “real-name” policy, and preloads apps to “put you in control of what you share.”[8] Crucially, the Blackphone, like the other technologies described earlier, interrupts and asks us to question default modes of digital sharing.
Owen Campbell-Moore offers a playful cut into shareveillance in the form of a Chrome extension he devised during his time at Oxford University. Using a technique known as JPEG steganography, Secretbook enables users to hide messages in photos on Facebook by making many visually imperceptible changes to encode the secret data.[9] Traditionally complex, steganography tools are simplified and therefore democratized by Campbell-Moore. Here the act of sharing a photograph on and “with” Facebook belies another, more targeted sharing: one that requires any receiver to have a password. Messages are thus hidden not only from other Facebook users but also from Facebook’s scanning algorithms and profile-building analytics. Whereas cryptography can flag up encoded communications to surveillant entities, steganography (within a platform like Facebook, which has to deal with more than 350 million photos being uploaded every day) has more chance of slipping secret messages through unnoticed.[10]
As a particularly decisive cut that utilizes obfuscation to show the perils of sharing qua open data, I will briefly outline a project published in 2016 by artist Paolo Cirio called Obscurity.[11] In the United States, the publication of police photographs, or “mug shots,” of arrestees is legal under Freedom of Information and transparency laws in most states. Websites scrape mug shots that have been published elsewhere, mostly on sites belonging to law enforcement entities, and republish the photographs, requesting money from the arrestee to remove the picture and details. In Obscurity, Cirio and his collaborators have developed a program to clone and scramble the data available on mug shot industry websites, such as MugShots.com, JustMugShots.com, and MugShotsOnline.com. Using almost identical domain names to these sites, Cirio’s clone sites show hazy faces that are impossible to identify and names that have been changed. Although Cirio is most concerned with the right to be forgotten, as the issue has come to be referred to in the EU after the landmark case in 2014 that ensured search engines like Google are subject to the existing EU data protection directive, we can also read this project as one that exposes the risks (of abuse and exploitation) inherent to “sharing” and the limits and failures of some open data or transparency initiatives. In addition, with the concerns of the current book in mind, the mug shot industry can be thought of as aping, cynically and darkly, the work undertaken by datapreneurs to transform open data into profitable forms. After all, the websites Cirio is protesting against indeed have an entrepreneurial, creative approach to repurposing open data.
By cutting into shareveillance, Cirio demands that incarceration be seen not as a decontextualized, individualized problem but as a collective, social issue for which we all have responsibility. He writes (Cirio, n.d.), “Obscurity proposes a democratic judicial system that would help to understand crime as a community-related issue, bringing attention to the victims of mass incarceration in the US and the unscrupulous criminal justice system and law enforcement agencies that created this situation.” The project exposes the unethical cut of shareveillance with respect to a particular sociopolitical issue: how, in this case, mug shot websites share data in such a way that presents incarceration as an asocial issue, while in the process performing a second tier of punishment (shaming and extortion) on top of any lawfully imposed penalties. The project asks us to see incarceration in terms of the political economy as well as the stratified and stratifying nature of the carceral state. It cuts into this particular distribution to share anew. Creative interruptions of shareveillance can make ethical cuts and, in the process, show up the incisions that have constructed the neoliberal securitized settlement of which shareveillant subjectivity is a part.
As well as the digital and aesthetic experiments with obfuscation outlined earlier, cutting into or interrupting shareveillance might include
- • imagining forms of transparency that do not simply make already inequitable systems more efficient
- • not using the morally inflected language of sharing when it comes to personal data (see Prainsack 2015)—it’s not always “good” to share (despite what we tell children)
- • acknowledging the unconditional secret and insisting on a right to opacity rather than privacy
To help with the last of these, I turn to the thought of Jacques Derrida and Édouard Glissant.
Derrida (2001) professed a “taste for the secret.” Rather than the common, contextual secret that hides somewhere waiting to be revealed (the secret that is, in principle at least, knowable), the secret of Derrida’s (1992, 201) formulation is the unconditional secret: “an experience that does not make itself available to information.” It is not unknowable because it is enigmatic but because knowledge, an event, a person, or a thing, is not fully present. That is, in any communication, any expression of knowledge, something is always “held back.” What is “held back” is in no way held in a reserve, waiting to be discovered. Rather, there is a singular excess that cannot fully come forth. In this sense, there will always be something secret. It is an undepletable excess that defies not only the surface/depth model and its promise that truth can be revealed but also the attendant metaphysics of presence. Eschewing the hermeneutic drive and circumventing attempts to anticipate revelation, the unconditional secret within a text should be thought of as an encounter with the Other through which a responsibility of reading is made possible (and, it is important to note, if we are to take proper account of Derrida’s aporia, impossible).
The secret here is best understood within the realm of ethics. Extending this ethical concern, the role of the secret in democracy leads Derrida (2001, 59) to defend the secret qua singularity, seeing it as an alternative to “the demand that everything be paraded in the public square.” “If a right to the secret is not maintained,” he writes, “we are in a totalitarian space” (59). This is also the logic underpinning Byung-Chul Han’s (2015) recent book The Transparency Society, in which he claims, “Transparency is an ideology. Like all ideologies, it has a positive core that has been mystified and made absolute. The danger of transparency lies in such ideologization. If totalized, it yields terror” (viii). For Derrida (1996, 81), real democracy has to allow for secrets/singularities. If democracy is to be meaningful, it must play host to multiple singularities, including those who do not wish to respond, participate in, or belong to the public sphere. More than this, democracy is nothing but the play between openness and secrecy, between belonging and nonbelonging, sharing and not sharing. In taking account of singularities in this way, democracy, for Derrida, is always “to come.” It is an impossible project: true democracy would create belonging among people who will never belong.
In light of such a formulation, we should be concerned for those who do not want to adhere to the dominant doctrines of digital democracy, including the ascendant principles of transparency, veillance, and sharing. The subject of democracy is not simply one who is asked to be transparent to the state and act on transparency. He or she is also, in the guise of Derrida’s non-self-present subject, one who is constituted by a singularity, an unconditional secret, that prevents full capitulation to the demands of transparency and sharing.
In a very different context to the one I am engaged with, but drawing on Derrida’s thought, Glissant coined the term a “right to opacity.” Glissant was writing about an ontological condition of minoritarian subjectivity and r(el)ationality that resists the demand to be knowable, understood, and rendered transparent by the dominant, Western, filial-based order (Glissant 2011): readable within the racialized terms already set by the dominant group. This means not settling for an idea of “difference” as the basis of an ethical relation to the Other but pushing further toward recognition of an irreducible opacity or singularity (Glissant 1997, 190). For Glissant, opacity is the “foundation of Relation and confluence” (190). What is important for the current study is the way in which the ethical subject is more aligned with secrecy than transparency in Glissant’s writing. Such a configuration offers us an alternative to the idea of the “good” shareveillant subject of neoliberalism.
While respecting the origins of these concepts in philosophical work on democracy and literature with respect to Derrida, and race with respect to Glissant, they can provide inspiration for thinking through the concerns of this book. Derrida’s unconditional secret highlights the unethical violation (against singularity) at the heart of shareveillant practices, while a right to opacity in the context I am concerned with would mean the demand not to be reduced to, understood as, consume, and share data in ways defined by state or consumer shareveillance. Rather than with acts of publicity, such as legal marches or online petitions, I want to argue that we need to meet the pervasive protocols of inequitable dataveillance employed by the securitized state and the logic of shareveillance with forms of illegibility: a reimagined opacity. Such reformulations of the politics of the secret and opacity enable us to begin to rethink the role of sharing in a data ecology that demands visible, surveillable, quantifiable, and entrepreneurial subjects.
The identity of the shareveilled data object–neoliberal data subject cum data set is not one that is allowed to interact with data in the creation or exploration of radical collective politics. A right to opacity could be mobilized to refuse the shareveillant distribution of the digital sensible. It might offer an opportunity to formulate a politics based not on privacy but on opacity; rather than a permanent and wholesale rejection of, or retreat from, the idea and practice of sharing (data, for our concerns), opacity used in this context would only ever be desirable if it allowed space to develop community-forming openness based on the principle of “commons” rather than its shareveillant manifestation. The commons is a multifaceted term that “can be seen as an intellectual framework and political philosophy; . . . a set of social attitudes and commitments; . . . an experiential way of being and even a spiritual disposition; . . . an overarching worldview” (Bollier and Helfrich 2012). But in all of these framings of commons and commoning, questions of what is shared and how come to the fore (de Angelis 2007, 244) in a way that places into question the logic of the shareveillant settlement. In other words, it is not only the enclosure, commodification, and privatization of previously communal or community-owned natural resources, land, property, goods, services, or relations that the concept of commons challenges but also the particular shape, place, and role sharing is given in any society. As an alternative to the shareveillant subject, then, we could propose a “common subject.” As Silvia Federici (2012) emphasizes,
if commoning has any meaning, it must be the production of ourselves as a common subject. This is how we must understand the slogan “no commons without community.” But “community” has to be intended not as a gated reality, a grouping of people joined by exclusive interests separating them from others, as with communities formed on the basis of religion or ethnicity, but rather as a quality of relations.
The common subject and the opaque subject are not in opposition: both, rather, interrupt the (at least in the contemporary conjuncture) pervasive interpellation of shareveillance. They offer other vantage points from which to make cuts into the distribution of the sensible and salvage a concept of sharing. For Geert Lovink (2016, 12), this is one of a series of revisions that needs to take place if we are to achieve a “cooperative renaissance of the Internet” revolving around “organized networks” that can allow us to think outside of “the ‘like’ economy and its weak links. Mutual aid outside the recommendation industry. Sharing outside of Airbnb and Uber.”
We use cookies to analyze our traffic. Please decide if you are willing to accept cookies from our website. You can change this setting anytime in Privacy Settings.