When we began this project, our intention was to find clear examples that would illustrate why one should care about surveillance and its ability to cause harm. In doing research we quickly discovered that there are so many ways to understand how surveillance currently plays and has historically played a role throughout the world. There are many lenses through which we can view and begin to unpack the primary mode of surveillance.
Our main goal with this project is to address the many implications of the more prominent form that surveillance takes today, the capture model. We will refer to the capture model as the point at which all of our concerns as researchers are rooted. We will also cover the history of word surveillance and its place in the world prior to the capture model coming into existence. The audience of this project will be those who participate in networked technologies. We would like to especially be in conversation with people like ourselves who sometimes take on what we will call a “so what attitude”. While this attitude is commonplace and symptomatic of the opaqueness with which surveillance operates, we think it is worth pushing back against in order to raise big questions and generate a greater awareness.
While our intention is to move past the privilege of saying “so what” we understand that this is no easy task. We are still trying to think about how we can take abstract concepts and distill them down into digestible information. This could be a possible first step to having more meaningful and critical conversations around surveillance. We also want to move past the notions of big brother and dig deeper into the mechanics and consequences, good or bad, and in many cases, both of surveillance models and tools.
This is where, for us, the bigger questions start. Early on in our research of the more tangible case studies of harmful surveillance we found that people were being affected by things like technical glitches. These glitches would cause people to be denied access to a service which take a database as verbatim. But then there is the obvious issue of what is the possibility of a “technical glitch” versus the possibility of a “human glitch”. Is this speaking to surveillance, or to capture, or to the opaqueness of algorithmic decision making?
From here we decided that we would research surveillance with a three-layer approach. We are sticking to the analog visualization of the research for now. Before moving to a digital view, we want to ensure that we have the correct framework and have made meaningful connections so that the design of the reader reflects the content itself. We are also hoping that the reader can be a web platform which can be easily expanded upon by each of us based on future research and feedback.
The first layer is dedicated to defining and contextualizing surveillance. We included its etymology, history and its two primary forms, optics and capture.
The second layer attempts to uncover some of the primary modes of surveillance. So far, we are interested in digging deeper into big data, security, autonomy, prediction markets, efficiency, and privacy. We see each of these as intersecting with each other but also as distinct modes with which we can begin to go a bit deeper into some of the modes and mechanics with which surveillance operates.
Big data is perhaps the most significant symptom of the capture model. Through the constant capturing of data created by internet connected devices the world has been met with a kind of database that needs a new kind of system for analyzing. While big data can be used for good in many arenas it is often used in irresponsible ways that lead analysts to false or overpredicted results. Big data and machine learning are often considered to go hand in hand. Machine learning needs big data. In an essay titled “Physiognomy’s New Clothes” Blaise Agüera y Arcas, Margaret Mitchell, and Alexander Todorov write “In an era of pervasive cameras and big data, machine-learned physiognomy can also be applied at unprecedented scale. Given society’s increasing reliance on machine learning for the automation of routine cognitive tasks, it is urgent that developers, critics, and users of artificial intelligence understand both the limits of the technology and the history of physiognomy, a set of practices and beliefs now being dressed in modern clothes.”
Also, in this era we are able to safely say that in addition to the capturing of data that is external to humans there is also the capturing of what is internal to humans. Large social networking sites like Facebook, regardless of their intentions, have access to this kind of internal behavior at their fingertips. In an interview with the Harvard Gazette Shoshanna Zuboff notes, “What is abrogated here is our right to the future tense, which is the essence of free will, the idea that I can project myself into the future and thus make it a meaningful aspect of my present. This is the essence of autonomy and human agency. Surveillance capitalism’s “means of behavioral modification” at scale erodes democracy from within because, without autonomy in action and in thought, we have little capacity for the moral judgment and critical thinking necessary for a democratic society.”
The notion of security in regard to networks and the digital is interesting in that there are many levels to what can be considered secure and even at that point there is the question of whether or not anything, once its connected to the internet, can be 100% secure. This week, Facebook announced that it wanted to create a privacy focused social network. This is a great departure from what Facebook is and has been since its inception in 2004. It is one thing to know that your data is secure and encrypted but it is another thing to know in whose hands it actually lives in at the end of the day. This raises questions of safety and security and who is maintaining these ideals and who is at risk when these ideals are broken or mishandled. Jasbir Puar, in Cosmologics Magazine interview, discusses surveillance as being more than the act of seeing or screening, “Regimes of security also entail corralling greater numbers of populations into a collective project of surveillance. We have seen, and continue to see, many examples of this post September 11th. The If You See Something, Say Something campaign on NYC public transit interpellates the general public into service of the “greater good”; the NSEERS list impelled pre-emptive repatriation (and sometimes migration to a country of origin that one had never been to) to South Asia and the Middle East; the Turban Is Not a Hat campaign sought to educate Americans about the differences between Muslims and Sikhs by regulating the distinctions between headwear, turbans, headscarves.”
Most things that internet connected users do is captured (did we say this already??) and then used to build proprietary products and systems which can in the end be worth billions of dollars. These highly profiting companies are then marketed back to us as free products and systems of which we have absolutely zero insight into. Prediction markets are now the driving force being surveillance capitalism. Companies like Google collect more behavioral data than they ever need to serve their customers with convenience and tailored content. The surplus is what is exchanged in the new markets of behavioral data.
All of the tools of surveillance and data capture create extreme efficiency. Whether its efficiency in the workplace or having your laundry detergent shipped as soon as its low, the capturing of data from interconnected devices and systems makes sure to deliver. Efficiency also means convenience and it might be worthwhile asking, at what cost does one get to experience such conveniences?
Security and privacy are two different things that are constantly overlapping and exchanging meaning. It is interesting to ask the question, what is privacy to you? If privacy requires security what does security need to look like in the year 2019? Answers might be very different depending on who you ask. Nonetheless an ownership of one’s privacy should be considered a human right. Bruce Scheiner, a security expert and fellow at Berkman Klein center argues that privacy is a means to human progress, “A few years ago we approved gay marriage in all 50 states. That went from ‘It’ll never happen’ to inevitable, with almost no intervening middle ground.” Things that are at a time considered immoral need space for privacy so that they can exist within and against the reigning “morals” of the powers that be. Being private does not necessarily mean that you have things to hide but that you have something which, if it came into the hands of a government or corporation, could be wrongfully used against you in any number of hypothetical situations.
Finally, the third layer of the critical surveillance reader will be a listing of the tools and techniques, many of which we covered in class, with the intention of demystifying some of the technologies in a tangible and digestible way.
The reader is very much a work in progress, and we consider our research thus to be only scratching the surface of something much greater. We will never fully be able to unpack and analyze surveillance in its current state, but we hope at least to learn and demystify and also to educate ourselves and our peers so that we, as technologists, can begin to ask the necessary and critical questions required by the things we are using or making.