Surveillance and Data Aggregation: The ‘Nothing to Hide’ Fallacy, Manipulation, and Human Reduction

1. Introduction

Since the September 2001 terrorist attacks, the United States government under the Bush administration implemented the Patriot Act, which promoted increased surveillance in the form of wiretapping and data aggregation to put an end to terrorism. This act allowed government agencies like the NSA to go unchecked for years until Edward Snowden blew the whistle in 2013 on just how much data was being accumulated by the government. The Snowden scandal, which involved the leakage of a classified NSA documents, raised awareness, as well as concern, about the government’s mass surveillance. However, many individuals remain still unaware of such an intrusion into their personal privacy and consequently approach the issue with a cavalier attitude, relying on the “I’ve got nothing to hide” argument. So in this report, I will examine the privacy issue in all its many forms, how it effects individuals, and the consequences of the “nothing to hide” argument, as well as examine possible solutions to consider at both a national and personal level.

2. Literature Review

2.1 The Problematic Approach to Privacy

Daniel Solove (2007) in his law review ‘I’ve Got Nothing to Hide’ and Other Misunderstandings of Privacy examines the prevalent attitude people have regarding privacy – “I’ve got nothing to hide” – but to do so he develops a pluralistic conception of privacy through his taxonomy of privacy. If viewed as a pyramid scheme (see the image below), surveillance resides at the top with data aggregation directly below it, which effectually chills civil liberties by interfering in one’s ability to make decisions (i.e. decisional interference). danah boyd (2013), along the same lines as Solove’s taxonomy, explains in her blog post “Where ‘Nothing to Hide’ Fails as Logic,” through not only her own experiences but also the experiences of others, how easy it is to create suspicion in this emerging surveillance state, which has put at risk mankind’s essential civil liberties, such as “innocent until proven guilty.” This is why she argues against the “nothing to hide” argument because to be so shortsighted in one’s own perceptions of privacy ultimately diminishes the importance of other’s safety, who may become subject to discrimination. boyd’s article parallels Solove (2007) when he argues, “By saying ‘I have nothing to hide,’ you are saying that it’s OK for the government to infringe on the rights of potentially millions of your fellow Americans, possibly ruining their lives in the process… [T]he “I have nothing to hide” argument basically equates to ‘I don’t care what happens, so long as it doesn’t happen to me’” (p. 751).


2.2 Surveillance Produces Suspicion

The suspicion that comes so readily in this modern surveillance state that danah boyd (2013) mentions is explored mores so in the documentary Terms and Conditions May Apply by director Cullen Hoback (2013). Supported with many examples, the film claims that when privacy is foregone assumptions become unavoidable. For instance, a man from Ireland, a culture entirely different from America, tweeted that he and his friend were going to “destroy America,” meaning party, but airport security read it differently. Similarly, a young boy made a Facebook post that was misinterpreted as a threat against the president. Further examples are: a writer for Cold Case was thought to have murdered his wife; SWAT responded to a man’s joke posted on Facebook; a zombie themed wedding was prevented without warrant or explanation; civilians were arrested for merely thinking about protesting during the royal wedding. This documentary ultimately shows that assumptions lead to the prevention and cessation of constitutional rights, falling under the subcategory of “distortion” on Solove’s (2007) taxonomy, as well as the category of “invasion” under which decisional interference falls (p. 758).

2.3 Humans Reduced to Data

Profiling individuals, oftentimes incorrectly as shown in the examples above, can be traced back to Michel Foucault’s (1995) description of the panopticon in his Discipline and Punish. The panopticon functioned as an “instrument of permanent, exhaustive, omnipresent surveillance, capable of making all visible, as long as it could itself remain invisible” (Foucault, 1995, p. 214). This constant surveillance does not split society in half according to what is considered universally normal. Rather, it functions on a bell curve. It creates a hierarchy of individuals “in relation to one another, and if necessary, disqualify and invalidate” them (Foucault, 1995, p. 223). As the panopticon creates individual profiles, it also enacts experiments on the individuals that are catered to their specific profile in order to alter their behavior. In summation, this type of profiling and manipulation reduces humans to data, used to inform and thereby strengthen the governing body, which is evident today, where the controlling apparatus is the Internet and users are mere data.

2.4 Humanity’s Static Condition

Eli Pariser (2011) addresses this idea of reduction and manipulation seen with the panopticon in The Filter Bubble, when he writes, “Behavior is now a commodity, a tiny piece of a market that provides a platform for the personalization of the whole Internet” (p. 60). Individuals are becoming increasingly reified. Once reduced to data, individuals are then behaviorally altered by the Internet through personalization filters, which interpellates (a concept introduced by Louis Althusser that means to “hail” an individual into the major social and political institutions) individuals into a certain ideology – their own. In Pariser’s (2007) own words, “personalization filters serve up a kind of invisible auto-propaganda, indoctrinating us without own ideas, amplifying our desire for things that are familiar and leaving us oblivious to the dangers lurking the dark territory of the unknown” (p. 27). By removing serendipity, individuals are never exposed to new ideas outside of their own bubbles so they never grow.

Ultimately, personalization is a form of “informational determinism,” which is an idea reminiscent of Daniel Solove’s (2007) “decisional interference.” Essentially, technodeterminsm dictates an individual’s future based on their past, affecting their ability to make decisions. Naturally, there is a balance between what one wants in the present and what one should do in the future, but personalization shifts this paradigm as it only understands the present self that interacts with the Internet (Pariser, 2011, p. 135). In the end, identity remains static because it is difficult to “both systematize and to appeal to the fullness of human life” (Pariser, 2011, p. 194). This difficulty leads to assumptions about people’s behavior, which fails to encompass their entire being. Now not only is serendipity lost, but with this type of attribution associated with personalization, nuances that make up an individual are also gone.

3. Problem

As detailed above, surveillance and data aggregation have larger effects on an individual’s psyche than people realize, which is why the “nothing to hide” response to the issue of privacy is not a valid argument in today’s society. The issue of privacy transcends the matter of whether someone has something to hide or not. As previously mentioned, people are interpellated by their own selves. They are a never-ending loop of themselves with no exposure to anything new or challenging. They are confined to a static realm in which nothing changes. Furthermore, privacy is a matter now of control and manipulation in which the government has all the power. This asymmetrical power relationship enables the government to propagate their ideologies and in so doing interpellate individuals into what’s acceptable. In other words, individuals conform and self-censor to avoid suspicion from the government.

Glenn Greenwald (2016) makes a similar argument in his article “New Study Shows Mass Surveillance Breeds Meekness, Fear and Self-Censorship” published in his online magazine The Intercept. In this article, Greenwald (2016) cites a study by Jon Penney in which it was discovered that after the Snowden scandal in 2013 there was “a 20 percent decline in page views on Wikipedia articles related to terrorism.” Penney made the remark based on this data that “If people are spooked or deterred from learning about important policy matters like terrorism and national security, this is a real threat to proper democratic debate” (Greenwald, 2016). Moreover, in Europe scholars and librarians fear that the research material and books they possess may connect them to terrorism and with good reason, considering some have already been accused of being terrorist supporters. However, despite the prevalence of these issues and the evidence available, people are still unaware of the chilling effects of surveillance and data aggregation. So in the end, ignorance is the main issue regarding privacy so I ask the question: How do we raise awareness in a way that makes people care?

In addition to the self-censorship and conformity, humans are ultimately reduced to data, and the data is all that really matters to government agencies and corporations, who profit from data aggregation/mining. As this objectification continues to grow, the Technological Singularity mentioned by Pariser (2011) may very well happen. Essentially, with the Internet as today’s panopticon – the possessor of all knowledge and power – humanity is now, as Pariser (2011) argues, on the event horizon of merging with computers, or worse, being surpassed by computers in the hierarchy of intelligence. Clearly, the ultimate dilemma of the filter bubble is the lack of consideration for idiosyncrasies and the introduction of new material because both inhibit individual growth. However, the only way to overcome this imbalance is to enact change at both the individual level and the corporate/political level. We need a technological revolution, but are we too locked-in to our current lifestyle to change? Does the convenience of the Internet win in the end?

4. Solutions

4.1 Striking the Balance between Too Little and Too Much Content

Though Pariser (2011) relents that privacy issues, as they are so inherent in society, will be difficult to overcome, he does propose some solutions. Rather than ghettos, in which an individual is confined to their own subculture, or a heterogeneous city, where individuals may become alienated by the overwhelming presence of so many differences, Pariser (2011) suggests instead a “mosaic of subcultures” (p. 246). Pariser (2011) calls for communities that support individual idiosyncrasies, but he also believes people should be introduced to “lots of ways of living in order to choose the best life for [one]self” (p. 246). In other words, Pariser (2011) wants to strike a balance between relevance and serendipity. To create this balance, “collaborative filtering” would retain personalization but also give people control over how the information is personalized. Another solution proposed by Pariser (2011) is “falsifiability” (p. 258). By continually doubting one’s online identity, falsifiability introduces new content so as to flesh out a person’s character. Lastly, Pariser (2011) mentions the “Fair Information Practice,” which grants more control over how one’s personal data is used, which would fall under the category of information processing on the taxonomy (p. 263). These broader solutions attempt to compromise between comfort and protection.

4.2 Raising Awareness in Computer-Based Education

In her article “The Invisible Digital Identity: Assemblages in digital networks,” Estee Beck (2015) proposes integrating lessons on privacy and surveillance, especially in computer- based classes. While providing a heuristic from which teachers may use, Beck urges educators to ask not only their class but also themselves how to address the issue of digital identities and online surveillance. Furthermore, she suggests teachers have their students explore websites like Blue Kai Registry and so they can see just who all is tracking them and how much personal information is on the Internet. Unless individuals see for themselves the extent to which they are being watched, they may never believe that surveillance and data aggregation are of any concern or that it has the ability to shape the way they think.

4.3 Strategies for the Individual

Hopefully as awareness increases, implementation of the tools and software that are already available will also increase. What individuals need to know is that there are free and simple ways to obfuscate, which is defined as the “deliberate addition of ambiguous, confusing, or misleading information to interfere with surveillance and data collection” (Brunton & Nissenbaum, 2015, p. 1). For instance, there are plug-ins like Ghostery, which enables users to decide which companies, if any, can track them, shifting the balance of power (“What are Cookies & Trackers?”, 2016). In addition, there are also services like TrackMeNot and AdNauseam. Switching to DuckDuckGo as an alternative search engine is also something individuals should consider as it lessens the personalized filtration. Individuals should check their privacy settings on their social media as more often than not the default settings make the account more open.

5. Conclusion

Despite Snowden’s revelation in 2013, Beck (2015) makes the observation that “web-tracking technologies are not going to disappear, and with rapid developments in technology and marketing tactics, we can be sure that we will be tracked more as the years pass by.” Since tracking, surveillance, and personalization are issues that will only accelerate over time, actions to educate and protect oneself, as well as others, must be made now. However, considering the fact that little awareness or concern has resulted from the Snowden act, something more impactful than someone modifying their Facebook privacy settings must occur, such as the implementation of a law or some disaster that will awaken the attention of every individual.

6. References

Beck, E. N. (2015). The Invisible Digital Identity: Assemblages in Digital Networks. Computers and Composition, 35, 125-140. doi:10.1016/j.compcom.2015.01.005

Boyd, D. (2013, June 10). Where ‘Nothing to Hide’ Fails as Logic [Web log post]. Retrieved May 5, 2016, from hide.html

Brunton, F., & Nissenbaum, H. (2015). Obfuscation: A user’s guide for privacy and protest. Cambridge: MIT Press.

Foucault, M. (1995). Discipline and punish: The birth of the prison (2nd ed.) (A. Sheridan, Trans.). New York, NY: Vintage Books.

Greenwald, G. (2016, April 28). New Study Shows Mass Surveillance Breeds Meekness, Fear and Self-Censorship. Retrieved May 06, 2016, from meekness-fear-and-self-censorship/

Holback, C. (Director). (2013). Terms and Conditions May Apply [Motion picture on Netflix]. USA: Hyrax Films.

Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. New York: Penguin Press.

Solove, D. (2007). ‘I’ve Got Nothing to Hide’ and Other Misunderstandings of Privacy. Retrieved May 06, 2016, from

What are Cookies & Trackers? (n.d.). Retrieved May 06, 2016, from

Published by: andreahektner

I am Texan - born and raised - whose passions lie in praising Jesus Christ and making my family proud. These passions have led me to getting an English degree from the University of Texas at Arlington, where I am a recent graduate. Over the course of my undergraduate, I've enjoyed analyzing the human condition whether it's through literature, film/television, and art with a philosophical, theoretical, and theological lens, and it is with this blog that I hope to share some of my past work and future writings.

Tags, , , , Leave a comment

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s