Thesis: Literature review

The following is a first draft of a literature review for my thesis project, which will look at how algorithms online shape user behavior and how user beliefs about the platform recursively shape the algorithm.

Algorithms as biopower

Foucault reminds us that power is not static, nor does it emanate from a center of origin; rather, power exists in an enmeshed network. In other words, power is not applied to individuals—it passes through them.

The digital era of online advertising has ushered in a new type of data collection aimed at maximizing profits by serving up advertisements based on modular, elastic categories. In the past, consumers were categorized based on demographic and geographic data available in the census. As marketers moved online over the past two decades, however, they were able to use data from search queries to build user profiles on top of these basic categories. The subsequent construction of “databases of intentions” help marketers understand general trends in social wants and needs and consequently influence purchase decisions (2011).

Through use-patterns online, an individual may be categorized based on her gender, her race, her age, her consumption patterns, her location, her peers, and any number of relevant groupings. Online users are categorized through “a process of continual interaction with, and modification of, the categories through which biopolitics works” (2011). Medical services and health-related advertisements might be served to that individual based on that categorization process, meaning that those who are categorized as Hispanic, for instance, might not experience the same advertisements and opportunities as those categorized as Caucasian.

In order to govern populations according to Foucault’s prescription for social control, biopower requires dynamic, modular categories that have the ability to adapt to the dynamic nature of human populations. In this system, the personal identity of the individuals matters less than the categorical profile of the collective body. Cheney-Lippold argues that soft biopower works by “allowing for a modularity of meaning that is always productive—in that it constantly creates new information—and always following and surveilling its subjects to ensure its user data are effective” (2011).


Foucault argues that surveillance exerts a homogenizing, “normalizing” force on individuals who are being monitored. When algorithms are employed in systems of selective surveillance, the personal identity of an individual matters less than the categorical profile of the group as a whole. It is this “normalizing” effect that I am most interested in understanding on the individual level.

Algorithms as interface

In recent years, researchers in the social sciences have worked to understand how Facebook users engage with the News Feed algorithm, which dictates what content they see in their Home Feed. Many researchers have studied the degree to which people become aware of such algorithms, how people make sense of and construct beliefs about these algorithms, and how an awareness of algorithms affect people’s use of social platforms.

Much research has been done on the question of ‘algorithm awareness’ – the extent to which people are aware that “our daily digital life is full of algorithmically selected content.” Eslami et al. (2014) raises several questions, including: How aware do users need to be of the algorithms at work in their daily internet use? How visible should computational processes be to users of a final product?

To answer the first question, several studies have attempted to gauge how aware Facebook users are of the algorithm. In one study of Facebook users, Eslami et al. (2015) found that the majority were not aware their News Feed had been filtered and curated. The authors created a tool FeedViz that allowed users to see visually how their News Feed was being sorted. Many of the study participants disclosed that they had previously made inferences about their personal relationships based on the algorithm output and were shocked to learn that such output was not a reflection of such relationships. The authors suggest that designers think about ways they can give users more autonomy and control over their News Feed without revealing the proprietary data from the algorithm itself.

A different study by Rader and Gray (2015) concluded that the majority of Facebook users were, in fact, aware that they were not seeing every post from their friends. The authors were interested in understanding how user beliefs about the Facebook news feed – accurate or not – shape the way they interact with the platform. “Even if an algorithm’s behavior is an invisible part of the system’s infrastructure,” they write, “users can still form beliefs about how the system works based on their interactions with it, and these beliefs guide their behavior.” Furthermore, such user beliefs about how the system works “are an important component of a feedback loop that can cause systems to behave in unexpected or undesirable ways.” They argue that we need more use cases where user and algorithm goals are in conflict as part of the design process. They also suggest that designers rethink their approach to making the mechanisms of the algorithm seamless or invisible—for instance, leaving clues within the interface that indicate how the system is working.

Martin Berg’s research attempts to track the ways in which personalized social feeds are shaped by the experienced relationship between the self and others (2014). He conducts a study in which participants wrote daily self-reflexive diaries about their own Facebook use. The study found that participants expressed a certain insecurity or strangeness in seeing their social boundaries collapse on Facebook. Berg argues that the algorithm acts as both an architecture, a social space, and a social intermediary. Facebook posts function as a social meeting point for friends. Furthermore, the “harvesting personal and interactional data on Facebook” forms the basis of a “virtual data-double” in which the self is “broken into distinct data flows.” His research supports the idea that the user is both shaped by and shapes the Facebook algorithm.

Building on the concept of algorithmic awareness, social scientist Taina Bucher seeks to map out the emotions and moods of the spaces in which people and algorithms meet. She develops the notion of “the algorithmic imaginary,” ways of thinking about what algorithms are, what they should be, and how they function (2017). Since such ways of thinking ultimately mold the algorithm itself, she argues that it is crucial that we understand how algorithms make people feel if we want to understand their social power. In a recent study, she examines personal stories about the Facebook algorithm through tweets and interviews with regular users of the platform. In her own words, she looks at “people’s personal algorithm stories – stories about situations and disparate scenes that draw algorithms and people together.” (2017). By taking an episodic, qualitative approach, Bucher constructs a picture of the disparate emotions generated by interactions with algorithms.

References:

Agamben, G. (1998) Homo Sacer: Sovereign Power and Bare Life. Stanford: Stanford University Press.

Agamben, G. (2005) State of Exception. Chicago: The University of Chicago Press.

Berg, M. (2014) ‘Participatory trouble: Towards an understanding of algorithmic structures on Facebook’, Cyberpsychology: Journal of Psychosocial Research on Cyberspace, 8(3), article 2.

Bucher, T. (2017), ‘The algorithmic imaginary: exploring the ordinary affects of Facebook algorithms’, Information, Communication & Society, 20:1, 30-44.

Bucher, T. (2012), ‘Want to be on the top? Algorithmic power and the threat of invisibility on Facebook’, new media & society 14(7): 1164-1180.

Cheney-Lippold, J. (2011) ‘A New Algorithmic Identity: Soft Biopolitics and the Modulation of Control’, Theory Culture & Society (28-164).

Eslami, M., Rickman, A., Vaccaro, K., Aleyasen, A., Vuong, A., Karahalios, K., Hamilton, K., Sandvig, C. (2015) ‘“I always assumed that I wasn’t really that close to [her]’: Reasoning about invisible algorithms in the news feed”’, CHI 2015, ACM Press.

Eslami, M., Hamilton, K., Sandvig, C., Pkarahalios, K. (2014) ‘A Path to Understanding the Effects of Algorithmic Awareness’, CHI 2014, ACM Press.

Foucault, M. (1977) Discipline and Punish: The Birth of a Prison. London: Penguin.

Foucault, M. (1990) The History of Sexuality: The Will to Knowledge. London: Penguin.

Foucault, M. (2003) Society Must Be Defended: Lectures at the Colle’ge de France, 1975-1976. New York: Picador.

Hier, S. (2003) ‘Probing the Surveillant Assemblage: On the Dialectics of Surveillance Practices as Processes of Social Control’, Surveillance & Society 1(3): 399-411.

Monahan, T. (2010) Surveillance in the Time of Insecurity. New Jersey: Rutgers University Press.

Rader, E. & Gray, R. (2015) ‘Understanding User Beliefs About Algorithmic Curation in the Facebook News Feed’, CHI 2015, Crossings: 173-182.

Rader, E. (2016) ‘Examining User Surprise as a Symptom of Algorithmic Filtering’, Journal of Human Computer Studies.

Schmitt, C. (1922) Political Theology: Four Chapters on the Concept of Sovereignty. Chicago: University of Chicago Press.

One thought on “Thesis: Literature review

Leave a Reply

Your email address will not be published. Required fields are marked *