Thesis statement & research framework.


How are our online behaviors being interpreted and understood by machine learning algorithms? How do we adjust our behavior when we know it’s being surveilled and categorized? To what extent do we come to see and identify ourselves through the ‘eyes’ of the algorithm? How do users adjust their online behavior in response to algorithms?


With my project, I want to take several different approaches to addressing the same set of questions.

First, I plan to do user research in the shape of individual anecdotes and broader surveys. I want to understand how prediction or recommendation engines, powered by increasingly accurate machine learning algorithms, are shaping our behaviors online. More importantly, I want to gain insights into how these mechanisms make us feel when we encounter them. I plan to send out an initial survey next week that gets to the heart of some of these questions.

Second, I intend to build a tool that gives users greater visibility into how algorithms are constructing a portrait of them based on their online behavior. What advertisements did they click? Who are their friends? What did they last purchase? I’m still not sure what form the tool itself will take but I plan to continue researching and referencing the work done by other researchers and activists.

Third, I want to gather up all my findings – both qualitative and quantitative – and present them in an engaging, exploratory way. I will likely write a research paper summarizing what I’ve discovered, but I also want to make that research accessible and educational to the average internet user.

Research Approach

First, the literature. I’ve started reading a number of books and academic articles that are relevant to this topic. Wendy Chun’s books Programmed Visions and Updating to Remain the Same have already been central to my research. I also plan to read Alexander Galloway’s Protocol, Patrick Hebron’s Learning Machines, and Weapons of Math Destruction. I’m making my way through Microsoft Research’s summary of academic articles related to critical algorithm studies. One article that’s been helpful in understanding user anecdotes has been Taina Bucher’s “The algorithmic imaginary: exploring the ordinary effects of Facebook algorithms,” which takes an ethnographic approach to understanding how users interact with algorithms online.

Second, the user research. I’m going to conduct my own user research in the form of surveys and the collection of individual anecdotes. I want to pinpoint specific interactions that users find particularly unnerving, creepy, benign, or invisible. I also want to understand how a knowledge that their news feed is filtered affects the way they interact with the platform.

Third, the experts. I want to get in touch with several researchers and artists who are already making strides in this field. I’m planning to reach out this week.

Personal Statement

Many people remember the day they first logged online or the day they got their first gmail account. I remember the exact day Facebook introduced its News Feed, a feature that allowed users to see what their friends were talking about on the platform. I remember going to high school that day and talking with my friends about the strangeness of it all, the experience of seeing what other people were commenting on and liking. And yet within days we had accepted and embraced the changes to the platform.

Since that day, Facebook has unrolled a number of changes to its platform, many of which we don’t notice or see because they are minor tweaks to the algorithm that dictates what information we see and what information is rendered invisible. Most recently, machine learning tools have thrown a whole new set of problems into the mix, as such algorithms become increasingly more nebulous and less transparent. I’m interested in understanding how algorithms – not just on Facebook, but on every platform – make us feel when we notice them. I also want to understand how users adjust their behavior in dialogue with such algorithms.

Much of my work at ITP has been focused on data privacy, surveillance culture, and the blurring of public and private spaces. I intend my thesis to be a continuation of past research and projects.

Leave a Reply

Your email address will not be published. Required fields are marked *