algogossip is an exploratory project I’m currently developing, which was recently tested out at the 2022 Internet Yami-Ichi.
Last month I called up my friend, a full-time content creator. I’ve been reading a lot about the tactics people use to increase the chances of their posts being seen on social media, so I asked: Where did she go to get advice and tips for how she posts?
When you consider how many people rely on algorithmic visibility on social media for financial stability, questions around how social media algorithms work become absolutely critical. According to a recent SignalFire report, roughly 2 million people work as full-time, professional content creators on platforms like YouTube, Instagram, and Twitch. These platforms do not provide detailed information about how their algorithms work, at the risk of being gamed. But at the same time, when changes to the algorithm are rolled out without warning, content creators are left scrambling trying to make sense of the changes.
Over the past few years, I’ve been thinking about the stories people tell each other about their everyday interactions with social media algorithms. As part of my grad school thesis research in 2016, I asked people to share their stories about Facebook’s ad targeting: What kinds of strange experiences did they have with targeted ads? During my time as a Ford-Mozilla fellow in 2017, I collaborated with Coding Rights to explore this question further: What experiences were women having on the platform? What constitutes algorithmic harm? We talked to women who had had unsettling or confusing experiences with Facebook’s targeted ads. Most recently at Mozilla, I’m leading qualitative research that aims to get at the heart of people’s experiences with YouTube’s user control mechanisms: Do they feel like they have meaningful control over the system? How do they change their behavior in an attempt to exert control?
My research has been informed by scholars writing about how people engage with social media algorithms. Taina Bucher has written extensively about the “algorithmic imaginary” (Bucher 2017), the animating force of social media algorithms, ways of thinking about what algorithms are, what they should be, and how they function. She argues that the stories people offer up about algorithms are important because they have real social impact. Michael Ann DeVito and others argue that the folk theories people hold about social media algorithms serve as frames through which we can understand their reactions to change (DeVito et al. 2017). They say that by looking seriously at the complaints people make about algorithms, we can better understand the nature of “expectation violations.”
A paper about how users exercise control over social media algorithms (Burrell et al. 2019), says that the complaints people make about social media algorithms are important feedback signals. Citing Sara Ahmed’s writing on complaint as a feminist tactic, they write that “the act of complaint itself can be a way for people to record their grievances and build solidarity in the face of limited recognition by those with organizational power.” (Ahmed has since published a book titled Complaint! that looks at how complaints are made and what they can do, specifically through a Black feminist and feminist of color lens.)
My thinking on this project has been most shaped and inspired by Sophie Bishop’s excellent scholarship on the concept of “algorithmic gossip,” a term she defines as “communally and socially informed theories and strategies” about social media algorithms that people share with one another in order to boost financial stability and visibility on social media platforms (Bishop 2019). She says that “gossip is productive” and that it is an “important and under-studied form of knowledge production.”
Gossip, especially in its association with women, has historically been looked down upon and treated as trivial, intimate, and dangerous. It’s also a tactic that’s wielded in situations where a power asymmetry exists, and most often it’s wielded by marginalized groups. I think about the whisper networks at universities or at companies that have warned newcomers about problematic individuals, or have allowed people to quickly share important knowledge. Most importantly, gossip serves to subvert power: In the absence of good, accurate information about how a system works, people rely on one another to make sense collectively.
Back to the question I asked my friend: Where did she go to get advice and tips for how she posts? She told me that she was part of a group text with other friends who were content creators, where they shared tips and offered support. Many of them subscribe to industry newsletters or work with agencies who give them advice about how and what to post. Others seek out internet forums for answers to their questions.
I started looking into some internet forums where these conversations take place. There are a number of subreddits dedicated to answering people’s questions about how the TikTok algorithm works, advice for boosting visibility on Instagram’s algorithm, avoiding/appealing shadowbans, and similar topics. The posts in these forums range from the didactic (“Post at least 6 times a day. Upload history like 2 of those. HASHTAGS VERY important.”) to the supportive (“Why don’t you try some challenges ? Like challenge people to do ex: 5 pushups everyday and have your own hashtag.”). There is a real sense of camaraderie, with posts expressing frustration (“I have been banned for more than a month now and they are not reviewing my appeal”) and affirmation (“Yeah this has been happening to my videos as well.” I decided to use these posts as a starting point from which I could explore further.
Coding the project
After setting up Reddit API credentials, I scraped posts and their comments from these subreddits using Reddit’s PRAW, filtered by specific flairs (e.g. “Algorithm Question / Shadowbanned”). I imported a Python module written by Prakhar Rathi, and then wrote a script that would scrape the posts and comments and save the dataset as a CSV. I combined the data and converted it into a JSON file.
I thought about what I wanted to do with this dataset of ‘algorithmic gossip’, especially in an art gallery setting. I considered curating a selection of the data in a book or zine. I also considered building an ML model to generate new advice from this dataset.
I thought about some of the previous exploration of voice technology I had done with my collective tendernet, and considered the ways we think about gossip as spoken: it has an aural quality. I got really excited about the creative potential for a voice interaction – could you call a phone number and get a voice message? Pick up an object, put it to your ear, and a message plays? With one tendernet collaborator Zoe Bachman, we brainstormed some ideas and agreed on an aesthetic: y2k tech girlie. Another collaborator Katrina Peterson took the aesthetic concept and iterated some cute logo designs.
Testing it out
Testing out the piece at the Internet Yami-Ichi was a lot of fun. The energy of the event was very much a cross between an art book fair, a bazaar, and an art gallery. I talked to lots of people who came through about the concept behind the project and got some great ideas. I was also inspired by hearing more about Angie Waller’s work with Unknown Unknowns; she is also working with comments and pictures scraped from the internet.
I want to continue refining and exploring this dataset through different creative explorations.
Sara Ahmed. 2021. Complaint! Duke University Press, Durham.
Sara Ahmed. 2018. Refusal, Resignation and Complaint. feministkilljoys. Retrieved May 7, 2022 from https://feministkilljoys.com/2018/06/28/refusal-resignation-and-complaint/
Sophie Bishop. 2019. Managing visibility on YouTube through algorithmic gossip. New Media & Society 21, 11–12 (November 2019), 2589–2606. DOI:https://doi.org/10.1177/1461444819854731
Sophie Bishop. 2020. Algorithmic Experts: Selling Algorithmic Lore on YouTube. Social Media + Society 6, 1 (January 2020), 205630511989732. DOI:https://doi.org/10.1177/2056305119897323
Taina Bucher. 2017. The algorithmic imaginary: exploring the ordinary affects of Facebook algorithms. Information, Communication & Society 20, 1 (January 2017), 30–44. DOI:https://doi.org/10.1080/1369118X.2016.1154086
Michael A. DeVito, Darren Gergle, and Jeremy Birnholtz. 2017. “Algorithms ruin everything”: #RIPTwitter, Folk Theories, and Resistance to Algorithmic Change in Social Media. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI ’17), Association for Computing Machinery, New York, NY, USA, 3163–3174. DOI:https://doi.org/10.1145/3025453.3025659
Michael Ann DeVito. 2021. Adaptive Folk Theorization as a Path to Algorithmic Literacy on Changing Platforms. Proc. ACM Hum.-Comput. Interact. 5, CSCW2 (October 2021), 1–38. DOI:https://doi.org/10.1145/3476080