+1(978)310-4246 credencewriters@gmail.com

I’m working on a business question and need an explanation and answer to help me learn.

After reading the article answer the following questions only from the readings without any references talk in a persons point of view it is a discussion. 
-Provide two examples where algorithms influence communication systems or platforms?
-Do you think this influence is positive or negative?
-How does it affect us as users and humans?Special Issue Article
Exploring user agency and small acts
of algorithm engagement in
everyday media use
Media International Australia
2022, Vol. 183(1) 16–29
© The Author(s) 2022
Article reuse guidelines:
DOI: 10.1177/1329878X211067803
Patrick Heiberg Kapsch
University of Copenhagen, Denmark
Based on participant-driven media use tracking and self-reflexive media use Vlogs, this article
explores how young adult media users make sense of their user agency vis-à-vis algorithms in
digital media and how they try actualizing it through reflexive and mundane enactments of algorithmic systems. The article proposes to adapt the concept of ‘small acts of engagement’ to
grasp the productive and agentic potentials of how users enact algorithms purposively in daily
media use. By engaging research participants actively in reflections to better understand, and possibly respond to the influence of algorithmic power in daily media use, the study unfolds common
boundaries of users’ reflexive capabilities, showing how exercising user agency in a datafied age is
increasingly complex and prospective, yet not merely limited by algorithmic power. As a result, the
article discusses the methodological implications and potentials of engaging media users in reflections and actions to shape their communicative agency, which might be a possible step towards
mobilizing algorithmic literacy.
algorithms, everyday media use, small acts of engagement, user agency
In the wake of today’s datafication of society, algorithms play an ever-increasing role in how we use
and experience digital media. As a central part of the data-driven infrastructures of digital and
platform-based media, algorithms are encoded to do work on users and their data traces, continuously optimizing the user experience by filtering, sorting, recommending, and curating personalized
content and user options. Accordingly, algorithms serve a reduced set of daily options, helping us
navigate the infinity of options available to us and, thus, help shape our everyday mediated reality
(Couldry and Hepp, 2018). In recent years, a sizable body of algorithm studies has raised attention
to especially the encoding of algorithms, focusing on the implications of how algorithms are
Corresponding author:
Patrick Heiberg Kapsch, Department of Communication, University of Copenhagen, Karen Blixens Plads 8, building 16, 1st
floor, Copenhagen, Denmark.
Email: patrick.kapsch@hum.ku.dk
designed and put to work in different sectors of society, stressing substantial problems of discrimination, accountability, and data ethics inherent to algorithmic operations (e.g. Crawford, 2021;
Eubanks, 2018; Noble, 2018; Pasquale, 2015). In addition, within media and communications
studies, the work of algorithms on people and their data traces is often conveyed to bring new tensions between media power and user agency (e.g. Hutchinson, 2021), sometimes indicating a troubling condition of passive and uncritical users getting looped in algorithmic systems. However, as
argued by Livingstone (2019), there are vital lessons to be learned from the history of audience
research when questioning users’ capabilities at navigating the changing conditions of media
“audiences are not so gullible as popularly feared, precisely because they are neither homogeneous nor
unthinking. (…) it is time to end the binary formulation that pits media power against audience power,
instead recognizing that the circulation of meanings includes not only encoding but also decoding and,
today, audience encoding too.” (Livingstone, 2019: 4–5)
While examining and critiquing what algorithms do to people is crucial, correspondingly, we
need to study and recognize how people also do things to algorithms as they entail ‘force relations’
that might create new tensions and give people a ‘reason to react’ (Bucher, 2017). Indeed, algorithms and their databased operations influence our actions and possibilities, especially in digital
media use. Arguably, however, we need a better understanding of how ordinary people experience
and respond to this influence (Couldry et al., 2016) to qualify and delineate the possible tensions of
personal agency in an age of algorithm-driven media. Thus, instead of merely stressing the blackboxed power of algorithms over people and their data, this study inquires actively into peoples’
lived experiences of datafication (Kennedy, 2018; Markham, 2019) by probing the ordinary
ways media users identify and respond to the influence of algorithms in their daily media use.
Hence, the article seeks to contribute to recent debates on the boundaries and potentials of user
agency and user resistance to algorithmic power (see Velkova and Kaun, 2019).
As Lomborg and Kapsch (2020) suggest, we can (and should) study people’s experiences with
algorithms; how they make sense of and possibly respond to them based on affective valuations. By
centering the affective relationship between people and algorithms we gain a productive steppingstone for exploring a key question of how diverse media users experience their agency to be challenged by datafication and algorithmic systems (Ytre-Arne and Das, 2020). However, as recent
qualitative studies have relied on interviews to capture how people experience and make sense
of algorithmic operations (e.g. Bucher, 2017; Lomborg and Kapsch, 2020), this article aims to demonstrate how participatory methods might be mobilized to better grasp the situated details of users’
engagement with algorithmic systems in everyday life. Thus, comparable to Swart (2021), who uses
qualitative interviews combined with the walk-through and think-aloud method, I attempt to
capture and explore peoples’ concrete encounters with algorithms. However, instead of prompting
for intuitive algorithm experiences in an interview setting, this study relies on workshops, selftracking exercises, and Vlog making as reflective modes of engaging participants. Consequently,
the study engages a group of young Danish students in a self-reflexive exploration of how they
identify and respond to the influence of algorithms. By engaging users in, first, tracking and recording their daily encounters with algorithms for 48 h, and second, making them create self-reflexive
media use Vlogs, the article systematically analyzes the ordinary and often tacit ways users navigate
and negotiate media use through interactions with algorithmic systems.
In the analysis, I adapt the concept of ‘small acts of engagement’ (Picone et al., 2019) to sensitize
how media users make sense of their personal agency vis-à-vis algorithms in digital media and how
they try to actualize it through reflexive and tactical engagement. As a result, the article unfolds
Media International Australia 183(1)
common boundaries of the participants’ reflexive capabilities to understand and respond to algorithmic systems, showing how exercising user agency in a datafied age is increasingly complex and
prospective (Ytre-Arne and Das, 2020), yet not solely limited by algorithmic power. Finally, the
article discusses the methodological potentials of engaging media users in self-reflexive actions
to shape their personal agency, which might be a productive step towards mobilizing algorithmic
literacy (Bruns, 2019).
Algorithms and user agency
In academic discourse, when we talk about algorithms in digital media, we often refer to systems of
computation that contain more than just one algorithm. Following Nick Seaver (2019), algorithmic
systems might be described as “heterogenous sociotechnical systems, influenced by cultural meanings and social structures” (Seaver, 2019: 419). The relative power of these systems to control and
structure our communication and possibilities of action permeate our social worlds and shape our
cultural lives. Furthermore, due to their black-boxed nature at the backbone of digital life, users are
offered only limited ways of altering the outputs of algorithms (Gillespie, 2014; van Dijck et al.,
2018). Nevertheless, while it is not a given that people know about the role and technical underpinnings of algorithms in digital media (Eslami et al., 2015), recent studies suggest that ordinary users
might develop a general sense of how algorithms influence their media use, and that some users do
things to algorithms purposely in effort to shape and alter their outputs (e.g. Bucher, 2017; Cotter,
2019). In addition, a developing body of literature focuses on “folk theories” to understand people’s
conceptions about algorithms and how their theories might provide them with means to interact
strategically with algorithmic systems (Siles et al., 2020; see also Ytre-Arne and Moe, 2021).
Fundamentally, a central point to the socio-technical ontology of algorithmic systems is how
they (also) rely on users’ interpretations and their data inputs to function properly and remain relevant, making negotiation and user engagement a powerful influence in its own right – at least in
theory (see Seaver, 2019). Thus, while the black-boxed nature of algorithmic systems might in
many ways transform media power, likewise, people may find new ways to exercise their user
agency and respond to this power, for example, through practices of tactical and subversive
media engagement (Lomborg and Kapsch, 2020: 14). In consequence, the agency of users might
come to rely on their capabilities to “effect power potentials through interpretative engagements
in everyday processes of communication, in relation to structures that take part in the same communicative processes” (Ytre-Arne and Das, 2020: 7).
Small acts of algorithm engagement
To better understand the boundaries of how users effect their agency vis-à-vis algorithms in digital
media, I adapt the framework of Small Acts of Engagement (SAOE) (Picone et al., 2019). Building
on the tradition of audience- and reception studies, SAOE offers an agentic understanding of small
and ordinary acts such as liking, commenting, and sharing, recognizing these as productive acts that
are not reducible to merely content or data traces. Picone et al. (2019) describe SAOE as personally
motivated, socially situated practices within the ordinariness of everyday media use that require
only little investment relative to one’s capacities. Thus, SAOE entails productive and interpretive
actions on a micro-level.
As a conceptual framework, SAOE highlights two central dimensions of media engagement –
investment and intention. Investment describes small engagements with digital media by introducing a sense of scalability. For example, liking a photo on Instagram does not include the same
amount of investment as posting a tweet on Twitter. Accordingly, SAOE involves acts that require
only little investment, “those productive acts that we feel comfortable performing and do not
require us stepping out of the comfort zone of our daily routines.” (Picone et al., 2019: 2018).
Importantly, investment is not simply defined by any measurable amount of user input, as the
investment for the same act may vary significantly based on users’ capacities and dispositions.
Thus, when studying users’ engagement with algorithmic systems through the lens of SAOE, a
focus on investment requires empirical analysis of people’s lived experiences and dispositions in
context. Consequently, an empirical focus on investment enables the study to grasp peoples’ everyday experiences, situated dispositions, and reflexive capabilities, e.g., when does tactical engagement with algorithms become relevant? How do people implement tactics of algorithm
engagement into everyday media use?
The second dimension of SAOE, intention, describes how SAOE are often “driven by the ambition not to create content but to either present oneself to others, or as a traceability tool for a digital
flaneur” (Picone et al., 2019: 2016). Hence, intention highlights how small engagements might be
used to shape one’s digital identity or as a strategic ‘steering mechanism’ to shape one’s media
experience. For example, different liking practices might be used to show support and affinity
with others or to create affinity with significant digital objects. Correspondingly, small engagements
might function as a way for users to shape and navigate their media experience, for example, by
using tactical ‘liking’-practices to amplify and curate specific types of content in a newsfeed.
Certainly, these types of small steering acts might demonstrate how users try to shape algorithmic
operations, by engaging them in active and strategic ways. Thus, central to this study, the notion of
intention implicates a focus on the mundane ways users micro-manage their media presence, social
identity, and media experience, which, in the datafied realm of today, naturally entails interacting
with algorithmic systems and digital infrastructures. Consequently, the notion of intention sensitizes the study to peoples’ reflexive capabilities to navigate, shape, and alter the output of algorithmic operations in digital media.
In summary, I adapt the framework of SAOE with a specific focus on what might be labeled
‘small acts of algorithm engagement.’ This concept sensitizes the study to the dynamics of how
media users respond to algorithmic operations in daily life – through mundane interpretative acts
of tactical media use – to actualize their personal agency.
Research design and empirical material
To address questions of personal agency in media users’ engagement with algorithms in daily life, I
undertook a participatory study of how young Danish students identify and respond to algorithms
through their use of digital media. The research was carried out in the welfare-state of Denmark, a
heavily digitized country with almost total internet diffusion. In addition, there are comparably high
educational levels, and thus presumably some amount of digital literacy.
The research design was informed by Annette Markham’s (2019) critical pedagogy and longitudinal work on self-reflexive research methods as a response to datafication. Thus, the research
framework was designed to make participants “dive deep into self-reflexive ethnographic analysis
of their own social media experiences” (Markham, 2019: 2). In practice, this approach entails
letting participants track and document their media use for a duration of time through a mix of techniques such as screenshots, screen recordings, and notetaking, followed by making them choose
different methods to give reflexive accounts on their experiences. The purpose of choosing this
co-exploratory research approach is twofold: First, it provides fine-grained material about digital
media use and everyday lived experience in digital-saturated social contexts (ibid.). Therefore,
Media International Australia 183(1)
the method enables the study to gain an in-depth look at how participants experience, reflect on, and
respond to algorithms in their daily media use. Second, and perhaps more promising, the method
has shown potentials at raising digital literacy by making the participants more consciously
aware of platform logics and their daily media use habits: “These techniques enable participants
to reflect on their feelings and relations with platforms in a way that reveals some of the continual
dialogical processes natural to human sensemaking” (Tiidenberg et al., 2017: 9). Essentially, the
method might benefit participants in making sense of and becoming more aware of the influence of
algorithms in everyday life, which could be a potential step towards building algorithmic literacy
(Bruns, 2019).
With a specific focus on algorithms in digital media, I purposively recruited a group of 25 Danish
students (13 females and 12 males) to partake in self-reflexive explorations of their engagement
with algorithmic systems in daily media use. The students were all between 20 and 25 years old,
enrolled in a 3-year bachelor’s degree in information science. In addition, the research was designed
as a series of workshops to take place during small exercise classes following the participants’
weekly lectures. Consequently, the sample naturally comprises a relatively cohesive and homogenous group of people – heavy media users with a high educational level, a shared interest in digital
media, and presumably high levels of digital skills and digital literacies. Accordingly, I expected the
participants to demonstrate at least some degree of algorithm awareness (Gran et al., 2020) and
some capacity to verbalize their experiences with algorithmic operations throughout the
co-exploratory research process. Even though this particular group might not represent a typical
population of societal concern, especially regarding digital skills and digital literacies, they do
make an interesting case for exploring the boundaries of how heavy media users with presumably
high levels of digital skills and digital literacy navigate and respond to the influence of algorithmic
Mobilizing media use tracking and media use Vlogs as a reflective device
Throughout the last semester of 2019/20, I met with the participating students five times in small
workshop-based sessions. In the first session, after filling out a media use questionnaire, we touched
upon their general media use habits and talked about their initial knowledge of algorithms, from
which they reported not to have read about or worked with the subject before during their
studies. Nevertheless, from this outset, most of them conveyed to be somewhat familiar with the
term and to have a general sense of how algorithms influence the content and offerings served to
them, for example, by regulating what gets filtered out from their newsfeeds on social media.
Hence, the participants showed no significant signs of difficulty when tracking their daily encounters with algorithms in digital media for 48 h through screen- and video recordings, screenshots, and
daily notetaking. The purpose of the tracking was for the participants to gather as much material as
possible while taking note of all daily algorithmic encounters.
In the following sessions, the participants had to work with their captured material, trying different methods of self-reflexive investigation: drawings, situational maps, and vignettes. Finally,
they got introduced to the exercise of producing reflection Vlogs, in which they were asked to
share their experiences and thoughts on algorithms in digital media by narrating and visually
showing key examples from their media tracking. Here, the participants were given three tasks:
(1) To describe what is going on in the media use situation at hand and to identify and describe
the algorithmic operations herein. (2) To reflect on how algorithms influence the media use situation, and what they do (or can do) themselves to respond to this influence – how they might try
to shape or alter the output of the algorithmic system. (3) To reflect on what they discovered
about algorithms in digital media (if anything new) from the different exercises. These three tasks
helped structure the format of the Vlogs, making them relevant to the study’s focus on users’ reflexive capabilities and their efforts at actualizing personal agency through ‘small acts of algorithm
Thus, in summary, media use tracking was used to capture the participants’ habitual and
mundane encounters with algorithms – as devised by the SAOE framework (Picone et al., 2019)
– providing them with vast material from daily life to work and think with. By extension,
Vlog-making was used as a tool for the participants to reflect on their captured material and communicate their experiences and findings in an organized way, pushing towards deep reflection and
purposive (tactical) interaction to arise as expressions of user agency.
Importantly, the participating students were made aware of the research-oriented purpose of their
involvement, and my goal as a researcher to facilitate and scope a fruitful co-exploratory process.
Following the five sessions, I received 25 Vlogs (7 to 15 min in length), which are unfolded and
used as a primary focal point in the analysis below.
Analysis: identifying and responding to algorithmic operations
In the following analysis, based on empirical data from self-reflexive Vlog accounts, I unpack the
participants1 efforts at identifying and responding to algorithmic operations in their daily media use.
Finally, the analysis highlights the participants’ main takeaways and discoveries from partaking in
an exploratory media use investigation.
Identifying algorithmic operations
Although the participants were given no limits regarding what media (and algorithms) to track and
focus on, their reflection Vlogs turned out somewhat similar in terms of media use situations and
interactions shown. Thus, they predominantly show the participants’ use of popular and wellknown media platforms, such as Instagram, Facebook, YouTube, Spotify, Snapchat, and Netflix.
Essentially, the reflection Vlogs all involve examples from those media platforms that the participants report spending most of their time on in their daily lives.
Overall, the Vlogs show a strong pattern of how almost all participants are aware of several algorithmic operations in their daily media use. In fact, they all visibly demonstrate capabilities at identifying typical examples of algorithmic influences on targeted ads, newsfeed filtering, and
recommendations on YouTube, Netflix, Spotify etc. For example, Louis (23), who included in
his Vlog different situations of him using the Facebook app on his smartphone: “If I ‘like’ this
picture or click on a sponsored link, it will influence what I get to see. The algorithm also predicts
who my closest friends are and suggests that I invite them to join the same Facebook groups that
I’m in” (Louis, 23). Louis’ Vlog resonates with how most participants seem to be highly aware of
the presence of algorithmic operations, as they find algorithms to influence most of what they get to
see, especially on social media.
The participants’ general high awareness of algorithmic operations echoes key results from Gran
et al. (2020), who did a representative survey on the population of Norway and found a significant
association between self-reported high levels of algorithm awareness and both educational level and
age (see also Swart, 2021). Hence, as the participants in this study comprise young well-educated
adults, their capabilities to identify algorithmic influences in digital media are perhaps not a surprise. However, by nature of letting the participants themselves show and demonstrate, how they
are aware, we arguably get a more detailed foundation for assessing the boundaries of their
Media International Australia 183(1)
algorithm awareness and their reflexive capabilities of interpreting and engaging algorithmic
systems. More than just identifying the structuring work of algorithms in daily media use, Louis
(23) and several of his fellow participants also demonstrate to be aware of how their own
actions play a central role in how algorithmic operations in digital media unfold. Essentially,
most participants display a general understanding of how they get profiled and categorized by algorithms that target them with personalized content and ads, by providing the systems with behavioral
data: “Facebook is tracking my behavior constantly, also on other sites that I go to, to provide me
with relevant content and ads. But in that sense, I think they do rely on me, and my data, to sell
sponsored ads and post” (Victoria, 22). As exemplified in the vlog from Victoria (22), algorithms
are often described to not function “properly” or “accurately” without user data, as they “rely” on
users’ behavioral data and media interaction to stay relevant. Furthermore, like Victoria, most participants naturally relate data tracking and the algorithmic outputs of targeted ads to the economic
relationship between users and platform services.
In summary, by analyzing the respondents’ first task of Vlog reflections, the large majority show
a generally high awareness of algorithms, as they visibly demonstrate reflexive capabilities to identify specific algorithmic operations in their daily media use. In addition, most participants also seem
to recognize how algorithms rely on user data and user interaction to provide personalized and
accurate content and options.
Responding to the influence of algorithms
In their second round of Vlog reflections, a central observation is how several participants describe
specific ways to actively shape and alter the outputs of algorithmic systems, primarily through different interactions with the user interface. Hence, their reflections on how they respond to and possibly alter algorithmic operations predominantly involve engagement with visible “buttons” and
in-built functions in the digital user interface. To exemplify, in her Vlog reflections on the algorithmic influences on Instagram, Meriem (21) recognizes how algorithms help filter and shape her
experience of using Instagram while describing different ways to influence these outputs herself:
“I influence what profiles are shown in my newsfeed by interacting with them. That could be by liking,
commenting, or texting over DM [direct message]. I can definitely help manage what I don’t want in
my feed. That could be by manually blocking people, for example. (…) Users who interact a lot with
the interface get a favor, as they get shared more and are more likely to be featured in the feeds of
others” (Meriem, 21).
As indicated by the quote, Meriem demonstrates reflexive capabilities when describing how different user interactions might influence the algorithmic outputs on Instagram. Furthermore, she displays several practices of deliberate engagement with algorithmic systems by applying popular
hashtags on her Instagram pictures, hoping to gain more followers, and occasionally flagging or
reporting unwanted content to help maintain a relevant flow of content in her newsfeeds.
Turning to the concept SAOE (Picone et al., 2019), Meriem has arguably become comfortably
invested in actively responding to the influence of algorithms in her daily media use. Her familiarity
with and reflexive capabilities to understand typical algorithmic mechanisms in digital media have
enabled her to interact purposively with specific algorithmic systems. Additionally, her main intention of tactically engaging these systems mainly involves gaining visibility and a relevant flow of
content and recommendations. In other words, by way of small acts of algorithm engagement,
Meriem tries to actualize her personal agency to maintain a sense of control over her media experience and media presence.
To be clear, most participants only rarely find themselves actively trying to shape what algorithms provide them in their daily media use. However, most of them do express to have
engaged in such deliberate acts before, predominantly intending to maintain control of the curation
and flow of content, especially concerning newsfeed filtering and recommender systems. Thus,
while not being entirely commonplace, most participants do provide different tactics they
believe will help alter the output of algorithmic systems. For example, Sif (22) demonstrates
how the user might gain control by clicking the three-dotted options-button on posts on
Instagram, to block or report the post at hand and to prevent similar content from showing up:
“It’s a way for Instagram to get some feedback to better moderate my feed, but I think it is nice,
and I actually feel that I’m able to take back some control of what posts Instagram are feeding
me.” (Sif, 22). Like Meriem (21) and Sif (23), for most participants, the main intention when occasionally engaging algorithms tactically seems to involve gaining a sense of control over the daily
flows of content and to maintain a relevant and attractive media use experience.
While most of these deliberate engagements with algorithmic systems through visible
interface-affordances might seem utterly mundane, they do testify to how users might try to exercise
their agency through small acts of algorithm engagement. Indeed, visible interface functions are
designed to be engaged by media users. Nevertheless, actively using in-built functions of algorithmdriven media to shape one’s media use might be described as an ordinary way of ‘speaking back to
the system’ (Lomborg and Kapsch, 2020), consequently influencing the datafied communication
loops between users and algorithms. Essentially, the participants’ accounts of how to respond to
the work of algorithmic systems, no matter how ordinary, show how relatively skilled heavy
media users make sense of complex mechanisms of computation and how they – when relevant
and feasible to them – invest in deliberate acts of algorithm engagement. Thus, the Vlogs demonstrate how the participants try actualizing their personal agency in a process of reflexive interpretation and SAOE with algorithmic systems. As noted by Ytre-Arne and Das (2020), “To be agentic,
audiences need some degree of capability to evaluate the complex and opaque communicative conditions of the datafied age, simply to grasp the implications of seemingly small and mundane communicative acts becoming metrified and aggregated.” (Ytre-Arne and Das, 2020: 13). By following
Ytre-Arne and Das’ (2020) notion of communicative agency, the participants might be described as
showing fundamental agentic potentials through their reflexive capabilities to evaluate the different
implications of situated user interactions. Most of the participants recognize how each user interaction might become aggregated and influence not just the situation at hand, but also future
media use situations. Effectively, they come to negotiate their media use through a process of interpretation and active engagement, sometimes responding deliberately to algorithmic systems in purposive ways to gain or maintain a sense of control. A substantiating example is Jonas (21), who
shows in his Vlog how to avoid unwanted content from appearing in the newsfeed on Facebook,
while at the same time stating that he chooses not to use this function:
“By clicking here, you can tell the algorithm not to show similar things again. However, by doing that you
also help teach it to categorize and target you with posts and ads even better. That’s why I personally don’t
do that myself” (Jonas, 21).
Like most participants, Jonas (21) demonstrates reflexive capabilities to identify and evaluate
how specific interactions with algorithmic systems might affect his media use. Furthermore, as
he finds it desirable not to help the platform target and categorize him too precisely, he tries to
avoid feeding the system excessively with data. Intentions like these arguably require some reflexive capabilities, whether accurate or not, to evaluate the general mechanisms of algorithmic
systems, which, in this case of Jonas (21), results in the absence of using specific in-built platform
Media International Australia 183(1)
functions. Of course, these types of tactical engagement might not do much on a large-scale infrastructural level, let alone avoid data-tracking, algorithmic targeting, and profiling. However, none
of the participants expressed any strong subversive intentions when engaging algorithm-driven
media in daily life. Instead, their reflections and actions suggest that media users also try to actualize
their personal agency by engaging algorithms in seemingly ordinary (non-subversive) ways.
Participant takeaways
By adopting a co-exploratory research approach, the study instructed research participants to
capture (media use tracking) and disseminate their daily responses to algorithmic operations (selfreflexive Vlog accounts), allowing them to choose what media use situations to focus on and
include in their final Vlog product. Consequently, this twofold process functioned as a reflective
device, purposively encouraging the participants to dive deep into the details of their daily
media use (Markham, 2019) and to reflect at length on the influence of algorithms in digital
media. As a result, the third round of Vlog reflections – covering personal takeaways – touch
upon many topics, ranging from participants gaining newfound knowledge on personal media
use habits, e.g., “I didn’t really know how much time I spent on these ads until I recorded my
media use and found out” (Tine, 25), to participants tapping into broader questions of how platform
logics might influence society, for example, how targeted political ads might influence public
opinion. Due to the broad range of topics, the section below is limited to include the main patterns
that have specific relevance to the article’s focus on user agency and engagement with algorithms in
digital media.
Overall, the most common takeaway is how most participants express to have become even more
aware of the presence of algorithms in their daily media use:
“My awareness towards algorithms has raised. Because before, I didn’t think much about what is going on
in my feed. I was aware that ads and other things are organized to me specifically, but most times you just
go with the flow. (…) I have also learned that I submit to these algorithms. I submit because I want to follow
my friends, and then I also have to see these ads that algorithms provide” (Martin, 24)
As Martin (24) reports, besides a heightened awareness of algorithms, he finds himself willingly
“submitting” to algorithmic operations on social media platforms because it is easy and convenient
to “just go with the flow.” That is, making algorithms automate and streamline his user experience.
However, he also proposes the opportunity for users to leave social media to avoid algorithms and
the influence of targeted ads – a disposition he deems irrelevant, as the benefits of being present on
social media with his peers are more important to him. Essentially, Martin’s takeaways resonate
with how most participants conveyed to invest in deliberate acts of algorithm engagement only
rarely. Congruently, a pivotal observation to the participants’ takeaways is a widespread acceptance
of algorithmic influences on daily media use, as the majority find algorithms instrumental in providing a relevant, seamless, and joyful user experience. As described by Kasper (24): “If
[Instagram] gave me full control, I think it would defeat the purpose of the algorithms, because
they want to control what gets shown to you. You don’t really want to think about what to do.
That’s why you have algorithms”.
Furthermore, several participants also report to have gained a better understanding of how user
interactions help shape the outputs of algorithmic systems:
“I think I have become more aware of how my interactions with platforms influence my future media use. It
has become super clear to me how I influence the algorithms myself, and how I help to reinforce them to get
more of what I’m interested in” (Meriem, 21).
Returning to the Vlog of Meriem, we get a sense of how the self-reflexive media use investigation has made her more aware of how she helps co-shape and reinforce the communicative loops of
algorithmic systems. However, while several participants express to have become positively aware
of how they help reinforce algorithm-driven content loops by default, not everyone conveys the
same confidence in actively changing or altering them:
“I’m still trying to figure out how I can influence the algorithm on Instagram. It seems like I have to more
actively do something to change what kind of content Instagram throws at me” (Marie, 23).
Marie, who does demonstrate reflexive capabilities at interpreting the general mechanisms of
algorithms in popular media platforms, describes how the outputs of algorithmic systems seem
somewhat difficult to change and how she finds it easy to “get stuck” in the same loops of
content. Interestingly, the feeling of “getting stuck” in specific loops of content is a common
topic amongst the participants, sometimes leading to moments of slight frustrations, and for
some, leading to small acts of algorithm engagement as an effort to alter these loops, gaining
back a sense of control.
By extension, the analysis shows a mix of takeaways concerning the perceived potentials of
actively shaping the outputs of algorithmic systems. Here, some participants report to have
gained confidence in how to engage algorithmic systems purposively, yet others find it somewhat
difficult, frustrating, or simply not relevant. For example, Kasper (24), who describes how he sometimes tries to “fight” the algorithms through different means – e.g., tactical use of clicks and likes –
but ultimately reports that “it is hard to know how exactly it is affecting the algorithm” (Kasper, 24).
Conversely, other participants in the study report to have found new specific ways to engage algorithmic systems to alter their outputs, once again, predominantly through in-built functions in the
user interface:
“Before this exercise, I did not know that Instagram actually gives me options to block content that I don’t
want to see. Even though it’s not a way to select content on my page, it becomes a possibility to take back
some control of my own profile and a way of interacting with the Instagram algorithm” (Line, 24).
Finally, having engaged in self-reflexive investigations, some participants find themselves more
critical minded towards the influence of algorithms and how they work. These takeaways have less
to do with active user engagement as they predominantly involve reflections on the importance of
critically questioning the underlying mechanisms and implications of how algorithms shape everyday media use. As expressed by Marcus (26):
“People might not be aware how these algorithms work, and it is also almost impossible to find out entirely,
but it’s important to be aware of their presence and to be able to identify the different things, the different
aspects that lead to the content you are shown. It enables you to understand some of the consequences of
that” (Marcus, 26)
Marcus’ example echoes how several participants consider it important for media users, in
general, to be attentive to and critically aware of how algorithms work to assess better their influence and possible implications for one’s personal media use.
Media International Australia 183(1)
In summary, the participants’ self-reported takeaways indicate that most of them have become
more aware of the influence of algorithmic operations in daily media use and how these systems
rely on user interaction to function accurately. Furthermore, some participants invoke to have
learned new ways of engaging algorithms deliberately through newfound functionality in the
user interfaces of platform media. Last, some participants report becoming more critically conscious about and critical minded towards how algorithms work.
Concluding discussion
The study has demonstrated how a group of young Danish students make sense of and try to exercise their personal agency through reflexive engagement with algorithms in digital media. The
results show how the dominant response of countering the influence of algorithmic operations
involves engaging purposively with in-built platform functions that are visible in the user interface
of popular media services. For example, by flagging and reporting specific types of content, using
‘not interested’-buttons and engaging in tactical ‘liking’ and ‘hashtag’ practices to help co-shape
and alter how algorithms filter and circulate personalized content, targeted ads, and user options.
Effectively, these small and seemingly mundane acts of algorithm engagement function as the
primary way for the participants to exercise their personal agency, with the common intention to
gain or maintain a sense of control over their media experience and media presence. However,
while these types of ordinary acts constitute the dominant form of deliberate engagement with algorithmic systems, the most commonplace practice in the participants’ everyday life is the modus of
“going with the flow,” not investing much thought and time into tactical media engagement as such.
The study suggests (at least) two possible explanations for this. First, the participants hold a shared
belief that algorithms are instrumental in providing a seamless, effortless, and attractive media user
experience, why they generally embrace their influence. Thus, to most of them, the different
mechanisms of algorithmic filtering, recommendations, targeting, and personalization have
become an essential and synonymous part of using popular platform-based media services.
Second, while most of the participants only occasionally invest deliberately in tactical engagement
with algorithmic systems, a few also report never trying, due to a lack of knowledge in how to
succeed, or simply because they find it somewhat irrelevant to their personal media use.
Agentic boundaries
By looking more specifically at the boundaries of users’ perceived agentic potentials, the study
speaks to recent debates on user agency and responses to algorithm power (see Velkova and
Kaun, 2019). Interestingly, most participants do not seem to find their agency to be critically challenged by algorithmic power. While they do seem to recognize the power of how algorithms exercise a strong influence over what is shown and offered to them in digital media, the vast majority do
not find this influence too unmanageable, let alone problematic. However, their agentic potentials
face boundaries, as they commonly report finding it practically impossible to know for sure how
these systems function and are affected, making their tactical engagements with algorithms characterized by varying amounts of speculation and uncertainty, although sometimes leading to a provisional feeling of being in control. Thus, given the distributed nature of control between human and
non-human entities in digital media platforms, users’ feelings of being in control might be characterized as both paradoxical and temporary (Markham et al., 2019).
These results relate to the notion of agency as prospective (Ytre-Arne and Das, 2020), describing
how agency in an increasingly complex datafied age requires capabilities of interpreting and
weighing situated actions based on future possible derived outcomes. Consequently, for the participants in this study, prospection comes to function as a way to deal with the uncertainty of how
media interaction and data aggregation influence algorithmic operations and future media use situations. In practice, most of the participants demonstrate reflexive capabilities to understand the
general mechanisms of algorithmic systems in popular media platforms. They seem to recognize
the relational dynamic of how these systems rely on user input and data to function accurately
(as intended), and they occasionally invest in ‘small acts of algorithm engagement,’ trying to
co-shape their outputs. Of course, these capabilities do not represent the average media user but
merely describe the capabilities of a group of well-educated young Danish adults with a shared
interest in digital media.
Co-shaping algorithmic literacy?
Finally, the study speaks to the potential for researchers to mobilize participatory paths to help
media users shape their personal agency and critical engagement with algorithmic systems. As a
research framework, the co-exploratory process of doing participant-driven media use tracking followed by self-reflexive media use Vlogs provided a relatively detailed audio-visual look into how
media users make sense of and navigate algorithm-driven media; how they identify and respond to
algorithmic operations in an everyday context. Perhaps more noteworthy, this two-fold exploratory
process became a reflective device, helping the participants think about their relationship with algorithms in new and productive ways, arguably shaping their personal agency.
As a result, the participants reported four main takeaways from partaking:
1. A better understanding of personal media use habits.
2. General heightened awareness of the influence of algorithms in digital media.
3. New specific ways to deliberately engage algorithmic systems in digital media, to shape or
counter their outputs.
4. A more critical conscious mindset towards algorithmic systems and how their implemented
logics might influence personal media use and society at large.
While these results are ultimately tentative, due to the small sample size and homogenous composition of participants, they do show signs of how co-exploratory approaches might be mobilized with
the benefit of raising critical knowledge and reflexive capabilities to engage algorithmic systems
tactically, and thus be a potential steppingstone for building algorithmic literacy (Bruns, 2019).
Essentially, by mobilizing participatory paths of actively engaging media users in reflections and
actions to shape their knowledge and personal agency, we might enable them to better understand
and verbalize felt tensions concerning algorithms and datafication.
By extensions, participatory studies on algorithms might help us give better accounts of the users
we want to understand and ultimately empower while enabling more people to join public debates
on the futures of algorithmic work and datafication of society.
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
The author(s) received no financial support for the research, authorship and/or publication of this article.
Media International Australia 183(1)
Patrick Heiberg Kapsch
1. Pseudonymized in the article
Bruns A (2019) Are Filter Bubbles Real? Cambridge, UK: Polity Press.
Bucher T (2017) The algorithmic imaginary: Exploring the ordinary affects of facebook algorithms.
Information, Communication & Society 20(1): 30–44.
Cotter K (2019) Playing the visibility game: How digital influencers and algorithms negotiate influence on
Instagram. New Media & Society 21(4): 895–913.
Couldry N, Fotopoulou A and Dickens L (2016) Real social analytics: A contribution towards a phenomenology of a digital world. The British Journal of Sociology 67(1): 118–137.
Couldry N and Hepp A (2017) The Mediated Construction of Reality. Cambridge, UK: Polity Press.
Crawford K (2021) The Atlas of AI. New Haven, CT: Yale University Press.
Eslami M, Rickman A, Vaccaro K, et al. (2015) I always assumed that I wasn’t really that close to [her]. In:
Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, Seoul
Republic of Korea 153–162. New York, NY: Association for Computing Machinery. https://doi.org/
Eubanks V (2018) Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. New
York, NY: St Martin’s Press.
Gillespie T (2014) The relevance of algorithms. In: Gillespie T, Boczkowski PJ and Foot KA (eds) Media
Technologies. Cambridge, MA: MIT Press, 167–194. https://doi.org/10.7551/mitpress/9780262525374.
Gran A-B, Booth P and Bucher T (2020) To be or not to be algorithm aware: A question of a new digital
divide? Information, Communication & Society 24: 1779–1796. https://doi.org/10.1080/1369118X.
Hutchinson J (2021) Digital intermediation: Unseen infrastructures for cultural production. New Media &
Society: 146144482110402. https://doi.org/10.1177/14614448211040247.
Kennedy H (2018) Living with data: Aligning data studies and data activism through a focus on everyday
experiences of datafication. Krisis: Journal for Contemporary Philosophy 1, ISSN 0168-275X.18-30.
Livingstone S (2019) Audiences in an Age of datafication: Critical questions for Media research. Television &
New Media 20(2): 170–183.
Lomborg S and Kapsch PH (2020) Decoding algorithms. Media. Culture & Society 42(5): 745–761.
Markham A, Stavrova S and Schlüter M (2019) Netflix, imagined affordances, and the illusion of control. In:
Plothe T and Buck AM (eds) Netflix at the Nexus. New York, NY: Peter Lang, 29–46.
Markham AN (2019) Critical pedagogy as a response to datafication. Qualitative Inquiry 25(8): 754–760.
Noble SU (2018) Algorithms of Oppression. New York, NY: New York University Press.
Pasquale F (2015) The Black Box Society: The Secret Algorithms That Control Money and Information.
Cambridge, MA: Havard University Press.
Picone I, Kleut J, Pavlíč ková T, et al. (2019) Small acts of engagement: Reconnecting productive audience
practices with everyday agency. New Media & Society 21(9): 2010–2028.
Seaver N (2019) Knowing algorithms. In digitalSTS: A Field Guide for Science & Technology Studies.
Princeton: Princeton University Press, 412–422. https://doi.org/10.1515/9780691190600-028
Siles I, Segura-Castillo A, Solís R, et al. (2020) Folk theories of algorithmic recommendations on Spotify:
Enacting data assemblages in the global south. Big Data & Society 7(1): 205395172092337.
Swart J (2021) Experiencing algorithms: How young people understand, feel about, and engage with algorithmic news selection on social media. Social Media + Society 7(2): 205630512110088.
Tiidenberg K, Markham A, Pereira G, et al. (2017) “I’m an addict” and other sensemaking devices.
Proceedings of the 8th International Conference on Social Media & Society – #SMSociety17, 1–10.
van Dijck J, Poell T and de Waal M (2018) The Platform Society: Public Values in a Connective World New
York, NY: Oxford University Press .
Velkova J and Kaun A (2019) Algorithmic resistance: Media practices and the politics of repair. Information,
Communication & Society 24(4): 523–540.
Ytre-Arne B and Das R (2020) Audiences’ communicative agency in a datafied Age: Interpretative, relational
and increasingly prospective. Communication Theory 0(C): 1–19.
Ytre-Arne B and Moe H (2021). Folk theories of algorithms: Understanding digital irritation. Media, Culture &
Society, 43(5), 807–824. https://doi.org/10.1177/0163443720972314

Purchase answer to see full

error: Content is protected !!