In an era dominated by algorithmic culture, the promise of personalized experiences has become increasingly prevalent. From social media feeds to music streaming platforms, algorithms curate content based on user data, ostensibly to enhance user satisfaction and engagement. However, this curated reality raises critical questions about user agency, the nature of “fun,” and the subtle coercion embedded within these seemingly benign systems. The recent executive order attempting to ban TikTok in the U.S. brought these concerns into sharp focus, not just regarding data privacy, but more fundamentally, about the influence of algorithms themselves.
Tech analyst Ben Thompson, in his analysis of the TikTok situation, pinpointed the algorithm as the primary concern. It’s not merely data collection, but the way TikTok’s algorithm shapes user feeds without explicit user direction. This system infers user preferences from viewing habits and engagement, creating a feedback loop designed to maximize platform usage. TikTok’s own explanation emphasizes this personalized, joy-inducing aspect, suggesting a decluttered media consumption experience where “every video sparks joy.” But this raises a crucial question: is this algorithmic curation simply delivering personalized entertainment, or does it delve into more manipulative territory?
The notion of algorithmic manipulation isn’t far-fetched. Paul Dabrowa, an AI and social media expert, argues that TikTok is a “program that develops predictive behavioral models” using extensive user data. His concern extends beyond mere entertainment, suggesting a potential for “brainwashing.” Unlike platforms based on social networks, TikTok’s AI builds behavioral profiles to populate feeds, even predicting desired social connections. Dabrowa posits a scenario where initial positive content subtly transitions into propaganda, leveraging positive feedback loops to condition users, drawing a disturbing parallel to dog training. While Thompson doesn’t explicitly endorse this extreme view, he does caution about TikTok’s potential as a propaganda tool, its algorithm capable of promoting content unchecked by social or professional filters.
Tyler Spangler's artwork depicting a colorful, chaotic digital scene with overlapping text and images, symbolizing the overwhelming and disorienting nature of algorithmic culture.
The anxieties surrounding TikTok might seem like xenophobic paranoia, particularly when framed as preventing the spread of “Marxism.” However, the underlying concern about algorithmic conditioning remains pertinent across various platforms. Eugene Wei, another tech analyst, highlights the “eerily perceptive” nature of TikTok’s algorithm, emphasizing its efficiency in building user interest profiles through passive consumption. The short-form video format accelerates data collection, making the “training process” feel effortless, even enjoyable. This “passive personalization” is a double-edged sword. It caters to the user, creating a sense of individual recognition, while simultaneously shaping the user within the algorithmic environment. Wei aptly notes, “When you gaze into TikTok, TikTok gazes into you,” underscoring this reciprocal shaping.
This user-algorithm dynamic transcends simple programming or service delivery. It operates through the tension between activity and passivity. Algorithmic control, when experienced as liberation, and censorship coexisting with hypervisibility, create a complex interplay. The algorithmic attempt to define us paradoxically heightens our awareness of individuality, albeit within the confines of the platform.
The primary imperative for platforms like TikTok is user retention. This goes beyond simply meeting existing demand; it involves actively creating an audience receptive to the platform’s offerings. This audience creation is a form of subtle “mind control,” akin to the broader seduction of consumerism. The link between our desires and our online viewing habits is inherently weak; algorithms like TikTok are designed to manage this gap, appearing to bridge it while actually maintaining it.
The Coercion of “Fun” in Algorithmic Culture
The pervasive description of platforms like TikTok as “fun” warrants closer scrutiny. The concept of “fun,” as articulated by Baudrillard and interpreted by critics like Sianne Ngai, transcends mere enjoyment. It becomes a “fun morality,” a compulsory enjoyment within consumer society. “Fun” becomes an aesthetic category, reflecting how capitalism shapes our capacity for pleasure. It marks the extent to which consumerism has subsumed our ability to experience genuine joy, effectively producing us as the ideal audience.
“Fun” often aligns with the “experience economy,” blurring retail with tourism and emphasizing manufactured “authenticity.” It extends to media “experiences,” delivered in endless feeds. Pleasure becomes commodified, abstracted from its source, and presented as a prefabricated chunk of attention to fill a presumed void of boredom. Boredom itself becomes a prerequisite for “fun.”
Gunther Anders, in his 1956 essay, “The World as Phantom and as Matrix,” connects “fun” to a disavowal of cultural indoctrination. He argues that the most effective depersonalization methods are those that appear to preserve individual freedom. Conditioning, disguised as “fun” and delivered privately within homes through media like radio and television, becomes incredibly potent. Users are not told they are sacrificing anything; the illusion of privacy reinforces the conditioning. The home, once a sanctuary, becomes valuable to the “caterers” of media, the providers of our “daily fare.”
Anders’s “conditioning” isn’t about specific content, but the continuous flow of programming and the individualized sense of control derived from private viewing. Wei’s observation that users “enjoy” TikTok’s algorithmic training aligns with this. Conditioning happens algorithmically, individually, within the privacy of a phone. Users are encouraged to perceive this as self-directed, believing they are shaping their own content consumption – “I’m programming the programmers!” However, the platform owners, the “caterers” of content, ultimately capture data and time, capturing not minds, but will.
The Broadcast Mentality and Algorithmic Self-Perception
Anders sees video consumption as creating a passive self. Social media, where users are both broadcasters and audience, doesn’t fundamentally alter this. “Broadcasting” the self becomes less about expression and more about engaging with oneself as a media object. Attention shifts from a condition of communication to its product. The “broadcast mentality” isn’t about audience size, but the ability to be one’s own audience.
Algorithms intensify this self-as-media-object dynamic. Users interact with themselves as defined by recommendations. The stream of “For You” videos projects a forward-moving self, simultaneously erasing past experiences. TikTok allows users to see themselves reflected in the interface itself. While videos vary in topic, the sequence becomes inherently about the user.
Anders argues that mass consumption is a “sum of solo performances,” where individual consumers are “unpaid homeworkers” producing the “mass man.” While consuming alone, individuals collectively create the idea of “the audience,” shaping their perception of content and their place within it. Deleuze, in “Postscript on the Societies of Control,” updates this, suggesting we’ve moved beyond “mass/individual” to “dividuals” and data samples. The “production of mass man” evolves into the production of the “dividual.” Individual actions on apps aggregate into a new, disseminated selfhood.
The Scarcity of Scarcity and Algorithmic Desire
Eric Drott’s analysis of music streaming recommendation algorithms in “Why the Next Song Matters” illuminates this process further. Streaming services initially marketed their infinite databases – “listen to whatever you want!” This shifted to promising the “perfect song” – “here’s something you didn’t know you could want this much!” This parallels TikTok’s promise of desired content. Streaming platforms, Drott argues, sell the fulfillment of a lack of lack, solving the “scarcity of scarcity” created by digital abundance. This “consumer surplus” necessitates algorithmic spoon-feeding of “good” content.
Choice overload becomes a genuine problem, particularly when traditional frameworks for taste are eroded. Musical taste isn’t innate; it’s socially and culturally constructed. Novelty itself is a learned behavior, linked to economic and social rewards. This context makes systems that “desire for us” appealing. Algorithms don’t reflect existing wants; they instill new ones. Platforms like TikTok and Spotify become places to turn when one wants to want something. Algorithms create a sense of “satisfaction” without prior deprivation, eliminating the effort of focused desire. Desire becomes a readily available consumer good, enjoyed on demand. Consuming algorithmic feeds positions users as constantly “winning at wanting.”
Drott notes a nostalgia for curators who “knew better,” like record store clerks or musically knowledgeable older siblings. Streaming services aim to replace these figures with algorithms, presented as superior servants that organize pre-existing proclivities.
Tyler Spangler's artwork depicting a colorful, chaotic digital scene with overlapping text and images, symbolizing the overwhelming and disorienting nature of algorithmic culture.
The marketing shift in streaming services reveals a “bait and switch”: users sought more music, but received algorithmic surveillance. “Recasting abundance as scarcity is performative,” Drott argues, fabricating a need it pretends to discover. This applies to algorithmic “discovery” in general – producing recognizable desires to stabilize a sense of self with legible tastes. “The gap between individual and the subject position… appears to collapse,” Drott explains, “the streaming service apparently interpellates us as ourselves and as nothing else.” I am music and I write the songs.
Algorithmic curation sells not just music, but “subject positions,” relieving users of the “burden of having to fabricate such subjectivities themselves.” The “range” of positions highlights the instability of self fostered by recommendations. Algorithms promise an accessible self while assuming users are alienated from their own desires, needing algorithmic prosthetics.
The Capitalist Imperative of Desire and Deskilled Consumption
Why would individuals seek algorithmic self-fashioning? This relates to capitalism’s crisis of overproduction: realizing profits requires constant consumer demand. “Without the production of desire, there is no continuous self-transformation of capital,” Drott notes, referencing Marx. Capitalism necessitates ceaseless expansion of consumer demand, translating into individual compulsion to want. Consumerism isn’t “natural”; it’s a systemic expression of banishing subsistence. It rationalizes inequality by portraying consumers as liberated and autonomous, compelled to constantly desire more. Consumerism becomes “hard work,” reproducing desire and keeping up with perceived desirability.
Alienation from desire is a socioeconomic condition, reflected in culturally available desires and “fun,” all built on contradictions. Capitalist culture interpellates us as both insatiable and easily satisfied by commodities like TikTok videos. This contradiction intensifies the need to consume more. If TikTok fosters “Marxism,” it’s by exposing the intolerable logic of capital, not CCP propaganda.
Managing consumer demand requires deskilled consumption. We unlearn self-satisfaction on our own terms. Fashion, valorizing change for change’s sake, exemplifies this. Deskilling isn’t entirely top-down; it’s effectuated by practices perceived as “fun,” “convenient,” or “trending.”
Anders’s “unpaid workers” producing the “mass man” anticipates Stiegler’s concept of consumer “proletarianization.” 19th-century capitalism proletarianized production labor; 20th-century capitalism proletarianizes consumers, deskilling desire into abstract libidinal energy. Stiegler links this to media consumption. Video synchronizes viewers’ experience of time to the interface’s rhythm, transforming “relations of consumption.” This “synchronization of consciousness” erodes individuation, echoing Anders’s “proud of being a nobody.”
Animalization and Database Consumption
Hiroki Azuma, in Otaku: Japan’s Database Animals, also addresses this deskilling, using Kojève’s Hegel interpretation to argue postmodern consumers are “animalized.” Human desire, unlike animal wants, involves sublimation and higher purpose. Human desire is “desire for the desire of the other.” When consumers find satisfaction without this social element, they are “animalized,” plugged into stimulus-response circuits, experiencing pleasure devoid of social communication. This is not mere conditioning, but “blue-pilling” oneself, choosing immediate gratification over collective meaning. It manifests in “convenient” consumerism that bypasses social friction and algorithms that replace social communication. Algorithms provide “the desire of the other” without reciprocal interaction. Azuma sees this as societal “animalization.”
Algorithms replace social interaction with data processing. Others’ actions are presented as content we “should” want. Stiegler views this content as mediatized “exteriorization of memory and knowledge,” controlling consumers through cognitive and cultural industries. Media products replace lived experience, reprogramming memory and pleasure, standardizing and “grammatizing” them, hindering individuation. This leads to “systemic stupidity,” or what we call “fun.”
In Azuma’s terms, “systemic stupidity” is animalized consumption detached from grand narratives, driven by compulsive fulfillment of base needs. He sees otaku as emblematic consumers, detaching “form” from “content” to confirm the self as a “pure idle spectator.” They consume endless trope configurations from emotional databases, foreshadowing porn sites, Pandora, Netflix, and TikTok’s content tagging.
Algorithmic culture allows experiencing the self as “pure form,” atomized in a networked society, becoming a “pure idle spectator” consuming the self, not social experience or reality. The feed is “just for you,” reinforcing a closed loop of self-centrality.
Proletarianization solves consumer demand by stripping “transindividuality,” rendering individuals as atomized, affectively waned, and susceptible to perpetually induced desire. Consumers resolve overconsumption by obsessively adopting new technologies, creating a “crisis of subjectivity,” a breakdown of individuation and responsibility. Emotional activities become “processed” nonsocially, in solitude, “animalistically,” in a postmodern “database-model society” lacking grand empathy. TikTok videos can be seen as tools for this “animalistic ‘processing.'”
Endless consumption dissolves the self, but algorithms become both the disease and the cure, externally reconstituting the self and solving the “crisis of subjectivity” they facilitate. Algorithms teach “fun,” reinforcing lessons from other entertainment forms, not through content affiliation or subcultural branding, but through a time orientation. Time becomes something to be “consumed” to realize recognizable interests and produce a “self” through consumption. The specific content is secondary to the continuous algorithmic recommendation.
Algorithms teach us to locate ourselves not in social relations, but in “next-ness,” a foreshortened self-horizon. Within this narrow timeframe, the self dissolves into subject positions offered by streaming services, broken down into configurable subcomponents. “You want to be the song that you hear in your head,” as Bono says.
The “normative listener” lacks a fixed identity, unlike subcultural identities of the late 20th century. Tastes “cohere… by virtue of a steadfast refusal of any positive principle of coherence,” fluctuating with context and affect. The “mass man’s” conformity lies not in shared tastes, but in being broken into configurable subcomponents, identity signifiers voiced by feeds. This is “database consumption”—not choosing from options, but living as part of a spreadsheet, algorithmically processed in interchangeable cells of content.
Algorithms serve capitalist consumer production, instilling desire for standardized culture. Isolation and control are achieved through consumption permission when affordability is no longer a barrier. Media, algorithmically selected, internalize patterns valuing convenience and efficiency, condemning interpersonal complexities. Media commodify emotional experience, decontextualized from social relations, inducing compulsive passivity simulating autonomy without responsibility.
The Gimmick of Algorithmic Personalization
While critiques of “animalization” and “proletarianization” are insightful, their terms can be alienating. Ngai’s Theory of the Gimmick offers a less moralizing perspective on deskilling consumer demand, focusing on pervasive ambivalence. In a world engineered to appeal to pre-shaped desires, “how can there not be… uncertainty at the heart of the aesthetic evaluations through which we process the pleasures we take in it?” This uncertainty is the “gimmick,” not just novelty, but how commodities seem to “lie.”
“All subjects in capitalism find something gimmicky.” Responses to “why” something is gimmicky are similar: “trying too hard,” “not working hard enough,” “unconvincing promises,” “instructing me how to consume.” Gimmicks seem like cheats or shortcuts, exploitative yet also clever “hacks.” They foreground obviousness, demanding judgment – “a judgment in which skepticism and enjoyment coincide.” Judging something a gimmick completes, not negates it. It incites reaction. Calling something a gimmick communicates “falseness of a thing’s promises… without disavowing their appeal or social effectivity.”
Tech platforms heavily rely on gimmicks, evident in memes, interface features, and algorithmic feeds. Algorithms “try too hard,” make mistakes, dictate consumption, flaunt labor-saving capacity – instilling ambivalence indexing our dependency. Algorithmic feeds offer a gimmicky self, saving the “labor” of self-creation through media consumption – a false identity revealing both surveillance and “hidden truth” under capitalism.
Like marketing, gimmicks let us “see through” them, feeling superior, yet lodging their framing in our consciousness. We simultaneously believe and disbelieve. Gimmicks allow holding contradictions, examining and disavowing capitalist desires, displacing induced desire onto gimmicks, enjoyed through debunking and enchantment.
“The conjoining of enigma and transparency in the gimmick points to a key shift in… illusions,” Ngai notes, applying to algorithms. It reflects “our simultaneous recognition of what we can but also cannot grasp about a productive process… as well as a double-sided gestalt: ‘work’ conjoined to an equivocal ‘zero’ or disappearance of work.” We understand algorithms generally, know they can’t “really” define us, yet find ourselves trapped, predicted and controlled. They provide a proxy for “desire of the desire of the other,” distorting sociality – social distancing without social connection. Within algorithmic media, we perceive others, even “connect,” but veiled by perpetual processing. All we know of others’ desires is a faint self-reflection, limited by that horizon.
Drott argues streaming companies invest in predictive analytics for business reproduction, not user-music matching. Whether listeners are truly “taken in” remains open, but platforms strive to “fabricate a desire” for their curation. TikTok seems to succeed, selling waning affect as a service, aiming to be the “master gimmick,” resolving capitalist desire’s ambivalence into a ceaseless stream of subjectivizing content. The next video, in this algorithmic prophecy, will reveal who you are, or perhaps, how you will become.