“We must cease once and for all to describe the effects of power in negative terms: it 'excludes', it 'represses', it 'censors', it 'abstracts', it 'masks', it 'conceals'. In fact, power produces; it produces reality; it produces domains of objects and rituals of truth. The individual and the knowledge that may be gained of him belong to this production.” - Michel Foucault, Discipline and Punish
(This paper was written for a class, but it was one of my favorite assignments and pieces of writing I have done, so I decided to upload it here, works cited and all. I hope you enjoy, and I hope to return to writing more here now that school is almost over.)
The internet and the technologies produced alongside it signal the most significant change in media consumption since the widespread adoption of television in the late 20th century. Social media, in particular, is a radical new technology that enables the widespread ability to consume and produce content widely seen by others for the first time in human history. The speed at which we can share information has upended many traditional news and information consumption forms. The convenience of social media has allowed many Americans to outsource their news consumption to influencers online, where misinformation is rampant. This revolution has created a concern within our society about the value of the information we consume and how it impacts us. While many researchers have delved into the social networks and structures that exist online to spread misinformation, there is a dearth of studies about how the form the content takes impacts our everyday beliefs. To fill this void, I am focusing on online Short-Form Video Content (SFVC) and how it radically impacts our ability to decipher between the truth and inaccurate information. To do this, we will first examine the history of widespread Short-Form Video Content, then look at the ways SFVC acts to hook users through algorithms, the effects that this has on our social interactions, and finally, go over some tips about how to combat the misinformation present online.
The internet did not begin in the early 2000s. However, radical audio and visual technology improvements at the time allowed it to take on a new form. Wei and Wang in 2022 writing about the history of SFVC on social media apps, denote four distinct periods with the first beginning in the early 2000’s. 2005 saw the creation of platforms like YouTube, where users were allowed to contribute to the site on their own (Wei and Wang, 2022). At this time, for the first time, it became possible to record a video in your home and put it online to be viewed. These websites were a far cry from the apps we are familiar with today, with access only to those who had personal computers. The next stage in the development of widespread Short-Form Video Content came in 2011 with the rise of smartphones and wireless technology that allowed users to access this content from devices in their pocket (Wei and Wang, 2022). This period saw the first SFVC app in 2011 named Viddy, which allowed users to shoot a 15-second video and apply filters and editing to the content they produced (Wei and Wang, 2022). Twitter followed suit with the creation of Vine, and Snapchat added stories to its platform, which allowed users to upload SFVC that their friends could view. The third stage began in 2016, when these apps gained massive capital influxes from investors, and competing apps developed similar technologies to stay relevant in the market (Wei and Wang, 2022). The last stage they note begins in 2018 and goes up to the present day, characterized by the explosive popularity of apps such as TikTok that occurred during the COVID pandemic (Wei and Wang, 2022). It is this last stage in which we currently exist, where SFVC is the dominant mode of communication between individuals. Showcasing its dominance within the cultural zeitgeist, TikTok was the most downloaded app of 2024, and around 34 million videos are posted daily (Shepard, 2025). This explosion is no accident, as the social and technical aspects of the app provide the user with a positive experience, promoting future use and a potential for addiction to the app (Zhang, 2019). This massive amount of content, coupled with users' positive experiences, has cemented TikTok and other apps that feature SFVC squarely within American culture.
The content featured in these videos is as diverse as the user base that engages with the app daily. Upon downloading the app, users can choose topics they are interested in. Since most SFVC occurs on social media sites, it is more than likely that they will follow influencers they like and their friends. Apps, like TikTok and Instagram, have a home page allowing users to scroll where they can engage with a popular selection of content the algorithm puts in front of them. The algorithm provides constant access to SFVC on these sites, allowing the users to scroll endlessly while consuming videos that interest them. This is a defining feature of SFVC as the user is not in control of the videos put before them. In an analysis of the Chinese social media app Douyin, which operates similarly to TikTok, researchers found “the algorithm increasingly takes the dominating place in the participatory media platform” (Liang, 2022). This denotes a shift from user-to-user interactions that happen online to a social media landscape dominated by algorithm recommendations to users. Instead of thinking through and meaningfully engaging with their content, users often scroll and consume whatever the algorithm puts on their feed.
The people who design these apps and algorithms have modified their approach to retain viewers better. Companies like Netflix switched from algorithms that aided the user's quest to find information to algorithms that sought to capture a user's attention simply by providing content the user would watch (Seaver, 2018). While a company like Netflix pioneered an algorithm that recommends content based on average retention rate instead of predicting users ' enjoyment, it became more potent on social media. Apps like YouTube, which became popular for their multi-minute videos, added a section to the site for SFVC and are rewarding the creators who make it (Duffy, 2024). To be clear, when talking about short-form video content, we are focusing on content that is less than a minute in length. While YouTube and TikTok provide options to upload longer content, the algorithm usually does not recommend these, as they struggle to retain the viewer for the full length of the video. If the algorithm values user retention, it should recommend short content that users can watch quickly before they lose interest. This might be why the average TikTok is 35 seconds long, and the average YouTube short hovers around the same length (Ceci, 2024 & Caliskan, 2023).
Higher retention rate for videos leads to more time spent on the site, and more potential ads can be shown to the user, leading to profit. There is a key difference here between SFVC and long-form videos. SFVC is often consumed at the behest of an algorithm when it appears before users who did not choose to view it. For example, the longer form of content on YouTube requires that a viewer remain interested in the same piece of content for some time, while SFVC allows the user to scroll rapidly and consume content requiring less bandwidth. The most compelling sign of the growth of short-form video content is the number of companies embracing SFVC to advertise their products and services. Ad agencies highlight the impact that SFVC and visual mediums have on consumers (Hubspot, 2025), and the average amount of time spent consuming SFVC is trending up (Ceci, 2024). These algorithms that promote SFVC are incredibly effective at capturing consumers' attention on social media platforms, and companies are embracing these strategies to garner viewers with no regard for the consequences it has on society.
There are two significant forms that the consequences of this blossoming SFVC have on the American public. The first form I will describe here is more abstract, passive, and sinister. In a classic 1964 chapter from his book Understanding Media: The Extensions of Man, Marshall McCluhan argues that the medium of a message contains a message within itself, or, simplified, the medium is the message (McCluhan, 1964). He argues that the message of any medium is “the change of scale or pace or pattern that it introduces into human affairs” (McCluhan, 1964). This basic argument was applied and compounded in another classic book, Amusing Ourselves to Death by Neil Postman. Writing about another drastic change in media habits, namely the newly established dominance of television, Postman laments print culture's decline in America. He points out that a radical shift in media forms “changes the structure of discourse; it does so by encouraging certain uses of intellect, by favoring certain definitions of intelligence and wisdom, and by demanding a certain kind of content–in a phrase, by creating new forms of truth-telling” (Postman, 1986). Many of Postman's critiques of TV apply to SFVC, as well as some powerful new ones.
Like TV, Short-Form video content is a rapid form of communication that does not prioritize abstract thought and deliberation. However, unlike SFVC, television programs had to be consumed when they were on air; viewers did not have constant access to all shows at their fingertips. TV watchers were making a conscious decision to consume shows they liked, instead of being at the mercy of an algorithm that fed them captivating content. Finally, the speed of the digital age is radically faster than that of TV, and the messages of short-form content are way quicker. There is no time to think in the digital world, as soon as you finish consuming one piece of content, the user quickly scrolls to the next. As far as forms of intellect are concerned, short-form video content usually appeals to the greatest common denominator, so it is easily digestible. Instead of worrying about the accuracy of the content created, it becomes about what message is easily understood by viewers in a short time frame. Misinformation is free to spread like wildfire in this environment, often because consumers are not making a conscious choice about their consumption. They can take things from videos they consume and share them easily while avoiding in-depth critical thinking about their messages.
There are greater stakes than sole pieces of misinformation, as the nature of truth in a digital society undergoes heavy contestation. To avoid philosophical confusion, truth in this sense does not refer to an accurate understanding of factual events, but rather an idea that holds the most sway within a society as it is largely agreed upon and understood. To tie in the epigraph from Foucault’s Discipline and Punishment, whatever message has the most power in a society is seen as the most true. For example, if you are in a community where the standard value is that men should be the providers, the community will hold that as a “true” fact of life. SFVC is dominating the media consumption of our society, and so it has the power to determine what we collectively believe. Worse yet, the truth that a user encounters online might look radically different than another user simply because of the variety of content the algorithm serves to users. No two users will see the same content, so forming a social consensus to discuss ideas or current events becomes challenging. What occurs as a result is not just inaccurate information, but leads to distinct differences in world views that emerge over time. One way of thinking about it is that we have been “hosted” by these sites and the content the algorithms recommend on them. Hosted refers to a state where we are “ provided with conditions for existence that facilitate activity while constraining it” (Seaver, 2018). These sites allow us to interact digitally, by consuming the SFVC they provide, and are a powerful determinant of truth and knowledge. At the same time, the variance in content between users leads to less social interaction and more dependence on the algorithms to serve us content and think for us. We have outsourced our ability to decide on our course of action to a medium that leaves no time for deliberation and can only spell out disastrous consequences.
A second and more active harm these algorithms have on our society stems from the massive amount of daily content uploaded to these sites. While most of it is harmless, fringe and extremist groups have been able to use the internet to their advantage. At worst, they blatantly spread misinformation online, and at worst, they encourage their members to enact real-world violence on those who disagree with them. The algorithm plays a significant role in radicalizing online consumers; users might receive recommendations for increasingly extreme content as they interact with content. A review of extremism online found that four factors led users down a path of extremism. Interactivity, the ability of the user to interact with extreme content, Demassification, the ability of the user to have messages target their niches specifically, Hypertextuality, the ability of the user to access similar content through links, and Asynchronicity, the ability of the user to access content at any time regardless of when it was published (Kuncoro and Hasanah, 2024). Users can discover and meaningfully participate in dangerous communities, like they never could. Tying into the above discussion of truth, these factors become more dangerous. Users might be pushed into an online ecosystem where the truth is videos promoting violent and anti-social behavior.
Radicalization can happen without the user being consciously aware of the fact. For example, liking a seemingly silly video by an influencer like Andrew Tate or Adin Ross can lead users to consume increasingly anti-woman content. Forming ideological opinions on any matter depends on the algorithm and can lead to users having inconsistent ideological beliefs. Finalyson writes about the emerging extreme online ideologies “What might have seemed historically, culturally and rationally distinct is bound by the force and fire of algorithmic, affective and aesthetic congruence” (Finalyson, 2021). SFVC recommended to users gives them a world inconsistent with their beliefs, but it does not matter online. These videos leave the user no time to think, so most participate online in a state of emotional reaction. Videos that blame women for the modern dating market emotionally resonate with many young men who might have faced rejection from a woman. These emotional ties and reactions, coupled with the algorithm, lead the user to extreme places. These manipulative aspects worsen when you consider who is producing SFVC online. Researchers have found that a small number of bots that infiltrate a group can effectively suppress the quality of information of the group (Menczer and Hills, 2020). Since the production of SFVC can occur anonymously, it can be hard for users to decipher if an account is a bot. Groups can boost the popularity of their posts through bot networks as well, leading to increased algorithm recommendations and online presence. A hostile bot network can anonymously produce SFVC that plays on the emotions of social media users, leading them to dark places.
Since the research into the form of SFVC as a harmful element to the quality of information online is sparse, solutions proposed by academics are rare. One research paper, which studied the influence of Short-Form Video Content on college students in China, recommended that the government play an active role in ensuring that the content is appropriate (Yu, 2020). This solution is untenable in the United States, where government intervention on social media sites is frowned upon. To avoid the influence that SFVC has on an individual's thinking, they should consider regulating their time spent on these apps or deleting them altogether. Actively engaging with and thinking about the Short-Form Video Content you consume also lessens its effects on you. While that works at an individual level, it can feel isolating in a society where SFVC has become so dominant. Individuals should work to build offline communities that share information without relying on Short-Form Video as a means of sharing information. Hopefully, further research into this topic leads to a better understanding of how Short-Form video content is radically altering our daily lives, and work to find a way forward that leads to a well-informed and digitally literate society.
Works Cited
Caliskan, Mert. “YouTube Shorts Explained!” INFLOW Network, 26 Apr. 2023, inflownetwork.com/youtube-shorts-explained/.
Ceci, Laura. “Tiktok Video Duration by Followers 2024.” Statista, 14 Aug. 2024, www.statista.com/statistics/1372569/tiktok-video-duration-by-number-of-views/#:~:text=In%202024%2C%20TikTok%20accounts%20with,as%20of%20the%20examined%20period.
Ceci, Laura. “U.S. Minutes Spent Watching Social Video 2028.” Statista, 9 Dec. 2024, www.statista.com/statistics/1349972/us-minutes-spent-daily-watching-social-video/.
Duffy, Clare. “Here’s Why YouTube Is Shelling out to Get Creators to Make Short Videos | CNN Business.” CNN, Cable News Network, 28 Mar. 2024, www.cnn.com/2024/03/28/tech/youtube-shorts-one-year-paying-creators/index.html#:~:text=YouTube%20says%20the%20fresh%20data%20to%20its,format%20%E2%80%94%20and%20to%20attract%20new%20users.&text=%E2%80%9CThe%20nature%20of%20short%2Dform%20content%20is%20that,pull%20in%20new%20viewers%2C%E2%80%9D%20Morgan%20told%20CNN.
Finlayson, Alan. “Neoliberalism, the alt-right and the intellectual dark web.” Theory, Culture & Society, vol. 38, no. 6, 6 Sept. 2021, pp. 167–190, https://doi.org/10.1177/02632764211036731.
Foucault, Michel. Discipline and Punish the Birth of the Prison. Vintage Books, 1979.
Hubspot. “2025 State of Marketing Report.” HubSpot, 2025, www.hubspot.com/state-of-marketing.
Liang, Meng. “The end of social media? how data attraction model in the algorithmic media reshapes the attention economy.” Media, Culture & Society, vol. 44, no. 6, 13 Mar. 2022, pp. 1110–1131, https://doi.org/10.1177/01634437221077168.
Kuncoro, Hestutomo Restu, and Khuswatun Hasanah. “How social media algorithms potentially reinforce radical views.” Insignia: Journal of International Relations, vol. 11, no. 2, 12 Nov. 2024, p. 126, https://doi.org/10.20884/1.ins.2024.11.2.11505.
Menczer, Filippo, and Thomas Hills. “THE Attention Economy Social Media Understanding How Algorithms and Manipulators Exploit Our Cognitive Vulnerabilities Empowers Us to Fight Back.” Scientific American, 20 Oct. 2020, https://warwick.ac.uk/fac/sci/psych/people/thills/thills/2020menczerhills2020.pdf Accessed May 2025.
McLuhan, Marshall. Understanding Media: The Extensions of Man. Routledge, 1964.
Postman, Neil. Amusing Ourselves to Death. Penguin, 1986.
Seaver, Nick. “Captivating algorithms: Recommender systems as traps.” Journal of Material Culture, vol. 24, no. 4, 29 Dec. 2018, pp. 421–436, https://doi.org/10.1177/1359183518820366.
Shepard, Jack. “25 Essential TikTok Statistics You Need to Know in 2025.” Social Shepard, 26 Mar. 2025, thesocialshepherd.com/blog/tiktok-statistics.
Wei, Tao, and Xiaohong Wang. “A historical review and theoretical mapping on short video studies 2005–2021.” Online Media and Global Communication, vol. 1, no. 2, 1 June 2022, pp. 247–286, https://doi.org/10.1515/omgc-2022-0040.
Yu, ChunMei. “Research on the innovation and integrated development of college ideological and political work based on short video recommendation model.” Journal of Physics: Conference Series, vol. 1533, no. 4, 1 Apr. 2020, p. 042038, https://doi.org/10.1088/1742-6596/1533/4/042038.
Zhang, Xing, et al. “Exploring short-form Video application addiction: Socio-technical and attachment perspectives.” Telematics and Informatics, vol. 42, Sept. 2019, p. 101243, https://doi.org/10.1016/j.tele.2019.101243.