Tik Tok is one of the most popular and fast-growing apps in the U.S and worldwide. It boasts millions of users, including A-list celebrities. Despite its great success, the content is not well regulated. Everything from gory videos to sexually explicit content to misinformation can pop up on the app’s homepage.
This is a problem because the videos are often unsettling and traumatizing. This is not the first time an app makes it difficult for users to control damaging content.
In 2013, Vine could be found on any app store. Users can produce six-second videos. However, in 2016, the app was shut down due to disorganization and minimal control over the content creators were posting. Bottom line: the company was unable to manage its app and monitor the content that users produced.
Soon after, in 2016, Tik Tok was launched by China's Bytedance. In 2018, its popularity grew in the U.S., and now it is one of the most used apps by teenagers around the world. Tik Tok is similar to Vine in that users can create 5-30 second videos, but it is also similar in regards to it being unorganized and unmoderated.
So, why is Tik Tok still being used when Vines was shut down?
Tik Tok was created at the perfect moment. By 2016, my generation was neck-deep in social media. If you did not have social media, then you did not exist.
When I first downloaded Tik Tok in 2018, it was embarrassing to make videos. People put on videos, mostly for laughs. But in 2019, teen influencers began to populate the app. Influencers are users who are typically good-looking or they do something unusual or they are talented dancers. Without Vines, there was a place in the social media market to fill with short video content, though no one predicted it would be what it is today.
A Faulty System
While the app grew around teens making content, millions of users can post hateful and inappropriate material with minimal consequences. Tik Tok claims to monitor and remove explicit content, yet I know from my own experience that this process does not seem reliable. For example, I’ve posted a video of my friends and I dancing, and it was taken down for being a “violation of community guidelines.” We were not in violation of any guidelines which do not allow nudity, hate speech, threats, etc.
Another example comes from December 2020, when I participated in a Tik Tok trend where people posted selfies from the past and present. My selfies looked like school photos, so there was nothing that violated the community rules, and yet, it was taken down. As a final example, I participated in the Corona-cation trend, depicting my friend’s knitting, riding bikes, and skateboarding.
For me, this raises two pressing questions. First, why is appropriate content being taken down when there is so much graphic content on the platform? And second, who do we believe is ultimately responsible for content control?
In response to my first question, I think the reason is that the app appears to privilege users who are registered, legal adults. Why do I think this? Because when I turned 18, Tik Tok did not remove that kind of content. This means that innocent or fun content by teens for other teens is being removed and more explicit content remains on the app. While I can not say this is the case, my own experience suggests that this could be the case.
As for my second concern--well, you can’t control people who do not want to be controlled. But you can silence them or have them face consequences for what they say.
Apps like Instagram regularly remove content that does not meet their site's standards. This means that the company takes some responsibility for what the users see. On the other hand, Tik Tok offers “Restricted Mode,” which places the burden of responsibility on the user, but it does a poor job of monitoring its community behavior.
For example, there are many influencers such as Jojo Siwa and Charli D'amelio who are trying to make a positive impact on our younger generation. The content they create is user-friendly, and also provides a good role model for kids. With millions of followers, these users are sure to come upon a child’s feed rather than gory videos. But why is that not the case?
It’s simple: Hashtags.
Tik Tok’s algorithm functions by collecting user data in two main ways. If a user likes a video with a certain hashtag and/or sound, more videos will come up under the same hashtag and/or sound. When creators want to get more views, they will often use a popular trending sound or hashtag. The more views and people use a hashtag and/or sound, the more attention your video gets. Because these sounds and hashtags are trending, it is the influencers who use them the most. But when children like these videos, it could redirect them to something entirely different.
Very often, inappropriate and gory videos will be posted under a popular sound or hashtag. If a user likes videos under trending hashtags or sounds, unmoderated videos with these hashtags and sounds will inevitably show up on their feed.
So what do we do?
Well, for starters, I do not think Tik Tok should shut down.
When Vine shut down, our generation was destroyed. All our humor, inside jokes, and pop culture come from Vine; and the same goes for Tik Tok. With 80 million users, if Tik Tok was shut down, I’m sure there would be quite a commotion within our generation. I think Tik Tok is aware of their pop culture influence, and that is one of the main reasons they have not shut down yet. But Tik Tok must begin to moderate its content quality.
So what do I mean by this?
The goal of a video should be to make someone laugh or make a positive impact on their lives, not to traumatize users with sexually explicit content and graphic violence, especially when those users are often children. For example, I know a child who, at 10 years old, saw videos on Tik Tok that taught them how to curse, exposed them to nudity, and normalized drug use. The content affected the child’s behavior, making light of highly dangerous content and situations. This included being stalked by someone who sent sexually explicit comments, many of which did not make sense to the user. The stalker eventually sent a sexually explicit image, which was brought to the parents, who ended the situation.
Regulating Tik Tok isn’t all up to the government- it’s up to us, the users.
As creators, no matter how big or small our presence is on the app, it is our generation’s job to make sure that users, especially children, are safe. This means parents need to seriously consider what they are giving their children access to when they give them a phone. It is not the government’s job, or a company's job, to protect children from every single thing. Of course, we need people in charge to agree on what is good or right for everyone’s safety, and we need support enforcing it. But if we only expect others to take responsibility, then how can we expect real change?
Change begins with the self.
Make content that spreads positivity and laughter, not sadness and hate. Make content to teach something new, not to bring someone down. To create a community is to join together under one cause. This means we need to raise our voices about the misuse of hashtags and bring awareness to how many underage or unsupervised users are on Tik Tok. When a person desires to be a part of a community, they should be able to trust it. They should be able to trust there isn’t content waiting to traumatize them in a safe hashtag. That means friendly hashtags should not be filled with explicit content. We can change this.
Tik Tok’s unique community is what makes Tik Tok so great. We laugh together, we cry together and we come together all through this app’s content. If we want to have a happy and safe Tik Tok community, we must begin to educate ourselves on the type of videos creators are putting out there. Tik Tok’s space should be for everybody, and we must begin to make our voices heard to see a change. If we can come together as a community, then maybe the standards will be raised for everyone’s psychological benefit.