Are TikTok's new settings enough to keep kids safe?

Are TikTok’s new settings enough to keep kids safe?

TikTok, the now widely popular social media platform that allows users to create, share, and discover, amateur short clips—usually something akin to music videos—has been enjoying explosive growth since it appeared in 2017. Since then, it hasn’t stopped growing—more so during the current pandemic. Although the latest statistics continue to show that in the US the single biggest age group (32.5 percent, at the time of writing) is users between 10 and 19 years of age, older users (aged 25 to 34 years) in countries like China, Indonesia, Malaysia, Saudi Arabia, and the UAE are quickly overtaking their younger counterparts.

Suffice to say, we can no longer categorize TikTok as a “kids’ app”.

This, of course, further enforces the many concerns parents already have about the app. We’re not even talking about the possibilities of young children, tweens, and teens seeing dangerous challenges and trends, or pre-teens lip-synching to songs that make grown up eyes go wide, or watching some generally inappropriate content. We’re talking about potential predators befriending your child, cyberbullies who are capable of following targeted kids from one social media platform to another, and a stream of unrestricted content from users they don’t even follow, or aren’t even friends with.

Limitations and guardrails

Eric Han, TikTok’s Head of Safety in the US, announced last week that all registered accounts of users aged 13 to 15 years have been set to private. This means that people who want to follow those accounts need to be pre-approved first, before they can see a user’s videos. It’s a way for TikTok to give tweens an opportunity to make informed choices about who they welcome into their account.

Furthermore, TikTok will be rolling out more changes and adjustments, such as:

  • Limitations to video commenting. Users within this age group will be able to decide whether they want their friends, or no one, to comment. Currently, anyone can comment, by default.
  • Limitations to availability of Duet and Stitch. In September last year, TikTok introduced two editing tools: Duet and Stitch. These were made available only to users ages 16 years and above. TikTok also limited the use of video clips to Friends only, among 16 to 17-year-old users.
  • Limitations to video downloads. Users ages 16 years and above only can download content within TikTok’s app. This feature is turned off by default for users ages 16 to 17, but they have the option to enable it.

Read: TikTok is being discouraged and the app may be banned


  • Limitations to suggested accounts. Users who are 16 years and under are not allowed to suggest their TikTok account to others.
  • Limitations to direct messaging and live streaming. Users who are 16 years and under are not allowed to live stream, and can’t be messaged privately by anyone.
  • Limitations in virtual gifting. Only users who are 18 years and over can purchase, send, and receive virtual gifts.

Growing pains

This isn’t the first time TikTok has tried to prove that they’re serious about making and implementing such changes for the benefit of their userbase. Here is a rundown of the social media platform’s security and privacy growth and challenges from a couple of years back.

  • After making a $5.7 million USD settlement with the Federal Trade Commission (FTC) in 2019, for violating the Children’s Online Privacy Protection Act by failing to seek parental consent for users under the age of 13, TikTok had set out to delete profiles of users who are within this age bracket.
  • TikTok introduced account linking for parents and/or guardians in April 2019. Called Family Pairing, responsible grown-ups are now equipped to connect their TikTok accounts with their teen’s, enabling them to remotely modify settings of their accounts.
  • In December 2019, TikTok teamed up with Family Online Safety Institute (FOSI) to host internet safety seminars. Its aim was “to help parents better understand the tools and controls they have to navigate the digital environment and the resources FOSI offers through its Good Digital Parenting initiative.”
  • In January 2020, TikTok updated their community guidelines, to clarify how it moderates harmful or unsafe content. It said it wanted to “maintain a supportive and welcoming environment”, so that “users feel comfortable expressing themselves openly”.
  • In February 2020, the company partnered with popular content creators in the US, to create videos reminding users to, essentially, stop scrolling their phone and take a break—in true TikTok fashion. This is part of their “You’re in Control” initiative, a user-centric series of videos that tries to informs users of TikTok’s “safety features and best practices”.
  • At the same time, TikTok was also trying to curb online misinformation, (which is rampant on social media platforms), by working with third-party fact checking and media literacy organizations, such as the Poynter Institute.

Are TikTok’s changes enough?

Tools provided by social media platforms like TikTok can be helpful and useful. However, these companies can only do so much for their users. Parents and/or guardians should never expect their child’s favorite social network to do all the heavy lifting when it comes to keeping young users safe. More than anything, grown-ups should be more involved in their children’s digital lives. Not just as an observer, but by being an active participant in one form or another.

There is no substitute for educating yourself about social media. Look into the pros and cons of using it, and then educate your kids about it.

Tell them it’s okay to say “no”, to not follow the herd, that although something may look fun and cool, to stop and think about it first before reacting (or doing).

Everything starts in the home. Choosing security and privacy is no different. You are their first line of defense, not those default settings. So, let’s take up that mantle, and be one.

ABOUT THE AUTHOR