TikTok has made a number of changes to its app that will make teenage users’ accounts private by default between the ages of 13-15, and allow for tighter privacy protection for anyone under the age of 18, the company said on Wednesday, a month later , with Federal regulators ordered it to disclose how its practices affect children and adolescents.
TikTok has added a number of new privacy policies for underage users.
Accounts of users between the ages of 13 and 15 will be set to private by default starting Wednesday. This means that they will have to manually approve any new followers who can view their videos, the company said in a press release.
For younger teen accounts, the “Suggest Your Account” setting is disabled by default, while comments on their videos are only available to people on their “Friends” list.
Users between the ages of 16 and 17 will see changes to Duet and Stitch – two features that TikToker can use to collaborate or mix multiple videos. The functions are now limited to their friends by default.
The platform only allows the download of videos created by users aged 16 and over, with the default setting disabled for users aged 16-17.
Videos created by users under the age of 16 can no longer be downloaded. Direct messages and live streams are also restricted for this younger age group.
TikTok also announced a partnership with the nonprofit Common Sense Networks to provide guidance to parents on media content. The goal is the limited app experience “TikTok for younger users” for users under the age of 13.
In late 2019, amid concerns about the impact on younger users, TikTok launched a limited version of the app experience for users under the age of 13 called “TikTok For Younger Users”. Similar to YouTube Kids, it offers better privacy protection and a curated library of video content that is considered age-appropriate. The company’s new partner, Common Sense Media, will help TikTok provide additional guidance on the nature of the content for children under the age of 13. Common Sense Media usually rates things like movies and video games based on parameters like violence, drug use, language, positive messages and others and then gives an appropriate rating for them.
Last month, the U.S. Federal Trade Commission asked ByteDance, the parent company of TikTok, and several other social media companies including Facebook, Twitter, and others, to post detailed information on how they collect and use consumer personal information and how their practices affect children and adolescents. Companies had 45 days – ending early next month – to respond to the orders, which are then used to create guidelines or recommend laws. TikTok asks its users like Facebook and Twitter to provide their date of birth when signing up. However, like its competitors, the company has no way of verifying this information. Last July, the U.S. Department of Justice began an investigation into TikTok to see if it violated a 2019 treaty to protect children’s privacy. However, the platform was soon threatened with a ban unless ByteDance sold it to a U.S. company, which it hasn’t done yet as the matter remains in court.
TikTok Makes Young Teenagers’ Accounts More Private By Default (The Verge)