TikTok hasn’t exactly had the best year.
Near the end of 2018, the ByteDance-owned company beat YouTube, Facebook, and Instagram in total mobile downloads. In the U.S. alone, the short-form video sharing app grew 25%.
It seemed things would get better for TikTok in 2019. At the start of the year, the company’s ‘lite’ mobile app quietly reached 12 million installs.
Then, things quickly went south.
The Federal Trade Commission (FTC) fined the app’s parent company $5.7 million. Filed by the Department of Justice (DOJ), the FTC found that TikTok and ByteDance knowingly violated the Children’s Online Privacy Protection Act (COPPA).
In the U.S., developers or websites targeted at children under 13 must require parental consent to share personal information.
Speaking about the violation, Joe Simons, Chairman of the FTC, explained,
“The operators of Musical.ly — now known as TikTok — knew many children were using the app but they still failed to seek parental consent before collecting names, email addresses, and other personal information from users under the age of 13.”
Soon thereafter, lawmakers in India called for a ban on TikTok. The app had apparently caused “cultural degradation.” People in the country had used the app to bully others, leading to a jump in suicides.
One lawmaker, Tamimum Ansari, took the issue to India’s Central Legislative Assembly.
Slamming content readily found in the app, he wrote,
“I raised an issue forwarded to me by community welfare workers that the mobile application (TikTok) was acting as a platform for heated debates inimical to law and order, and sharing of sexually-explicit material.”
In April, the country successfully banned the app. This came after the death of a college student as well as a suicide.
Last October, a 24-year-old took his own life. V. Kalaiyarasan had used TikTok to release videos of himself dressed in drag. Bullied and ridiculed by strangers – and even his own friends – Kalaiyarasan jumped in front of a train. In February, a college student died while filming a TikTok video on a motorcycle. The motorcyclist lost control, ramming into the back of a bus. His two friends were severely injured.
Following these and other controversies, the Madras High Court in India ordered Google and Apple to take down TikTok on their respective app stores. The court also ruled that the app encouraged pornography and other illicit content.
According to ByteDance, India’s ban resulted in “financial losses” of up to $500,000 a day. The company pointed to a serious drop in the value of its investments, as well as the loss of commercial revenue.
Weeks later, an Indian state court reversed the ban.
Now, after two months, TikTok and its parent company face another high-profile investigation.
Why hasn’t ByteDance done more to protect underage users?
Speaking with The Guardian, UK Information Commissioner Elizabeth Denham confirmed authorities have launched an “active investigation” into TikTok.
She added the investigation first started in February following the FTC’s fine against the social media app.
Stating that the UK will now see how the company violated child protection laws, Denham explained,
“We’re looking at the transparency tools for children [and] looking at the messaging system, which is completely open, we’re looking at the kind of videos that are collected and shared by children online. We do have an active investigation into TikTok right now, so watch this space.”
The commissioner added that ByteDance may also have violated the EU’s General Data Protection Regulation (GDPR). According to Denham, the law “requires the company to provide different services and different protections for children.”
Acknowledging the investigation but remaining silent on the specifics, a TikTok representative merely said,
“We cooperate with organizations such as the [Information Commissioner’s Office] to provide relevant information about our product to support their work. Ensuring data protection principles are upheld as a top priority for TikTok.”
Featured image by TikTok.