FTC Hits Company Once Known as Musical.ly With Record Fine Over Children’s Privacy

Senior Editor

When children from across the country were signing up in huge numbers for an app to create their own lip-syncing and dance videos, they were also being asked to turn over personal information without their parents knowing about it, in violation of the law, according to federal officials.

Now the operator of that social-networking platform, TikTok—formerly known as Musical.ly—is paying a record amount to settle a complaint brought by a consumer-protection agency over allegations that it collected e-mail addresses, phone numbers, profile pictures, bios, and other information without proper consent.

TikTok agreed to pay the Federal Trade Commission a $5.7 million fine, in what the agency said is the largest civil penalty it has ever obtained in a children’s privacy case.

The company “knew many children were using the app but they still failed to seek parental consent before collecting names, email addresses, and other personal information from users under the age of 13,” FTC Chairman Joe Simons said in a statement.

“This record penalty should be a reminder to all online services and websites that target children: We take enforcement of COPPA very seriously, and we will not tolerate companies that flagrantly ignore the law.”

Musical.ly, now TikTok, has a massive audience: Since 2014, it has been downloaded by more than 200 million users worldwide, and 65 million accounts have been registered in the United States, the FTC said.

The FTC’s complaint, filed by the Department of Justice on behalf of the consumer agency, alleged that Musical.ly violated COPPA  by not notifying parents about the app’s collection and use of personal information from users under 13, failing to obtain parental consent before that collection and use, and by not deleting personal information at the request of parents.

User accounts for Musical.ly were public by default, meaning that a child’s username, picture, video, and profile bios could be seen by other users, the FTC said.

And while the site allowed users to switch their default settings from public to private, so that only approved users could follow them, children’s profile pictures and bios stayed public, and users could still be allowed to send them direct messages, the agency said in a statement. The FTC added that there were “public reports” of adults trying to contact children through the Musical.ly app.

In addition, until October of 2016,  the app included a “my city” feature that allowed users to view others on Musical.ly within a 50-mile radius of their location.

Separating Out Young Users

The operators of Musical.ly knew that a significant portion of their users were younger than 13 and it received “thousands of complaints from parents” of children who were registered users, the FTC alleged.

The 25-page settlement requires TikToks to adhere to COPPA and remove all video made by children younger than 13. It also includes mandates that TikTok delete personal information it collects from children upon parents’ requests, and keep records on children’s usage, among other steps.

COPPA governs websites and online services that are directed to children and that get personal information, as well as those that target a general audience but have “actual knowledge” they’re taking information from kids, the FTC says. In either case, the law requires parental consent before personal information can be taken from students.

In response to the FTC’s announcement, TikTok said it was adding additional privacy protections and segmenting its app so that younger users were placed in “age-appropriate” environments, with new safeguards.

“It’s our priority to create a safe and welcoming experience for all of our users,” the company said, “and as we developed the global TikTok platform, we’ve been committed to creating measures to further protect our user community.”

TikTok said it is will create a “separate app experience” for young users of the platform that includes additional privacy and safety measures. The app will not allow the sharing of personal information, and it will put broad limitations on content and interactions among users.

Both existing and new customers will be directed to the age-appropriate platform, the company said. The policy, said TikTok, is in line with guidance issued by the FTC for mixed-audience apps.

U.S. Sen. Edward Markey, a Massachusetts Democrat who has sponsored federal legislation focused on children’s privacy, said that the FTC’s penalty, while a record, is “not high enough for the harm that is done to children and to deter violations of the law in the future by other companies.”

“This FTC ruling underscores what we have long known: companies do not consider children’s personal information out of bounds,” the lawmaker said in a statement. “But the clock is ticking on companies that don’t follow the law and protect the privacy of children. TikTok knowingly collected children’s data in order to reap profits with blatant disregard for the Children’s Online Privacy Protection Act.”


See also:

Leave a Reply

Your email address will not be published. Required fields are marked *