The Justice Department sued TikTok and its parent company ByteDance on Friday for allegedly violating children’s online privacy law.
The agency accused the popular social media app of allowing children under 13 to create accounts, collecting data about those children, and failing to respond to parents’ requests to delete the accounts and information.
TikTok’s actions would violate the Children’s Online Privacy Protection Act (COPPA) as well as a 2019 agreement with the app then known as Musical.ly, according to the lawsuit.
“To put an end to TikTok’s large-scale unlawful invasions of children’s privacy, the United States files this lawsuit seeking injunctive relief, civil penalties, and other relief,” the lawsuit says.
The Justice Department alleges that TikTok “knowingly allowed children under the age of 13 to create accounts” on the platform and “collected extensive personal information” without notifying their parents or obtaining their consent.
When parents asked TikTok to delete their children’s accounts and associated data, the lawsuit alleges that the company obstructed and failed to comply with these requests.
“Parents must navigate a complicated process to figure out how to request deletion of their child’s account and information,” the DOJ alleged, adding: “Even if a parent was able to submit a request to delete their child’s account and information children, [TikTok] many times they did not honor this request.”
The lawsuit followed an investigation by the Federal Trade Commission (FTC), which in 2019 filed a consent order against TikTok for alleged past violations of COPPA.
“TikTok knowingly and repeatedly violated children’s privacy, threatening the safety of millions of children across the country,” said FTC Chair Lina M. Khan. “The FTC will continue to use the full scope of its authorities to protect children online – especially as companies deploy increasingly sophisticated digital tools to surveil children and profit from their data.”
In a statement, TikTok said: “We disagree with these allegations, many of which relate to past events and practices that are factually inaccurate or have been addressed.”
“We are proud of our efforts to protect children and will continue to update and improve the platform. To that end, we provide age-appropriate experiences with strict safeguards, proactively remove users suspected of being underage, and voluntarily roll out features like default time limits display, family pairing, and additional privacy protections for minors,” the company continued.
Updated at 2:18 p.m.
This story originally appeared on thehill.com read the full story