TikTok app described as ‘hunting ground for paedophiles’ slapped with £4.3m fine

A video sharing app that has been described as a "hunting ground" for paedophiles has been slapped with a record-breaking fine for violating children’s privacy.

TikTok, which allows users to create short lip-syncing videos overlaid with digital filters and stickers, was one of the most downloaded apps of 2018, beating both Instagram  and Snapchat.

Previously known as Musical.ly, TikTok has an estimated 1 billion users worldwide, a large percentage of whom are thought to be children.

Now the Federal Trade Commission (FTC) in the United States has ruled that the app "illegally collected" sensitive data from underage users without obtaining parental consent.

TikTok has been ordered to pay $5.7 million (£4.3 million) – the largest ever fine for a US case involving children’s data privacy – and delete all content uploaded by users under the age of 13.

Users in the US will be asked to verify their age when they next open the app, although no proof is required, and users in the UK and other countries will not be required to do this.

Those who admit to being under 13 will be directed to new "separate app experience that introduces additional safety and privacy protections designed specifically for this audience," according to TikTok.

"We care deeply about the safety and privacy of our users," the company said in a statement.

"This is an ongoing commitment, and we are continuing to expand and evolve our protective measures in support of this."

Read More

Latest tech news

  • WhatsApp is now blocked on these phones
  • Snapchat CEO limits son’s screen time
  • Louis Theroux’s Twitter account HACKED
  • Google Maps: King Henry’s Dock is hiding

TikTok’s terms of service already state that users must be at least 13 years old to use the app, but asks for no proof of age.

The app also includes age-gating measures at signup, and has a 12+ App Store rating, which enables parents to simply block it from their child’s phone using device-based parental controls.

However, the FTC said this was not good enough, after receiving thousands of complaints from parents.

"Just because you say it’s intended for over-13 doesn’t mean that it is," said Andrew Smith, director of the FTC’s Bureau of Consumer Protection.

Children’s charity Barnardo’s recently issued a warning about the app, claiming that young people were being encouraged to engage in sexual activity by online predators.

The charity said its child sexual exploitation services had supported 19 children as young as eight in a year. In previous years, the youngest children needing help were aged 10.

Some schools in the UK have also issued alerts directly to parents, with one claiming the app was being used to "stalk teenage girls.”

"Without the right security settings, children broadcasting live video of themselves over the internet could be targeted by abusers in their bedrooms," said Barnardo’s Chief Executive Javed Khan.

"It’s vital that parents get to know and understand the technology their children are using and make sure they have appropriate security settings in place.

"We are also calling for a legal duty on technology companies to prevent children being harmed online."

Source: Read Full Article