Facebook policy strictly prohibits anyone under the age of 13 from operating an account. As you’re probably well aware, plenty of kids under the age of 13 operate Facebook accounts. That’s because people are allowed to lie on the internet, which must be shocking to you, I know.
That doesn’t mean that Facebook just lets it happen, however. Although some reports have estimated that 40% of the Facebook users under the age of 18 are actually under the age of 13, Facebook continues to remove accounts beloning to underage kids every day. Some estimates put the number of daily removals at more than 20,000.
But it’s a huge game of whack-a-mole. Where one underage account is terminated, a hundred pop up in its place. And Facebook knows they’re impotent.
Do you think Facebook should abolish the age limit? Do you have kids under the age of 13? Do they operate Facebook accounts? Let us know in the comments.
Speaking at the Oxford Media Convention recently, Facebook policy director in the U.K. and Ireland Simon Milner discussed the social network’s problem policing their no kids under 13 rule.
“We haven’t got a mechanism for eradicating the problem [of underage users],” he said. He went on to call the problem “tricky.”
“Facebook does have a rule that users have to be over 13, as does YouTube, which not a lot of people know. It is not because we think that Facebook is unsafe but because of a US law about children’s online privacy. So we have it as a global rule.”
Milner is of course referring to the Children’s Online Privacy Protection Act (COPPA), an old law that details how minors’ personal data can be accessed and shared. The FTC recently announced some additions to COPPA, which they say will strengthen the law. Since the law was enacted way back in 1998, it makes sense that they would feel the need to update it for the digital age where social networks, apps, and other internet properties are snatching information at every turn.
The FTC’s proposal is a whopping 169 pages long, and makes a couple of extremely influential changes to the law. In our previous coverage of the FTC’s announcement, Zach Walton described the changes as such:
The first is a definition change that files geolocation information under a child’s personal information. The change means that services can not track a child across various Web sites and other online services.
In the same vein, the second update extends privacy protections to modern Web applications apps, games and Web site plug-ins. The latter is the most interesting because some Web sites appeal to people both young and old. These plug-ins can be used to track the adults, but what about the children? How will a Web site know who’s a child and who isn’t?
Of course, Facebook is one social provider who has taken issue with the “plug-ins” addition. Their ubiquitous “like” button, which appears on pretty much every website you would ever visit could be affected. They claim such regulations could “chill innovation.”
But back to Milner, who went on to say that the most obvious mechanism, an age check, is impractical:
“It is increasingly difficult to know what to do. You can’t make everyone prove their age – that would get privacy advocates up in arms.”
He’s right. Facebook’s real names policy catches enough flak – can you imagine what kind of hell Facebook would catch for some sort of true age verification system? Let’s say they attempted something like that anyway – damn the dissidents. It would be pretty much impossible, or at the very least a resource-hogging nightmare. So, short of implementing a long, resource-intensive age verification system that would probably infuriate everyone, what’s Facebook going to do?
One idea that’s been thrown around is to simply open up the site to kids under the age of 13 – but with a load of restrictions. Those restrictions, in theory, would allow parents to control their young childrens’ accounts and would do more to make sure their info stayed private on the site. The rumor first started floating around back in June of 2012, which led to privacy groups demanding that Facebook better give parents ultimate control over privacy, if they chose to let in sub-13-year-olds.
Facebook responded, saying,
“Enforcing age restrictions on the Internet is a difficult issue, especially when many reports have shown parents want their children to access online content and services. We welcome today’s recommendations by consumer, privacy, health and child groups as we continue our dialogue with stakeholders, regulators and other policymakers about how best to help parents keep their kids safe in an evolving online environment.”
Two congressmen joined the party, sending Facebook a pointed letter.
“At this point, we have made no final decision whether to change our current approach of prohibiting children under 13 from joining Facebook,” said the company nearly six months ago.
Currently, Facebook still requires members to be at least 13 years old, and there are still plenty of 10,11, and 12-year-olds on the site. The Guardian quotes a study that says 34% of 9-12-year-olds in the U.K. have Facebook accounts.
And those kids face the same kinds of danger that even older kids and teens face on social media – scammers, bullies, criminals. Just yesterday, a U.S. Appeals Court ruled that convicted sex offenders cannot be barred from operating Facebook accounts, as it’s unconstitutional to deny them such a ubiquitous form of communication. I happen to agree with the ruling, but I’m sure there are plenty of parents out there who, upon hearing a headline like that, immediately imagine their children being preyed upon.
The bottom line seems to be that young kids are going to find a way onto Facebook, Facebook is currently powerless to stop it, and the only real option seems to be to just let them in officially, and try to give parents control over their experience on the site. You know, if you can’t stop them, at least try to make it super safe.
Do you have any ideas? Just let them join in an official capacity? Age checks? It appears that Facebook is kind of stumped. Let us know in the comments.