Facebook has published two posts today about Instagram for kids. Specifically, they address two issues: creating a safe experience for teens on Instagram and age verification to ensure users are of the right age.
In recent months, some organizations have criticized Facebook for planning to create an Instagram for children. In a letter The Campaign for a Commercial-Free Childhood, an international coalition of 35 children's and consumer groups, wrote to Mark Zuckerberg that "a children's version of Instagram could entangle even younger users in endless routines of photo scrolling and body image shaming." The letter states:
Facebook: “There is no foolproof way to stop people from misrepresenting their age”
The real audience for a kids' version of Instagram will be much younger children who don't currently have accounts on the platform. While collecting valuable family data and cultivating a new generation of Instagram users may be good for Facebook's bottom line, it will likely increase the app's use by young children, who are particularly vulnerable to the platform's manipulative and exploitative features.
Despite the criticism, Facebook believes that it is better to create a tool for these children than to ban them from being online. In today's Contribution is it [called:
We're also looking at ways we can reduce the incentive for people under 13 to lie about their age. The reality is, they're already online, and since there's no foolproof way to stop people from misrepresenting their age, we want to create experiences specifically designed for them that are managed by parents and guardians. This includes a new Instagram experience for kids. We believe encouraging them to use an experience that is age-appropriate and managed by parents is the right way to go.
Improving the Instagram experience for people under 18 but over 13: This is what Facebook wants to do
Facebook is focusing on three main pillars to provide young people with a safer, more private experience on Instagram:
- Putting young people on private accounts by default
- The potential for suspicious accounts to find young people
- Limiting the opportunities advertisers have to reach young people with ads
For example, teens under 16 (or under 18 in certain countries) will be given a private account by default when they join Instagram. This means other users will no longer be able to comment on their content or see them in areas like Explore or Hashtags.
Certain targeting options will no longer be available to advertisers
In the US, Australia, France, the UK and Japan, Instagram has also developed new technology to help the company find accounts that have shown potentially suspicious behaviour and then block them from interacting with teens' accounts. By "potentially suspicious behaviour" Instagram means accounts belonging to adults that have, for example, been recently blocked or reported by a teen. Facebook is also changing the way advertisers can engage with children:
In a few weeks, we will only allow advertisers to target ads to people under 18 (or older in certain countries) based on their age, gender, and location. This means that previously available targeting options, such as based on interests or activity on other apps and websites, will no longer be available to advertisers. These changes will take effect globally and apply to Instagram, Facebook, and Messenger.
Facebook also provides some insight into how the company uses AI, collaboration with industry partners and experts, and counting other users who report people under the age of 13 on Facebook and Instagram to determine whether a minor is using its platform. Whether Facebook's plan will ultimately work in this form remains to be seen, of course - opposition from relevant organizations is certainly to be expected. (Photo by Unsplash / Solen Feyissa)