Protecting Children’s Privacy Online

privacy online

In today’s digital world, where children are increasingly tech-savvy, ensuring their privacy online is a crucial but challenging task for parents and guardians. Family Law Solicitor, Fiona Moffat, explores some of the latest statistics on protecting your children’s privacy online…

It has been a significant worry over the last few years as to how children can live and learn in the digital world but be protected from the dangers of unfettered access to the world of the internet.

The Information Commissioner’s Office (ICO) is calling on social media and video sharing platforms to improve their practices so that children can be protected when using their platforms.

The Children’s Code of Practice was rolled out in September 2021. Since then there have been some improvements made:

• Facebook and Instagram limited the information used in targeted advertising for under 18s to age and location. They committed not to personalise adverts for under 18s based on their on or off-app activity. Both Facebook and Instagram started asking for people’s date of birth at sign up, preventing them from signing up if they repeatedly enter different dates, and disabling accounts where people can’t prove they’re over 13. Instagram also launched parental supervision tools, along with new features like Take A Break to help teens manage their time on the app.

• YouTube turned off auto play by default and turned on take a break and bedtime reminders by default for Google Accounts for under 18s.

• Google enabled anyone under 18 (or their parent/guardian) to request to remove their images from Google image search results; location history cannot be enabled by Google accounts of under 18s; and they have expanded safeguards to prohibit age-sensitive ad categories from being shown to these users.

Social Media Rise: Increased Engagement with Age

Research has shown that 96% of children aged 3 to 17 years watch videos on video sharing platforms. The proportion of children using social media platforms increases with age 30% of children aged 5-7 years, 635 children aged 8-11, rising to 93% of children aged 12 to 15 and 97% of those aged 16-17 use social media platforms.

The ICO is therefore focussing in 2022 to 2025 on social media and video sharing platforms. Children are unlikely to be aware that their personal data is being harvested resulting in algorithmically generated content feeds. These feeds may create pathways to less suitable content. The ICO Children’s code strategy seeks to ensure social media and video sharing platforms comply with data protection law and conform with the standards of the Children’s code. This means that at that they need to design data protection safeguards into their online services.

As parents and those working directly with children, I am sure we would welcome a safer environment for children and young people in this digital age. Too often children are exposed to inappropriate content and become engaged in toxic activity without realising what they are being drawn into. Hopefully some of the steps being taken will help to protect our children and young people until they can discern for themselves how to avoid the loss of their personal privacy.

In recent news, children as young as nine have been added to malicious WhatsApp groups promoting self-harm, sexual violence and racism. Read more here.

 

Fiona Moffat

April 2024

Blog by Area of Expertise