World

Instagram gives parents more control over teens’ accounts


Getty Images Stock image of three young people using their smartphonesGetty Images

Instagram is changing the way it works for teens, promising to offer more “built-in protections” for young people and increased controls and peace of mind for parents.

The new “Teen Accounts” will be rolled out from Tuesday in the UK, US, Canada and Australia.

Social media companies are under pressure around the world to make their platforms safer, with concerns that not enough is being done to protect young people from harmful content.

The NSPCC called the announcement a “step in the right direction” but said Instagram owner Meta appeared to “emphasise the need for children and parents to keep themselves safe”.

Rani Govender, NSPCC child safety online policy manager, said Meta and other social media companies needed to do more themselves.

“This must be supported by proactive measures to prevent harmful and sexually abusive content from spreading on Instagram in the first place, so that all children benefit from comprehensive protections across the products they use,” she said.

Meta describes the changes as a “new, parent-led experience for teens” and says they will “better support parents and give them peace of mind that their kids are safe with the right safeguards in place”.

However, media regulator Ofcom raised concerns in April. on parents’ willingness to intervene to keep their children safe online.

“One of the things we found… was that even when we built these controls, parents weren’t using them,” Meta’s senior managing director Nick Clegg said in a speech last week.

Ian Russell, whose daughter Molly viewed content about self-harm and suicide on Instagram before taking her own life at the age of 14, told the BBC it was important to wait and see how the new policy was implemented.

“We can only know whether these measures are effective when they are implemented,” he said.

“Meta is great at engaging in public relations and making important announcements, but they also have to be good at being transparent and sharing how effective their measures are.”

How will it work?

Teen accounts will change much of how Instagram works for users between the ages of 13 and 15, with some settings enabled by default.

Includes strict controls on sensitive content to prevent recommendations of potentially harmful material and mute notifications at night.

Accounts will also be set to private rather than public — meaning teens will have to actively accept new followers and their content won’t be able to be viewed by people who don’t follow them.

These default settings can only be changed by adding a parent or guardian to the account.

Instagram infographic shows some teens will be prompted to add a parent if they try to change the default settings on their teen accountInstagram

Instagram will display a message asking users under 16 to get parental permission when trying to change important default settings in their teen accounts.

Parents who choose to monitor their child’s account will be able to see who their child is messaging and what topics they’re interested in — although they won’t be able to see the content of the messages.

Instagram said it will begin transitioning its millions of existing teen users to the new experience within 60 days of notifying them of the changes.

Determine age

The system relies heavily on users being honest about their age — although Instagram already has tools to verify a user’s age if it suspects they’re not telling the truth.

Starting in January in the US, artificial intelligence (AI) tools will also begin to be used to proactively detect teens using adult accounts, with the aim of redirecting them back to teen accounts.

The UK’s Online Safety Act, passed earlier this year, requires online platforms to take action to keep children safe or face large fines.

Ofcom warned social media sites in May They can be named and shamed. – and banned for under-18s – if they fail to comply with new online safety rules.

Social media industry analyst Matt Navarra describes the changes as significant — but says they depend on implementation.

“As we’ve seen with teenagers throughout history, in situations like this they will find a way out, if they can,” he told the BBC.

“So I think Instagram will need to make sure that the protections can’t be easily bypassed by tech-savvy teenagers.”

Question for Meta

Instagram isn’t the first platform to introduce such tools for parents — and the platform claims to have more than 50 tools aimed at keeping teens safe.

In 2022, the company introduced a family hub and monitoring tools for parents, allowing them to see which accounts their kids follow and who follows them, among other features.

Snapchat is also introducing its own family hub that lets parents over 25 see who their kids are messaging with and limit their ability to see certain content.

In early September, YouTube said that will limit recommendations of some health and fitness videos for teenssuch as those that “idealize” certain body types.

Instagram used age verification technology to check the age of teenagers who want to change their age to over 18 through selfie video.

This raises the question of why, despite Instagram’s many safeguards, young people are still exposed to harmful content.

A study by Ofcom earlier this year found that every child interviewed had viewed violent material online, with Instagram, WhatsApp and Snapchat being the most frequently mentioned services.

While they are also among the biggest issues, this is a clear sign that a problem has yet to be solved.

Hereafter Online Safety ActPlatforms will have to demonstrate they are committed to removing illegal content, including child sexual abuse content (CSAM) or content that encourages suicide or self-harm.

But the regulations are not expected to come into full effect until 2025.

In Australia, Prime Minister Anthony Albanese recently announced plans to ban children from using social media by introducing a new age limit for children allowed to use the platform.

Instagram’s latest tools put more control in the hands of parents, who will now be more directly responsible for deciding whether to give their kids more freedom on Instagram and monitoring their activity and interactions.

Of course they also need to have their own Instagram account.

But ultimately, parents aren’t the ones running Instagram and can’t control the algorithms that deliver content to their kids or what’s shared by billions of users around the world.

Social media expert Paolo Pescatore said it was “an important step in protecting children from exposure to the world of social media and fake news”.

“Smartphones have opened up a world of misinformation, inappropriate content, driving behavioral change in children,” he said.

“More needs to be done to improve children’s digital lives and that starts with giving parents back control.”

News7f

News 7F: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button