Search Menu

Protecting minors from certain digital content

Normalising bulimia or anorexia, accessing pornography, glorifying violence or promoting suicidal behaviour are some of the inappropriate content that minors can access if a series of measures are not put in place.

Telefónica

Since the birth of the Internet and over the decades, its use has changed in many ways.

Subscribe to Telefónica’s blog and find out before anyone else.





At first, and taking the first website in history as an example, information was simply shared in a static way, with limited interaction between Internet users and the published content.

The arrival of Web 2.0 at the turn of the 21st century – to be succeeded by Web 3.0 – was accompanied by a greater degree of interaction with the creation and dissemination of content developed by the users themselves.

An evolution towards a more interactive and collaborative network which, although it presented notable benefits, was not free from challenges, such as the absence of filters to control material that could be harmful, offensive, inappropriate or directly illegal.

Legislation has advanced over the years although it is true that there are still challenges pending, one of them being the exposure of minors to this type of content.

To this end, Telefónica has drawn up a position paper for 2025 – Building a safe digital environment for minors – in which the company analyses issues such as these.

Types of offensive, harmful or inappropriate content

This document identifies four types of inappropriate or offensive content. Let’s take a look at what they are and their main characteristics.

Body image distortion

Issues that are not exclusive to the digital environment can be exacerbated in certain environments. For example, through content that normalises or idolises bulimia or anorexia by presenting them as lifestyles.

Although these may be extreme examples, the promotion of unrealistic physical standards can also lead to a distortion of young people’s perception of their body image.

Exaltation of violence

Although it is true that violent situations in school environments, such as fights, have been a reality throughout history, the difference that the digital environment has brought is the viralisation of this content.

We are faced with a vicious circle: the curiosity generated by this type of content leads those who publish it to continue doing so in view of the high number of views it brings to their social network profiles.

Pornography

Although there are reports warning that the usual age for consuming pornography is 12, there are even studies that lower the age at which some minors come across this kind of material for the first time in their lives to six.

Minors indicate that access to pornography occurs either accidentally (either through social networks or pop-ups) or through receiving links sent by friends or someone in their environment.

Promotion of suicide

Although it is not a unique issue of the digital environment, and with a lower volume than the cases mentioned above, the promotion of suicide is another case of harmful content to which minors may be exposed.

A study from 2022 in the United Kingdom already warned that one in four causes of suicide in young people could be related to the consumption of online content in which they are normalised or even encouraged to continue.

Proposals for a safe environment for minors

In view of these or other hypothetical situations in which minors are confronted with inappropriate or other types of content, Telefónica proposes a number of measures to move towards a safe digital space for children.

  • Commitment of the entire digital ecosystem. Promoting measures aimed at ensuring the protection of minors from certain content is the responsibility of the entire digital ecosystem, without exception.
  • Parental control. Although this measure can be effective in limiting viewing times, it does not have the capacity to discriminate against the content covered in this article: that which is harmful, offensive or inappropriate.
  • Age labelling. Video or social media platforms should provide the option of labelling content by age when it is published.
  • Avoid asymmetries between actors in the digital and audiovisual ecosystem. For example, operators that have television content have both specific content for minors and verification mechanisms.
  • Effective age verification. Effective age verification solutions are a filter to prevent possible access to inappropriate content.

All these measures linked to certain digital content progress in parallel to other issues related to the online ecosystem, such as the relationship between social networks and the protection of minors.

Share it on your social networks


Communication

Contact our communication department or requests additional material.

Exit mobile version