Australia has positioned itself as one of the leading countries in promoting a safe digital environment for children and teenagers. Its commitment began in 2015 with the creation of the eSafety Commissioner and the adoption of the Enhancing Online Safety Act.
Recognising the rapid pace of technological change and the emergence of new platforms and services, in 2019 the Australian government proposed legislative reform to update safety safeguards in the digital environment. As a result, the Online Safety Act was adopted in 2021, repealing the 2015 law mentioned above.
One of the most relevant elements of this legislation is the explicit recognition of the risks associated with online interaction and content experienced by children and teenagers. In this context, the implementation of age assurance systems began to be explored, resulting in the adoption of the Roadmap for Age Verification in 2023.
More recently, in 2024, the Australian government undertook a review of the Online Safety Act, which concluded with the adoption of an amendment setting a minimum age of 16 for accessing social media (Social Media Minimum Age).
Australia’s approach to age assurance systems
In order to define the criteria to be met by age assurance systems to be implemented by pornographic content (+18) and social media (+16) platforms, the Australian government launched a tender for an Age Assurance Technology Trial. This was awarded to the Age Check Certification Scheme (ACCS); a UK Accreditation Service conformity assessment body composed of auditors, certification specialists and data protection experts.
The main task given to ACCS was to conduct an independent assessment of the technologies available in Australia for age verification, age estimation, age inference, or parental control. Since Australian law does not allow the use of official identity documents, these age assurance technologies include age-confirming methods such as biometric markers or digital behavioural patterns.
Late last month, ACCS released a report with preliminary findings, which were presented as twelve key conclusions. Broadly speaking, the conclusions are as follows:
- Age assurance can be done privately, robustly and effectively.
- No technological limitations have been identified that would prevent implementation of the age eligibility requirements set by the Australian authorities.
- Solutions with a high level of technology readiness (score of 7 or higher) can be integrated into the user experience.
- There is no one-size-fits-all solution. Technology should be chosen according to the specific purpose or characteristics of the platform.
- The age assurance industry is in a time of innovation and transition from research to viable products.
- Age assurance technologies comply with privacy policies. Data is collected, stored, shared and disposed of in accordance with the principle of privacy by design. In addition, only data that is necessary and with explicit consent is used.
- In general, the systems for age analysis by race and gender do not show significant deviations (except in the case of indigenous Australian populations due to lack of representative data).
- Despite advances in the field, there is room for improvement in age assurance technologies.
- There is no evidence that parental control systems are adequate to adapt to children’s progressive digital skills and capabilities, enhance their digital rights or be fully effective and secure in managing their digital footprint.
- Due to rapid technological advancement, age assurance systems need to be continuously monitored, updated and improved.
- In the absence of specific guidelines for the design of these systems, several providers have over-anticipated potential regulatory requirements, increasing the risk of privacy breaches through unnecessary or disproportionate data collection and retention.
- The standards-based approach adopted in the trial provides a solid basis for the development of accreditation and certification schemes.
What are the next steps?
Adult content and social media platforms will have to implement age assurance systems by December 2025. Thereafter, the Australian government will conduct a review to ensure that platforms are taking reasonable steps to prevent access by the age range stipulated by the legislation. In case of non-compliance, platforms may face financial penalties of up to A$50 million (approximately €27.7 million).