Morning, Tuesday, 14th January 2025
Online
This conference focuses on next steps for child online safety regulation, policy and practice in the UK.
Key stakeholders and policymakers will discuss the development of new regulation under the Online Safety Act, including assessing Ofcom’s recently published Illegal Harms Codes and guidance.
Delegates will assess the content of the new Codes and enforcement practicalities, as well as priorities for future updates to guidance after the omission of measures for dealing with suicide and self-harm material. The roles and responsibilities of online service providers will also be discussed, as well as support needed for smaller platforms in undertaking mandatory risk assessments by March 2025 and demonstrating subsequent compliance.
The conference will also be an opportunity to examine the way forward for protecting against child sexual abuse material, with a further consultation to expand the Codes, looking at account banning, hash matching, and the use of AI in tackling CSAM expected in spring 2025.
We expect discussion on the strategic priorities for online safety laid out by the Secretary of State for Science, Innovation and Technology in November 2024, and consideration of issues raised during Ofcom’s consultation on Protecting Children from Harms Online. Discussion will focus on the role of education providers, new approaches to improving children’s media literacy, and the relationship between offline and online spaces.
The agenda looks at latest developments in addressing practicalities of implementing age verification requirements, with reports that a potential future social media ban for under-16s in the UK is being considered. Identifying and classifying content, and issues around attention-retaining features that may negatively impact children, will also be discussed. Sessions will assess next steps for safeguarding against harmful content, including embedding safety by design, the future regulation and responsibilities relating to livestreaming sites, and challenges around moderating livestreams and homepage feeds.
Considerations around the balance between freedom of expression and online safety will be discussed, particularly in the context of identifying, classifying and moderating journalistic content. Attendees will also discuss privacy considerations and the provision of information for age verification purposes, and to emergency services and through the legal system.
The potential positive contribution and potential threats emanating from emerging technologies, such as AI, on online safety will be explored, looking at the adequacy of current legislation in regulating AI’s effects on internet safety. We also expect discussion on next steps for innovation to support child protection and the development of automated detection and areas for further research, following Ofcom’s Evaluation in Online Safety: A Discussion of Hate Speech Classification and Safety Measures, published in March 2024.
Further sessions will focus on international cooperation and enhancing collaboration to protect children online, with recent commitments to international regulatory consistency, looking at opportunities for sharing expertise, and exploring next steps for international private-public collaboration.
We are pleased to be able to include keynote sessions with: Michael Tunks, Principal, Online Safety, Ofcom; Owen Bennett, Head, International Online Safety, Ofcom; Martin Harris Hess, Head, Sector for the Protection of Minors Online, European Commission; and Prof Sonia Livingstone, Professor, Department of Media and Communications, LSE.
Overall, areas for discussion include:
- policy and regulation: implementation of Ofcom’s Illegal Harms Codes and guidance - key considerations for the Government’s strategic priorities for online safety - Ofcom’s consultation and next steps - options for strengthening the Online Safety Act - addressing key stakeholder concerns
- responsibilities: roles of social media platforms, content providers and ISPs in tackling online harms and misinformation - risk assessment and liability - practicalities and challenges for sector stakeholders - transparency and accountability
- regulatory balance: challenges in moderating journalistic content - considerations for freedom of speech - ensuring child protection
- specific issues: impacts of screen time-extending features - recommender systems - body-image and depressive content - mitigation strategies
- child media literacy: new educational approaches - roles of industry, families, schools and policymakers
- AI and emerging technologies: opportunities and threats in online safety - adequacy of current legislation - innovations in automated detection
- implementation: best practice and practical steps for stakeholder joint-working - addressing technology-amplified threats - moderating livestreams and feeds
- privacy and data sharing: considerations in providing information to authorities - balancing privacy with safety - legal and ethical implications
- international cooperation: building key global relationship - opportunities for UK leadership - regulatory consistency - private-public partnerships
All delegates will be able to contribute to the output of the conference, which will be shared with parliamentary, ministerial, departmental and regulatory offices, and more widely. This includes the full proceedings and additional articles submitted by delegates. As well as key stakeholders, those due to attend include officials from DBT; DfE; DSIT; HM Treasury; DoE, ROI; GLD; Home Office; Ofcom; DCEDIY, ROI; The Scottish Government; and the Welsh Government.