December 2019
Starting from:
£99 + VAT
Format: DOWNLOADABLE PDF
The seminar began by addressing industry responsibility, highlighting the focus on industry responsibility in the Online Harms White Paper - including the proposed legal duty of care for online companies towards its users. This raised questions of what duties and liability should fall on social media platforms and tech companies.
It follows the Science and Technology Committee’s report in 2019 on the mental health impacts of social media - which found that social media exacerbated online harms including negative body image and cyberbullying - and reported evidence of social media platforms being used as a marketplace for the illegal sale of drugs - particularly among young people.
As the government looks to make tech companies more accountable, delegates heard directly from industry experts on what more can be done to protect children online.
The seminar also assessed regulation and media literacy, bringing out the latest thinking on the future of online regulation with respect to children.
Discussion also revolved around the effectiveness of current public and private sector initiatives aimed at encouraging children to think more critically online.
Delegates also had the opportunity to consider areas for improvement - with the white paper announcing plans to develop a new online media literacy strategy.
Additionally, the seminar examined privacy and data protection, measuring the progress that has been made towards protecting children’s privacy online and what further support might be needed.
Discussion targeted the implication of increased protections of user privacy on the ability of industry and investigators to identify and intercept online communications which could harm children’s online safety. The ICO’s Age Appropriate Design Code was particularly prevalent within this discussion.
Finally, the seminar scrutinized age verification and freedom of expression legislation. Part 3 of the UK’s Digital Economy Act - which includes provisions for age-verification on porn sites - will not go ahead, according to the government announcement, and instead the objective of preventing adult content from being easily accessed online will be delivered through the regulatory framework proposed in the Online Harms White Paper.
Delegates also considered the implications this has for the implementation of the Design Code, and assessed the feasibility and workability of age-verification technology in future policy and regulatory design.
Discussion also explored issues of how to define what is harmful, especially when the content is not illegal, as well as enforceability issues and whose responsibility this should be - and concerns about age verification technology, with restriction of access to information potentially stifling freedom of expression.