Tough new internet laws to protect young people, uphold free speech and make sure there are no safe spaces for criminals online return to Parliament for their second reading this week.
My Parliamentary colleagues and I will debate the government’s groundbreaking Online Safety Bill, which requires social media platforms, search engines and other apps and websites allowing people to post content to improve the way they protect their users.
Ofcom, the regulator, will have the power to fine companies failing to comply with the laws up to ten per cent of their annual global turnover, force them to improve their practices and block non-compliant sites. Crucially, the laws have strong measures to safeguard children from harmful content such as pornography and child sexual abuse.
Ahead of Tuesday’s debate, the government is launching the next phase of its Online Media Literacy Strategy. It aims to help vulnerable and ‘hard-to-reach’ people, such as those who are digitally excluded or from lower socio-economic backgrounds, navigate the internet safely and teach them to spot falsities online.
The Department for Digital, Culture, Media and Sport (DCMS) will spend £2.5 million to advance the plan through the next year including on training, research and providing expert advice. This includes a new Media Literacy Taskforce featuring experts from a range of disciplines and a boost to the Media Literacy Fund, which gives teachers and local service providers the skills they need to teach people to improve their critical thinking of what they see online.
Thinking critically online has never been more important. There was a rise in misinformation and disinformation on social media and other online platforms during the global pandemic and the Kremlin continues to use disinformation to target UK and international audiences to justify its actions in Ukraine.
Ofcom research shows adults are often overconfident in their ability to detect disinformation and only 32 per cent of children aged 12 to 17 know how to use online flagging or reporting functions.
Forty per cent of adult internet users do not have the skills to assess online content critically and children up to the age of 15 are particularly vulnerable.
A new Media Literacy Taskforce with 18 experts from a range of relevant organisations, including Meta, TikTok, Google, Twitter, Ofcom and the Telegraph as well as universities and charities, will work with the government as part of its strategy to tackle disinformation and help hard-to-reach and vulnerable groups in society think about what they see on the web, including improving their ability to protect their data and privacy.
The taskforce will look at new ways to identify and reach people most in need of education. This could include working through local authorities or coordinating support offered by local services to roll out training.
The Media Literacy Fund will expand a pilot ‘Train the Trainer’ programme which ran last year to give teachers, library workers and youth workers more skills to help boost people’s critical thinking skills.
New research will be commissioned to understand the root causes of poor media literacy and on the effectiveness of different methods which aim to build people’s resilience to misinformation and disinformation.
The fund will have a broader scope including a focus on improving media literacy provision for people who are particularly vulnerable online - such as children or people suffering with mental health issues.
Since it launched in July 2021, the Online Media Literacy Strategy has provided £256,000 in grant funding to five organisations to adapt media literacy resources for teachers working with disabled children, run a successful awareness campaign to promote Safer Internet Day and empower LGBTQ+ young people with tools to deal with online abuse.