The Online Safety Bill was passed by Parliament this week, meaning technology companies will have to abide by more stringent laws regarding our children's online safety.
Among the new legislation are tougher rules on age restrictions on social media platforms such as TikTok and Instagram, and platforms must remove content that is harmful to young people.
HELLO! sat down with Michelle Donelan, the Secretary of State for Science, Innovation and Technology, who has worked to get these new laws through parliament.
We asked Michelle how and when the new legislation will come into practice, what more needs to be done to protect children online and how the government is supporting parents as they navigate the online world with their kids.
Read the interview below…
Hello Michelle, it's fantastic news that the Online Safety Bill has been passed. What does this mean for keeping our children safe?
Michelle: The Internet and social media in particular offer some fantastic opportunities, especially tackling things like loneliness, but huge risks as well, especially for our young. And the problem that we've got at the moment is basically the sector is completely unregulated to a degree and they're not accountable for some of the problems that we're seeing.
Where we are now in terms of that legislative journey is, it's really exciting because we're just about to finish the process so it can become a law and that will be transformational.
I call it a game changer in terms of the legislation because it really will make a difference, particularly for children who are at the heart of this piece of legislation. It will mean that here in the UK, it will become the safest place in the world to be on the Internet.
Parents will have the confidence of knowing that when their child is using a social media platform, they're doing so in a safer way because they're not able to see the illegal content that they are now and that they're not able to see some of the content that is really not age appropriate - just like you would expect with a movie or anything else of that nature.
We do treat children differently, and we should be protecting them when they go online.
How will the new online age restriction law work in practice?
Michelle: The situation that we've got at the moment is the law says that children under the age of 13 shouldn't be using these platforms because of data protection issues.
But we know that there are sometimes nine-year-olds and seven-year-olds using these platforms and seeing content that is deeply disturbing and will have an impact on their mental health and wellbeing.
And so we're saying that platforms have to, in the future, enforce their age restrictions, which will be a minimum of 13 - some have higher. If they don't, then they're going to face potentially humongous fines of up to billions of pounds.
So there's a real incentive and deterrent in fact to actually adhere to that piece of legislation.
And how that will work in practice will depend on the platform and how they operate. It may be that they will introduce an age verification technique, but it's more likely that they'll introduce what we call 'age assurance, age estimation' techniques utilising lots of different tools to assess the age of their user.
If they're not confident that they are sure of the age, they can then go back to that user and say they need some further information, at which case it might be at the level of age verification.
So, it's a little bit similar to when a young person tries to buy alcohol in a supermarket and we do Challenge 25 - they will shoot above to make sure that they're definitely implementing that age restriction.
That's still a huge leap forward, isn't it?
Michelle: It's a massive leap forward. When I took over the bill, many people said to me, 'You can't make it any stronger for children,' and I felt passionately that this bill at its heart does need to be about children. They need to be protected.
And some of the additional stuff that we've done in the last year includes adding that age restriction in there, making sure that we've got the voice of children coming through with the children's commissioner, making sure that if we all believe that things should be illegal, let's inject some common sense, let's make them illegal.
So going forward, the encouragement and promotion of self-harm - not just for children but for adults online - will be made illegal. Cyber flashing will be made illegal, intimate image abuse will be made illegal, and that's all in this one piece of legislation, which is world-leading.
What we're doing here is being looked at across the globe and there's a movement now to ensure that the Internet and social media is safer, especially for our children.
Let's talk about harmful online content on the apps. As a parent, how will my child be protected going forward?
Michelle: Once the bill becomes law and it starts to be implemented, those social media companies will have a duty legally to ensure that your child, and my child when he's older, doesn't see videos of that content or images or words, etc., because we know the ramifications are horrendous.
We've worked with campaigners and bereaved families throughout this process, people like Ian Russell, who know only too well the devastating consequences that this can have. It's easy to overlook how important this piece of legislation is. It really is a game-changer in that respect.
So social media platforms will have to introduce lots of systems and lots of processes, which means utilising tools to ensure that that kind of content doesn't appear on that platform. And they will have to be proactive but also reactive as well, to make sure that if some did slip through the net, they react very quickly.
The regulator will expect them to produce public risk assessments showing that, detailing that, which will also be available to parents and teachers to better understand the issue.
They will also have to be transparent around the types of technology that they're using as well, and so that will mean that everybody can have confidence in what they are doing, how they are doing it, and knowing that they're actually protecting our children.
In terms of timelines, when can we expect to see the changes happening?
Michelle: We're expecting royal assent, which is the process where it becomes a law, within weeks. And then from day one, these platforms will be working with the regulator.
In fact, they already have been doing that. So, I've been working with the platforms to make sure that this bill is doable and that they can actually adhere to the things within it, and we've seen their behaviour change to date.
If you look at TikTok and Instagram, both have introduced changes. TikTok has introduced the policy that under 16s can no longer have direct messaging facilities. Instagram has also introduced changes, and that is in response to this bill and them getting ready for it.
So, from day one of implementation, they'll work with the regulator who will be setting out their expectations and how they can start to implement those, and we'll see some changes quite quickly.
Within two months, Ofcom, which will be the regulator, will take on board the duty of trying to act as an information gatherer on behalf of coroners so that they can get some information and data out of social media companies for inquests when it tragically results in awful consequences.
We've seen those very public cases like the Molly Russell case, where there were difficulties in acquiring some of the information to ascertain what exactly happened. We want to try and stop that and make that process easier.
Then over the course of 18 months, you'll see a lot of those changes happening.
There will be consultations with different groups in society: people like the children's commissioner, the domestic abuse adviser, different charities, groups and organisations and social media companies so that the rules and guidance can be written in a way that's really clear, and that enables the law to be fully implemented in the best way.
If parents do come across harmful content, who should they report it to?
Michelle: One of the things that we've introduced in this bill is boosting the complaints process for individuals, but also for groups, so that they can flag that kind of material and concerns directly to the platforms.
The regulator will not just be reactive but proactive, so I've worked with the regulator in the build-up to this, and it's fully envisaged that they will be actively keeping an ear out and an eye out for concerns that people have.
So if there were journalists, for instance, that were raising concerns, they could proactively look at that as well as just waiting for reactive things.
What are your views on how some of these apps are designed?
Michelle: The way the bill works is that they have to adhere to all of the requirements within it. So, when it comes to children, they cannot see illegal content, they can't see content that's going to be harmful to them. And then when it comes to adults, we're calling it a triple shield, really.
First of all, adults can't see illegal content because what is offline should be the same as online. It's all one world.
Secondly, they have to introduce their terms and conditions because they have to be transparent - if they say they're going to do something, they have to do it. For too long, we've seen some of these social media platforms treating different members of society and different groups of society differently from others. That's not acceptable.
And thirdly, they will also have to offer the ability for users to have more empowerment and choice over the content that they see. And as I said, also making things illegal that should be illegal.
So social media platforms will need to consider all of that and make sure that they can comply when they design apps in the first place, if we're going to see new social media companies arise or social media companies that are already in existence.
But this bill doesn't do everything that everybody would ever want when it comes to the Internet or app design, and there will be future bills in this space.
But this is a massive leap forward in terms of where we're coming from and where we're going.
This is the world's leading piece of legislation in this space. No other country has produced such a comprehensive bill, and we've managed to overcome many challenges on the way. Many people said we couldn't do it and we fought through that because it's so important to protect our children when they go online.
Is the government doing anything more to support parents with their children's online safety?
Michelle: I think it's a massive support knowing that this bill is coming and knowing that we're going to take this zero-tolerance approach when it comes to children.
Because like you say, the moment you've got this situation where children are almost two clicks away from adulthood and it's really impeding the ability of a parent to parent, but also a parent can't be with their child all hours of every day.
We do need to enable parents and support parents by changing the law in this really radical way, but also, we need to support parents and children with media literacy.
There is a piece in the legislation that I've added around media literacy and Ofcom progressing work on this area. I've been working with the Department for Education in terms of media literacy in our schools as well because this is an important part.
I think personally, having spoken to those parents and parent groups, that there is a sense of relief here that this legislation will provide comfort to them to know that their children won't be exposed to some of the most horrific images. Who knows the untold mental health damage that this is going to cause for our children?
One of the areas that we've introduced in this bill is making sure that children can't access porn.
I almost feel gobsmacked saying it because it's awful the situation that we've got at the moment. Studies show that the average age a child sees pornography now is 13 years old. That's the average. No wonder things like porn addiction are on the rise because of that, and their view of healthy relationships will be distorted.
That's why it's important that we protect our children and protect their innocence for as long as possible.
Let's talk about end-to-end encryption on social media messaging services. How are we going to help young victims who are moved off apps onto private messaging by abusers?
Michelle: There's been quite a great deal of confusion on this. So, we haven't actually taken out the bill, it's still there.
The situation is that this government and I personally as well, wholeheartedly support encryption. We believe in privacy. Everybody deserves the right to privacy when they go onto a social media platform.
But there is a line, and I think that the line should be drawn when it comes to talking about child abuse and sexual exploitation.
What we're saying is that the process would be if a platform was encrypted and there was a concern by the Home Office or the police that there was child abuse and sexual exploitation on that encrypted platform, they would work with the regulator to put in place those mitigations, try and prevent and move that and eradicate that abuse and predatory action on that platform.
If those mitigations weren't enough, then as a last resort, the regulator could say to that platform, what we need you to do is research technology that will enable encryption to be protected whilst allowing access for child abuse and sexual exploitation.
Now, we have already done a fund to look at this, The Challenge Fund, and we came up with a proof of concept, so we know it is possible.
The work now just needs to be done to actually develop that specific technology, and we would then ask the platform to introduce that technology.
It may be that it's never needed to be used, but I think that it's morally the right thing to do, that we put that safety net in the legislation in case we ever have to deploy it, because we cannot turn a blind eye to paedophilia.
We cannot turn a blind eye to predators hunting down innocent children on social media platforms. That's just not okay.
And that's something that I as a minister personally am convinced we need to tackle and work with the platforms to find solutions that will enable us to protect encryption whilst also not protecting paedophiles.
We are the first generation of parents dealing with these issues. We can't ask grandparents: 'What did you do?' It's a huge thing for children and parents to deal with…
Michelle: It really is a huge thing. We also know that predators try to utilise some of these social media platforms directly to target children, and that's why we have to be aware of the fact that there are risks on these platforms. We have to make sure that we are stamping out those risks of being radical.
That's why we're producing this piece of legislation that's world-leading. When I speak to my counterparts in other countries, they tell me that they're waiting for this piece of legislation to go through so that they can replicate it, because it's not an easy bill to pass. This took us a very long time. There's lots of thorny issues in it. A lot of it is common sense.
I think most people would agree that protecting children online should be a priority, but some of it is quite tricky, and we've navigated through it.
We've stuck with it, and we're now delivering a really comprehensive piece of legislation that will go further than was ever intended because we think it's the right thing to do.
HELLO!: We're celebrating it at HELLO! and looking forward to the changes over the coming months.