The regulator would also be able to fine Facebook and other tech giants billions of pounds and require them to publish an audit of efforts to tackle posts that are harmful but not illegal. The government is to include the measures in its Online Harms Bill.
The proposed law would not introduce criminal prosecutions, however. Nor would it target online scams and other types of internet fraud.
This will disappoint campaigners, who had called for the inclusion of both. She welcomed the requirement on messaging apps to use technology to identify child abuse and exploitation material when directed to by the regulator.
However, much will rest on the detail behind these announcements, which we will be looking at closely,” she added.
The TechUK trade association said, “significant clarity” was needed about how the proposals would work in practice, adding that the “prospect of harsh sanctions” risked discouraging investment in the sector.
The government claims the new rules will set “the global standard” for online safety. Plans to introduce the law were spurred on by the death of 14-year-old Molly Russell, who killed herself after viewing online images of self-harm.
In 2019, her father Ian Russell accused Instagram of being partly to blame, leading ministers to demand social media companies take more responsibility for harmful online content.
Under the proposals, Ofcom would be able to fine companies up to 10% of their annual global turnover or £18m – whichever is greater – if they refused to remove illegal content and/or potentially failed to satisfy its concerns about posts that were legal but still harmful.
Examples of the latter might include pornography that is visible to youngsters, bullying, and dangerous disinformation, such as misleading claims about the safety of vaccinations. In addition, Ofcom could compel internet service providers to block devices from connecting to offending services.
The regulator would be given an ultimate say over where to draw the line and what offenses would warrant its toughest sanctions.
But in theory, it could find Instagram’s parent company Facebook $7.1bn and YouTube’s owner Google $16.1bn based on their most recent earnings.
It will, however, allow Ofcom to demand tech firms take action against child abuse imagery shared via encrypted messages, even if the apps in question are designed to stop their makers from being able to peer within.
Digital Secretary Oliver Dowden is due to address the House of Commons on Tuesday, and his department will also set out what it expects companies to do to tackle online child abuse and terrorist activity before the new laws are passed.
Mr. Dowden will make a commitment to bring the bill before parliament in 2021, but it might not be 2022 or later before it comes into force.
The Children’s Commissioner for England, Anne Longfield, said there were signs that new laws would have “teeth”, including strong sanctions for companies found to be in breach of their duties.