This article is more than 1 year old

Porn, abuse, depravity - and how they plan to stop it

Part one: Strangling content

Policing the Internet Contrary to popular belief, the government and police forces have hitherto not exerted a great deal of direct control over content. But, after a decade of growth in self-regulation and filtering by the industry to avoid government intervention, that may be about to change.

Current UK law on content is a mish-mash. The first and main stop-off must be the Obscene Publications Act 1959 (pdf), which made it an offence to publish material likely to "deprave and corrupt". This has been followed over the years by various laws brought in to deal with specific media and moral panics.

In the late 1950s, for example, a law was passed to regulate Children's Cartoons - and then used just once in the next 50 years. And in 1984, panic over "video nasties" led to the Video Recordings Act (pdf), which has been used rather more frequently.

Child protection has been a favourite wedge for politicians, and this has introduced a new principle to UK law over the last couple of decades, making possession of specific images an offence. Recently, this principle has been extended from child abuse into the areas of porn and terror.

We are not immune...

The first sign that the net wasn't entirely above the law came in 2000 with settlement of the Demon/Godfrey case. Dr Laurence Godfrey had sued Demon for hosting a libel in the form of a Usenet message, and Demon settled out of court for £15,000 plus £250,000 expenses. This was a wake-up call to ISPs, who suddenly grasped that - in the UK at least - they might bear some responsibility for the content they hosted.

The late 90s also saw the emergence of the one body most of us associate with internet policing - the Internet Watch Foundation (IWF).

As public concern about the availability of paedophilic material on the net grew, one of the UK's more influential net figures, Peter Dawe (then CEO at Pipex), realised that unless ISPs took action they would be forced to do so by government. He was therefore a key figure in the creation of the IWF, which was founded in 1996 by the ISPA, the trade body for Internet Service Providers, and other leading players in the internet.

The IWF model probably comes closest to what we might think of as an internet policeman. It runs a hotline through which the public and IT professionals may report any inadvertent exposure to potentially illegal content online.

The IWF then investigates, and its staff liaise closely with the police in order to align their standards with what the law says. The organisation then has two possible avenues of response. If a site deemed to be hosting illegal content is found to be operating within the UK, the IWF issues a take-down notice, which usually results in that site coming down within 24 hours. For content hosted outside the UK, the IWF sends details to the relevant authorities for investigation.

But it also maintains a list of potentially illegal URLs, updated twice daily, and this is provided to all major UK ISPs. At any one time, the list contains between 800 and 1,200 live child abuse URLs, with around 50 added daily.

In theory, owners of websites blocked or taken down by the IWF have a right of appeal. But according to the IWF this circumstance has never arisen.

The IWF's remit is slightly peculiar. Its focus - and what constitutes the vast majority of its work - is child abuse hosted anywhere in the world. But it also polices UK-hosted sites whose content is deemed to be criminally obscene or incites racial hatred (solely - no other form of hatred is covered).

Is there any question mark over where the IWF might next extend its empire? Talking to The Register, the IWF suggested that the clampdown on extreme porn was a refinement of its criminal obscenity remit, and that it was engaged in consulting with its board and the online industry regarding its potential future role.

More about

TIP US OFF

Send us news


Other stories you might like