In Depth News Feature: Why the UK doesn't moderate UGC

The Feedster

Active Member
Premium Member
Joined
Jun 26, 2007
Messages
26,190
Reaction score
6
Points
38
Age
62
youtube-logo-200-200.jpg
Although their findings may be hotly debated - much of the Select Committee findings on the way in which children are kept away from harmful content on the internet is sensible and well-measured.

Using the common-sense findings of Tanya Byron as its starting point the recommendations on a whole host of subjects make a lot of sense, but perhaps the most important declaration is that sites should not be penalised for actively moderating their content.

To understand any this is relevant to the UK internet industry you have to consider the confusion that has reigned in the past over the way that companies approach UGC content like forums and uploaded pictures/videos.

The way it is

The primary reason that many of the UK's major internet companies - Microsoft's MSN and our own Future Publishing's websites for instance - have adopted there current policy on UGC is because of the EC-E-Commerce Directive.

This directive deals with the responsibility of companies for content published on their website of which they do not have 'actual knowledge.' As the Committee report explains:

"Under regulation 17 of the Electronic Commerce (EC Directive) Regulations 2002 (which transpose the Directive into UK law), companies that transmit Internet content on behalf of others (such as a user's profile page on a social networking site) cannot be held liable for anything illegal about the content if they did not initiate the transmission, select the receiver, or select or modify the information contained in the transmission.

"Nor is a service which hosts Internet content liable for damages or for any criminal sanction as a result of that storage if they do not have "actual knowledge" of unlawful activity or information and if, on becoming aware of such activity, they act "expeditiously" to remove or to disable access to the information."

In other words - if you don't know about it then you can't be held responsible. Which has led to many companies taking the stance that if they choose to actively moderate their UGC then they could feasibly be considered to therefore have 'actual knowledge' of all content posted which would make them legally responsible not just for the posting of unsuitable material, but libellous comments.

Which means that the majority of major companies have taken a passive 'report and take-down' approach to ensue that they can use regulation 17 as a defence.

Head in the sand

Tanya Byron's response to this was to suggest the approach: "is a bit like saying that it is unfair to ask companies to survey their premises for asbestos in case they find some but fail to remove it safely", adding that "on this issue, companies should not hide behind the law."

Which is a fair comment, but does not salve the fears of the companies who take the passive stance. Inside the current system those that actively moderate could well be found guilty of having prior knowledge through moderation, so you can't blame those who choose to take the 'head in the sand' approach. At least until the law is clarified.

And the committee appears to appreciate this as well - sensibly going so far as to suggest that the government should seek to 'seek amendment to the Directive if it is preventing ISPs and websites from exercising more rigorous controls over content."

Public interest

The report says: "We do not believe that it is in the public interest for Internet service providers or networking sites to neglect screening content because of a fear that they will become liable under the terms of the EC E-Commerce Directive for material which is illegal but which is not identified.

"It would be perverse if the law were to make such sites more vulnerable for trying to offer protection to consumers. We recommend that Ofcom or the Government should set out their interpretation of when the E-Commerce Directive will place upon Internet service providers liability for content which they host or to which they enable access."

It's a commendable stance, albeit one that needs to be backed up in actual law in order to convince companies that they should switch to an active moderation system.

Sheer volume

However, as you have probably noted - there is another problem to active moderation. It is a costly and time consuming process; and when your UGC content is of the volume of a site like YouTube or Flickr's then the financial burden is potentially massive.

Indeed Google - the owners of YouTube - have expressed their doubt that it would even be feasible to pro-actively moderate every post.

"We have strict rules on what's allowed, and a system that enables anyone who sees inappropriate content to report it to our 24/7 review team and have it dealt with promptly," a spokesman told the BBC.

"Given the volume of content uploaded on our site, we think this is by far the most effective way to make sure that the tiny minority of videos that break the rules come down quickly."

Not an excuse

The committee does not accept that volume is an excuse saying: "We found the arguments put forward by Google/You Tube against their staff undertaking any kind of proactive screening to be unconvincing.

"To plead that the volume of traffic prevents screening of content is clearly not correct: indeed, major providers such as MySpace have not been deterred from reviewing material posted on their sites.

"Even if review of every bit of content is not practical, that is not an argument to undertake none at all."

So, essentially, the Committee are saying that companies should be seen to try to moderate their content even if they can't do it with 100 per cent effectiveness - which makes a lot of sense.

The WORLD wide web

Of course, regulating UK sites is one thing, but the laws don't apply to much of the rest of the internet - a truly global product.

However, Committee chair John Whittingdale MP told TechRadar that this shouldn't lead to a laissez faire attitude.

"Just because people can get around the rules doesn't mean that there should be no rules," said Whittingdale.

"We want the industry to self-regulate and produce its own list of standards that people comply to.

"The sites that are prepared to comply should want to advertise this to their users. It should be something that companies are proud of saying: 'we weill keep your kids safe from harmful content'.

mf.gif





More...
 
Top