Log in
Register
Menu
Log in
Register
Home
What's new
Latest activity
Authors
Forums
New posts
Search forums
What's new
New posts
Latest activity
Members
Current visitors
New posts
Search forums
Menu
Log in
Register
Install the app
Install
Forums
Miscellaneous Sections
Tech Head - The Technology Section
Tech News feeds.
In Depth News Feature: Why the UK doesn't moderate UGC
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="The Feedster" data-source="post: 537352" data-attributes="member: 259515"><p><img src="http://mos.techradar.com//images/youtube-logo-200-200.jpg" alt="" class="fr-fic fr-dii fr-draggable " style="" /> Although their findings may be hotly debated - much of the Select Committee findings on the way in which children are kept away from harmful content on the internet is sensible and well-measured.</p><p></p><p>Using the common-sense findings of Tanya Byron as its starting point the recommendations on a whole host of subjects make a lot of sense, but perhaps the most important declaration is that sites should not be penalised for actively moderating their content.</p><p></p><p>To understand any this is relevant to the UK internet industry you have to consider the confusion that has reigned in the past over the way that companies approach UGC content like forums and uploaded pictures/videos.</p><p></p><p><strong>The way it is</strong></p><p></p><p>The primary reason that many of the UK's major internet companies - Microsoft's MSN and our own Future Publishing's websites for instance - have adopted there current policy on UGC is because of the EC-E-Commerce Directive.</p><p></p><p>This directive deals with the responsibility of companies for content published on their website of which they do not have 'actual knowledge.' As the Committee report explains:</p><p></p><p>"Under regulation 17 of the Electronic Commerce (EC Directive) Regulations 2002 (which transpose the Directive into UK law), companies that transmit Internet content on behalf of others (such as a user's profile page on a social networking site) cannot be held liable for anything illegal about the content if they did not initiate the transmission, select the receiver, or select or modify the information contained in the transmission.</p><p></p><p>"Nor is a service which hosts Internet content liable for damages or for any criminal sanction as a result of that storage if they do not have "actual knowledge" of unlawful activity or information and if, on becoming aware of such activity, they act "expeditiously" to remove or to disable access to the information."</p><p></p><p>In other words - if you don't know about it then you can't be held responsible. Which has led to many companies taking the stance that if they choose to actively moderate their UGC then they could feasibly be considered to therefore have 'actual knowledge' of all content posted which would make them legally responsible not just for the posting of unsuitable material, but libellous comments.</p><p></p><p>Which means that the majority of major companies have taken a passive 'report and take-down' approach to ensue that they can use regulation 17 as a defence.</p><p></p><p><strong>Head in the sand</strong></p><p></p><p>Tanya Byron's response to this was to suggest the approach: "is a bit like saying that it is unfair to ask companies to survey their premises for asbestos in case they find some but fail to remove it safely", adding that "on this issue, companies should not hide behind the law."</p><p></p><p>Which is a fair comment, but does not salve the fears of the companies who take the passive stance. Inside the current system those that actively moderate could well be found guilty of having prior knowledge through moderation, so you can't blame those who choose to take the 'head in the sand' approach. At least until the law is clarified.</p><p></p><p>And the committee appears to appreciate this as well - sensibly going so far as to suggest that the government should seek to 'seek amendment to the Directive if it is preventing ISPs and websites from exercising more rigorous controls over content."</p><p></p><p><strong>Public interest</strong></p><p></p><p>The report says: "We do not believe that it is in the public interest for Internet service providers or networking sites to neglect screening content because of a fear that they will become liable under the terms of the EC E-Commerce Directive for material which is illegal but which is not identified.</p><p></p><p>"It would be perverse if the law were to make such sites more vulnerable for trying to offer protection to consumers. We recommend that Ofcom or the Government should set out their interpretation of when the E-Commerce Directive will place upon Internet service providers liability for content which they host or to which they enable access."</p><p></p><p>It's a commendable stance, albeit one that needs to be backed up in actual law in order to convince companies that they should switch to an active moderation system.</p><p></p><p><strong>Sheer volume</strong></p><p></p><p>However, as you have probably noted - there is another problem to active moderation. It is a costly and time consuming process; and when your UGC content is of the volume of a site like YouTube or Flickr's then the financial burden is potentially massive.</p><p></p><p>Indeed Google - the owners of YouTube - have expressed their doubt that it would even be feasible to pro-actively moderate every post.</p><p></p><p>"We have strict rules on what's allowed, and a system that enables anyone who sees inappropriate content to report it to our 24/7 review team and have it dealt with promptly," a spokesman told the BBC.</p><p></p><p>"Given the volume of content uploaded on our site, we think this is by far the most effective way to make sure that the tiny minority of videos that break the rules come down quickly."</p><p></p><p><strong>Not an excuse</strong></p><p></p><p>The committee does not accept that volume is an excuse saying: "We found the arguments put forward by Google/You Tube against their staff undertaking any kind of proactive screening to be unconvincing.</p><p></p><p>"To plead that the volume of traffic prevents screening of content is clearly not correct: indeed, major providers such as MySpace have not been deterred from reviewing material posted on their sites.</p><p></p><p>"Even if review of every bit of content is not practical, that is not an argument to undertake none at all."</p><p></p><p>So, essentially, the Committee are saying that companies should be seen to try to moderate their content even if they can't do it with 100 per cent effectiveness - which makes a lot of sense.</p><p></p><p><strong>The WORLD wide web</strong></p><p></p><p>Of course, regulating UK sites is one thing, but the laws don't apply to much of the rest of the internet - a truly global product.</p><p></p><p>However, Committee chair John Whittingdale MP told TechRadar that this shouldn't lead to a laissez faire attitude.</p><p></p><p>"Just because people can get around the rules doesn't mean that there should be no rules," said Whittingdale.</p><p></p><p>"We want the industry to self-regulate and produce its own list of standards that people comply to.</p><p></p><p>"The sites that are prepared to comply should want to advertise this to their users. It should be something that companies are proud of saying: 'we weill keep your kids safe from harmful content'.</p><p></p><p><img src="http://rss.feedsportal.com/c/669/f/8513/s/18f2f0a/mf.gif" alt="" class="fr-fic fr-dii fr-draggable " style="" /><a href="http://res.feedsportal.com/viral/sendemail2.html?title=http://www.techradar.com/439142&link=In Depth News Feature: Why the UK doesn't moderate UGC" target="_blank"><img src="http://rss.feedsportal.com/images/emailthis2.gif" alt="" class="fr-fic fr-dii fr-draggable " style="" /></a><a href="http://res.feedsportal.com/viral/bookmark.cfm?title=http://www.techradar.com/439142&link=In Depth News Feature: Why the UK doesn't moderate UGC" target="_blank"><img src="http://rss.feedsportal.com/images/bookmark.gif" alt="" class="fr-fic fr-dii fr-draggable " style="" /></a></p><p></p><p></p><p><a href="http://da.feedsportal.com/r/14855459216/f/8513/c/669/s/26160906/a2.htm" target="_blank"><img src="http://da.feedsportal.com/r/14855459216/f/8513/c/669/s/26160906/a2.img" alt="" class="fr-fic fr-dii fr-draggable " style="" /></a></p><p></p><p><a href="http://rss.feedsportal.com/c/669/f/8513/s/18f2f0a/story01.htm" target="_blank">More...</a></p></blockquote><p></p>
[QUOTE="The Feedster, post: 537352, member: 259515"] [IMG]http://mos.techradar.com//images/youtube-logo-200-200.jpg[/IMG] Although their findings may be hotly debated - much of the Select Committee findings on the way in which children are kept away from harmful content on the internet is sensible and well-measured. Using the common-sense findings of Tanya Byron as its starting point the recommendations on a whole host of subjects make a lot of sense, but perhaps the most important declaration is that sites should not be penalised for actively moderating their content. To understand any this is relevant to the UK internet industry you have to consider the confusion that has reigned in the past over the way that companies approach UGC content like forums and uploaded pictures/videos. [B]The way it is[/B] The primary reason that many of the UK's major internet companies - Microsoft's MSN and our own Future Publishing's websites for instance - have adopted there current policy on UGC is because of the EC-E-Commerce Directive. This directive deals with the responsibility of companies for content published on their website of which they do not have 'actual knowledge.' As the Committee report explains: "Under regulation 17 of the Electronic Commerce (EC Directive) Regulations 2002 (which transpose the Directive into UK law), companies that transmit Internet content on behalf of others (such as a user's profile page on a social networking site) cannot be held liable for anything illegal about the content if they did not initiate the transmission, select the receiver, or select or modify the information contained in the transmission. "Nor is a service which hosts Internet content liable for damages or for any criminal sanction as a result of that storage if they do not have "actual knowledge" of unlawful activity or information and if, on becoming aware of such activity, they act "expeditiously" to remove or to disable access to the information." In other words - if you don't know about it then you can't be held responsible. Which has led to many companies taking the stance that if they choose to actively moderate their UGC then they could feasibly be considered to therefore have 'actual knowledge' of all content posted which would make them legally responsible not just for the posting of unsuitable material, but libellous comments. Which means that the majority of major companies have taken a passive 'report and take-down' approach to ensue that they can use regulation 17 as a defence. [B]Head in the sand[/B] Tanya Byron's response to this was to suggest the approach: "is a bit like saying that it is unfair to ask companies to survey their premises for asbestos in case they find some but fail to remove it safely", adding that "on this issue, companies should not hide behind the law." Which is a fair comment, but does not salve the fears of the companies who take the passive stance. Inside the current system those that actively moderate could well be found guilty of having prior knowledge through moderation, so you can't blame those who choose to take the 'head in the sand' approach. At least until the law is clarified. And the committee appears to appreciate this as well - sensibly going so far as to suggest that the government should seek to 'seek amendment to the Directive if it is preventing ISPs and websites from exercising more rigorous controls over content." [B]Public interest[/B] The report says: "We do not believe that it is in the public interest for Internet service providers or networking sites to neglect screening content because of a fear that they will become liable under the terms of the EC E-Commerce Directive for material which is illegal but which is not identified. "It would be perverse if the law were to make such sites more vulnerable for trying to offer protection to consumers. We recommend that Ofcom or the Government should set out their interpretation of when the E-Commerce Directive will place upon Internet service providers liability for content which they host or to which they enable access." It's a commendable stance, albeit one that needs to be backed up in actual law in order to convince companies that they should switch to an active moderation system. [B]Sheer volume[/B] However, as you have probably noted - there is another problem to active moderation. It is a costly and time consuming process; and when your UGC content is of the volume of a site like YouTube or Flickr's then the financial burden is potentially massive. Indeed Google - the owners of YouTube - have expressed their doubt that it would even be feasible to pro-actively moderate every post. "We have strict rules on what's allowed, and a system that enables anyone who sees inappropriate content to report it to our 24/7 review team and have it dealt with promptly," a spokesman told the BBC. "Given the volume of content uploaded on our site, we think this is by far the most effective way to make sure that the tiny minority of videos that break the rules come down quickly." [B]Not an excuse[/B] The committee does not accept that volume is an excuse saying: "We found the arguments put forward by Google/You Tube against their staff undertaking any kind of proactive screening to be unconvincing. "To plead that the volume of traffic prevents screening of content is clearly not correct: indeed, major providers such as MySpace have not been deterred from reviewing material posted on their sites. "Even if review of every bit of content is not practical, that is not an argument to undertake none at all." So, essentially, the Committee are saying that companies should be seen to try to moderate their content even if they can't do it with 100 per cent effectiveness - which makes a lot of sense. [B]The WORLD wide web[/B] Of course, regulating UK sites is one thing, but the laws don't apply to much of the rest of the internet - a truly global product. However, Committee chair John Whittingdale MP told TechRadar that this shouldn't lead to a laissez faire attitude. "Just because people can get around the rules doesn't mean that there should be no rules," said Whittingdale. "We want the industry to self-regulate and produce its own list of standards that people comply to. "The sites that are prepared to comply should want to advertise this to their users. It should be something that companies are proud of saying: 'we weill keep your kids safe from harmful content'. [IMG]http://rss.feedsportal.com/c/669/f/8513/s/18f2f0a/mf.gif [/IMG][URL="http://res.feedsportal.com/viral/sendemail2.html?title=http://www.techradar.com/439142&link=In Depth News Feature: Why the UK doesn't moderate UGC"][IMG]http://rss.feedsportal.com/images/emailthis2.gif[/IMG][/URL][URL="http://res.feedsportal.com/viral/bookmark.cfm?title=http://www.techradar.com/439142&link=In Depth News Feature: Why the UK doesn't moderate UGC"][IMG]http://rss.feedsportal.com/images/bookmark.gif[/IMG][/URL] [URL="http://da.feedsportal.com/r/14855459216/f/8513/c/669/s/26160906/a2.htm"][IMG]http://da.feedsportal.com/r/14855459216/f/8513/c/669/s/26160906/a2.img[/IMG][/URL] [url=http://rss.feedsportal.com/c/669/f/8513/s/18f2f0a/story01.htm]More...[/url] [/QUOTE]
Insert quotes…
Verification
Post reply
Forums
Miscellaneous Sections
Tech Head - The Technology Section
Tech News feeds.
In Depth News Feature: Why the UK doesn't moderate UGC
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.
Accept
Learn more…
Top