Social media sites, and those that host user-generated content, need to do more to screen the content on their sites and protect users—particularly children—from videos that could be considered harmful, according to a UK government agency. The House of Commons' Culture Media and Sport Committee released its tenth report today, titled "Harmful content on the Internet and in video games," which examines "the Internet’s dark side" and what should be done to keep users safe. The Committee feels that social media sites need to implement stricter policies, implement more content filtering, and make it easier to report abuse.
The Committee starts off by describing the Internet as a place "where hardcore pornography and videos of fights, bullying or alleged rape can be found, as can websites promoting extreme diets, self-harm, and even suicide." Because of this, websites like MySpace, Facebook, and YouTube need to take a more active stance against offensive or illegal content than they do currently. The Committee expressed distress that there appeared to be an industry standard of 24 hours to remove content that contains child abuse, for example, and strongly recommended making such important issues higher-priority.
Another area of concern was over the apparent realization that videos uploaded to YouTube go through no filtering (human or computer) before being posted to the site. Google argued that the task of doing so would be nearly impossible, as some 10 hours of video are uploaded to the site every minute of the day, but the Committee was having none of it. "To plead that the volume of traffic prevents screening of content is clearly not correct: indeed, major providers such as MySpace have not been deterred from reviewing material posted on their sites," reads the report. It urges YouTube and user-generated content sites in general to implement technology that can screen file titles for questionable material (since we all know that people uploading illegal content always make sure that the filename is specific and accurate).
Other recommendations included making terms of service more prominent and easier for users to find, implementing all possible privacy controls by default, requiring users to deliberately and manually make them make their profiles more public, and implementing controls that make it easy for users to report instances of child porn directly to law enforcement. The agency encouraged the industry as a whole to come up with standards, and that a minister be appointed to oversee these developments.
Of course, the nature of the Internet means that those who are interested in spreading illegal content—whether it's copyrighted material or child porn—will always try to remain a step ahead of filters and law enforcement, and their sheer numbers make success likely. The Committee seems to realize this to some degree, but argues that perfection should not be the enemy of the good. Child safety should be of utmost priority, says the agency, and any costs or technical limitations should be considered second to protecting children when it comes to the Internet.
Further reading:Harmful content on the Internet and in video games (PDF)Harmful content on the Internet and in video games, oral and written evidence (PDF)