Taking full advantage of the seventh anniversary of 9/11, YouTube announced changes to its community guidelines last week, prohibiting the upload of videos inciting others to commit violent acts. The change comes several months after Senator Joe Lieberman pressured YouTube to remove videos not only inciting violence, but also content "that can be readily identified as produced by Al-Qaeda or another [Foreign Terrorist Organization]," through logos such as these:
Senator Lieberman issued a press release praising the move and claiming it as a "direct response to the Senator’s complaints about violent Islamist videos that have been posted on the popular website." This certainly is a partial victory for the Senator, but it is important to note that YouTube did not go quite as far as Lieberman suggests. Its new guidelines prohibit incitement to violence, which is not protected speech under the U.S. Constitution (although deciding exactly what exactly constitutes direct incitement to violence is a difficult task, as this post on Ars Technica points out). The new guidelines do not engage in the kind of broad viewpoint discrimination suggested by Lieberman's targeting of videos bearing Islamist logos.
As I argued in a previous post, this kind of viewpoint discrimination is inconsistent with our nation's free speech norms (not the law -- remember YouTube is a private actor) and makes us look like hypocrites in the eyes of the world. Thankfully, YouTube doesn't appear willing to follow Lieberman down that road.
For more detailed coverage, see The Washington Post, Wired, and TechCrunch.