Sep 292013
 

This past week California passed a law requiring website owners to allow minors (who are also residents of California) to delete any postings they may have made on the website. There is plenty to criticize about this law, including that it is yet another example of a legislative commandment cavalierly imposing liability on website owners with no contemplation of the technical feasibility of how they are supposed to comply with it.

But such discussion should be moot. This law is precluded by federal law, in this case 47 U.S.C. Section 230. By its provisions, Section 230 prevents intermediaries (such as websites) from being held liable for content others have posted on them. (See Section 230(c)(1)). Moreover, states are not permitted to undermine that immunity. (See Section 230(e)(3)). So, for instance, even if someone were to post some content to a website that might be illegal in some way under state law, that state law can’t make the website hosting that content itself be liable for it (nor can that state law make the website delete it). But that’s what this law proposes to do at its essence: make websites liable for content others have posted to them.

As such, even aside for the other Constitutional infirmities of this law such as those involving compelled speech for forcing website owners to either host or delete content at someone else’s behest (see a discussion from Eric Goldman about this and other Constitutional problems here), it’s also constitutionally pre-empted by a prior act of Congress.

Some might argue that the intent of the law is important and noble enough to forgive it these problems. Unlike in generations past, kids today truly do have something akin to a “permanent record” thanks to the ease of the Internet to collect and indefinitely store the digital evidence of everyone’s lives. But such a concern requires thoughtful consideration for how to best ameliorate those consequences, if it’s even possible to, without injuring important free speech principles and values the Internet also supports. This law offers no such solution.

Jul 282013
 

I was asked to write the “Posts of the Week” for Techdirt this past weekend and used it as an opportunity to convey some of the ideas I explore here to that audience. The post was slightly constrained by the contours of the project — for instance, I could only punctuate my greater points with actual posts that appeared on Techdirt last week — but I think they held together with coherence, and I appreciated the chance to reframe some of the issues Techdirt was already exploring in this way.

In any case, I’ve decided to cross-post my summary here, partly because I always like to host a copy of my guest blog posts on one of my sites, and partly because it gives me a chance to update and annotate those ideas further. Please do go visit Techdirt though, which was kind enough to ask me to do this, to read more about the items described below.
Continue reading »

Feb 062012
 

While I was working on this post Eric Goldman beat me to the punch and posted something similar. But great minds and all that… Intermediary liability is also such a crucial issue related to the criminalization of online content I want to make sure plenty of discussion on it takes place here.

In addition to the First Amendment, in the US free speech on the Internet is also advanced by 47 U.S.C. Section 230, an important law that generally serves to immunize web hosts for liability in user generated content. (See Section 230(c)(1): “No provider . . . of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”). Note that this law doesn’t absolve the content of any intrinsic liability; it just means that the host can’t be held liable for it. Only the user who posted it can be.

This small rule has a big impact: if hosts could be held liable for everything their users’ posted, they would be forced to police and censor it. True, the effect of this immunity means that sometimes some vile content can make its way online and linger there, potentially harmfully. But it also means that by not forcing hosts to be censorious middlemen, they are not finding themselves tempted to kill innocuous, or even abjectly good, content. As a result all sorts of vibrant communities and useful information have been able to take root on the Web.

But for this immunity to really be meaningful, it’s not enough that it protect the host from a final award on damages. It’s extremely expensive to have to be dragged into court at all. If hosts legitimately fear needing to go through the judicial process to answer for users’ content, they may find it more worth their while to become censorious middlemen with respect to that content, in order to ensure they never need go down this road.

Which brings us to Fair Housing Council of San Fernando v. Roommates.com, both its seminal piece of Section 230 jurisprudence and its more recent epilogue, each part of the attempted civil prosecution of a web host for fair housing act violations. Continue reading »