Jul 122017
 

The following was also posted on Techdirt.

It’s always hard to write about the policy implications of tragedies – the last thing their victims need is the politicization of what they suffered. At the same time, it’s important to learn what lessons we can from these events in order to avoid future ones. Earlier Mike wrote about the chilling effects on Grenfell residents’ ability to express their concerns about the safety of the building – chilling effects that may have been deadly – because they lived in a jurisdiction that allowed critical speech to be easily threatened. The policy concern I want to focus on now is how copyright law also interferes with safety and accountability both in the US and elsewhere.

I’m thinking in particular about the litigation Carl Malamud has found himself faced with because he dared to post legally-enforceable standards on his website as a resource for people who wanted ready access to the law that governed them. (Disclosure: I helped file amicus briefs supporting his defense in this litigation.) A lot of the discussion about the litigation has focused on the need for people to know the details of the law that governs them: while ignorance of the law is no excuse, as a practical matter people need a way to actually know what the law is if they are going to be expected to comply with it. Locking it away in a few distant libraries or behind paywalls is not an effective way of disseminating that knowledge.

But there is another reason why the general public needs to have access to this knowledge. Not just because it governs them, but because others’ compliance with it obviously affects them. Think for instance about the tenants in these buildings, or any buildings anywhere: how can they be equipped to know if the buildings they live in meet applicable safety standards if they never can see what those standards are? They instead are forced to trust that those with privileged access to that knowledge will have acted on it accordingly. But as the Grenfell tragedy has shown, that trust may be misplaced. “Trust, but verify,” it has been famously said. But without access to the knowledge necessary to verify that everything has been done properly, no one can make sure that it has. That makes the people who depend on this compliance vulnerable. And as long as copyright law is what prevents them from knowing if there has been compliance, then it is copyright law that makes them so.  Continue reading »

Jul 062017
 

The following was originally posted on Techdirt.

Sunday morning I made the mistake of checking Twitter first thing upon waking up. As if just a quick check of Twitter would ever be possible during this administration… It definitely wasn’t this past weekend, because waiting for me in my Twitter stream was Trump’s tweet of the meme he found on Reddit showing him physically beating the crap out of a personified CNN.

But that’s not what waylaid me. What gave me pause were all the people demanding it be reported to Twitter for violating its terms of service. The fact that so many people thought that was a good idea worries me, because the expectation that when bad speech happens someone will make it go away is not a healthy one. My concern inspired a tweet storm, which has now been turned into this post. Continue reading »

Jul 052017
 

The attached paper is a re-publication of the honors thesis I wrote in 1996 as a senior at the University of California at Berkeley.  As the title indicates, it was designed to study Internet adoption among my fellow students. Continue reading »