Public Knowledge today released a landmark analysis showing that filtering of Internet content as advocated by big media companies will not work and will be harmful to the Internet. The full paper is here.
Gigi B. Sohn, president and co-founder of Public Knowledge, said: “Our study, 'Forcing the Net Through a Sieve: Why Copyright Filtering is Not a Viable Solution for U.S. ISPs,' examines for the first time the complex topic of content filtering from the technical, economic and legal perspectives. Content filtering fails in all of these tests. Filtering will not be the 'magic bullet' that the media moguls want, but it could degrade and alter the Internet for everyone while invading the privacy of every Internet user. There is no reason that any Internet Service Provider or media company should even think about engaging in such activity.”
The report was submitted with Public Knowledge's reply comments to the Federal Communications Commission (FCC) in the proceeding asking for information on how to structure a national broadband plan. Several content-related parties suggested content filtering should be part of such a plan. PK’s reply comments, other than the report, are here.
According to the PK report: “The content industry would like to convince policymakers and the general public that copyright filtering is the most effective means by which to combat online copyright infringement and protect America's creative economy. This could not be further from the truth. In practice, copyright filtering is likely to harm innovators, end users, online service providers and Internet service providers alike. What's more, it will compromise the privacy of all American Internet users for the perceived benefit of one industry. As such, copyright filtering will discourage investment in the Internet economy—our most promising engine for economic growth—and will harm American competitiveness in the global market.”
The study is divided into four sections. “Technological Analysis, the anatomy of a copyright filter,” shows how filters work and how complicated such a filter would have to be given the complexity of U.S. copyright law and the amount of Internet traffic involved. According to the technological analysis, “depending on the technology used to identify copyrighted works, copyright filters will be underinclusive, overinclusive or both. The filter will fail to identify all copyrighted works that pass through it, will filter out legal, legitimate content or, as is the case with most filtering technologies currently on the market, the filter will fail on both counts.”
The second section, “Limitations and Consequences of Copyright Filtering,” discusses how the protocol used to transmit traffic and the type of media is poor indicators for determining which traffic should be filtered. In addition, the section shows how filtering will slow Internet traffic, cause security risks, and result in a “technological arms race” as users would try to evade filters, as happened in Iran and in France. The losers in that arms race would be everyone using the Internet.
The third section, “Economic Analysis,” asks who would pay for the filtering and examines the lack of economic incentives ISPs have for installing them, and the harm to the Internet economy that would result as a consequence of deploying network-level filters.
The final section, “Legal Analysis,” argues that copyright filters would be an “unconstitutional burden on free expression,” would undermine the “safe harbor” provisions of the Digital Millennium Copyright Act which protect ISPs and online service providers from liability under copyright law and may violate the Electronic Communications Privacy Act.
Members of the media may contact Communications Director Shiva Stella with inquiries, interview requests, or to join the Public Knowledge press list at email@example.com or 405-249-9435.