Public Knowledge opposes the uses of network filters as a means to curb widespread copyright infringement on the Internet. These are technologies that analyze network traffic and selectively block it, either if the content being transmitted is determined itself to be infringing, or if the traffic is being sent by a protocol or application pre-determined to be “illegitimate.” The first kind of filter looks at your traffic and blocks if it is, for example, Spiderman 3. The second kind of filter looks at your traffic and blocks it if it is, for example, BitTorrent (regardless of whether you're transmitting SpiderMan 3 or Ubuntu ISOs. While we agree that copyright infringement is a problem that needs to be addressed, we've argued that network filtering technologies (such as deep packet inspection) come at too high a cost for too little gain.
However, a fair point to raise in response to our opposition to the use of these technologies, is that network filters aren't supposed to stop 100% of all illegal copying. Rather, they are a technology that is “good enough”– one that can solve maybe 80% of the problem. This notion of achieving the “80% solution” has some appeal. As Voltaire said, “Le mieux est l'ennemi du bien”: the perfect is the enemy of the good. One shouldn't allow a quest for the perfect technology to detract from a good-enough solution that will solve most of the problem today.
A related concept is what Paul Ohm, a professor at my law school, calls “The Myth of the Superuser.” This is that we shouldn't allow our policy and technology choices be driven by what we suppose some über-hacker might do. Instead, we should treat outliers like outliers and base our decision-making on the most common use cases.
Back to network filters. Let's forget about the problem of overinclusiveness for now– a technology that actually achieved an 80% solution, at the cost of blocking a large number of noninfringing uses, would of course still be unacceptable. Instead, we should look at another “80% principle”– the Pareto Principle. This well-known business guideline states that 80% of a business's sales will come from only 20% of its customers– and generally that 80% of effects come from 20% of causes. These kinds of rule of thumb distribution ideas can be compelling– related ideas are Chris Anderson's Long Tail (minority tastes, in the aggregate, can outweigh majority tastes) and Sturgeon's Law (90% of anything is garbage). Although you have to be very cautious when applying these rules of thumb, a careful look at the facts shows that the idea of network filters being an 80% solution to copyright infringement is undermined by the 80% rule of the Pareto principle.
According to the content companies and ISPs, copyright infringement (and its proxy, super-high bandwidth use) on the Internet follows a Pareto-type distribution: Most infringing is done by a minority of users. NBC Universal lays the blame at a small minority of “bandwidth hogs.” Comcast claims that its recently-instituted bandwidth caps only affect .01% of users.
So, if you really want to achieve the 80% solution to copyright infringement, you don't want to just block 80% of infringers. If you want to stop 80% of the infringement, you need to block the (say) 20% of users who account for the majority of problem.
And this is the problem with the 80% solution idea when applied to network filters. The minority of users who do the most infringing, are, relatively speaking, “superusers.” Users who know how to set up BitTorrent, install PeerGuardian, and do port forwarding are also going to be the very same users who can figure out how to bypass traffic shaping: the information is only a Google search away.
Network filters have no hope of stopping the few, determined pirates who are technologically savvy and who account for the majority of copyright infringement on the Internet. Rather, at best, they might inconvenience the average user, who might only turn to “piracy” in order to check out the latest hot single or the episode of a favorite TV show she forgot to TiVo the night before. Then, of course, there is the fact that the determined infringers will help their less technologically knowledgeable friends bypass filtering technology, further limiting the effectiveness of the filters.
Network filters are the least effective and most intrusive means of curbing copyright infringement on the Internet. They might have a temporary, minor impact on the problem at the high cost of stifling innovation, invading privacy, blocking legitimate uses of both copyrighted and public domain works, and cementing centralized control over Internet applications.
Network filters are only the camel's nose under the tent. Their end point is destruction of the end-to-end principles that made the Internet great. They are not the 80% solution to copyright infringement.