Not so Smart: The SMART Copyright Act’s Dangerous Approach to Online Copyright Protection
Not so Smart: The SMART Copyright Act’s Dangerous Approach to Online Copyright Protection
Not so Smart: The SMART Copyright Act’s Dangerous Approach to Online Copyright Protection

    Get Involved Today

    For expanded analysis of the “SMART Copyright Act” and designated technical measures, and a deeper dive on standard technical measures, view our latest white paper, “Consensus, Not Command: A Smarter Approach to Standard Technical Measures,” by Public Knowledge Policy Counsel Nicholas P. Garcia.


    Recently, Senators Thom Tillis and Patrick Leahy introduced a bill that would threaten freedom of expression, creativity, and competition called the “Strengthening Measures to Advance Rights Technologies (SMART) Copyright Act of 2022.” The proposed legislation vests the Copyright Office with the authority to mandate the adoption of new “designated technical measures,” or DTMs, for monitoring and enforcing copyright. Simply put, the bill could result in every digital platform or website that allows for user-generated, uploaded content being forced to use content monitoring software—designated by the Copyright Office—on penalty of statutory damages. This bill would reshape the internet as we know it, and threatens its long-standing values of freedom, creativity, and innovation.

    Protecting creative works online is an ongoing challenge—especially for small or independent creators that lack the deep pockets of major media companies—but government-mandated monitoring solutions are dangerous, and are unlikely to be fair and effective solutions. 

    On top of these considerable issues, the DTM designation process envisioned by the bill—relying on the Librarian of Congress and the Copyright Office—raises serious procedural concerns. DTMs will rely on cutting-edge technology and have deep impacts on how platforms and websites are able to function. Fully considering these decisions demands a level of technical expertise and stakeholder trust that frankly does not exist at the Copyright Office.

    DTMs Are Not STMs

    In some ways, DTMs look a lot like their “standard technical measures,” or STM, cousins that have been a part of the Digital Millennium Copyright Act, or DMCA, from its inception. DTMs and STMs are both technical measures “used by copyright owners to identify or protect copyrighted works” and DTMs share some structural similarities with STMs but DTMs are explicitly intended to be different from STMs.

    The DMCA requires websites to accommodate STMs to keep their “safe harbor” liability shield intact, but STMs have to be agreed upon and developed through consensus, and have never gained much traction. On the other hand, DTMs—while they would be just as mandatory as STMs because they pierce the DMCA “safe harbor” protections—lack that organic and consensus-driven adoption model. Instead DTMs would be selected by the Copyright Office and then sites must implement and adopt them regardless of whether there is broad agreement that such measures are beneficial. This directed method of implementing technical measures is a dangerous idea. 

    STMs have never taken for good reasons; the technical measures that could be used to identify or protect copyrighted material have never achieved the level of consensus required to become “standard” because they have proven to be ineffective, harmful, or unduly burdensome. The SMART Copyright Act, instead of accepting these facts, is an end-run around the original STM provisions that intends to force tried-and-failed, or new and untested, technical measures to be adopted across the whole of the internet. These technical measures will undoubtedly take the form of content filters that are both dangerous and ineffective.

    DTM Mandates Are Dangerous

    DTM mandates are an alarming prospect. The internet has prospered under the current set of rules, resulting in a vibrant, creative, free-flowing exchange of culture and ideas. When it comes to copyright, the digital landscape has always posed a challenge to police, but protecting and enforcing copyright is an important component of the incentive structures that allow for a healthy internet ecosystem. After all, everyone who makes content on the internet relies on copyright to protect their work. But DTM mandates threaten to upset this delicate balancing act. 

    Though there are no widely-adopted standard technical measures that are used to enforce copyright, we have a good idea what DTMs will look like and what effects they are going to have. Automated content filters—powered by content-recognition algorithms and proprietary databases of content “fingerprints”—that scan content as it is uploaded and actively comb through content are the corporate content industry’s favored copyright enforcement technical measure. These filters are unable to accurately accommodate fair use, will chill free speech, and hinder the creativity of content makers. The very existence of DTM mandates will throw up barriers to entry that harm competition, limit innovation, and drive further consolidation and centralization on the internet.

    DTMs Threaten Free Speech

    The power to enforce copyright is written into the Constitution, but—as a government enforced limitation on speech—it exists in inherent tension with the First Amendment. Congress, in the Copyright Act of 1976, enshrined in statute the common law doctrine of “fair use”, and the Supreme Court has clarified that fair use is a safeguard and safety valve that prevents copyright law from overtaking our right to free speech. Without fair use, and other limiting doctrines, copyright enforcement would be incompatible with the First Amendment.

    DTMs challenge the effectiveness of fair use to protect free speech because they are unable to assess and accommodate instances of fair use. The fair use factors that need to be considered are a particularly complex area of law that requires qualitative and normative analysis that simply cannot be performed  by automated systems. Many lawyers and judges struggle to apply these factors, experts can disagree, and it is unrealistic to expect software to be able to do it any better. And, without the ability to account for fair use, any technical measure takes copyright enforcement beyond its constitutionally permissible bounds and infringes on free speech. 

    DTMs Threaten Creativity

    One critical way to advance creativity involves absorbing existing ideas, then copying, transforming, and recombining them to create “new” ideas. This process is common to visual artists, musicians, writers, and even those who work in disciplines such as history or science. Thus, automated content filters which are designed to seek out similarity and target “infringing” content for deletion—or prevent it from being uploaded at all—introduce unprecedented obstacles to creative work.

    Ubiquitous content-filtering DTMs would deter individuals from incorporating, iterating on, or recontextualizing existing ideas for fear that their efforts would run afoul of the DTM filters. To be clear, this extends beyond the fair use and free speech concerns discussed above; the concern is that DTMs selected by the Copyright Office would affect the actual creative process itself by disincentivizing the pattern of iteration and adaptation required to develop new work. 

    Besides the chilling effect on content creators, DTMs would affect the diversity and openness of creative spaces themselves. When faced with the prospect of needing to adopt DTMs to monitor user-generated or submitted content, many service providers may instead opt for other, more limited, content models. This translates directly to fewer creative spaces and communities, or platforms with stricter access requirements, each of which would result in fewer people having the opportunity to create, connect, and build on the internet. Such changes will only reinforce existing power dynamics, favoring large corporate content creators, while implicitly silencing and suppressing opportunities for smaller, independent creators and for diverse and marginalized voices to find a place to be heard.

    DTMs Threaten Competition and Innovation

    In addition to the threats to free speech and creativity, DTM mandates also carry with them serious economic and competitive costs. Requiring the adoption and implementation of specific technological copyright protection measures serves as a barrier to new entrants. Even “reasonable” royalty terms may, based on current market rates, be impractically high for startups. Furthermore, accommodation requires more than just off-the-shelf licensing; DTM implementation and compliance has additional up-front and maintenance costs.

    Mandating specific technical measures will also put a damper on innovation in copyright-intensive platform development. Accommodating DTMs requires service providers to develop platforms and services that are compatible with the Copyright Office-designated technical measures. These compatibility requirements will necessarily limit the design space for new platforms and services by anchoring them to what is compatible with DTMs. Developing new ways to create and share information is an essential component of internet innovation and hamstringing it for the sake of copyright enforcement ultimately violates the core motive behind copyright itself: to promote progress and creativity.

    Rather than reward innovation and competition, DTM mandates will thus become yet another factor that entrenches incumbent platforms and drives consolidation.

    DTM Mandates Are Ineffective

    The dangers to free expression, creativity, and competition presented by DTM mandates are serious enough to warrant serious objection to this concept, however even if one were to put those concerns to one side, an even more basic flaw emerges: DTMs simply will not provide fair or effective protection for copyrighted works. The automated identification and enforcement of copyrighted material is far from perfect, and the technical challenges associated with such technical measures is one of the primary reasons no STM has ever achieved the level of consensus required for broad adoption. 

    For example, it is widely reported that YouTube spent over $100 million developing its proprietary copyright enforcement technical measures called ContentID. Even with all of that money, Google’s vast pool of talent, and huge market power, ContentID remains a problematic mess. It has drawn criticism from the content industry, YouTube creators, and digital rights groups alike. This is not because YouTube has taken some Solomonic split-the-baby approach that is fair but makes everyone unhappy; ContentID simply doesn’t work as well as it needs to in order to provide fair, accurate, and consistent protection for any of the site’s stakeholders. 

    All of this said, ContentID is still the premier example of what could be called a copyright identification and enforcement technical measure—and YouTube is unlikely to be sharing it. So, if even YouTube hasn’t managed to get this right, it calls into question what kind of half-baked and flawed technologies the DTM designation process would force upon the internet. Whatever they would be, they won’t be able to even deliver on the promise of providing copyright protection, so why accept these mandates at all?

    Designating Technical Measures Through the Copyright Office is the Wrong Approach

    So, designating technical measures is harmful to free expression, creativity, competition, innovation, and it will not deliver fair and effective copyright enforcement. Well, on top of all that, the process for doing the designating itself is fundamentally flawed. The SMART Act gives the rulemaking authority to the Copyright Office, which suffers from a lack of technical expertise and stakeholder confidence. Further, it calls for obligatory triennial rulemaking which will be hugely disruptive.

    When calling on federal agencies to review and adopt technical measures, it is critical that the agency have the internal technical expertise and experience to assess whether the measures being endorsed by third party stakeholders, particularly those with a monetary interest in the outcome, are valid and will perform in the manner intended. That is not an expertise of the Copyright Office, which has traditionally played more of a legal, administrative function. In the framing of the SMART Act, the Copyright Office would be designating technologies that must be broadly adopted and could have disastrous effects on the internet ecosystem. This is an awesome responsibility that requires intimate understanding of the technology up for designation, the technical functioning of the internet, the structures and designs of the service providers it will apply to, as well as the economic impacts of accommodation and implementation. Ultimately, the expertise required in technology and economics for this process outweighs the need for copyright-specific expertise, which makes the Copyright Office the wrong choice for oversight of the imagined designation process.

    Further, the Copyright Office has, in the past, demonstrated a willingness to  make decisions favoring overly-expansive or draconian interpretations of copyright law and to favor the corporate content industry. The Copyright Office has also had difficulty getting input and participation from technology companies, even when those companies had devices under review by Copyright Office rulemaking. As recently as 2020, the Copyright Office issued reports that discount the perspectives of users, retaining a narrow focus on the intellectual property aspects of its decisions. While the Copyright Office has made strides in rectifying entrenched biases, this is a troubling history that hinders stakeholder engagement and creates doubt and resistance to proceedings led by the Copyright Office.

    Additionally, the SMART Act calls for a triennial rulemaking proceeding. This is a bad model for these kinds of rules for three key reasons. 

    First, the technology does not need to change that often. We’d be going from an internet that developed without the need to adopt any STMs for decades to requiring an active decision-making process for changing or adopting new DTMs every three years; this would be a massive and unnecessary policy shift. 

    Second, having such periodic rulemaking makes compliance hugely burdensome, if not impossible, because the technical measures could change so rapidly. Periodic reviews would create uncertainty about the durability of the mandates while at the same time the law punished noncompliance harshly. 

    Third, the unrelenting periodicity of the proceedings would also result in cyclical battles that favor industry over the public. The routine nature of these proceedings will lead to less participation from the public, and make it challenging for independent creators, non-profit groups, and other stakeholder organizations to mobilize participation and be meaningfully heard. Conversely, corporate interests with deep pockets will be able to continually wage a war of attrition, pushing for ever-more restrictive technologies to be adopted to secure licensing fees or for measures that favor large corporate content companies over smaller and independent creators.

    Overall, the fundamental idea of designating technical measures to enforce copyright law is severely flawed. Couple that flaw with the procedural elements of the SMART Act and you take a dangerous idea and make it worse by placing the authority with the Copyright Office and requiring an endless gauntlet of triennial rulemaking proceedings. Not so SMART.