Fair Use Week 2025: Day Two With Guest Expert Brandon Butler

I am delighted to host Day Two of the 12th Anniversary of Fair Use Week with a write up from Brandon Butler, copyright expert and Executive Director at Re:Create. While fair use is a crucial safeguard for free expression, Brandon states, its effectiveness depends on access to platforms and technologies, which are protected by copyright safe harbors like Section 512 of the DMCA. As new tools like generative AI introduce stricter content filters or “guardrails,” there is a growing risk that fair use will be unintentionally restricted, making it essential to maintain strong safe harbors to support creative expression. – Kyle K. Courtney

Safe Harbors, Guardrails, and 21st Century Fair Use

by Brandon Butler

“The Wayward Press: Do You Belong in Journalism?” by A. J. Liebling The New Yorker, May 14, 1960 P. 105

In a 1960 New Yorker story on the problem of monopoly in the newspaper business, A.J. Liebling wrote that “Freedom of the press is guaranteed only to those who own one.” You could say the same thing about fair use, which the Supreme Court has described as one of the safeguards of the First Amendment in the copyright system. Fair use may give everyone the freedom to make parodies, critiques, and other transformative uses without worrying about copyright, but fair use doesn’t mean much if you don’t have a way of making or sharing your work with the world. 

Liebling’s aphorism is true as far as it goes, but it doesn’t go nearly as far in 2025 as it did in 1960. Back then, the implication of Liebling’s savvy aphorism was dark and cynical: freedom of the press is a privilege for newspaper owners and an empty abstraction for the rest of us. Not so much in 2025. 

Today there are a dozen online platforms I can use for free to reach more people with my laptop in a few minutes than a 1960 newspaper magnate could reach in a week with a dozen presses. My phone’s camera shoots better video than the best camera at the BBC in 1960. I can record a song with Garageband on my iPad that has more simultaneous tracks than the Beach Boys’ Pet Sounds and the Beatles’ White Album put together. It won’t be any good, but that’s not the point! The point is that the freedom of the press that Liebling mocked as an empty abstraction in 1960 has probably never been more real than it is today. We all have easy access to the tools to make and share songs, videos, essays, or whatever else strikes our fancy.

The technologies and platforms that make this possible would not exist without the copyright safe harbors that shield platforms from liability for the acts of their users. Chief among these is the well-known Section 512 of the DMCA, which applies to internet service providers like YouTube or Soundcloud as well as broadband providers like Verizon and Comcast. If technology makers and platform providers were held liable for every infringement by one of their users, none of them could afford to operate. The notoriously high statutory damages in copyright law combined with the large scale of copying made possible by the internet would create potentially ruinous liability exposure. (The Section 230 safe harbor for platforms, which protects them against non-IP related liability for user speech, have the same purpose and effect; shockingly, at a time when we’ve never needed more leeway for free and independent speech, some in Congress are once again threatening to neuter Section 230.)

The kinds of creativity that involve fair use—critique, parody, commentary—are especially in need of safe harbor protection. Without safe harbors, technology platforms would be under strong pressure to filter and block any use that includes material recognized as in-copyright. Most fair uses are, by definition, uses that include in-copyright material: a clip from a film to illustrate a critique, the backing music from a song to serve as the basis for a parody, and so on. Fair uses are like dolphins that are inevitably caught up in the shoddy tuna nets of copyright enforcement online. Without safe harbors, we get clumsy filters, and filters are especially bad for fair use.

I was reminded of this issue recently at the midwinter meeting of the Copyright Society, during a discussion about who should be held liable when someone uses generative AI tools to make an infringing image, song, or literary work. The most commonly used examples are characters like Nintendo’s Mario the plumber, or Snoopy, or Spider-Man, whose images are so ubiquitous that the earliest AI image generators were highly capable of producing them when prompted, even with vague terms like “Italian plumber.” While panelists Joe Gratz and Josh Simmons (star litigators for the technology and content industries, respectively) debated the technicalities, there seemed to be agreement that, in any event, responsible companies have increasingly effective filters that prevent their generative tools from outputting works that contain protected elements of copyrighted material, like characters, lyrics, or melodies. These filters are often called “guardrails,” designed to keep users from wandering off the straight and narrow path as they use generative AI. I worry, though, about how these guardrails might impact fair use, which is well within the straight and narrow of copyright law but much harder for a machine to detect. 

Every year my law firm represents dozens of creators who rely on fair use to tell their stories. From the serious and scholarly to the deeply personal, from punk rock to high art, these films wouldn’t be complete without fair use. These creators rely on sophisticated digital technologies to assemble their films, and many of them distribute trailers and even full films using Vimeo and other platforms. These tools and platforms could of course be used for infringing purposes just as easily as for fair ones. (That’s why people hire us: to help them stay on the right side of the line!) Thankfully the tools and platforms our clients rely on are not overly worried about being held liable for what a few bad faith users might do, so they have not built in any filters or ‘guardrails’ that would threaten fair use.

As new technologies emerge to help creators make and share new works, robust safe harbors will continue to be important to ensure that companies who build the tech that makes creativity possible aren’t pressured to create guardrails so strict that they short circuit fair use.

Brandon Butler is the executive director of the Re:Create Coalition and a partner at the fair use law firm Jaszi Butler PLLC. Brandon has been working on copyright and fair use issues his entire career, with the focus on their impact on research and creative reuses of archival material