Donate
  • Freedom
  • Innovation
  • Growth

On AI, a Rush to Legislate Is a Bad Idea

On the surface, a “responsive” government seems like a good idea.
 
But the Founders recognized the tendency of elected politicians to chase the approval of voters could lead to bad policy and the trampling of rights, which is why they designed a system of checks and balances to slow down the responsiveness of government.
 
The way I have often responded is “we don’t want a responsive government; we want a limited government.”
 
A “rush to legislate” is almost always a bad idea. It’s better to let new innovations and technologies play out, and then only regulate or legislate if discrete problems have become obvious. Otherwise, we run the risk of letting precaution get in the way of innovation.
 
We see this rush to legislate in many areas of public policy, but today perhaps the best example is the concerns and fears about artificial intelligence (AI). At both the state and federal levels, legislators are rushing to “get out in front” of the issue and show voters they are being responsive.
 
Of course, protecting ordinary Americans and public figures against abuses facilitated by generative AI is a worthy goal. But many of the so-called “digital replica” bills currently moving in state legislatures go too far in creating unintentional legal liability for innocent bystanders. They could also violate First Amendment free speech protections.
 
Furthermore, in most cases existing law is adequate to cover the majority of concerns. States and the federal government already have laws against deceptive trade practices, fraud and deception, and laws governing name, image and likeness (NIL). Misappropriating someone’s likeness is already punishable under existing law.
 
But that hasn’t stopped states from moving legislation that could, in some cases, make it illegal for you to use color correction on a photo of a performer that you post on social media. Tennessee has already passed its ELVIS Act, which thankfully was significantly amended before passage, to address these concerns and to narrowly target performers’ rights. But many state efforts are far broader, as is the proposed federal No AI Fraud Act.
 
Imagine the minefield these digital replica bills would have created for the movie “Forest Gump,” for instance, where actor Tom Hanks was digitally edited into a host of historical footages. Without First Amendment fair-use protections, much of the film might have been subject to legal liability under these new digital replica bills. The same could be said about the current streaming hit “For All Mankind,” which is an alternate history of the Space Age in which, for instance, John Lennon is not murdered and lives into the Reagan administration and is portrayed as criticizing President Reagan.
 
Again, it’s eminently reasonable to protect performers as well as ordinary Americans against abusive uses of AI. But legislators should be cautious in their rush to legislate on AI to not rule out tools that can be used for innovation and creativity, especially when existing law covers most concerns.