Hypothetically speaking, I can see no obvious reason copyright law cannot be extended to exclude from "fair use" the injection of images into a machine learning engine without consent of the copyright owner of the image.

This will need to be done carefully, because if the goal is to throw a wrench into Stable Diffusion and its siblings, that wrench can easily ping-pong into e.g. banning Google from creating ContentID fingerprints to protect artists from copyright abuse. Some specific dimensions to pin down:

- what is a machine learning engine?

- what does it mean to train one on an artist's work?

- what does 'consent' look like? How explicit must it be?

Implementation will be messy but it's always messy; this is copyright, there's no other kind of implementation of copyright. "...other nations have thought that these monopolies produce more embarrassment than advantage to society" (Thomas Jefferson). But it may be a good idea if we have no alternative that protects the livelihoods of artists in a world where everyone is now a mediocre visual artist.

... And, of course, US copyright law has no bearing on China's law. Nor Russia. Nor dozens of other countries. So we would have to be prepared for our entertainment industries paying top-dollar for human labor competing on the international stage with a sea of mediocre artists that cost electrons and little else.

@mtomczak keep in mind that the doctrine of fair use is not a right, it is an affirmative defense. It applies to infringements. And whether an infringement is excused under the doctrine is determined only in court. So people declaring they are making a fair use are essentially admitting infringement. They seem to forget the "throwing themselves on the mercy of the court" part. We've already seen that in a prominent case related to Google versus Oracle.

@orcmid Very true. Regarding infringement, the relevant question, I think, is whether creation of the recognizer itself is an infringement. Once the recognizer has been created, the act of using it to synthesize art smells a lot like “creating novel work in the style of the artist,” and (unlike Google v. Oracle where we had no precedent on the copyrightability of APIs) precedent for copyright is that a style cannot be copyrighted.

If creation of the recognizer is infringement, that opens a whole can of worms… ContentID (which protects artists) is also a recognizer, and I don’t think artists in general want it to be illegal overnight to create a ContentID thumbprinting algorithm.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.