Image: nitpicker / Shutterstock.com

Artificial intelligence is consuming art—and no one is stopping it

A photo uploaded to the internet – and suddenly it's in an AI. Without consultation, without payment, without consent. That's exactly what happened to Robert Kneschke. The professional photographer from Germany discovered that one of his images had ended up in a gigantic AI training dataset compiled by the LAION e.V. association in Hamburg. As a result, his image became part of the feed for AI models such as Stable Diffusion and Midjourney, which can now generate photorealistic images from keywords. Photografix Magin reports on the case.

The scandal: The court says it was allowed.

 

A ruling with explosive consequences for all creative professionals

In December 2025, the Hanseatic Higher Regional Court of Hamburg dismissed Kneschke's lawsuit. The reasoning: LAION is a non-commercial research institution, and the use of the data is therefore covered by copyright law. Sections §44b and §60d allow so-called "text and data mining" – i.e., the automatic evaluation of large amounts of data – even with copyright-protected works, provided that there is no machine-readable objection.

And this is precisely where the problem lies: although the photo agency that distributed Kneschke's photo had prohibited automatic evaluation in its terms of use, this prohibition did not apply to machine-readable formats, i.e., formats that programs can automatically understand. For the court, this was not sufficient.

 

"Machine-readable" or powerless?

The fact that legislators insist on machine-readable opt-outs may be technically understandable, but for authors it is unreasonable. For many, the question of how to technically "lock out" a photo on a platform such as Unsplash or Flickr is simply unanswerable. The court also left open the question of what a valid opt-out would have to look like. A note in the terms and conditions? Not enough. A robots.txt file? Only works if the crawler reads it—which LAION says it does not. The ruling leaves authors in a gray area where they have rights but cannot enforce them in practice.

Kneschke has therefore lodged an appeal. The case will now be heard by the Federal Court of Justice in Karlsruhe. This is the first time that a German court at the highest level has had to rule on the rights of creative artists in the age of AI.

 

Comment: When "non-profit" becomes a free pass

What is happening here is not an unfortunate isolated case, but a systemic failure that was bound to happen. As long as AI companies hide behind supposedly "non-profit" associations to supply billion-dollar companies with training data, the concept of non-profit status is a farce.

And the ruling? It's a gift for anyone who considers intellectual property to be a free resource. Because anyone who is smart enough to run the data vacuum cleaner through an association apparently hardly has to worry about copyrights anymore. This is not a technology of the future—it is a legal loophole at the expense of those who create content.

Creative people need to be able to protect their work. Not with legal clauses that are technically difficult to implement, but with clear, enforceable rules. Anyone who wants to train AI should pay. Anyone who uses content should ask permission first. Or at least: don't steal it behind their backs.

Source: photografix-magazin.de

Subscribe to the newsletter

and always up to date on data protection.