Business news

London AI company claims Getty copyright lawsuit is “existent threat” to the generation technology industry

The British High Court has begun a landmark legal battle that could redefine the boundaries between copyright law and artificial intelligence innovation.

Stability AI, the company behind the steady spread of popular London image generators, warned that Getty Images’ copyright and trademark lawsuits pose a “clear threat” to the future of the Generative AI industry.

Getty Images, one of the world’s largest and most influential photography agencies, claims stability illegally uses its extensive database of copyrighted photos to train its AI models – a tech firm strongly denies. At the heart of the case is the technical output of alleged stability that still bears the Getty watermark, which led the agency to describe the results as “stand up on our trademarks on pornography” and “AI Rubbish”.

Getty’s lawyers believe that this is not a struggle between art and innovation, but a struggle about fair payments and responsible technological development. “The problem is when AI companies like stability companies want to use these jobs,” said Lindsay Lane KC, representing Getty. “This is a group of tech geeks who are so excited about AI that they are indifferent to any dangers or problems that pose.”

The board of stable AI includes film director James Cameron, who claims Getty uses the legal argument of “fantasy” and spends over £10m to derail it as a survival threat to his business model. The company also strongly rejected Getty’s separate allegations, which had been trained in a database containing child sexual abuse material (CSAM), called the claim “offensive” and insisted it had strong safeguards to prevent abuse.

The case opens up amid increasing tensions between generative AI companies and the creative industry as photographers, musicians and writers warn that AI tools increasingly train their work without consent or compensation. The latest campaign supported by stars including Elton John and Dua Lipa calls for stricter copyright protection and regulatory reforms.

In the UK, debate has reached Westminster, and the proposed government policy will force copyright owners to opt out of materials for training AI models – a move that has been widely criticized by creators and rights holders who believe that “choice” should be the default.

“Of course, Getty’s image recognizes that the AI ​​industry may be a permanent force in general,” Ryan said. “But that doesn’t prove that those who develop AI models are allowed to ride on intellectual property.”

The trial is expected to last for weeks and will involve testimony from leading academics and AI experts, including experts from the University of California, Berkeley and the University of Friberg, Germany. More than 78,000 pages of evidence have been submitted, including examples of Getty possessing images allegedly used during the training process, such as portraits of Donald Glover, Jurgen Klopp and Christopher Nolan.

With the results set precedents not only in the UK, but globally, technology companies, artists and lawmakers are watching the case closely. The stakes are high: A favorable ruling in Getty could impose new restrictions on how AI companies can obtain training data, and a victory in stability could allow the industry to continue training publicly accessible materials.

Either way, the results can shape the future of content creation, and the subtle boundaries between inspiration, imitation and intellectual property in the era of AI.


Jamie Young

Jamie is a senior journalist in business affairs, bringing more than a decade of experience in the UK SME report. Jamie holds a degree in business administration and regularly attends industry conferences and workshops. When not reporting the latest business developments, Jamie is passionate about coaching emerging journalists and entrepreneurs to inspire the next generation of business leaders.



Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button