You will remember that European writers’ groups used the AI Safety Summit as a chance to ask for greater AI protection for creative artists in law. This week it is the turn of UK groups; including the Society of Authors, Publishers’ Association, and Authors’ Licensing and Collecting Agency.
The groups have written to the UK government. Their open letter distinguishes, as we are seeing happen more and more, between AI tools that are seen as a benefit to the creative process, and the generative AI platforms that are taken to be less benign. ALLi News Editor, Dan Holloway, reports.
A Call for AI Protection and Copyright
In particular, the letter focuses on copyright. There is a demand for retrospective compensation for the use of copyrighted works without consent, citing the Books3 database used to train large platforms. And there is a demand for an end to such un-consented use in future. They also seek full credit for the creators of any work, as well as the fair operation of permission and payment when it comes to use of people’s work.
There have been occasional stories over the past few months of creatives taking matters into their own hands when it comes to counteracting the perceived harms from AI. You might remember that Omegaverse writers took to dropping key words and phrases into their work in an attempt to “out” AI as being trained on it.
A Data Poisoning Tool
But now there is a tool that has been developed to outfox anyone looking to train an AI on your art. Nightshade, developed by a team at MIT, uses a more sophisticated form of obfuscation than simply filling your work with code words or nonsense. It manipulates the pixels in works of digital art. This means that the human eye sees what the artist intended.
But any AI that tries to interpret the work will “see” something entirely different and utterly misleading, so that it cannot generate meaningful works from it. The idea is that the whole stream of training data will be muddied by the presence of such poisoned images because it will be impossible to untangle the clean and dirty images.