Skip to content

Nightshade: poisoning AI image models

Seamus Byrne
Seamus Byrne

Meet Nightshade. A tool for collective, offensive artist action against image scraping for AI. Nightshade adds a human-eye invisible interference pattern to images that will accumulate to increase the cost of training on unlicensed data".

AIArt & Design

Seamus Byrne

Founder and Head of Content at Byteside. Brings two decades of experience covering tech, digital culture, and their impacts on society.


Related Posts

Fujifilm X-E5 review: delicious classic shooting with the digital touches you need

If the pricetag doesn't make you choke, the tactile and customisable experience will make you glad you brought a real camera with you.

A close up of tulips in bright sunlight in a park. The tulips are red and yellow, with many of the variety blending into the distance.

Pixel 10 Pro review: a great device well ahead of its software

Google takes its smartphone hardware to another level. It's now just waiting for the promises of the software to catch up.

A woman holds a mint green Pixel phone to her ear, taking a call. She wears a green sweater and stands in front of a fancy artistic building.

Accountable for every sloppy word

One cool trick to stop the word slop? Demand transparency when AI errors appear in documents that were meant to be written for people.

A stack of reports left of centre on a black table. Viewed low to table with stack rising out of top of frame. There are sticky tabs in various colours poking out of pages.