Connect with us

Hi, what are you looking for?

Tech News

Artists can use a data poisoning tool to confuse DALL-E and corrupt AI scraping

Banana bed AI art
Image: OpenAI

Fighting against data used to train AI models has become more poisonous.

A new tool called Nightshade allows users to attach it to their creative work, and it will corrupt — or poison — training data using that art. Eventually, it can ruin future models of AI art platforms like DALL-E, Stable Diffusion, and Midjourney, removing its ability to create images.

Nightshade adds invisible changes to pixels in a piece of digital art. When the work is ingested by a model for training, the “poison” exploits a security vulnerability that confuses the model, so it will no longer read an image of a car as a car and come up with a cow instead.

The MIT Technology Review reported that Ben Zhao, a professor at the University of Chicago and one of the…

Continue reading…

You May Also Like

Tech News

Image: OceanGate It’s only been months since the implosion of OceanGate’s Titan tourist submersible, but Hollywood producers are already working on a film based...

Tech News

Amazon’s latest Echo Show 8 is one of many discounted Alexa smart home devices. | Photo by Alex Cranz / The Verge Amazon’s fall...

Tech News

Image: Brazil Climate Summit At the moment I arrived at the Brazil Climate Summit event, it felt like home to me. As I opened...

Editor's Pick

While the UAW strike continues, and the debate on how much it matters in the scheme of things rages on, other wage trends are...