According to Cointelegraph, researchers from the University of Chicago developed a tool that allows artists to “poison” their digital art. The tool is believed to stop developers from training artificial intelligence (AI) systems in their work.
Sources revealed that the tool modifies images so that their inclusion contaminates the data sets used to train AI with incorrect information. It is believed that the tool is named “Nightshade,” named after the family of plants known for their poisonous berries.
Furthermore, “The researchers don’t yet know of robust defences against these attacks, the implication being that even robust models such as OpenAI’s ChatGPT could be at risk,” Vitaly Shmatikov, a professor at Cornell University, concluded.
(With insights from Cointelegraph)
Follow us onTwitter,Facebook,LinkedIn
Leave a Reply