It works by tricking models during training into miscategorizing images, rendering them useless. Though it only affects new models and risks escalation, Nightshade represents creators' efforts to impose costs on unethical AI firms to push for proper licensing, as the rapid growth of generative models threatens artists' ability to control their work.
Interesting way to stop “AI theft”. Hoping government bodies step in here so that we don’t have to resort into tricking models.
For further actions, you may consider blocking this person and/or reporting abuse
Read up on it some more and found:
Interesting way to stop “AI theft”. Hoping government bodies step in here so that we don’t have to resort into tricking models.