drahardja, to ai
@drahardja@sfba.social avatar

Are there any studies that actually show the efficacy of in preventing models from getting trained on your artwork? The authors of Nightshade of course claim that it works, but has there been independent studies to verify this?

I’ve only found some reddit posts that talk about it. and Nightshade don’t seem to be effective in fooling CLIP (text description extraction from image), only in fooling training models into misunderstanding the style (GLAZE) or label-to-subject correlation (Nightshade) of the image during training. While these sound pretty good, they don’t seem to be silver bullets. How many of us have to consistently misdirect trained models before they get fooled? How effective are these techniques actually?

beach, to random
@beach@illo.social avatar

I understand the importance of alt text for images. But as I was drafting my image description just now a question occurred to me – in adding alt text to images am I also inadvertently making it easier for the AI bots to scrape my artwork?

NatureMC,
@NatureMC@mastodon.online avatar

@beach At least in the Fediverse there shouldn't be any for (?) Does anybody know more? @FediTips @feditips ?

Scraping bots don't need ALT-texts for scraping (they have image recognition).

At the moment, the only ways to protect are and , https://glaze.cs.uchicago.edu/index.html and https://nightshade.cs.uchicago.edu/ and banning AI bots on your website in the robot.txt

janellecshane, to random
@janellecshane@wandering.shop avatar

If you try to scrape a protected seafloor, the spiky art will mess up your tools.

https://www.wired.com/story/underwater-sculptures-stopping-trawling/

kerfuffle, to Kotlin
@kerfuffle@mastodon.online avatar

For a hackathon this weekend, we built a small application in #Kotlin, #Javalin and #Lit that uses #Tensorflow to detect if you're about to upload something you might want to reconsider, and then allows stripping Exif metadata for privacy.

We also looked at distorting the image to make it unusable for training an #AI. In one day we could just garble the image beyond human recognition, but a better option would be integrating #Glaze to distort it for AI yet not for the human eye.

naynay, to fanfiction

was just about to try glazing and nightshading a #fandom x nature pop #art series of mine to see how it goes but https://glaze.cs.uchicago.edu/ is down D:

anyone know what's up? it says it's forbidden-?

#glaze #nightshade #AI

emmetoneill, to ai
@emmetoneill@mas.to avatar

Does anyone know of an existing open source project working on AI model poisoning or style cloaking, in the vein of and ?

I'm interested in this tech but they both seem to be proprietary, and I'd like to see if there is any work being done on the open source side of things.

jilleduffy, to random
@jilleduffy@mastodon.social avatar

Pottery drop!

My "seaweed" series (the glaze is Slate Blue over Spearmint on Miller 50 clay)
#pottery #glaze

image/jpeg
image/jpeg

lps, to random
@lps@masto.1146.nohost.me avatar

@aenderlara Looks great! Also, I never heard of Cara .. great! Is that an alternative to ArtStation? I see they have #Glaze to help artists protect their work from AI :)

kate, to random
@kate@federatedfandom.net avatar

okay, glaze and nightshade for graphic art, but are there any tools going for audio?

shalien, to random French
@shalien@projetretro.io avatar

I wish tools like #glaze and #nightshade existed to protect open source code. Guess s I will have to close my GitHub account and #selfhost #gitlab

Woolier, to art

Artists! are you tired of Generative AI models training on your art despite Being clear that your stuff shouldn't be trained on?

Poison your art.

https://nightshade.cs.uchicago.edu/index.html

https://glaze.cs.uchicago.edu/

I've been following the development of a neat little tool called NightShade Which supposedly alters the image in imperceptible ways that impact how generative models learn from your art, effectively "poisoning" their training information. There is a second tool called glaze by the same developers that made nighshade.

in short, Nighshade distorts what the AI learns about a certain pictures, and glaze provides protection from style mimicry, a quite awful practice i've already come across often where some insensitive individual trains an AI on a specific artist.

Using both of these tools we can push back against companies and individuals that scrape our art without our consent.


kegill, to ai
@kegill@mastodon.social avatar

ALL of us should use on artwork we post online. It’s a defense against “style mimicry attacks.”

is offensive. It “turns any image into a data sample that is unsuitable for model training. [It] transforms images into ‘poison’ samples, so that models training on them without consent will see their models learn unpredictable behaviors …”

https://nightshade.cs.uchicago.edu/whatis.html

abucci, to midjourney
@abucci@buc.ci avatar

Nightshade 1.0 is out: https://nightshade.cs.uchicago.edu/index.html

From their "What is Nightshade?" page:

Since their arrival, generative AI models and their trainers have demonstrated their ability to download any online content for model training. For content owners and creators, few tools can prevent their content from being fed into a generative AI model against their will. Opt-out lists have been disregarded by model trainers in the past, and can be easily ignored with zero consequences. They are unverifiable and unenforceable, and those who violate opt-out lists and do-not-scrape directives can not be identified with high confidence.

In an effort to address this power asymmetry, we have designed and implemented Nightshade, a tool that turns any image into a data sample that is unsuitable for model training. More precisely, Nightshade transforms images into "poison" samples, so that models training on them without consent will see their models learn unpredictable behaviors that deviate from expected norms, e.g. a prompt that asks for an image of a cow flying in space might instead get an image of a handbag floating in space.

-E

Norobiik, to llm
@Norobiik@noc.social avatar

#Nightshade is an offensive #DataPoisoning tool, a companion to a defensive style protection tool called #Glaze, which The Register covered in February last year.

Nightshade poisons #ImageFiles to give indigestion to models that ingest data without permission. It's intended to make those training image-oriented models respect content creators' wishes about the use of their work. #LLM #AI

How artists can poison their pics with deadly Nightshade to deter #AIScrapers
https://www.theregister.com/2024/01/20/nightshade_ai_images/

dansup, to random
@dansup@mastodon.social avatar

Nightshade results:

Before + After + Settings

"To achieve maximal effect, please try to include the poison tags below as part of the ALT text field when you post corresponding images online.

morteza-khalili-6vbX0f32tv0-unsplash-nightshade-intensity-LOW-V1.jpg: island"

After
Settings

nickharrison, to art
aral, to ai
@aral@mastodon.ar.al avatar

Hmm, here’s an idea: what if tools like Glaze and Nightshade were integrated into fediverse servers and/or clients and automatically applied whenever any images are posted… 🤔

Thoughts @dansup?

https://nightshade.cs.uchicago.edu

#ai #selfDefense #glaze #nightshade #fediverse #pixelfed #mastodon

frankel, to ai
@frankel@mastodon.top avatar

is a system designed to protect human artists by disrupting style mimicry

https://glaze.cs.uchicago.edu/what-is-glaze.html

Crazypedia, to random
@Crazypedia@pagan.plus avatar

Anti Ai image scraping tool that serves up random_noise_bmp with applied to that. When an ai web bot asks for images, the server just give it junk instead of the images it thinks it's getting.

Poisen the Algorithm

lanIka, to fediverse Portuguese

Throwing this out on the #fediverse: with Meta scrapping #artworks from billions of accounts between Facebook and Instagram to train their AI, I'm seeing a strong feeling of helplessness, of "where do I go from here and how do I protect myself?"

There's a move to go to image sharing platforms with #Glaze and #Nightshade options growing.

How is the @pixelfed crowd dealing with this? Is #pixelfed safe from scrapping and crawling while showcasing artists?

itnewsbot, to machinelearning

University of Chicago researchers seek to “poison” AI art generators with Nightshade - Enlarge (credit: Getty Images)

On Friday, a team of researcher... - https://arstechnica.com/?p=1978501 #largelanguagemodels #universityofchicago #adversarialattacks #foundationmodels #machinelearning #aitrainingdata #imagesynthesis #datapoisoning #nightshade #aiethics #benzhao #biz#google #metaai #openai #aiart #glaze #meta #ai

manisha, to ai
@manisha@neuromatch.social avatar

Heck yeah!! This new data poisoning tool lets artists fight back against generative AI

https://www.technologyreview.com/2023/10/23/1082189/data-poisoning-artists-fight-generative-ai/

#Nightshade #Glaze #AI #GenerativeAI #Punk

jmcrookston, to random
@jmcrookston@mastodon.social avatar

https://www.nature.com/articles/s41746-023-00939-z

Wait you're telling me if we train LLMs on human knowledge it ends up racist?

Why, that would mean we humans are racist!

I'm somehow not shocked.

RunRichRun,
@RunRichRun@mastodon.social avatar

@jmcrookston Tech described here will not address that flaw (of either people nor LLMs being trained on human knowledge/dialogue), & I'd prefer a phrase carrying fewer negative connotations than "data poisoning." If LLMs and are inevitable — and they are, in and elsewhere — it behooves owners/operators to make sure they "behave" ethically":

"...data poisoning tool lets artists fight back against generative AI" https://www.technologyreview.com/2023/10/23/1082189/data-poisoning-artists-fight-generative-ai/

Deus, (edited ) to bangalore
@Deus@charcha.cc avatar

Friend in #Bangalore has been taking these #pottery classes, thought I’d share. So satisfying to watch. She’s happy with :instagram: , so I don’t bother pulling her to join the Fediverse. 🎧 On.

#Ceramics #Glaze #Clay #Art #ArtistsofMastodon #MastoArt #MastodonArt #India

hanjabanja, to art
@hanjabanja@mastodon.social avatar
  • All
  • Subscribed
  • Moderated
  • Favorites
  • JUstTest
  • mdbf
  • ngwrru68w68
  • tester
  • magazineikmin
  • thenastyranch
  • rosin
  • khanakhh
  • InstantRegret
  • Youngstown
  • slotface
  • Durango
  • kavyap
  • DreamBathrooms
  • megavids
  • tacticalgear
  • osvaldo12
  • normalnudes
  • cubers
  • cisconetworking
  • everett
  • GTA5RPClips
  • ethstaker
  • Leos
  • provamag3
  • anitta
  • modclub
  • lostlight
  • All magazines