matthewskelton,
@matthewskelton@mastodon.social avatar

LLMs and GPT-based AI systems should pay IP fees to all creators and authors whose work provides the AI ability.

What's that you say? They wouldn't be financially viable? Then these AI systems are basically stealing. 💰

#ChatGPT #AI #LLMs

ktaylor,

@matthewskelton This is sure to be addressed at some point when content creators including journalism outlets and academic publishing companies start fighting against this massive misappropriation of their intellectual capital.

sheldrake,

@matthewskelton

In one way, yes.

In another way, no.

After all, you've used words here that you didn't coin (how dare you!) to communicate an idea that you would be hard-pressed to attribute other than to say with confidence that it wasn't a so-called original thought.

matthewskelton,
@matthewskelton@mastodon.social avatar

@sheldrake but I am human and AI systems use industrial content slurping. Not equivalent at all.

sheldrake,

@matthewskelton by your logic are you saying that human intelligence can "steal" but the machine intelligent fruits of our intelligent labours cannot?

I guess what I'm trying to say is that your reference to "stealing" is a very modern idea. Just a few centuries old. For the hundred millennia before that, it may have been called riffing.

sheldrake,

@matthewskelton Just for the avoidance of any doubt, I'm not agreeing with you, but I'm not setting out to disagree either. Just trying to see if there are others ways to think this through.

matthewskelton,
@matthewskelton@mastodon.social avatar

@sheldrake I am saying that the "riffing" or "inspiration" by humans is different from the industrial slurping by AI systems.

The taking of things of value without any appreciable giving back is the stealing bit.

matthewskelton,
@matthewskelton@mastodon.social avatar

Regarding blogging: "Do I just ignore the fact that I’m helping train the generative pap that tech hopes will replace us all?" -- @baldur

https://www.baldurbjarnason.com/2023/tech-broke-the-webs-social-contract/

jchyip,
@jchyip@mastodon.online avatar

@matthewskelton Makes me wonder if the rise of LLMs might help restore the idea of public domain.

matthewskelton,
@matthewskelton@mastodon.social avatar

@jchyip as long as the corporations are paying a fair amount for the value, I think all kinds of financial models might work. It's the "value without compensation" which is the stealing bit.

thirstybear,
@thirstybear@agilodon.social avatar

@matthewskelton I am now idly wondering whether steganographic signing of code is possible…

pigsaw,

@matthewskelton You're right. But of course they should ask first.

ids,
@ids@c.im avatar

@matthewskelton I think it’s quite likely that due to the huge numbers involved the contribution of value from any one artist (or the amount by which the LLM would be less valuable without it) would be so tiny as to be pointless.

Surely trying to do this would be a massive amount of admin and cost to almost no benefit?

Perhaps there are other ways the companies could recognise the contributions of all those creators - sponsoring young writers and artists or something similar?

matthewskelton,
@matthewskelton@mastodon.social avatar

@ids If the contribution is recognized - and paid for - then that's the key thing. IP fees are just one option.

It's the industrial scale extraction of value with no payment that is the problem. That's the "stealing" part.

Maybe GPTs/LLMs pay governments a citizen contribution aka tax? 🤷

ids,
@ids@c.im avatar

@matthewskelton Now steady on... you can't just make large tech companies go around paying tax. It'd be the end of civilisation as we know it! Or something.

Andylongshaw,

@matthewskelton @ids what about an equivalent of performing rights? https://en.m.wikipedia.org/wiki/Performing_rights

Dogzilla,

@matthewskelton Wouldnt that set a precedent that could conceivably destroy all creative work? I’m not sure what the legal difference in training an AI/ML system on other’s work versus training a human artist on the works of previous artists (as art schools do every day) is.

Juro,

@matthewskelton Should artisits who get inspired by your art do the same?

I mean, it sort of happens if you charge for access to your IP

It would be stealing if each of an AIs creation would be very similar to the data it was trained on. Then the model contains that whole data, instead of being a summary of it.

Since the whole art process is summarizing, replicating and remixing other art, creating a model does exactly that (i.e. an LLM) follows this process and should be ok

matthewskelton,
@matthewskelton@mastodon.social avatar

@Juro it this is a false comparison because LLMs and GPTs in general can do this at massive scale. That is not true of individual artists.

The automated ingestion of content is wildly different from "being inspired by", imo.

Juro,

@matthewskelton But the rules of looking at other creations to make your own are not limited to art, or to maintaining the art form, or staying in the art regime at all

I can write texts, draw pictures or do statistical analyses on any art I come across

The training algorithm for a GPT is just a tool helping to do an analysis, and compresses it into a form that makes it easy to construct other examples

The training, model, and inference are not stealing

matthewskelton,
@matthewskelton@mastodon.social avatar

@Juro when done on an industrial scale, automated and without attribution, it is stealing. (IMO)

Juro,

@matthewskelton Okay you may have the opinion but Id like to understand what makes it stealing once it becomes automated. The other 2 are not sufficient and done by humans

  • Industrial scale made by a human is just "yearlong dedication" or "a big team of humans"
  • Without attribution is commonplace in art as well. If you 'steal' from enough people, it becomes original and dont have to name all people you stole from
matthewskelton,
@matthewskelton@mastodon.social avatar

@Juro the worldwide slurping of publicly accessible digital content by AI engines is so far from the human-scale approach that you cannot be serious.

Pretending these things are "the same" is dehumanizing (imo).

Juro,

@matthewskelton But I, as an artist, am allowed to construct tools that help me with my artistic process, right?

I, as a scientist, can construct tools that help me with my scientific and statistical evaluations, right?

Maybe the source of our disagreement is just the person behind the AI Art:

Are you thinking of big coorperations, while I am thinking of individuals, using and fine-tuning modern tools?

Then we can agree on open-source AI models, like stablediffusion or gptneo

matthewskelton,
@matthewskelton@mastodon.social avatar

@Juro yep, the entity or power behind the use is always important. And the large GPT/LLM systems are owned and run by massive corporations with budgets that are way out of reach for any artist collective.

Individual artists paying a modest subscription to an open AI tool so it can pay IP fees seems healthy, right?

Juro,

@matthewskelton That would be nice, but thats not how open-source AI tools work

The models weights can always be accessed by anyone for free, while the developers behind it rely on donations

Thats an important pillar of open-source, because it makes sure that people from poorer backgrounds are not left behind, while richer people can support the development

If you would charge IP fees for those teams, open source AI would have no chance to compete with large coorperations

Juro,

@matthewskelton The models are already out there, and they are very good.

If you would charge these teams IP fees now, they would simply co bankrupt. Big coorperations would gain a large adnvantage, but the open models would still exist and still be free to use

Better still, you couldnt differentiate people who payed the IP fees and people who didn't

In the end, the free nature of open source models is what makes them fair

Juro,

@matthewskelton Instead of clinging to the aged idea of IP, we should push for UBI, to make sure the basic needs, also known as human rights, are covered for anyone

The concept Intellectual property cannot survive if I have a model that can take a text and change it to a sufficient degree that noone can call it copyright infringement

This model does not require what you call "art theft", but in its light IP becomes a lost cause

matthewskelton,
@matthewskelton@mastodon.social avatar

@Juro Okay, but that seems a fair way in the future. In the meantime, we may need some more practical measures.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • ChatGPT
  • khanakhh
  • magazineikmin
  • mdbf
  • GTA5RPClips
  • everett
  • rosin
  • Youngstown
  • tacticalgear
  • slotface
  • ngwrru68w68
  • kavyap
  • DreamBathrooms
  • thenastyranch
  • tester
  • JUstTest
  • ethstaker
  • cubers
  • osvaldo12
  • cisconetworking
  • Durango
  • InstantRegret
  • normalnudes
  • Leos
  • modclub
  • anitta
  • provamag3
  • megavids
  • lostlight
  • All magazines