@matthewskelton This is sure to be addressed at some point when content creators including journalism outlets and academic publishing companies start fighting against this massive misappropriation of their intellectual capital.
After all, you've used words here that you didn't coin (how dare you!) to communicate an idea that you would be hard-pressed to attribute other than to say with confidence that it wasn't a so-called original thought.
@matthewskelton by your logic are you saying that human intelligence can "steal" but the machine intelligent fruits of our intelligent labours cannot?
I guess what I'm trying to say is that your reference to "stealing" is a very modern idea. Just a few centuries old. For the hundred millennia before that, it may have been called riffing.
@matthewskelton Just for the avoidance of any doubt, I'm not agreeing with you, but I'm not setting out to disagree either. Just trying to see if there are others ways to think this through.
@jchyip as long as the corporations are paying a fair amount for the value, I think all kinds of financial models might work. It's the "value without compensation" which is the stealing bit.
@matthewskelton I think it’s quite likely that due to the huge numbers involved the contribution of value from any one artist (or the amount by which the LLM would be less valuable without it) would be so tiny as to be pointless.
Surely trying to do this would be a massive amount of admin and cost to almost no benefit?
Perhaps there are other ways the companies could recognise the contributions of all those creators - sponsoring young writers and artists or something similar?
@matthewskelton Now steady on... you can't just make large tech companies go around paying tax. It'd be the end of civilisation as we know it! Or something.
@matthewskelton Wouldnt that set a precedent that could conceivably destroy all creative work? I’m not sure what the legal difference in training an AI/ML system on other’s work versus training a human artist on the works of previous artists (as art schools do every day) is.
@matthewskelton Should artisits who get inspired by your art do the same?
I mean, it sort of happens if you charge for access to your IP
It would be stealing if each of an AIs creation would be very similar to the data it was trained on. Then the model contains that whole data, instead of being a summary of it.
Since the whole art process is summarizing, replicating and remixing other art, creating a model does exactly that (i.e. an LLM) follows this process and should be ok
@matthewskelton But the rules of looking at other creations to make your own are not limited to art, or to maintaining the art form, or staying in the art regime at all
I can write texts, draw pictures or do statistical analyses on any art I come across
The training algorithm for a GPT is just a tool helping to do an analysis, and compresses it into a form that makes it easy to construct other examples
The training, model, and inference are not stealing
@matthewskelton Okay you may have the opinion but Id like to understand what makes it stealing once it becomes automated. The other 2 are not sufficient and done by humans
Industrial scale made by a human is just "yearlong dedication" or "a big team of humans"
Without attribution is commonplace in art as well. If you 'steal' from enough people, it becomes original and dont have to name all people you stole from
@Juro yep, the entity or power behind the use is always important. And the large GPT/LLM systems are owned and run by massive corporations with budgets that are way out of reach for any artist collective.
Individual artists paying a modest subscription to an open AI tool so it can pay IP fees seems healthy, right?
@matthewskelton That would be nice, but thats not how open-source AI tools work
The models weights can always be accessed by anyone for free, while the developers behind it rely on donations
Thats an important pillar of open-source, because it makes sure that people from poorer backgrounds are not left behind, while richer people can support the development
If you would charge IP fees for those teams, open source AI would have no chance to compete with large coorperations
@matthewskelton The models are already out there, and they are very good.
If you would charge these teams IP fees now, they would simply co bankrupt. Big coorperations would gain a large adnvantage, but the open models would still exist and still be free to use
Better still, you couldnt differentiate people who payed the IP fees and people who didn't
In the end, the free nature of open source models is what makes them fair
@matthewskelton Instead of clinging to the aged idea of IP, we should push for UBI, to make sure the basic needs, also known as human rights, are covered for anyone
The concept Intellectual property cannot survive if I have a model that can take a text and change it to a sufficient degree that noone can call it copyright infringement
This model does not require what you call "art theft", but in its light IP becomes a lost cause
Add comment