tarkowski,
@tarkowski@101010.pl avatar

. @mweinberg wrote recently a short post titled "Licenses are Not Proxies for Openness in AI Models".

The title basically says it all, and is spot on.

https://michaelweinberg.org/blog/2024/03/26/ntia-open-ai/

Mike writes that as long as we don't have consensus on what "open" means in the #AI space, "any definition of open should require a more complex analysis than simply looking at a license".

Good point! Funnily enough, the European AI Act drafters did exactly that, which Mike suggests should be avoided: defined open source AI as a pile of AI stuff under a "free and open-source license".

(I wrote about it on our blog: https://openfuture.eu/blog/ai-act-fails-to-set-meaningful-dataset-transparency-standards-for-open-source-ai/)

Mike also distinguishes hardware and software. Hardware is more complex, and it's therefore the hardware space where licensing will not serve as a simple proxy of openness.

I would argue that this point can be made also for software, and other types of content. Mike is right that licenses grew to be powerful proxies of openness. But there have always been other factors - less visible, and not so easily standardized. For example collaborative practices, standards for platforms that support open sharing, etc.

There seems to be a growing sense that we need to look beyond the license proxies, and identify other factors as core to open frameworks. The #AI space is one where such thinking is most visible, but I'm expecting spillover beyond the AI debates.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • ai
  • DreamBathrooms
  • ngwrru68w68
  • cubers
  • magazineikmin
  • thenastyranch
  • rosin
  • khanakhh
  • InstantRegret
  • Youngstown
  • slotface
  • Durango
  • kavyap
  • mdbf
  • tacticalgear
  • JUstTest
  • osvaldo12
  • normalnudes
  • tester
  • cisconetworking
  • everett
  • GTA5RPClips
  • ethstaker
  • anitta
  • Leos
  • provamag3
  • modclub
  • megavids
  • lostlight
  • All magazines