jakob,

#LLM are brute-forcing their way through absurd amounts of data to generate an autocomplete output for any given input that approximates outputs a human might give instead.

They lack a few distinct properties of human cognition, including language, that more brute force alone cannot compensate for. Because they can only ever internalize and compute intratextual context.

Incidentally, humans need much less input(!) to learn language. Probably because they can contextualize across domains.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • DreamBathrooms
  • ngwrru68w68
  • modclub
  • magazineikmin
  • thenastyranch
  • rosin
  • khanakhh
  • InstantRegret
  • Youngstown
  • slotface
  • Durango
  • kavyap
  • mdbf
  • GTA5RPClips
  • megavids
  • tacticalgear
  • normalnudes
  • tester
  • osvaldo12
  • everett
  • cubers
  • ethstaker
  • anitta
  • provamag3
  • Leos
  • cisconetworking
  • JUstTest
  • lostlight
  • All magazines