obrhoff,

The amazing thing about LLMs is how much knowledge they posess in their small size. The llama3-8b model, for instance, weighs only 4.7GB yet can still answer your questions about everything (despite some hallucinations).

noplasticshower,
@noplasticshower@zirk.us avatar

@obrhoff being DEAD WRONG is not really a "hallucination"...but your point is well taken. Cramming information into a smaller space is amazing.

When you re-represent and compress information in the long tails of gradient Gaussians disappears.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • llm
  • osvaldo12
  • DreamBathrooms
  • ngwrru68w68
  • magazineikmin
  • thenastyranch
  • ethstaker
  • rosin
  • Youngstown
  • slotface
  • love
  • InstantRegret
  • kavyap
  • GTA5RPClips
  • Durango
  • megavids
  • everett
  • khanakhh
  • tacticalgear
  • mdbf
  • cisconetworking
  • cubers
  • modclub
  • Leos
  • tester
  • anitta
  • normalnudes
  • provamag3
  • JUstTest
  • All magazines